hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
b8d3d6eef9923c53e2c72ef3ffa4d51959b6e188 | 263 | py | Python | run_perf_benchmarks.py | alirezajahani60/FabFlee | e2cfdb6efc758281e123f6acc1b06f93176dd756 | [
"BSD-3-Clause"
] | null | null | null | run_perf_benchmarks.py | alirezajahani60/FabFlee | e2cfdb6efc758281e123f6acc1b06f93176dd756 | [
"BSD-3-Clause"
] | null | null | null | run_perf_benchmarks.py | alirezajahani60/FabFlee | e2cfdb6efc758281e123f6acc1b06f93176dd756 | [
"BSD-3-Clause"
] | null | null | null | from base.fab import *
from plugins.FabFlee.FabFlee import *
@task
def flee_get_perf(results_dir):
print("{}/{}".format(env.local_results,results_dir))
my_file = open("{}/{}/perf.log".format(env.local_results,results_dir), 'r')
print(my_file.read())
| 29.222222 | 79 | 0.703422 | 39 | 263 | 4.512821 | 0.589744 | 0.170455 | 0.159091 | 0.238636 | 0.352273 | 0.352273 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114068 | 263 | 8 | 80 | 32.875 | 0.755365 | 0 | 0 | 0 | 0 | 0 | 0.076046 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.428571 | 0.285714 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b8dcb2e38617c441c3331cf21108a3eb3fba7a49 | 3,094 | py | Python | test_main.py | zenranda/proj10-gcalfinal | ee32beb3ef570b23883d41f84394b28818e5a07c | [
"Artistic-2.0"
] | null | null | null | test_main.py | zenranda/proj10-gcalfinal | ee32beb3ef570b23883d41f84394b28818e5a07c | [
"Artistic-2.0"
] | 2 | 2021-02-08T20:17:57.000Z | 2021-04-30T20:38:59.000Z | test_main.py | zenranda/proj10-gcalfinal | ee32beb3ef570b23883d41f84394b28818e5a07c | [
"Artistic-2.0"
] | null | null | null | ###
#Various nose tests. If you want to adapt this for your own use, be aware that the start/end block list has a very specific formatting.
###
import get_freebusy
import arrow
from operator import itemgetter
from pymongo import MongoClient
import secrets.admin_secrets
import secrets.client_secrets
MONGO_CLIENT_URL = "mongodb://{}:{}@localhost:{}/{}".format(
secrets.client_secrets.db_user,
secrets.client_secrets.db_user_pw,
secrets.admin_secrets.port,
secrets.client_secrets.db)
try:
dbclient = MongoClient(MONGO_CLIENT_URL)
db = getattr(dbclient, secrets.client_secrets.db)
collection = db.dated
base_size = collection.count() #current size of the db, for comparison later
except:
print("Failure opening database. Is Mongo running? Correct password?")
sys.exit(1)
def test_free_times(): #Given a sample list, check to see if it's getting free/busy blocks correctly
ranges = [['2016-11-20T08:30:00-08:00', '2016-11-20T010:30:00-08:00'], ['2016-11-20T11:00:00-08:00', '2016-11-20T15:00:00-08:00'], ['2016-11-20T16:30:00-08:00', '2016-11-20T19:00:00-08:00'], ['2016-11-24T13:30:00-08:00', '2016-11-24T16:00:00-08:00'], ['2016-11-21T15:00:00-08:00', '2016-11-21T18:30:00-08:00']]
start = '2016-11-20T8:00:00-08:00'
end = '2016-11-23T20:00:00-08:00'
assert get_freebusy.get_freebusy(ranges, start, end) == [['At 2016-11-20 from 08:00:00 to 08:30:00', 'At 2016-11-20 from 10:30:00 to 11:00:00', 'At 2016-11-20 from 15:00:00 to 16:30:00', 'At 2016-11-20 from 19:00:00 to 20:00:00', 'At 2016-11-21 from 08:00:00 to 15:00:00', 'At 2016-11-21 from 18:00:00 to 20:00:00', 'At 2016-11-24 from 08:00:00 to 13:30:00', 'At 2016-11-24 from 16:00:00 to 20:00:00'], ['At 2016-11-20 from 08:30:00 to 10:30:00', 'At 2016-11-20 from 11:00:00 to 15:00:00', 'At 2016-11-20 from 16:30:00 to 19:00:00', 'At 2016-11-21 from 15:00:00 to 18:00:00', 'At 2016-11-24 from 13:30:00 to 16:00:00']]
ranges = []
start = '2016-11-20T12:00:00-08:00'
end = '2016-11-23T20:00:00-08:00'
assert get_freebusy.get_freebusy(ranges, start, end) == [[], []]
def test_overlap(): #tests if the program can handle dates that overlap/intersect
ranges = [['2016-11-22T11:00:00-08:00', '2016-11-22T16:00:00-08:00'], ['2016-11-23T12:00:00-08:00', '2016-11-23T15:30:00-08:00']]
start = '2016-11-20T8:00:00-08:00'
end = '2016-11-23T20:00:00-08:00'
assert get_freebusy.get_freebusy(ranges, start, end) == [['At 2016-11-22 from 08:00:00 to 11:00:00', 'At 2016-11-22 from 16:00:00 to 20:00:00', 'At 2016-11-23 from 08:00:00 to 11:00:00', 'At 2016-11-23 from 18:30:00 to 20:00:00'], ['At 2016-11-22 from 11:00:00 to 16:00:00', 'At 2016-11-23 from 11:00:00 to 18:30:00']]
def test_db():
assert collection != None
collection.insert({"type" : "freebusy", "entry" : [["entry 1"], ["entry 2"]]})
assert base_size < collection.count()
collection.remove({"entry" : [["entry 1"], ["entry 2"]]})
assert base_size == collection.count()
| 55.25 | 624 | 0.649968 | 578 | 3,094 | 3.430796 | 0.230104 | 0.08472 | 0.060514 | 0.085729 | 0.533535 | 0.474029 | 0.378215 | 0.297529 | 0.279375 | 0.239032 | 0 | 0.290058 | 0.170976 | 3,094 | 56 | 625 | 55.25 | 0.483041 | 0.101487 | 0 | 0.128205 | 0 | 0 | 0.509205 | 0.19514 | 0 | 0 | 0 | 0 | 0.153846 | 1 | 0.076923 | false | 0.025641 | 0.153846 | 0 | 0.230769 | 0.025641 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b8e81060803693ffd42ace6d2aecd7a9dd90f046 | 417 | py | Python | testing/resources/test_g.py | tongni1975/processing.py | 0b9ad68a1dc289d5042d1d3b132c13cc157d3f88 | [
"Apache-2.0"
] | null | null | null | testing/resources/test_g.py | tongni1975/processing.py | 0b9ad68a1dc289d5042d1d3b132c13cc157d3f88 | [
"Apache-2.0"
] | 1 | 2021-06-25T15:36:38.000Z | 2021-06-25T15:36:38.000Z | testing/resources/test_g.py | tongni1975/processing.py | 0b9ad68a1dc289d5042d1d3b132c13cc157d3f88 | [
"Apache-2.0"
] | null | null | null | import processing.opengl.PGraphics3D
def setup():
size(100, 100, P3D)
def draw():
# check that "g" is defined and is the expected type
assert(isinstance(g, processing.opengl.PGraphics3D))
# check that the alias cameraMatrix->camera is working as expected
g.camera(0, 0, -10, 0, 0, 0, 0, 1, 0)
assert(g.cameraMatrix.m03 == 0)
assert(g.cameraMatrix.m23 == -10)
print 'OK'
exit()
| 26.0625 | 70 | 0.654676 | 62 | 417 | 4.403226 | 0.548387 | 0.029304 | 0.197802 | 0.14652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079511 | 0.215827 | 417 | 15 | 71 | 27.8 | 0.755352 | 0.275779 | 0 | 0 | 0 | 0 | 0.006689 | 0 | 0 | 0 | 0 | 0 | 0.3 | 0 | null | null | 0 | 0.1 | null | null | 0.1 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7710dc16a8fbe11c81dbff2a20f74da32953814d | 1,550 | py | Python | solutions/python3/problem1265.py | tjyiiuan/LeetCode | abd10944c6a1f7a7f36bd9b6218c511cf6c0f53e | [
"MIT"
] | null | null | null | solutions/python3/problem1265.py | tjyiiuan/LeetCode | abd10944c6a1f7a7f36bd9b6218c511cf6c0f53e | [
"MIT"
] | null | null | null | solutions/python3/problem1265.py | tjyiiuan/LeetCode | abd10944c6a1f7a7f36bd9b6218c511cf6c0f53e | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
1265. Print Immutable Linked List in Reverse
You are given an immutable linked list, print out all values of each node in reverse with the help of the following
interface:
ImmutableListNode: An interface of immutable linked list, you are given the head of the list.
You need to use the following functions to access the linked list (you can't access the ImmutableListNode directly):
ImmutableListNode.printValue(): Print value of the current node.
ImmutableListNode.getNext(): Return the next node.
The input is only given to initialize the linked list internally.
You must solve this problem without modifying the linked list.
In other words, you must operate the linked list using only the mentioned APIs.
Constraints:
The length of the linked list is between [1, 1000].
The value of each node in the linked list is between [-1000, 1000].
Follow up:
Could you solve this problem in:
Constant space complexity?
Linear time complexity and less than linear space complexity?
"""
"""
This is the ImmutableListNode's API interface.
You should not implement it, or speculate about its implementation.
"""
class ImmutableListNode:
def printValue(self) -> None: # print the value of this node.
pass
def getNext(self) -> 'ImmutableListNode': # return the next node.
pass
class Solution:
def printLinkedListInReverse(self, head: 'ImmutableListNode') -> None:
if head is None:
return
self.printLinkedListInReverse(head.getNext())
head.printValue()
| 29.245283 | 116 | 0.74129 | 218 | 1,550 | 5.270642 | 0.426606 | 0.078329 | 0.067885 | 0.020888 | 0.038294 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014412 | 0.194194 | 1,550 | 52 | 117 | 29.807692 | 0.905524 | 0.682581 | 0 | 0.181818 | 0 | 0 | 0.094444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0.181818 | 0 | 0 | 0.545455 | 0.363636 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
77135615dccca76a8c5274c97ffda5de511d3e32 | 87 | py | Python | Python/Sum/main.py | drtierney/hyperskill-problems | b74da993f0ac7bcff1cbd5d89a3a1b06b05f33e0 | [
"MIT"
] | 5 | 2020-08-29T15:15:31.000Z | 2022-03-01T18:22:34.000Z | Python/Sum/main.py | drtierney/hyperskill-problems | b74da993f0ac7bcff1cbd5d89a3a1b06b05f33e0 | [
"MIT"
] | null | null | null | Python/Sum/main.py | drtierney/hyperskill-problems | b74da993f0ac7bcff1cbd5d89a3a1b06b05f33e0 | [
"MIT"
] | 1 | 2020-12-02T11:13:14.000Z | 2020-12-02T11:13:14.000Z | num1 = input()
num2 = input()
num3 = input()
print(int(num1) + int(num2) + int(num3))
| 14.5 | 40 | 0.609195 | 13 | 87 | 4.076923 | 0.461538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 0.172414 | 87 | 5 | 41 | 17.4 | 0.652778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7730282673237879a35fb5efc177b9a2f6881b87 | 514 | py | Python | cheers/settings/prod.py | bahattincinic/cheers | 4443b23ad752c233743d71d1e035b757583a05f3 | [
"MIT"
] | 3 | 2019-03-12T03:38:13.000Z | 2021-03-15T16:48:49.000Z | cheers/settings/prod.py | bahattincinic/cheers | 4443b23ad752c233743d71d1e035b757583a05f3 | [
"MIT"
] | null | null | null | cheers/settings/prod.py | bahattincinic/cheers | 4443b23ad752c233743d71d1e035b757583a05f3 | [
"MIT"
] | 2 | 2022-01-05T11:43:42.000Z | 2022-03-16T00:05:19.000Z | from .base import *
import os
import dj_database_url
ALLOWED_HOSTS = ['*']
DEBUG = False
MIDDLEWARE += [
'whitenoise.middleware.WhiteNoiseMiddleware'
]
INSTALLED_APPS = [
'whitenoise.runserver_nostatic',
] + INSTALLED_APPS
DATABASES = {
'default': dj_database_url.config()
}
EMAIL_USE_TLS = True
EMAIL_HOST = os.environ.get('EMAIL_HOST')
EMAIL_HOST_USER = os.environ.get('EMAIL_HOST_USER')
EMAIL_HOST_PASSWORD = os.environ.get('EMAIL_HOST_PASSWORD')
EMAIL_PORT = os.environ.get('EMAIL_PORT')
| 17.133333 | 59 | 0.743191 | 66 | 514 | 5.454545 | 0.469697 | 0.15 | 0.133333 | 0.188889 | 0.175 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134241 | 514 | 29 | 60 | 17.724138 | 0.808989 | 0 | 0 | 0 | 0 | 0 | 0.258755 | 0.138132 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.052632 | 0.157895 | 0 | 0.157895 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
7732a52cf70bb1c65299ac307a32800ed068e230 | 854 | py | Python | src/7/accessing_variables_defined_inside_a_closure/example2.py | tuanavu/python-gitbook | 948a05e065b0f40afbfd22f697dff16238163cde | [
"MIT"
] | 14 | 2017-05-20T04:06:46.000Z | 2022-01-23T06:48:45.000Z | src/7/accessing_variables_defined_inside_a_closure/example2.py | tuanavu/python-gitbook | 948a05e065b0f40afbfd22f697dff16238163cde | [
"MIT"
] | 1 | 2021-06-10T20:17:55.000Z | 2021-06-10T20:17:55.000Z | src/7/accessing_variables_defined_inside_a_closure/example2.py | tuanavu/python-gitbook | 948a05e065b0f40afbfd22f697dff16238163cde | [
"MIT"
] | 15 | 2017-03-29T17:57:33.000Z | 2021-08-24T02:20:08.000Z | # Example of faking classes with a closure
import sys
class ClosureInstance:
def __init__(self, locals=None):
if locals is None:
locals = sys._getframe(1).f_locals
# Update instance dictionary with callables
self.__dict__.update((key,value) for key, value in locals.items()
if callable(value) )
# Redirect special methods
def __len__(self):
return self.__dict__['__len__']()
# Example use
def Stack():
items = []
def push(item):
items.append(item)
def pop():
return items.pop()
def __len__():
return len(items)
return ClosureInstance()
if __name__ == '__main__':
s = Stack()
print(s)
s.push(10)
s.push(20)
s.push('Hello')
print(len(s))
print(s.pop())
print(s.pop())
print(s.pop())
| 20.829268 | 73 | 0.580796 | 106 | 854 | 4.358491 | 0.471698 | 0.051948 | 0.058442 | 0.060606 | 0.058442 | 0.058442 | 0 | 0 | 0 | 0 | 0 | 0.008333 | 0.297424 | 854 | 40 | 74 | 21.35 | 0.761667 | 0.139344 | 0 | 0.107143 | 0 | 0 | 0.027397 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.035714 | 0.107143 | 0.428571 | 0.178571 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
7759ab5bb6b2419c0cf09ba0f8c0454651c021e4 | 3,618 | py | Python | src/morphforge/simulation/neuron/core/neuronsimulationenvironment.py | mikehulluk/morphforge | 2a95096f144ed4ea487decb735ce66706357d3c7 | [
"BSD-2-Clause"
] | 1 | 2021-01-21T11:31:59.000Z | 2021-01-21T11:31:59.000Z | src/morphforge/simulation/neuron/core/neuronsimulationenvironment.py | mikehulluk/morphforge | 2a95096f144ed4ea487decb735ce66706357d3c7 | [
"BSD-2-Clause"
] | null | null | null | src/morphforge/simulation/neuron/core/neuronsimulationenvironment.py | mikehulluk/morphforge | 2a95096f144ed4ea487decb735ce66706357d3c7 | [
"BSD-2-Clause"
] | null | null | null | #!/usr/bin/python
# -*- coding: utf-8 -*-
# ---------------------------------------------------------------------
# Copyright (c) 2012 Michael Hull.
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions
# are met:
#
# - Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# - Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in
# the documentation and/or other materials provided with the
# distribution.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# ----------------------------------------------------------------------
from morphforge.core import PluginDict
from morphforge.simulation.base import SimulationEnvironment
from morphforge.simulation.base import CurrentClampStepChange
from morphforge.simulation.base import VoltageClampStepChange
from morphforge.simulation.neuron.core import NEURONSimulationSettings
from morphforge.simulation.neuron.networks import NEURONGapJunction
from morphforge.simulation.neuron.core import NEURONCell
from morphforge.simulation.neuron.core import NEURONSimulation
class NEURONEnvironment(SimulationEnvironment):
_env_name = "NEURON"
def Simulation(self, **kwargs):
return NEURONSimulation(environment=self, **kwargs)
def Cell(self, **kwargs):
return NEURONCell(**kwargs)
def SimulationSettings(self, **kwargs):
return NEURONSimulationSettings(**kwargs)
channels = PluginDict()
presynapticmechanisms = PluginDict()
synapse_psm_template_type = PluginDict()
currentclamps = PluginDict()
voltageclamps = PluginDict()
@classmethod
def Channel(cls, chltype, **kwargs):
chl = cls.channels.get_plugin(chltype)
return chl(**kwargs)
@classmethod
def SynapticTrigger(cls, triggertype, **kwargs):
trigger_functor = cls.presynapticmechanisms.get_plugin(triggertype)
return trigger_functor(**kwargs)
def PostSynapticMechTemplate(self, psm_type, **kwargs):
tmpl_functor = self.synapse_psm_template_type.get_plugin(psm_type)
return tmpl_functor(**kwargs)
def CurrentClamp(self, protocol=CurrentClampStepChange, **kwargs):
current_clamp = self.currentclamps.get_plugin(protocol)
return current_clamp(**kwargs)
def VoltageClamp(self, protocol=VoltageClampStepChange, **kwargs):
voltage_clamp = self.voltageclamps.get_plugin(protocol)
return voltage_clamp(**kwargs)
def GapJunction(self, **kwargs):
return NEURONGapJunction(**kwargs)
def Synapse(self, **kwargs):
from morphforge.simulation.neuron.networks import NEURONSynapse
return NEURONSynapse(**kwargs)
| 37.6875 | 75 | 0.725263 | 399 | 3,618 | 6.518797 | 0.423559 | 0.048443 | 0.073818 | 0.05767 | 0.189927 | 0.132257 | 0.052288 | 0.052288 | 0.052288 | 0.052288 | 0 | 0.001654 | 0.164456 | 3,618 | 95 | 76 | 38.084211 | 0.85875 | 0.408789 | 0 | 0.046512 | 0 | 0 | 0.002848 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.232558 | false | 0 | 0.209302 | 0.093023 | 0.837209 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
91f45538afa3b794621cc7c469da195bbca2956a | 627 | py | Python | samples/cordic/cordic_golden.py | hj424/heterocl | e51b8f7f65ae6ad55c0c2426ab7192c3d8f6702b | [
"Apache-2.0"
] | 236 | 2019-05-19T01:48:11.000Z | 2022-03-31T09:03:54.000Z | samples/cordic/cordic_golden.py | hj424/heterocl | e51b8f7f65ae6ad55c0c2426ab7192c3d8f6702b | [
"Apache-2.0"
] | 248 | 2019-05-17T19:18:36.000Z | 2022-03-30T21:25:47.000Z | samples/cordic/cordic_golden.py | hj424/heterocl | e51b8f7f65ae6ad55c0c2426ab7192c3d8f6702b | [
"Apache-2.0"
] | 85 | 2019-05-17T20:09:27.000Z | 2022-02-28T20:19:00.000Z | import numpy as np
golden = np.array([
[100.0, 100.0],
[206.226840616, 179.610387213],
[1190.25124092, 1197.15702025],
[1250.76639667, 1250.3933971],
[1261.76760093, 1250.17718583],
[1237.4846285, 1237.56490579],
[1273.56730356, 1266.82141705],
[1272.899992, 1259.92589118],
[1.17000308922e-06, 1.21115462165e-06],
[4.69048419035e-08, 5.61093645301e-08],
[1.50244060584e-09, 2.44292250731e-09],
[8.47391624349e-11, 1.15593790738e-10],
[5.10649970307e-12, 4.80114236959e-12],
[8.34326950279e-13, 4.1368839091e-13],
[3.66142109259e-14, 4.95319932219e-14],
[8.20801944862e-15, 4.94154683061e-14]])
| 31.35 | 41 | 0.700159 | 87 | 627 | 5.045977 | 0.735632 | 0.018223 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.710432 | 0.113238 | 627 | 19 | 42 | 33 | 0.079137 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
91f4996456aabf6bbe1ac697a26d604a9883879d | 98 | py | Python | src/game_client/conf.py | adapiekarska/network-pong | c6a88b66570f26aea9c9976eb16953c480b846ec | [
"MIT"
] | 2 | 2018-11-14T17:25:24.000Z | 2019-12-09T17:57:30.000Z | src/game_client/conf.py | adapiekarska/network-pong | c6a88b66570f26aea9c9976eb16953c480b846ec | [
"MIT"
] | null | null | null | src/game_client/conf.py | adapiekarska/network-pong | c6a88b66570f26aea9c9976eb16953c480b846ec | [
"MIT"
] | null | null | null | """
User configuration file for the client.
"""
SERVER_ADDRESS = "127.0.0.1"
SERVER_PORT = 50000
| 14 | 39 | 0.704082 | 15 | 98 | 4.466667 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.13253 | 0.153061 | 98 | 6 | 40 | 16.333333 | 0.674699 | 0.397959 | 0 | 0 | 0 | 0 | 0.176471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
91f69a518f7a745cba0d44a46ab85227b8ebc8dd | 636 | py | Python | 8. str_range/test_solution.py | dcragusa/WeeklyPythonExerciseB2 | a7da3830e27891060dcfb0804c81f52b1f250ce8 | [
"MIT"
] | null | null | null | 8. str_range/test_solution.py | dcragusa/WeeklyPythonExerciseB2 | a7da3830e27891060dcfb0804c81f52b1f250ce8 | [
"MIT"
] | null | null | null | 8. str_range/test_solution.py | dcragusa/WeeklyPythonExerciseB2 | a7da3830e27891060dcfb0804c81f52b1f250ce8 | [
"MIT"
] | null | null | null | from solution import str_range
def test_same_start_end():
r = str_range('a', 'a')
assert iter(r) == r
assert ''.join(list(r)) == 'a'
def test_simple():
r = str_range('a', 'c')
assert ''.join(list(r)) == 'abc'
def test_simple_with_step():
r = str_range('a', 'c', 2)
assert ''.join(list(r)) == 'ac'
def test_simple_with_negativestep():
r = str_range('c', 'a', -2)
assert ''.join(list(r)) == 'ca'
def test_hebrew():
r = str_range('א', 'ז', 2)
assert ''.join(list(r)) == 'אגהז'
test_same_start_end()
test_simple()
test_simple_with_step()
test_simple_with_negativestep()
test_hebrew()
| 18.171429 | 37 | 0.606918 | 99 | 636 | 3.616162 | 0.292929 | 0.134078 | 0.125698 | 0.209497 | 0.195531 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005837 | 0.191824 | 636 | 34 | 38 | 18.705882 | 0.690661 | 0 | 0 | 0 | 0 | 0 | 0.034591 | 0 | 0 | 0 | 0 | 0 | 0.272727 | 1 | 0.227273 | false | 0 | 0.045455 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
91fa4adf813afeff4ee8cff082ebb2bd99d4723f | 269 | py | Python | Python3/Coursera/003_quadratic_roots/solution.py | neon1ks/Study | 5d40171cf3bf5e8d3a95539e91f5afec54d1daf3 | [
"MIT"
] | null | null | null | Python3/Coursera/003_quadratic_roots/solution.py | neon1ks/Study | 5d40171cf3bf5e8d3a95539e91f5afec54d1daf3 | [
"MIT"
] | null | null | null | Python3/Coursera/003_quadratic_roots/solution.py | neon1ks/Study | 5d40171cf3bf5e8d3a95539e91f5afec54d1daf3 | [
"MIT"
] | null | null | null | import sys
import math
if __name__ == '__main__':
a = int(sys.argv[1])
b = int(sys.argv[2])
c = int(sys.argv[3])
d = b * b - 4 * a * c
x1 = (-b + math.sqrt(d)) / (2 * a)
x2 = (-b - math.sqrt(d)) / (2 * a)
print(int(x1))
print(int(x2))
| 19.214286 | 38 | 0.472119 | 48 | 269 | 2.479167 | 0.416667 | 0.151261 | 0.252101 | 0.168067 | 0.201681 | 0.201681 | 0 | 0 | 0 | 0 | 0 | 0.053476 | 0.304833 | 269 | 13 | 39 | 20.692308 | 0.582888 | 0 | 0 | 0 | 0 | 0 | 0.02974 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.181818 | 0 | 0.181818 | 0.181818 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6221e9086b65f59966870eca97102d109aabb9a1 | 3,458 | py | Python | RestPy/ixnetwork_restpy/testplatform/sessions/ixnetwork/vport/interface/dhcpv4discoveredinfo/dhcpv4discoveredinfo.py | ralfjon/IxNetwork | c0c834fbc465af69c12fd6b7cee4628baba7fff1 | [
"MIT"
] | null | null | null | RestPy/ixnetwork_restpy/testplatform/sessions/ixnetwork/vport/interface/dhcpv4discoveredinfo/dhcpv4discoveredinfo.py | ralfjon/IxNetwork | c0c834fbc465af69c12fd6b7cee4628baba7fff1 | [
"MIT"
] | null | null | null | RestPy/ixnetwork_restpy/testplatform/sessions/ixnetwork/vport/interface/dhcpv4discoveredinfo/dhcpv4discoveredinfo.py | ralfjon/IxNetwork | c0c834fbc465af69c12fd6b7cee4628baba7fff1 | [
"MIT"
] | null | null | null |
# Copyright 1997 - 2018 by IXIA Keysight
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
from ixnetwork_restpy.base import Base
from ixnetwork_restpy.files import Files
class DhcpV4DiscoveredInfo(Base):
"""The DhcpV4DiscoveredInfo class encapsulates a required dhcpV4DiscoveredInfo node in the ixnetwork hierarchy.
An instance of the class can be obtained by accessing the DhcpV4DiscoveredInfo property from a parent instance.
The internal properties list will contain one and only one set of properties which is populated when the property is accessed.
"""
_SDM_NAME = 'dhcpV4DiscoveredInfo'
def __init__(self, parent):
super(DhcpV4DiscoveredInfo, self).__init__(parent)
@property
def Gateway(self):
"""(Read only) A learned/allocated IPv4 Gateway address for this interface on the router that connects to the network segment on which the source host is located.
Returns:
str
"""
return self._get_attribute('gateway')
@property
def Ipv4Address(self):
"""(Read only) A learned/allocated IPv4 address for this interface,
Returns:
str
"""
return self._get_attribute('ipv4Address')
@property
def Ipv4Mask(self):
"""(Read only) A 32-bit address mask used in IP to indicate the bits of an IP address that are being used for the subnet address.
Returns:
number
"""
return self._get_attribute('ipv4Mask')
@property
def IsDhcpV4LearnedInfoRefreshed(self):
"""(Read Only) When true, the DHCPv4 discovered information is refreshed automatically.
Returns:
bool
"""
return self._get_attribute('isDhcpV4LearnedInfoRefreshed')
@property
def LeaseDuration(self):
"""(Read Only) The user-specified value and the lease timer (from the DHCP Server) are compared. The lowest value is used as the release/renew timer. After this time period has elapsed, the address will be renewed.
Returns:
number
"""
return self._get_attribute('leaseDuration')
@property
def ProtocolInterface(self):
"""(Read only) An Ixia protocol interface that is negotiating with the DHCP Server.
Returns:
str(None|/api/v1/sessions/1/ixnetwork/vport?deepchild=interface)
"""
return self._get_attribute('protocolInterface')
@property
def Tlv(self):
"""(Read only) Type Length Value for DHCPv4.
Returns:
list(dict(arg1:number,arg2:str))
"""
return self._get_attribute('tlv')
| 34.58 | 217 | 0.738577 | 468 | 3,458 | 5.401709 | 0.440171 | 0.03481 | 0.033228 | 0.060918 | 0.089003 | 0.079114 | 0.026108 | 0 | 0 | 0 | 0 | 0.01073 | 0.19144 | 3,458 | 99 | 218 | 34.929293 | 0.893419 | 0.703297 | 0 | 0.259259 | 0 | 0 | 0.123986 | 0.032445 | 0 | 0 | 0 | 0 | 0 | 1 | 0.296296 | false | 0 | 0.074074 | 0 | 0.703704 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
6224e1d3f02d7b9dda37a271e14789ceeccd2dd5 | 574 | py | Python | code_hashers/attendant.py | ksajan/iis-ms-del | 6339f639d674fedb88454b43dcd64493be2a4558 | [
"MIT"
] | 2 | 2019-12-24T13:32:22.000Z | 2019-12-26T11:26:08.000Z | code_hashers/attendant.py | ksajan/iis-ms-del | 6339f639d674fedb88454b43dcd64493be2a4558 | [
"MIT"
] | 1 | 2019-12-26T07:53:34.000Z | 2019-12-26T07:53:34.000Z | code_hashers/attendant.py | ksajan/iis-ms-del | 6339f639d674fedb88454b43dcd64493be2a4558 | [
"MIT"
] | 35 | 2019-12-22T05:05:43.000Z | 2019-12-22T07:16:56.000Z | class ParkingLot:
def __init__(self, username, latitude, longitude, totalSpace, costHour):
self.username = username
self.latitude = latitude
self.longitude = longitude
self.totalSpace = totalSpace
self.availableSpace = totalSpace
self.costHour = costHour
def getSpace(self):
return self.availableSpace
def setBook(self):
self.availableSpace -= 1
class signUp:
def __init__(self, username, password):
self.username = username
self.password = password
# def getDetails():
| 24.956522 | 76 | 0.651568 | 55 | 574 | 6.654545 | 0.327273 | 0.131148 | 0.060109 | 0.103825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002387 | 0.270035 | 574 | 22 | 77 | 26.090909 | 0.871122 | 0.029617 | 0 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.125 | 0 | 0.0625 | 0.4375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
622f91861f9e766601a659b3e6368f910237afb0 | 159 | py | Python | Source Codes Testing/list1.py | urstrulykkr/Python | 098ed5d391f0e62d4950ca80cc57a032c65d1637 | [
"MIT"
] | null | null | null | Source Codes Testing/list1.py | urstrulykkr/Python | 098ed5d391f0e62d4950ca80cc57a032c65d1637 | [
"MIT"
] | null | null | null | Source Codes Testing/list1.py | urstrulykkr/Python | 098ed5d391f0e62d4950ca80cc57a032c65d1637 | [
"MIT"
] | null | null | null | lst1=list()
lst1.append('K')
lst1.append('A')
lst2=['U', 'S', 'H', 'I', 'K']
print(lst1+lst2)
print(lst2[0] +lst2[1]+lst1[1])
for i in lst1+lst2:
print(i)
| 14.454545 | 32 | 0.578616 | 31 | 159 | 2.967742 | 0.483871 | 0.217391 | 0.282609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.100719 | 0.125786 | 159 | 10 | 33 | 15.9 | 0.561151 | 0 | 0 | 0 | 0 | 0 | 0.044025 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.375 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
623f19da861cce44fb9bf0964c673c92b5ac9b2f | 713 | py | Python | learn_big_data_on_aws/config.py | MacHu-GWU/learn_big_data_on_aws-project | 0db78c35a1712fdd905763fd299663982e44601c | [
"MIT"
] | null | null | null | learn_big_data_on_aws/config.py | MacHu-GWU/learn_big_data_on_aws-project | 0db78c35a1712fdd905763fd299663982e44601c | [
"MIT"
] | null | null | null | learn_big_data_on_aws/config.py | MacHu-GWU/learn_big_data_on_aws-project | 0db78c35a1712fdd905763fd299663982e44601c | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from s3pathlib import S3Path
class Config:
aws_profile = "aws_data_lab_sanhe"
aws_region = "us-east-2"
# where you store data, artifacts
bucket = "aws-data-lab-sanhe-for-everything-us-east-2"
# s3 folder for data lake
dataset_prefix = "poc/2022-02-26-learn_big_data_on_aws/dataset"
# s3 folder for athena results
athena_result_prefix = "athena/results"
# glue catalog database name
dbname = "poc"
@property
def s3path_dataset_prefix(self):
return S3Path(self.bucket, self.dataset_prefix)
@property
def s3path_athena_result_prefix(self):
return S3Path(self.bucket, self.athena_result_prefix)
config = Config()
| 24.586207 | 67 | 0.695652 | 98 | 713 | 4.857143 | 0.5 | 0.081933 | 0.113445 | 0.063025 | 0.151261 | 0.151261 | 0.151261 | 0 | 0 | 0 | 0 | 0.033569 | 0.206171 | 713 | 28 | 68 | 25.464286 | 0.80742 | 0.186536 | 0 | 0.133333 | 0 | 0 | 0.228223 | 0.151568 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.066667 | 0.133333 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
6244eae08f282f33089c904acc046111406cab02 | 234 | py | Python | ex15.py | phyupyarko/python-exercises | f231ca8c8c1f2614bb166cc72ce45860eff88c1d | [
"MIT"
] | null | null | null | ex15.py | phyupyarko/python-exercises | f231ca8c8c1f2614bb166cc72ce45860eff88c1d | [
"MIT"
] | null | null | null | ex15.py | phyupyarko/python-exercises | f231ca8c8c1f2614bb166cc72ce45860eff88c1d | [
"MIT"
] | null | null | null | from sys import argv
script, filename=argv
txt = open(filename)
print(f"Here's your file {filename}:")
print(txt.read())
print("Type the filename again:")
file_again = input("> ")
txt_again = open (file_again)
print(txt_again.read())
| 23.4 | 38 | 0.726496 | 37 | 234 | 4.486486 | 0.513514 | 0.156627 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 234 | 9 | 39 | 26 | 0.801932 | 0 | 0 | 0 | 0 | 0 | 0.230769 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0.444444 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
6250a79068b77c4892032d50b57910bd5cac5d15 | 42,245 | py | Python | uhd_restpy/testplatform/sessions/ixnetwork/topology/ospfv3_c029fd7cd4a9e9897b7b4e4547458751.py | Vibaswan/ixnetwork_restpy | 239fedc7050890746cbabd71ea1e91c68d9e5cad | [
"MIT"
] | null | null | null | uhd_restpy/testplatform/sessions/ixnetwork/topology/ospfv3_c029fd7cd4a9e9897b7b4e4547458751.py | Vibaswan/ixnetwork_restpy | 239fedc7050890746cbabd71ea1e91c68d9e5cad | [
"MIT"
] | null | null | null | uhd_restpy/testplatform/sessions/ixnetwork/topology/ospfv3_c029fd7cd4a9e9897b7b4e4547458751.py | Vibaswan/ixnetwork_restpy | 239fedc7050890746cbabd71ea1e91c68d9e5cad | [
"MIT"
] | null | null | null | # MIT LICENSE
#
# Copyright 1997 - 2020 by IXIA Keysight
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
from uhd_restpy.base import Base
from uhd_restpy.files import Files
class Ospfv3(Base):
"""Ospfv3 Interface level Configuration
The Ospfv3 class encapsulates a list of ospfv3 resources that are managed by the user.
A list of resources can be retrieved from the server using the Ospfv3.find() method.
The list can be managed by using the Ospfv3.add() and Ospfv3.remove() methods.
"""
__slots__ = ()
_SDM_NAME = 'ospfv3'
_SDM_ATT_MAP = {
'Active': 'active',
'AdjSID': 'adjSID',
'AreaId': 'areaId',
'AreaIdIp': 'areaIdIp',
'AuthAlgo': 'authAlgo',
'BFlag': 'bFlag',
'ConnectedVia': 'connectedVia',
'Count': 'count',
'DeadInterval': 'deadInterval',
'DemandCircuit': 'demandCircuit',
'DescriptiveName': 'descriptiveName',
'EnableAdjSID': 'enableAdjSID',
'EnableAuthentication': 'enableAuthentication',
'EnableBfdRegistration': 'enableBfdRegistration',
'EnableFastHello': 'enableFastHello',
'EnableIgnoreDbDescMtu': 'enableIgnoreDbDescMtu',
'Errors': 'errors',
'ExternalCapability': 'externalCapability',
'GFlag': 'gFlag',
'HelloInterval': 'helloInterval',
'HelloMultiplier': 'helloMultiplier',
'InstanceId': 'instanceId',
'Key': 'key',
'LFlag': 'lFlag',
'LinkMetric': 'linkMetric',
'LocalRouterID': 'localRouterID',
'Multiplier': 'multiplier',
'Name': 'name',
'NetworkType': 'networkType',
'NssaCapability': 'nssaCapability',
'Ospfv3IfaceState': 'ospfv3IfaceState',
'Ospfv3NeighborState': 'ospfv3NeighborState',
'PFlag': 'pFlag',
'Priority': 'priority',
'Router': 'router',
'SaId': 'saId',
'SessionInfo': 'sessionInfo',
'SessionStatus': 'sessionStatus',
'StackedLayers': 'stackedLayers',
'StateCounts': 'stateCounts',
'Status': 'status',
'TypeAreaId': 'typeAreaId',
'V6': 'v6',
'VFlag': 'vFlag',
'Weight': 'weight',
}
def __init__(self, parent):
super(Ospfv3, self).__init__(parent)
@property
def Connector(self):
"""
Returns
-------
- obj(uhd_restpy.testplatform.sessions.ixnetwork.topology.connector_d0d942810e4010add7642d3914a1f29b.Connector): An instance of the Connector class
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
from uhd_restpy.testplatform.sessions.ixnetwork.topology.connector_d0d942810e4010add7642d3914a1f29b import Connector
return Connector(self)
@property
def LearnedInfo(self):
"""
Returns
-------
- obj(uhd_restpy.testplatform.sessions.ixnetwork.topology.learnedinfo.learnedinfo_ff4d5e5643a63bccb40b6cf64fc58100.LearnedInfo): An instance of the LearnedInfo class
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
from uhd_restpy.testplatform.sessions.ixnetwork.topology.learnedinfo.learnedinfo_ff4d5e5643a63bccb40b6cf64fc58100 import LearnedInfo
return LearnedInfo(self)
@property
def Active(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Activate/Deactivate Configuration
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['Active']))
@property
def AdjSID(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): An Adjacency Segment Identifier (Adj-SID) represents a router adjacency in Segment Routing
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['AdjSID']))
@property
def AreaId(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): OSPFv3 Area ID for a non-connected interface, displayed in Interger format
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['AreaId']))
@property
def AreaIdIp(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): OSPFv3 Area ID for a non-connected interface, displayed in IP Address format
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['AreaIdIp']))
@property
def AuthAlgo(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Authentication Algorithms
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['AuthAlgo']))
@property
def BFlag(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): B Flag: Backup Flag: If set, the Adj-SID refers to an adjacency that is eligible for protection
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['BFlag']))
@property
def ConnectedVia(self):
"""DEPRECATED
Returns
-------
- list(str[None | /api/v1/sessions/1/ixnetwork/topology/.../*]): List of layers this layer is used to connect with to the wire.
"""
return self._get_attribute(self._SDM_ATT_MAP['ConnectedVia'])
@ConnectedVia.setter
def ConnectedVia(self, value):
self._set_attribute(self._SDM_ATT_MAP['ConnectedVia'], value)
@property
def Count(self):
"""
Returns
-------
- number: Number of elements inside associated multiplier-scaled container object, e.g. number of devices inside a Device Group.
"""
return self._get_attribute(self._SDM_ATT_MAP['Count'])
@property
def DeadInterval(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Dead Interval
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['DeadInterval']))
@property
def DemandCircuit(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Option bit 5
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['DemandCircuit']))
@property
def DescriptiveName(self):
"""
Returns
-------
- str: Longer, more descriptive name for element. It's not guaranteed to be unique like -name-, but may offer more context.
"""
return self._get_attribute(self._SDM_ATT_MAP['DescriptiveName'])
@property
def EnableAdjSID(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Makes the Adjacency Segment Identifier (Adj-SID) available
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['EnableAdjSID']))
@property
def EnableAuthentication(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Enable Authentication
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['EnableAuthentication']))
@property
def EnableBfdRegistration(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Enable BFD Registration
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['EnableBfdRegistration']))
@property
def EnableFastHello(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Enable Fast Hello
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['EnableFastHello']))
@property
def EnableIgnoreDbDescMtu(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Ignore DB-Desc MTU
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['EnableIgnoreDbDescMtu']))
@property
def Errors(self):
"""
Returns
-------
- list(dict(arg1:str[None | /api/v1/sessions/1/ixnetwork//.../*],arg2:list[str])): A list of errors that have occurred
"""
return self._get_attribute(self._SDM_ATT_MAP['Errors'])
@property
def ExternalCapability(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Option bit 1
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['ExternalCapability']))
@property
def GFlag(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): G-Flag: Group Flag: If set, the G-Flag indicates that the Adj-SID refers to a group of adjacencies where it may be assigned
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['GFlag']))
@property
def HelloInterval(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Hello Interval
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['HelloInterval']))
@property
def HelloMultiplier(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Hello Multiplier
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['HelloMultiplier']))
@property
def InstanceId(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Instance ID
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['InstanceId']))
@property
def Key(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Key
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['Key']))
@property
def LFlag(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): L-Flag: Local Flag. If set, then the value/index carried by the SID has local significance
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['LFlag']))
@property
def LinkMetric(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Link Metric
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['LinkMetric']))
@property
def LocalRouterID(self):
"""
Returns
-------
- list(str): Router ID
"""
return self._get_attribute(self._SDM_ATT_MAP['LocalRouterID'])
@property
def Multiplier(self):
"""
Returns
-------
- number: Number of layer instances per parent instance (multiplier)
"""
return self._get_attribute(self._SDM_ATT_MAP['Multiplier'])
@Multiplier.setter
def Multiplier(self, value):
self._set_attribute(self._SDM_ATT_MAP['Multiplier'], value)
@property
def Name(self):
"""
Returns
-------
- str: Name of NGPF element, guaranteed to be unique in Scenario
"""
return self._get_attribute(self._SDM_ATT_MAP['Name'])
@Name.setter
def Name(self, value):
self._set_attribute(self._SDM_ATT_MAP['Name'], value)
@property
def NetworkType(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Network Type
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['NetworkType']))
@property
def NssaCapability(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Option bit 3
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['NssaCapability']))
@property
def Ospfv3IfaceState(self):
"""
Returns
-------
- list(str[backup | down | dr | drOther | pointToPoint | unrecognized | waiting]): Logs additional information about the Interface State
"""
return self._get_attribute(self._SDM_ATT_MAP['Ospfv3IfaceState'])
@property
def Ospfv3NeighborState(self):
"""
Returns
-------
- list(str[attempt | down | exchange | exStart | full | init | loading | multiNeighbor | none | twoWay]): Logs additional information about the Neighbor State
"""
return self._get_attribute(self._SDM_ATT_MAP['Ospfv3NeighborState'])
@property
def PFlag(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): P-Flag:Persistent Flag: If set, the SID is persistently allocated. The SID value remains consistent across router restart and session/interface flap
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['PFlag']))
@property
def Priority(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Priority (when DR/BDR)
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['Priority']))
@property
def Router(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Option bit 4
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['Router']))
@property
def SaId(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Security Association ID
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['SaId']))
@property
def SessionInfo(self):
"""
Returns
-------
- list(str[ifaceSessInfoAllNbrIn2Way | ifaceSessInfoAllNbrInattempt | ifaceSessInfoAllNbrInDown | ifaceSessInfoAllNbrInExchange | ifaceSessInfoAllNbrInExStart | ifaceSessInfoAllNbrInInit | ifaceSessInfoAllNbrInLoading | ifaceSessInfoFsmNotStarted | ifaceSessInfoSameNbrId | iPAddressNotRcvd | none]): Logs additional information about the session state
"""
return self._get_attribute(self._SDM_ATT_MAP['SessionInfo'])
@property
def SessionStatus(self):
"""
Returns
-------
- list(str[down | notStarted | up]): Current state of protocol session: Not Started - session negotiation not started, the session is not active yet. Down - actively trying to bring up a protocol session, but negotiation is didn't successfully complete (yet). Up - session came up successfully.
"""
return self._get_attribute(self._SDM_ATT_MAP['SessionStatus'])
@property
def StackedLayers(self):
"""
Returns
-------
- list(str[None | /api/v1/sessions/1/ixnetwork/topology/.../*]): List of secondary (many to one) child layer protocols
"""
return self._get_attribute(self._SDM_ATT_MAP['StackedLayers'])
@StackedLayers.setter
def StackedLayers(self, value):
self._set_attribute(self._SDM_ATT_MAP['StackedLayers'], value)
@property
def StateCounts(self):
"""
Returns
-------
- dict(total:number,notStarted:number,down:number,up:number): A list of values that indicates the total number of sessions, the number of sessions not started, the number of sessions down and the number of sessions that are up
"""
return self._get_attribute(self._SDM_ATT_MAP['StateCounts'])
@property
def Status(self):
"""
Returns
-------
- str(configured | error | mixed | notStarted | started | starting | stopping): Running status of associated network element. Once in Started state, protocol sessions will begin to negotiate.
"""
return self._get_attribute(self._SDM_ATT_MAP['Status'])
@property
def TypeAreaId(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Area ID Type
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['TypeAreaId']))
@property
def V6(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Option bit 0
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['V6']))
@property
def VFlag(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): V-Flag: Value flag. If set, then the SID carries an absolute value label value
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['VFlag']))
@property
def Weight(self):
"""
Returns
-------
- obj(uhd_restpy.multivalue.Multivalue): Weight of the SID for the purpose of load balancing
"""
from uhd_restpy.multivalue import Multivalue
return Multivalue(self, self._get_attribute(self._SDM_ATT_MAP['Weight']))
def update(self, ConnectedVia=None, Multiplier=None, Name=None, StackedLayers=None):
"""Updates ospfv3 resource on the server.
This method has some named parameters with a type: obj (Multivalue).
The Multivalue class has documentation that details the possible values for those named parameters.
Args
----
- ConnectedVia (list(str[None | /api/v1/sessions/1/ixnetwork/topology/.../*])): List of layers this layer is used to connect with to the wire.
- Multiplier (number): Number of layer instances per parent instance (multiplier)
- Name (str): Name of NGPF element, guaranteed to be unique in Scenario
- StackedLayers (list(str[None | /api/v1/sessions/1/ixnetwork/topology/.../*])): List of secondary (many to one) child layer protocols
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
return self._update(self._map_locals(self._SDM_ATT_MAP, locals()))
def add(self, ConnectedVia=None, Multiplier=None, Name=None, StackedLayers=None):
"""Adds a new ospfv3 resource on the server and adds it to the container.
Args
----
- ConnectedVia (list(str[None | /api/v1/sessions/1/ixnetwork/topology/.../*])): List of layers this layer is used to connect with to the wire.
- Multiplier (number): Number of layer instances per parent instance (multiplier)
- Name (str): Name of NGPF element, guaranteed to be unique in Scenario
- StackedLayers (list(str[None | /api/v1/sessions/1/ixnetwork/topology/.../*])): List of secondary (many to one) child layer protocols
Returns
-------
- self: This instance with all currently retrieved ospfv3 resources using find and the newly added ospfv3 resources available through an iterator or index
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
return self._create(self._map_locals(self._SDM_ATT_MAP, locals()))
def remove(self):
"""Deletes all the contained ospfv3 resources in this instance from the server.
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
self._delete()
def find(self, ConnectedVia=None, Count=None, DescriptiveName=None, Errors=None, LocalRouterID=None, Multiplier=None, Name=None, Ospfv3IfaceState=None, Ospfv3NeighborState=None, SessionInfo=None, SessionStatus=None, StackedLayers=None, StateCounts=None, Status=None):
"""Finds and retrieves ospfv3 resources from the server.
All named parameters are evaluated on the server using regex. The named parameters can be used to selectively retrieve ospfv3 resources from the server.
To retrieve an exact match ensure the parameter value starts with ^ and ends with $
By default the find method takes no parameters and will retrieve all ospfv3 resources from the server.
Args
----
- ConnectedVia (list(str[None | /api/v1/sessions/1/ixnetwork/topology/.../*])): List of layers this layer is used to connect with to the wire.
- Count (number): Number of elements inside associated multiplier-scaled container object, e.g. number of devices inside a Device Group.
- DescriptiveName (str): Longer, more descriptive name for element. It's not guaranteed to be unique like -name-, but may offer more context.
- Errors (list(dict(arg1:str[None | /api/v1/sessions/1/ixnetwork//.../*],arg2:list[str]))): A list of errors that have occurred
- LocalRouterID (list(str)): Router ID
- Multiplier (number): Number of layer instances per parent instance (multiplier)
- Name (str): Name of NGPF element, guaranteed to be unique in Scenario
- Ospfv3IfaceState (list(str[backup | down | dr | drOther | pointToPoint | unrecognized | waiting])): Logs additional information about the Interface State
- Ospfv3NeighborState (list(str[attempt | down | exchange | exStart | full | init | loading | multiNeighbor | none | twoWay])): Logs additional information about the Neighbor State
- SessionInfo (list(str[ifaceSessInfoAllNbrIn2Way | ifaceSessInfoAllNbrInattempt | ifaceSessInfoAllNbrInDown | ifaceSessInfoAllNbrInExchange | ifaceSessInfoAllNbrInExStart | ifaceSessInfoAllNbrInInit | ifaceSessInfoAllNbrInLoading | ifaceSessInfoFsmNotStarted | ifaceSessInfoSameNbrId | iPAddressNotRcvd | none])): Logs additional information about the session state
- SessionStatus (list(str[down | notStarted | up])): Current state of protocol session: Not Started - session negotiation not started, the session is not active yet. Down - actively trying to bring up a protocol session, but negotiation is didn't successfully complete (yet). Up - session came up successfully.
- StackedLayers (list(str[None | /api/v1/sessions/1/ixnetwork/topology/.../*])): List of secondary (many to one) child layer protocols
- StateCounts (dict(total:number,notStarted:number,down:number,up:number)): A list of values that indicates the total number of sessions, the number of sessions not started, the number of sessions down and the number of sessions that are up
- Status (str(configured | error | mixed | notStarted | started | starting | stopping)): Running status of associated network element. Once in Started state, protocol sessions will begin to negotiate.
Returns
-------
- self: This instance with matching ospfv3 resources retrieved from the server available through an iterator or index
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
return self._select(self._map_locals(self._SDM_ATT_MAP, locals()))
def read(self, href):
"""Retrieves a single instance of ospfv3 data from the server.
Args
----
- href (str): An href to the instance to be retrieved
Returns
-------
- self: This instance with the ospfv3 resources from the server available through an iterator or index
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
return self._read(href)
def get_device_ids(self, PortNames=None, Active=None, AdjSID=None, AreaId=None, AreaIdIp=None, AuthAlgo=None, BFlag=None, DeadInterval=None, DemandCircuit=None, EnableAdjSID=None, EnableAuthentication=None, EnableBfdRegistration=None, EnableFastHello=None, EnableIgnoreDbDescMtu=None, ExternalCapability=None, GFlag=None, HelloInterval=None, HelloMultiplier=None, InstanceId=None, Key=None, LFlag=None, LinkMetric=None, NetworkType=None, NssaCapability=None, PFlag=None, Priority=None, Router=None, SaId=None, TypeAreaId=None, V6=None, VFlag=None, Weight=None):
"""Base class infrastructure that gets a list of ospfv3 device ids encapsulated by this object.
Use the optional regex parameters in the method to refine the list of device ids encapsulated by this object.
Args
----
- PortNames (str): optional regex of port names
- Active (str): optional regex of active
- AdjSID (str): optional regex of adjSID
- AreaId (str): optional regex of areaId
- AreaIdIp (str): optional regex of areaIdIp
- AuthAlgo (str): optional regex of authAlgo
- BFlag (str): optional regex of bFlag
- DeadInterval (str): optional regex of deadInterval
- DemandCircuit (str): optional regex of demandCircuit
- EnableAdjSID (str): optional regex of enableAdjSID
- EnableAuthentication (str): optional regex of enableAuthentication
- EnableBfdRegistration (str): optional regex of enableBfdRegistration
- EnableFastHello (str): optional regex of enableFastHello
- EnableIgnoreDbDescMtu (str): optional regex of enableIgnoreDbDescMtu
- ExternalCapability (str): optional regex of externalCapability
- GFlag (str): optional regex of gFlag
- HelloInterval (str): optional regex of helloInterval
- HelloMultiplier (str): optional regex of helloMultiplier
- InstanceId (str): optional regex of instanceId
- Key (str): optional regex of key
- LFlag (str): optional regex of lFlag
- LinkMetric (str): optional regex of linkMetric
- NetworkType (str): optional regex of networkType
- NssaCapability (str): optional regex of nssaCapability
- PFlag (str): optional regex of pFlag
- Priority (str): optional regex of priority
- Router (str): optional regex of router
- SaId (str): optional regex of saId
- TypeAreaId (str): optional regex of typeAreaId
- V6 (str): optional regex of v6
- VFlag (str): optional regex of vFlag
- Weight (str): optional regex of weight
Returns
-------
- list(int): A list of device ids that meets the regex criteria provided in the method parameters
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
return self._get_ngpf_device_ids(locals())
def Abort(self, *args, **kwargs):
"""Executes the abort operation on the server.
Abort CPF control plane (equals to demote to kUnconfigured state).
The IxNetwork model allows for multiple method Signatures with the same name while python does not.
abort(SessionIndices=list)
--------------------------
- SessionIndices (list(number)): This parameter requires an array of session numbers 1 2 3
abort(SessionIndices=string)
----------------------------
- SessionIndices (str): This parameter requires a string of session numbers 1-4;6;7-12
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
payload = { "Arg1": self }
for i in range(len(args)): payload['Arg%s' % (i + 2)] = args[i]
for item in kwargs.items(): payload[item[0]] = item[1]
return self._execute('abort', payload=payload, response_object=None)
def ClearAllLearnedInfo(self, *args, **kwargs):
"""Executes the clearAllLearnedInfo operation on the server.
Clear All Learned Info
The IxNetwork model allows for multiple method Signatures with the same name while python does not.
clearAllLearnedInfo(SessionIndices=list)
----------------------------------------
- SessionIndices (list(number)): This parameter requires an array of session numbers 1 2 3
clearAllLearnedInfo(SessionIndices=string)
------------------------------------------
- SessionIndices (str): This parameter requires a string of session numbers 1-4;6;7-12
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
payload = { "Arg1": self }
for i in range(len(args)): payload['Arg%s' % (i + 2)] = args[i]
for item in kwargs.items(): payload[item[0]] = item[1]
return self._execute('clearAllLearnedInfo', payload=payload, response_object=None)
def ClearAllLearnedInfoInClient(self, *args, **kwargs):
"""Executes the clearAllLearnedInfoInClient operation on the server.
Clears ALL routes from GUI grid for the selected OSPFv3 router.
clearAllLearnedInfoInClient(Arg2=list)list
------------------------------------------
- Arg2 (list(number)): List of indices into the protocol plugin. An empty list indicates all instances in the plugin.
- Returns list(str): ID to associate each async action invocation
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
payload = { "Arg1": self.href }
for i in range(len(args)): payload['Arg%s' % (i + 2)] = args[i]
for item in kwargs.items(): payload[item[0]] = item[1]
return self._execute('clearAllLearnedInfoInClient', payload=payload, response_object=None)
def GetBasicLearnedInfo(self, *args, **kwargs):
"""Executes the getBasicLearnedInfo operation on the server.
Get Basic Learned Info
The IxNetwork model allows for multiple method Signatures with the same name while python does not.
getBasicLearnedInfo(SessionIndices=list)
----------------------------------------
- SessionIndices (list(number)): This parameter requires an array of session numbers 1 2 3
getBasicLearnedInfo(SessionIndices=string)
------------------------------------------
- SessionIndices (str): This parameter requires a string of session numbers 1-4;6;7-12
getBasicLearnedInfo(Arg2=list)list
----------------------------------
- Arg2 (list(number)): List of indices into the protocol plugin. An empty list indicates all instances in the plugin.
- Returns list(str): ID to associate each async action invocation
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
payload = { "Arg1": self }
for i in range(len(args)): payload['Arg%s' % (i + 2)] = args[i]
for item in kwargs.items(): payload[item[0]] = item[1]
return self._execute('getBasicLearnedInfo', payload=payload, response_object=None)
def GetDetailedLearnedInfo(self, *args, **kwargs):
"""Executes the getDetailedLearnedInfo operation on the server.
Get Detailed Learned Info
The IxNetwork model allows for multiple method Signatures with the same name while python does not.
getDetailedLearnedInfo(SessionIndices=list)
-------------------------------------------
- SessionIndices (list(number)): This parameter requires an array of session numbers 1 2 3
getDetailedLearnedInfo(SessionIndices=string)
---------------------------------------------
- SessionIndices (str): This parameter requires a string of session numbers 1-4;6;7-12
getDetailedLearnedInfo(Arg2=list)list
-------------------------------------
- Arg2 (list(number)): List of indices into the protocol plugin. An empty list indicates all instances in the plugin.
- Returns list(str): ID to associate each async action invocation
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
payload = { "Arg1": self }
for i in range(len(args)): payload['Arg%s' % (i + 2)] = args[i]
for item in kwargs.items(): payload[item[0]] = item[1]
return self._execute('getDetailedLearnedInfo', payload=payload, response_object=None)
def RestartDown(self, *args, **kwargs):
"""Executes the restartDown operation on the server.
Stop and start interfaces and sessions that are in Down state.
The IxNetwork model allows for multiple method Signatures with the same name while python does not.
restartDown(SessionIndices=list)
--------------------------------
- SessionIndices (list(number)): This parameter requires an array of session numbers 1 2 3
restartDown(SessionIndices=string)
----------------------------------
- SessionIndices (str): This parameter requires a string of session numbers 1-4;6;7-12
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
payload = { "Arg1": self }
for i in range(len(args)): payload['Arg%s' % (i + 2)] = args[i]
for item in kwargs.items(): payload[item[0]] = item[1]
return self._execute('restartDown', payload=payload, response_object=None)
def ResumeHello(self, *args, **kwargs):
"""Executes the resumeHello operation on the server.
Resume sending OSPFv3 Hellos
The IxNetwork model allows for multiple method Signatures with the same name while python does not.
resumeHello(SessionIndices=list)
--------------------------------
- SessionIndices (list(number)): This parameter requires an array of session numbers 1 2 3
resumeHello(SessionIndices=string)
----------------------------------
- SessionIndices (str): This parameter requires a string of session numbers 1-4;6;7-12
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
payload = { "Arg1": self }
for i in range(len(args)): payload['Arg%s' % (i + 2)] = args[i]
for item in kwargs.items(): payload[item[0]] = item[1]
return self._execute('resumeHello', payload=payload, response_object=None)
def Resumehello(self, *args, **kwargs):
"""Executes the resumehello operation on the server.
Starts the protocol state machine for the given protocol session instances.
resumehello(Arg2=list)list
--------------------------
- Arg2 (list(number)): List of indices into the protocol plugin. An empty list indicates all instances in the plugin.
- Returns list(str): ID to associate each async action invocation
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
payload = { "Arg1": self.href }
for i in range(len(args)): payload['Arg%s' % (i + 2)] = args[i]
for item in kwargs.items(): payload[item[0]] = item[1]
return self._execute('resumehello', payload=payload, response_object=None)
def Start(self, *args, **kwargs):
"""Executes the start operation on the server.
Start CPF control plane (equals to promote to negotiated state).
The IxNetwork model allows for multiple method Signatures with the same name while python does not.
start(SessionIndices=list)
--------------------------
- SessionIndices (list(number)): This parameter requires an array of session numbers 1 2 3
start(SessionIndices=string)
----------------------------
- SessionIndices (str): This parameter requires a string of session numbers 1-4;6;7-12
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
payload = { "Arg1": self }
for i in range(len(args)): payload['Arg%s' % (i + 2)] = args[i]
for item in kwargs.items(): payload[item[0]] = item[1]
return self._execute('start', payload=payload, response_object=None)
def Stop(self, *args, **kwargs):
"""Executes the stop operation on the server.
Stop CPF control plane (equals to demote to PreValidated-DoDDone state).
The IxNetwork model allows for multiple method Signatures with the same name while python does not.
stop(SessionIndices=list)
-------------------------
- SessionIndices (list(number)): This parameter requires an array of session numbers 1 2 3
stop(SessionIndices=string)
---------------------------
- SessionIndices (str): This parameter requires a string of session numbers 1-4;6;7-12
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
payload = { "Arg1": self }
for i in range(len(args)): payload['Arg%s' % (i + 2)] = args[i]
for item in kwargs.items(): payload[item[0]] = item[1]
return self._execute('stop', payload=payload, response_object=None)
def StopHello(self, *args, **kwargs):
"""Executes the stopHello operation on the server.
Stop sending OSPFv3 Hellos
The IxNetwork model allows for multiple method Signatures with the same name while python does not.
stopHello(SessionIndices=list)
------------------------------
- SessionIndices (list(number)): This parameter requires an array of session numbers 1 2 3
stopHello(SessionIndices=string)
--------------------------------
- SessionIndices (str): This parameter requires a string of session numbers 1-4;6;7-12
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
payload = { "Arg1": self }
for i in range(len(args)): payload['Arg%s' % (i + 2)] = args[i]
for item in kwargs.items(): payload[item[0]] = item[1]
return self._execute('stopHello', payload=payload, response_object=None)
def Stophello(self, *args, **kwargs):
"""Executes the stophello operation on the server.
Stops the protocol state machine for the given protocol session instances.
stophello(Arg2=list)list
------------------------
- Arg2 (list(number)): List of indices into the protocol plugin. An empty list indicates all instances in the plugin.
- Returns list(str): ID to associate each async action invocation
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
payload = { "Arg1": self.href }
for i in range(len(args)): payload['Arg%s' % (i + 2)] = args[i]
for item in kwargs.items(): payload[item[0]] = item[1]
return self._execute('stophello', payload=payload, response_object=None)
| 41.951341 | 565 | 0.639981 | 4,679 | 42,245 | 5.696089 | 0.110707 | 0.022963 | 0.044199 | 0.025364 | 0.680399 | 0.653047 | 0.639764 | 0.606784 | 0.580144 | 0.54176 | 0 | 0.009511 | 0.245852 | 42,245 | 1,006 | 566 | 41.993042 | 0.82705 | 0.534075 | 0 | 0.365385 | 0 | 0 | 0.107858 | 0.011262 | 0 | 0 | 0 | 0 | 0 | 1 | 0.224359 | false | 0 | 0.112179 | 0 | 0.554487 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
62592611062846e8ddc9453d08b3f9cc749f88fa | 129 | py | Python | Python/Courses/Python-Tutorials.Telusko/02.Miscellaneous/20.03-File-handling.py | shihab4t/Books-Code | b637b6b2ad42e11faf87d29047311160fe3b2490 | [
"Unlicense"
] | null | null | null | Python/Courses/Python-Tutorials.Telusko/02.Miscellaneous/20.03-File-handling.py | shihab4t/Books-Code | b637b6b2ad42e11faf87d29047311160fe3b2490 | [
"Unlicense"
] | null | null | null | Python/Courses/Python-Tutorials.Telusko/02.Miscellaneous/20.03-File-handling.py | shihab4t/Books-Code | b637b6b2ad42e11faf87d29047311160fe3b2490 | [
"Unlicense"
] | null | null | null | file = open("text.txt", "r")
file2 = open("text2.txt", "w")
for data in file:
file2.write(data)
file.close()
file2.close()
| 14.333333 | 30 | 0.620155 | 21 | 129 | 3.809524 | 0.619048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0.162791 | 129 | 8 | 31 | 16.125 | 0.703704 | 0 | 0 | 0 | 0 | 0 | 0.147287 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6261861dfa046a0934777f8f23b5ec284278ef51 | 1,115 | py | Python | py_connect/exceptions.py | iparaskev/py_connect | 43476cddfb25130d058fcf59928454f867af8feb | [
"BSD-3-Clause"
] | 5 | 2021-03-19T07:05:50.000Z | 2021-03-31T22:53:52.000Z | py_connect/exceptions.py | iparaskev/py_connect | 43476cddfb25130d058fcf59928454f867af8feb | [
"BSD-3-Clause"
] | null | null | null | py_connect/exceptions.py | iparaskev/py_connect | 43476cddfb25130d058fcf59928454f867af8feb | [
"BSD-3-Clause"
] | null | null | null | """Exceptions of the library"""
class PyConnectError(Exception):
"""Base class for all exceptions in py_connect."""
class InvalidPowerCombination(PyConnectError):
"""Connection of different power pins."""
class MaxConnectionsError(PyConnectError):
"""Interface has exceeded it's max connections limit."""
class InvalidGpioError(PyConnectError):
"""Invalid connection of two gpio pins."""
class AlreadyConnectedError(PyConnectError):
"""One or more pins of the interface are already connected."""
class TwoMasterError(PyConnectError):
"""Error when connecting two master interfaces."""
class TwoSlaveError(PyConnectError):
"""Error when connecting two slave interfaces."""
class ChipEnabledFullError(PyConnectError):
"""All chip enable pins are in use."""
class NotImplementedDriverError(PyConnectError):
"""This peripheral doesn't have an implemented driver."""
class UnicludedDeviceError(PyConnectError):
"""Device hasn't been included in connections specification."""
class EmptyListError(PyConnectError):
"""Empty list given for an attribute."""
| 24.23913 | 67 | 0.744395 | 115 | 1,115 | 7.208696 | 0.608696 | 0.012063 | 0.055489 | 0.079614 | 0.086852 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148879 | 1,115 | 45 | 68 | 24.777778 | 0.873551 | 0.464574 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
626b51aefb27ae8f4702b720697fa00e55d0360c | 1,309 | py | Python | robot_simulator/grid/positioning.py | darshikaf/toy-robot-simulator | 408d160033728d65e9bac376d3af7fc84c520f31 | [
"MIT"
] | null | null | null | robot_simulator/grid/positioning.py | darshikaf/toy-robot-simulator | 408d160033728d65e9bac376d3af7fc84c520f31 | [
"MIT"
] | null | null | null | robot_simulator/grid/positioning.py | darshikaf/toy-robot-simulator | 408d160033728d65e9bac376d3af7fc84c520f31 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from __future__ import annotations
import math
class Point:
def __init__(self, x: int = 0, y: int = 0):
self.x = x
self.y = y
def __eq__(self, other: object) -> bool:
if isinstance(other, Point):
return (self.x == other.x) and (self.y == other.y)
return NotImplemented
def __ne__(self, other: object) -> bool:
if isinstance(other, Point):
return not (self == other)
return NotImplemented
def __add__(self, other: Point) -> Point:
x = self.x + other.x
y = self.y + other.y
return self.__class__(x, y)
class Vector(Point):
def __mul__(self, scale: int) -> Vector:
x = self.x * scale
y = self.y * scale
return self.__class__(x, y)
class Rect:
def __init__(self, point1: Point, point2: Point) -> None:
self.top = max(point1.y, point2.y)
self.right = max(point1.x, point2.x)
self.bottom = min(point1.y, point2.y)
self.left = min(point1.x, point2.x)
def contains(self, point: Point) -> bool:
contains_x = (self.left <= point.x) and (point.x <= self.right)
contains_y = (self.bottom <= point.y) and (point.y <= self.top)
return contains_x and contains_y
| 27.270833 | 71 | 0.578304 | 183 | 1,309 | 3.918033 | 0.256831 | 0.041841 | 0.030683 | 0.052999 | 0.281729 | 0.192469 | 0.131102 | 0.131102 | 0.131102 | 0 | 0 | 0.013934 | 0.287242 | 1,309 | 47 | 72 | 27.851064 | 0.754555 | 0.032086 | 0 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.212121 | false | 0 | 0.060606 | 0 | 0.575758 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
626ca157c2ac9db263365279311bac86dc999674 | 328 | py | Python | backmarker/api/viewsets/driver_viewset.py | jmp/backmarker | e12a094d92dec798ad10aa8890fabe84f946c303 | [
"MIT"
] | null | null | null | backmarker/api/viewsets/driver_viewset.py | jmp/backmarker | e12a094d92dec798ad10aa8890fabe84f946c303 | [
"MIT"
] | null | null | null | backmarker/api/viewsets/driver_viewset.py | jmp/backmarker | e12a094d92dec798ad10aa8890fabe84f946c303 | [
"MIT"
] | null | null | null | from rest_framework.viewsets import ReadOnlyModelViewSet
from backmarker.api.serializers.driver_serializer import DriverSerializer
from backmarker.models.driver import Driver
class DriverViewSet(ReadOnlyModelViewSet):
queryset = Driver.objects.all()
serializer_class = DriverSerializer
lookup_field = "reference"
| 29.818182 | 73 | 0.829268 | 33 | 328 | 8.121212 | 0.636364 | 0.104478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115854 | 328 | 10 | 74 | 32.8 | 0.924138 | 0 | 0 | 0 | 0 | 0 | 0.027439 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
62735fa3cb9b4a375ffe477b83e79ab29f0e085c | 537 | py | Python | plugs_newsletter/emails.py | solocompt/plugs-newsletter | 57b9aa2caf9ed5bd5adf25839dbf52b85c0afa53 | [
"MIT"
] | 1 | 2017-01-10T23:24:55.000Z | 2017-01-10T23:24:55.000Z | plugs_newsletter/emails.py | solocompt/plugs-newsletter | 57b9aa2caf9ed5bd5adf25839dbf52b85c0afa53 | [
"MIT"
] | 1 | 2017-01-08T00:01:21.000Z | 2017-01-08T00:01:21.000Z | plugs_newsletter/emails.py | solocompt/plugs-newsletter | 57b9aa2caf9ed5bd5adf25839dbf52b85c0afa53 | [
"MIT"
] | null | null | null | """
Plugs Newsletter Emails
"""
from plugs_mail.mail import PlugsMail
class NewsletterSubscribed(PlugsMail):
"""
Email sent to subscriber after newsletter subscription
"""
template = 'NEWSLETTER_SUBSCRIBED'
description = 'Email sent to subscriber after newsletter subscription'
class NewsletterUnsubscribed(PlugsMail):
"""
Email sent to subscriber after newsletter unsubscription
"""
template = 'NEWSLETTER_UNSUBSCRIBED'
description = 'Email sent to subscriber after newsletter unsubscription'
| 26.85 | 76 | 0.748603 | 51 | 537 | 7.823529 | 0.411765 | 0.090226 | 0.110276 | 0.210526 | 0.591479 | 0.591479 | 0.591479 | 0 | 0 | 0 | 0 | 0 | 0.182495 | 537 | 19 | 77 | 28.263158 | 0.908884 | 0.251397 | 0 | 0 | 0 | 0 | 0.423077 | 0.120879 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
62758b3a2a4619c1b6d03498fcd2b870db5024e4 | 495 | py | Python | gail/crowd_sim/configs/icra_benchmark/sarl.py | ben-milanko/PyTorch-RL | 4d7be8a7f26f21b490c93191dca1844046a092df | [
"MIT"
] | null | null | null | gail/crowd_sim/configs/icra_benchmark/sarl.py | ben-milanko/PyTorch-RL | 4d7be8a7f26f21b490c93191dca1844046a092df | [
"MIT"
] | null | null | null | gail/crowd_sim/configs/icra_benchmark/sarl.py | ben-milanko/PyTorch-RL | 4d7be8a7f26f21b490c93191dca1844046a092df | [
"MIT"
] | null | null | null | from configs.icra_benchmark.config import BaseEnvConfig, BasePolicyConfig, BaseTrainConfig, Config
class EnvConfig(BaseEnvConfig):
def __init__(self, debug=False):
super(EnvConfig, self).__init__(debug)
class PolicyConfig(BasePolicyConfig):
def __init__(self, debug=False):
super(PolicyConfig, self).__init__(debug)
self.name = 'sarl'
class TrainConfig(BaseTrainConfig):
def __init__(self, debug=False):
super(TrainConfig, self).__init__(debug)
| 27.5 | 98 | 0.733333 | 52 | 495 | 6.5 | 0.403846 | 0.06213 | 0.097633 | 0.142012 | 0.230769 | 0.230769 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163636 | 495 | 17 | 99 | 29.117647 | 0.816425 | 0 | 0 | 0.272727 | 0 | 0 | 0.008081 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0.090909 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
627d2e10dedbd895286404f157c63ff39dd0589c | 368 | py | Python | experiments/duet_dataloader/input_file_generator.py | 18praveenb/ss-vq-vae | 89e76d69d6127b27ae4cc066a1a1f9c4147fb020 | [
"Apache-2.0"
] | null | null | null | experiments/duet_dataloader/input_file_generator.py | 18praveenb/ss-vq-vae | 89e76d69d6127b27ae4cc066a1a1f9c4147fb020 | [
"Apache-2.0"
] | null | null | null | experiments/duet_dataloader/input_file_generator.py | 18praveenb/ss-vq-vae | 89e76d69d6127b27ae4cc066a1a1f9c4147fb020 | [
"Apache-2.0"
] | null | null | null | genres = ['blues', 'classical', 'country', 'disco', 'hiphop', 'jazz', 'metal', 'pop', 'reggae', 'rock']
num_files = 100
with open(f'INPUT_FULL', 'w') as f:
for genre in genres:
for i in range(num_files):
for j in range(6):
f.write(f'/datasets/duet/genres/{genre}.{i:05d}.{j}.wav /datasets/duet/genres/{genre}.{i:05d}.{j}.wav\n') | 52.571429 | 121 | 0.578804 | 57 | 368 | 3.684211 | 0.596491 | 0.07619 | 0.171429 | 0.219048 | 0.295238 | 0.295238 | 0.295238 | 0.295238 | 0 | 0 | 0 | 0.027119 | 0.19837 | 368 | 7 | 121 | 52.571429 | 0.684746 | 0 | 0 | 0 | 0 | 0.142857 | 0.428184 | 0.249322 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6289a72aad33159f7d054947864a1fb1eeedfac8 | 2,756 | py | Python | ROSITA/ARMCycleCounterImpl.py | 0xADE1A1DE/Rosita | 13a669c1877b2eb5a36be6b41a6f840d83ffd46a | [
"Apache-2.0"
] | 10 | 2020-11-22T08:17:08.000Z | 2021-12-17T04:06:01.000Z | ROSITA/ARMCycleCounterImpl.py | 0xADE1A1DE/Rosita | 13a669c1877b2eb5a36be6b41a6f840d83ffd46a | [
"Apache-2.0"
] | null | null | null | ROSITA/ARMCycleCounterImpl.py | 0xADE1A1DE/Rosita | 13a669c1877b2eb5a36be6b41a6f840d83ffd46a | [
"Apache-2.0"
] | null | null | null | # Copyright 2020 University of Adelaide
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
def check_inst(inst, mask, match):
return inst & mask == match
def cycles_push(inst):
if check_inst(inst, 0b1111111000000000, 0b1011010000000000):
return 1 + bin(inst & 0x00ff).count('1')
return -1
def cycles_pop(inst):
# Format 14: push/pop registers
if check_inst(inst, 0b1111111000000000, 0b1011110000000000):
return 1 + bin(inst & 0x00ff).count('1')
return -1
def cycles_pop_pc(inst):
# Format 14: push/pop registers
if check_inst(inst, 0b1111111100000000, 0b1011110100000000):
return 4 + bin(inst & 0x00ff).count('1')
return -1
def cycles_add(inst):
# Format 2: add/subtract
if check_inst(inst, 0b1111101000000000, 0b0001100000000000):
return 1
return -1
def cycles_add_pc(inst):
return -1
def cycles_rot(inst):
# Format 4
if check_inst(inst, 0b1111111111000000, 0b0100000111000000):
return 1
return -1
def cycles_ldr(inst):
# Format 7
# Format 9
if check_inst(inst, 0b1111101000000000, 0b0101100000000000) or \
check_inst(inst, 0b1110100000000000, 0b0110100000000000):
return 2
return -1
def cycles_str(inst):
# Format 7
# Format 9
if check_inst(inst, 0b1111101000000000, 0b0101000000000000) or \
check_inst(inst, 0b1110100000000000, 0b0110000000000000):
return 2
return -1
def cycles_mov(inst):
# Format 1: move shifted register
# Format 3: move/compare/add/subtract immediate
# Format 5: Hi register operations/branch exchange
if check_inst(inst, 0b1111111111000000, 0b0000000000000000) or \
check_inst(inst, 0b1111100000000000, 0b0010000000000000) or \
check_inst(inst, 0b1111111100000000, 0b0100011000000000):
return 1
return -1
def cycles_mov_pc(inst):
# Format 5: dest = pc
if check_inst(inst, 0b1111111101000111, 0b0100011001000111):
return 3
return -1
__cycle_counts = [
[ cycles_mov, cycles_mov_pc ],
[ cycles_add, cycles_add_pc ],
[ cycles_ldr ],
[ cycles_str ],
[ cycles_rot ],
[ cycles_pop, cycles_pop_pc ],
[ cycles_push ]
]
def get_cycle_counts():
return __cycle_counts | 29.010526 | 74 | 0.697388 | 354 | 2,756 | 5.293785 | 0.353107 | 0.05603 | 0.097118 | 0.072038 | 0.350053 | 0.23159 | 0.167022 | 0.167022 | 0.167022 | 0.148346 | 0 | 0.229698 | 0.21807 | 2,756 | 95 | 75 | 29.010526 | 0.639907 | 0.302975 | 0 | 0.309091 | 0 | 0 | 0.001582 | 0 | 0 | 0 | 0.009494 | 0 | 0 | 1 | 0.218182 | false | 0 | 0 | 0.054545 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6561c740148f4e9dc7f6313a8907266fbf944537 | 233 | py | Python | backend/src/urls.py | Ornstein89/LeadersOfDigital2021_FoxhoundTeam | a376525b07b900bd69dbc274daef66ddad7f1800 | [
"MIT"
] | null | null | null | backend/src/urls.py | Ornstein89/LeadersOfDigital2021_FoxhoundTeam | a376525b07b900bd69dbc274daef66ddad7f1800 | [
"MIT"
] | null | null | null | backend/src/urls.py | Ornstein89/LeadersOfDigital2021_FoxhoundTeam | a376525b07b900bd69dbc274daef66ddad7f1800 | [
"MIT"
] | null | null | null | from django.contrib import admin
from django.urls import path, include
from src.base import urls as base_api
urlpatterns = [
path('admin/', admin.site.urls),
path('rest_api/', include(
base_api.urlpatterns
)),
] | 21.181818 | 37 | 0.690987 | 32 | 233 | 4.9375 | 0.46875 | 0.126582 | 0.227848 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.197425 | 233 | 11 | 38 | 21.181818 | 0.84492 | 0 | 0 | 0 | 0 | 0 | 0.064103 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
65795b0076fedc356eabb71c92883de2bfec241d | 261 | py | Python | tests/test_crud/conftest.py | amisadmin/fastapi_amis_admin | 07967a31c46cb6e8e0b4ca703d6d815c2091624a | [
"Apache-2.0"
] | 166 | 2022-02-05T15:52:51.000Z | 2022-03-31T10:57:35.000Z | tests/test_crud/conftest.py | amisadmin/fastapi_amis_admin | 07967a31c46cb6e8e0b4ca703d6d815c2091624a | [
"Apache-2.0"
] | 9 | 2022-02-17T07:32:58.000Z | 2022-03-31T13:46:24.000Z | tests/test_crud/conftest.py | amisadmin/fastapi_amis_admin | 07967a31c46cb6e8e0b4ca703d6d815c2091624a | [
"Apache-2.0"
] | 15 | 2022-02-10T07:24:17.000Z | 2022-03-24T04:08:10.000Z | import pytest
from tests.test_crud.main import app
@pytest.fixture(scope='session', autouse=True)
def startup():
import asyncio
# asyncio.run(app.router.startup())
loop = asyncio.get_event_loop()
loop.run_until_complete(app.router.startup())
| 21.75 | 49 | 0.731801 | 36 | 261 | 5.166667 | 0.638889 | 0.096774 | 0.172043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141762 | 261 | 11 | 50 | 23.727273 | 0.830357 | 0.126437 | 0 | 0 | 0 | 0 | 0.030973 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.428571 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
657f66cb6267b45323c6fdaa161920c2b665fce3 | 24,375 | py | Python | XOconv/pycgtypes/mat4.py | jsburg/xdsme | 3fc9ed185ab78e1a42306edf24e681981eacd221 | [
"BSD-3-Clause"
] | 16 | 2016-05-20T11:19:40.000Z | 2021-01-01T19:44:23.000Z | XOconv/pycgtypes/mat4.py | jsburg/xdsme | 3fc9ed185ab78e1a42306edf24e681981eacd221 | [
"BSD-3-Clause"
] | 11 | 2016-09-09T15:00:15.000Z | 2021-05-07T15:02:10.000Z | XOconv/pycgtypes/mat4.py | jsburg/xdsme | 3fc9ed185ab78e1a42306edf24e681981eacd221 | [
"BSD-3-Clause"
] | 9 | 2016-12-15T16:00:06.000Z | 2021-09-10T08:34:14.000Z | ######################################################################
# mat4 - Matrix class (4x4 matrix)
#
# Copyright (C) 2002, Matthias Baas (baas@ira.uka.de)
#
# You may distribute under the terms of the BSD license, as
# specified in the file license.txt.
####################################################################
import types, math, copy
from vec3 import vec3 as _vec3
from vec4 import vec4 as _vec4
from mat3 import mat3 as _mat3
# [ 0 1 2 3 ]
# [ 4 5 6 7 ]
# [ 8 9 10 11 ]
# [ 12 13 14 15 ]
# mat4
class mat4:
"""Matrix class (4x4).
This class represents a 4x4 matrix that can be used to store
affine transformations.
"""
def __init__(self, *args):
"Constructor"
# No arguments
if len(args)==0:
self.mlist = 16*[0.0]
# 1 argument (list, scalar or mat4)
elif len(args)==1:
T = type(args[0])
if T==types.FloatType or T==types.IntType or T==types.LongType:
self.mlist = [args[0],0.0,0.0,0.0,
0.0,args[0],0.0,0.0,
0.0,0.0,args[0],0.0,
0.0,0.0,0.0,args[0]]
# mat4
elif isinstance(args[0], mat4):
self.mlist = copy.copy(args[0].mlist)
# String
elif T==types.StringType:
s=args[0].replace(","," ").replace(" "," ").strip().split(" ")
self.mlist=map(lambda x: float(x), s)
else:
self.mlist = list(args[0])
# 4 arguments (sequences)
elif len(args)==4:
a,b,c,d=args
self.mlist = [a[0], b[0], c[0], d[0],
a[1], b[1], c[1], d[1],
a[2], b[2], c[2], d[2],
a[3], b[3], c[3], d[3]]
# 16 arguments
elif len(args)==16:
self.mlist = list(args)
else:
raise TypeError,"mat4() arg can't be converted to mat4"
# Check if there are really 16 elements in the list
if len(self.mlist)!=16:
raise TypeError, "mat4(): Wrong number of matrix elements ("+`len(self.mlist)`+" instead of 16)"
def __repr__(self):
return 'mat4('+`self.mlist`[1:-1]+')'
def __str__(self):
fmt="%9.4f"
m11,m12,m13,m14,m21,m22,m23,m24,m31,m32,m33,m34,m41,m42,m43,m44 = self.mlist
return ('['+fmt%m11+', '+fmt%m12+', '+fmt%m13+', '+fmt%m14+']\n'+
'['+fmt%m21+', '+fmt%m22+', '+fmt%m23+', '+fmt%m24+']\n'+
'['+fmt%m31+', '+fmt%m32+', '+fmt%m33+', '+fmt%m34+']\n'+
'['+fmt%m41+', '+fmt%m42+', '+fmt%m43+', '+fmt%m44+']')
def __eq__(self, other):
"""== operator"""
if isinstance(other, mat4):
return self.mlist==other.mlist
else:
return 0
def __ne__(self, other):
"""!= operator"""
if isinstance(other, mat4):
return self.mlist!=other.mlist
else:
return 1
def __add__(self, other):
"""Matrix addition.
>>> M=mat4(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16)
>>> print M+M
[ 2.0000, 4.0000, 6.0000, 8.0000]
[ 10.0000, 12.0000, 14.0000, 16.0000]
[ 18.0000, 20.0000, 22.0000, 24.0000]
[ 26.0000, 28.0000, 30.0000, 32.0000]
"""
if isinstance(other, mat4):
return mat4(map(lambda x,y: x+y, self.mlist, other.mlist))
else:
raise TypeError, "unsupported operand type for +"
def __sub__(self, other):
"""Matrix subtraction.
>>> M=mat4(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16)
>>> print M-M
[ 0.0000, 0.0000, 0.0000, 0.0000]
[ 0.0000, 0.0000, 0.0000, 0.0000]
[ 0.0000, 0.0000, 0.0000, 0.0000]
[ 0.0000, 0.0000, 0.0000, 0.0000]
"""
if isinstance(other, mat4):
return mat4(map(lambda x,y: x-y, self.mlist, other.mlist))
else:
raise TypeError, "unsupported operand type for -"
def __mul__(self, other):
"""Multiplication.
>>> M=mat4(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16)
>>> print M*2.0
[ 2.0000, 4.0000, 6.0000, 8.0000]
[ 10.0000, 12.0000, 14.0000, 16.0000]
[ 18.0000, 20.0000, 22.0000, 24.0000]
[ 26.0000, 28.0000, 30.0000, 32.0000]
>>> print 2.0*M
[ 2.0000, 4.0000, 6.0000, 8.0000]
[ 10.0000, 12.0000, 14.0000, 16.0000]
[ 18.0000, 20.0000, 22.0000, 24.0000]
[ 26.0000, 28.0000, 30.0000, 32.0000]
>>> print M*M
[ 90.0000, 100.0000, 110.0000, 120.0000]
[ 202.0000, 228.0000, 254.0000, 280.0000]
[ 314.0000, 356.0000, 398.0000, 440.0000]
[ 426.0000, 484.0000, 542.0000, 600.0000]
>>> print M*_vec3(1,2,3)
(0.1765, 0.4510, 0.7255)
>>> print _vec3(1,2,3)*M
(0.7083, 0.8056, 0.9028)
"""
T = type(other)
# mat4*scalar
if T==types.FloatType or T==types.IntType or T==types.LongType:
return mat4(map(lambda x,other=other: x*other, self.mlist))
# mat4*vec3
if isinstance(other, _vec3):
m11,m12,m13,m14,m21,m22,m23,m24,m31,m32,m33,m34,m41,m42,m43,m44 = self.mlist
w = float(m41*other.x + m42*other.y + m43*other.z + m44)
return _vec3(m11*other.x + m12*other.y + m13*other.z + m14,
m21*other.x + m22*other.y + m23*other.z + m24,
m31*other.x + m32*other.y + m33*other.z + m34)/w
# mat4*vec4
if isinstance(other, _vec4):
m11,m12,m13,m14,m21,m22,m23,m24,m31,m32,m33,m34,m41,m42,m43,m44 = self.mlist
return _vec4(m11*other.x + m12*other.y + m13*other.z + m14*other.w,
m21*other.x + m22*other.y + m23*other.z + m24*other.w,
m31*other.x + m32*other.y + m33*other.z + m34*other.w,
m41*other.x + m42*other.y + m43*other.z + m44*other.w)
# mat4*mat4
if isinstance(other, mat4):
m11,m12,m13,m14,m21,m22,m23,m24,m31,m32,m33,m34,m41,m42,m43,m44 = self.mlist
n11,n12,n13,n14,n21,n22,n23,n24,n31,n32,n33,n34,n41,n42,n43,n44 = other.mlist
return mat4( m11*n11+m12*n21+m13*n31+m14*n41,
m11*n12+m12*n22+m13*n32+m14*n42,
m11*n13+m12*n23+m13*n33+m14*n43,
m11*n14+m12*n24+m13*n34+m14*n44,
m21*n11+m22*n21+m23*n31+m24*n41,
m21*n12+m22*n22+m23*n32+m24*n42,
m21*n13+m22*n23+m23*n33+m24*n43,
m21*n14+m22*n24+m23*n34+m24*n44,
m31*n11+m32*n21+m33*n31+m34*n41,
m31*n12+m32*n22+m33*n32+m34*n42,
m31*n13+m32*n23+m33*n33+m34*n43,
m31*n14+m32*n24+m33*n34+m34*n44,
m41*n11+m42*n21+m43*n31+m44*n41,
m41*n12+m42*n22+m43*n32+m44*n42,
m41*n13+m42*n23+m43*n33+m44*n43,
m41*n14+m42*n24+m43*n34+m44*n44)
# unsupported
else:
raise TypeError, "unsupported operand type for *"
def __rmul__(self, other):
T = type(other)
# scalar*mat4
if T==types.FloatType or T==types.IntType or T==types.LongType:
return mat4(map(lambda x,other=other: other*x, self.mlist))
# vec4*mat4
if isinstance(other, _vec4):
m11,m12,m13,m14,m21,m22,m23,m24,m31,m32,m33,m34,m41,m42,m43,m44 = self.mlist
return _vec4(other.x*m11 + other.y*m21 + other.z*m31 + other.w*m41,
other.x*m12 + other.y*m22 + other.z*m32 + other.w*m42,
other.x*m13 + other.y*m23 + other.z*m33 + other.w*m43,
other.x*m14 + other.y*m24 + other.z*m34 + other.w*m44)
# vec3*mat4
if isinstance(other, _vec3):
m11,m12,m13,m14,m21,m22,m23,m24,m31,m32,m33,m34,m41,m42,m43,m44 = self.mlist
w = float(other.x*m14 + other.y*m24 + other.z*m34 + m44)
return _vec3(other.x*m11 + other.y*m21 + other.z*m31 + m41,
other.x*m12 + other.y*m22 + other.z*m32 + m42,
other.x*m13 + other.y*m23 + other.z*m33 + m43)/w
# mat4*mat4
if isinstance(other, mat4):
return self.__mul__(other)
# unsupported
else:
raise TypeError, "unsupported operand type for *"
def __div__(self, other):
"""Division
>>> M=mat4(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16)
>>> print M/2.0
[ 0.5000, 1.0000, 1.5000, 2.0000]
[ 2.5000, 3.0000, 3.5000, 4.0000]
[ 4.5000, 5.0000, 5.5000, 6.0000]
[ 6.5000, 7.0000, 7.5000, 8.0000]
"""
T = type(other)
# mat4/scalar
if T==types.FloatType or T==types.IntType or T==types.LongType:
return mat4(map(lambda x,other=other: x/other, self.mlist))
# unsupported
else:
raise TypeError, "unsupported operand type for /"
def __mod__(self, other):
"""Modulo.
>>> M=mat4(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16)
>>> print M%5.0
[ 1.0000, 2.0000, 3.0000, 4.0000]
[ 0.0000, 1.0000, 2.0000, 3.0000]
[ 4.0000, 0.0000, 1.0000, 2.0000]
[ 3.0000, 4.0000, 0.0000, 1.0000]
"""
T = type(other)
# mat4%scalar
if T==types.FloatType or T==types.IntType or T==types.LongType:
return mat4(map(lambda x,other=other: x%other, self.mlist))
# unsupported
else:
raise TypeError, "unsupported operand type for %"
def __neg__(self):
"""Negation.
>>> M=mat4(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16)
>>> print -M
[ -1.0000, -2.0000, -3.0000, -4.0000]
[ -5.0000, -6.0000, -7.0000, -8.0000]
[ -9.0000, -10.0000, -11.0000, -12.0000]
[ -13.0000, -14.0000, -15.0000, -16.0000]
"""
return mat4(map(lambda x: -x, self.mlist))
def __pos__(self):
"""
>>> M=mat4(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16)
>>> print +M
[ 1.0000, 2.0000, 3.0000, 4.0000]
[ 5.0000, 6.0000, 7.0000, 8.0000]
[ 9.0000, 10.0000, 11.0000, 12.0000]
[ 13.0000, 14.0000, 15.0000, 16.0000]
"""
return mat4(map(lambda x: +x, self.mlist))
def __len__(self):
return 4
def __getitem__(self, key):
if type(key)==types.IntType:
if key<0 or key>3:
raise IndexError,"index out of range"
m=self.mlist
if key==0: return [m[0],m[4],m[8],m[12]]
elif key==1: return [m[1],m[5],m[9],m[13]]
elif key==2: return [m[2],m[6],m[10],m[14]]
elif key==3: return [m[3],m[7],m[11],m[15]]
elif type(key)==types.TupleType:
i,j=key
if i<0 or i>3 or j<0 or j>3:
raise IndexError, "index out of range"
return self.mlist[i*4+j]
else:
raise TypeError,"index must be integer or 2-tuple"
def __setitem__(self, key, value):
if type(key)==types.IntType:
if key<0 or key>3:
raise IndexError,"index out of range"
m=self.mlist
if key==0: m[0],m[4],m[8],m[12]=value
elif key==1: m[1],m[5],m[9],m[13]=value
elif key==2: m[2],m[6],m[10],m[14]=value
elif key==3: m[3],m[7],m[11],m[15]=value
elif type(key)==types.TupleType:
i,j=key
if i<0 or i>3 or j<0 or j>3:
raise IndexError, "index out of range"
self.mlist[i*4+j] = value
else:
raise TypeError,"index must be integer or 2-tuple"
def getRow(self, idx):
"""Return row (as vec4)."""
m=self.mlist
if idx==0: return _vec4(m[0], m[1], m[2], m[3])
elif idx==1: return _vec4(m[4], m[5], m[6], m[7])
elif idx==2: return _vec4(m[8], m[9], m[10], m[11])
elif idx==3: return _vec4(m[12], m[13], m[14], m[15])
else:
raise IndexError,"index out of range"
def setRow(self, idx, value):
"""Set row."""
m=self.mlist
if idx==0: m[0],m[1],m[2],m[3] = value
elif idx==1: m[4],m[5],m[6],m[7] = value
elif idx==2: m[8],m[9],m[10],m[11] = value
elif idx==3: m[12],m[13],m[14],m[15] = value
else:
raise IndexError,"index out of range"
def getColumn(self, idx):
"""Return column (as vec4)."""
m=self.mlist
if idx==0: return _vec4(m[0], m[4], m[8], m[12])
elif idx==1: return _vec4(m[1], m[5], m[9], m[13])
elif idx==2: return _vec4(m[2], m[6], m[10], m[14])
elif idx==3: return _vec4(m[3], m[7], m[11], m[15])
else:
raise IndexError,"index out of range"
def setColumn(self, idx, value):
"""Set column."""
m=self.mlist
if idx==0: m[0],m[4],m[8],m[12] = value
elif idx==1: m[1],m[5],m[9],m[13] = value
elif idx==2: m[2],m[6],m[10],m[14] = value
elif idx==3: m[3],m[7],m[11],m[15] = value
else:
raise IndexError,"index out of range"
def toList(self, rowmajor=0):
"""Return a list containing the matrix elements.
By default the list is in column-major order (which can directly be
used in OpenGL or RenderMan). If you set the optional argument
rowmajor to 1, you'll get the list in row-major order.
>>> M=mat4(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16)
>>> print M.toList()
[1, 5, 9, 13, 2, 6, 10, 14, 3, 7, 11, 15, 4, 8, 12, 16]
>>> print M.toList(rowmajor=1)
[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16]
"""
if rowmajor:
return copy.copy(self.mlist)
else:
return self.transpose().mlist
def identity(self):
"""Return identity matrix.
>>> print mat4().identity()
[ 1.0000, 0.0000, 0.0000, 0.0000]
[ 0.0000, 1.0000, 0.0000, 0.0000]
[ 0.0000, 0.0000, 1.0000, 0.0000]
[ 0.0000, 0.0000, 0.0000, 1.0000]
"""
return mat4(1.0, 0.0, 0.0, 0.0,
0.0, 1.0, 0.0, 0.0,
0.0, 0.0, 1.0, 0.0,
0.0, 0.0, 0.0, 1.0)
def transpose(self):
"""Transpose matrix.
>>> M=mat4(1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16)
>>> print M.transpose()
[ 1.0000, 5.0000, 9.0000, 13.0000]
[ 2.0000, 6.0000, 10.0000, 14.0000]
[ 3.0000, 7.0000, 11.0000, 15.0000]
[ 4.0000, 8.0000, 12.0000, 16.0000]
"""
m11,m12,m13,m14,m21,m22,m23,m24,m31,m32,m33,m34,m41,m42,m43,m44 = self.mlist
return mat4(m11,m21,m31,m41,
m12,m22,m32,m42,
m13,m23,m33,m43,
m14,m24,m34,m44)
def determinant(self):
"""Return determinant.
>>> M=mat4(2.0,0,0,0, 0,2.0,0,0, 0,0,2.0,0, 0,0,0,2.0)
>>> print M.determinant()
16.0
"""
m11,m12,m13,m14,m21,m22,m23,m24,m31,m32,m33,m34,m41,m42,m43,m44 = self.mlist
return m11*m22*m33*m44 \
-m11*m22*m34*m43 \
+m11*m23*m34*m42 \
-m11*m23*m32*m44 \
+m11*m24*m32*m43 \
-m11*m24*m33*m42 \
-m12*m23*m34*m41 \
+m12*m23*m31*m44 \
-m12*m24*m31*m43 \
+m12*m24*m33*m41 \
-m12*m21*m33*m44 \
+m12*m21*m34*m43 \
+m13*m24*m31*m42 \
-m13*m24*m32*m41 \
+m13*m21*m32*m44 \
-m13*m21*m34*m42 \
+m13*m22*m34*m41 \
-m13*m22*m31*m44 \
-m14*m21*m32*m43 \
+m14*m21*m33*m42 \
-m14*m22*m33*m41 \
+m14*m22*m31*m43 \
-m14*m23*m31*m42 \
+m14*m23*m32*m41
def _submat(self, i,j):
M=_mat3()
for k in range(3):
for l in range(3):
t=(k,l)
if k>=i:
t=(k+1,t[1])
if l>=j:
t=(t[0],l+1)
M[k,l] = self[t]
return M
def inverse(self):
"""Return inverse matrix.
>>> M=mat4(0,-2.0,0,0, 2.0,0,0,0, 0,0,2,0, 0,0,0,2)
>>> print M.inverse()
[ 0.0000, 0.5000, 0.0000, 0.0000]
[ -0.5000, 0.0000, 0.0000, 0.0000]
[ 0.0000, 0.0000, 0.5000, 0.0000]
[ 0.0000, 0.0000, 0.0000, 0.5000]
"""
Mi=mat4()
d=self.determinant()
for i in range(4):
for j in range(4):
sign=1-((i+j)%2)*2
m3=self._submat(i,j)
Mi[j,i]=sign*m3.determinant()/d
return Mi
def translation(self, t):
"""Return translation matrix."""
return mat4(1.0, 0.0, 0.0, t.x,
0.0, 1.0, 0.0, t.y,
0.0, 0.0, 1.0, t.z,
0.0, 0.0, 0.0, 1.0)
def scaling(self, s):
"""Return scaling matrix."""
return mat4(s.x, 0.0, 0.0, 0.0,
0.0, s.y, 0.0, 0.0,
0.0, 0.0, s.z, 0.0,
0.0, 0.0, 0.0, 1.0)
def rotation(self, angle, axis):
"""Return rotation matrix.
angle must be given in radians. axis should be of type vec3.
"""
sqr_a = axis.x*axis.x
sqr_b = axis.y*axis.y
sqr_c = axis.z*axis.z
len2 = sqr_a+sqr_b+sqr_c
k2 = math.cos(angle)
k1 = (1.0-k2)/len2
k3 = math.sin(angle)/math.sqrt(len2)
k1ab = k1*axis.x*axis.y
k1ac = k1*axis.x*axis.z
k1bc = k1*axis.y*axis.z
k3a = k3*axis.x
k3b = k3*axis.y
k3c = k3*axis.z
return mat4( k1*sqr_a+k2, k1ab-k3c, k1ac+k3b, 0.0,
k1ab+k3c, k1*sqr_b+k2, k1bc-k3a, 0.0,
k1ac-k3b, k1bc+k3a, k1*sqr_c+k2, 0.0,
0.0, 0.0, 0.0, 1.0)
def translate(self, t):
"""Concatenate a translation."""
m11,m12,m13,m14,m21,m22,m23,m24,m31,m32,m33,m34,m41,m42,m43,m44 = self.mlist
self.mlist[3] = m11*t.x + m12*t.y + m13*t.z + m14
self.mlist[7] = m21*t.x + m22*t.y + m23*t.z + m24
self.mlist[11] = m31*t.x + m32*t.y + m33*t.z + m34
self.mlist[15] = m41*t.x + m42*t.y + m43*t.z + m44
return self
def scale(self, s):
"""Concatenate a scaling."""
self.mlist[0] *= s.x
self.mlist[1] *= s.y
self.mlist[2] *= s.z
self.mlist[4] *= s.x
self.mlist[5] *= s.y
self.mlist[6] *= s.z
self.mlist[8] *= s.x
self.mlist[9] *= s.y
self.mlist[10] *= s.z
self.mlist[12] *= s.x
self.mlist[13] *= s.y
self.mlist[14] *= s.z
return self
def rotate(self, angle, axis):
"""Concatenate a rotation.
angle must be given in radians. axis should be of type vec3.
"""
R=self.rotation(angle, axis)
self.mlist = (self*R).mlist
return self
def frustum(self, left, right, bottom, top, near, far):
"""equivalent to the OpenGL command glFrustum()"""
return mat4( (2.0*near)/(right-left), 0.0, float(right+left)/(right-left), 0.0,
0.0, (2.0*near)/(top-bottom), float(top+bottom)/(top-bottom), 0.0,
0.0, 0.0, -float(far+near)/(far-near), -(2.0*far*near)/(far-near),
0.0, 0.0, -1.0, 0.0)
def perspective(self, fovy, aspect, near, far):
"""von Mesa ubernommen (glu.c)"""
top = near * math.tan(fovy * math.pi / 360.0)
bottom = -top
left = bottom * aspect
right = top * aspect
return self.frustum(left, right, bottom, top, near, far)
def lookAt(self, pos, target, up=_vec3(0,0,1)):
"""Look from pos to target.
The resulting transformation moves the origin to pos and
rotates so that The z-axis points to target. The y-axis is
as close as possible to the up vector.
"""
dir = (target - pos).normalize()
up = up.normalize()
up -= (up * dir) * dir
try:
up = up.normalize()
except:
# We're looking along the up direction, so choose
# an arbitrary direction that is perpendicular to dir
# as new up.
up = dir.ortho()
right = up.cross(dir).normalize()
self.mlist=[right.x, up.x, dir.x, pos.x,
right.y, up.y, dir.y, pos.y,
right.z, up.z, dir.z, pos.z,
0.0, 0.0, 0.0, 1.0]
return self
def ortho(self):
"""Return a matrix with orthogonal base vectors.
Makes the x-, y- and z-axis orthogonal.
The fourth column and row remain untouched.
"""
m11,m12,m13,m14,m21,m22,m23,m24,m31,m32,m33,m34,m41,m42,m43,m44 = self.mlist
x = _vec3(m11, m21, m31)
y = _vec3(m12, m22, m32)
z = _vec3(m13, m23, m33)
xl = x.length()
xl*=xl
y = y - ((x*y)/xl)*x
z = z - ((x*z)/xl)*x
yl = y.length()
yl*=yl
z = z - ((y*z)/yl)*y
return mat4( x.x, y.x, z.x, m14,
x.y, y.y, z.y, m24,
x.z, y.z, z.z, m34,
m41, m42, m43, m44)
def decompose(self):
"""Decomposes the matrix into a translation, rotation and scaling part.
Returns a tuple (translation, rotation, scaling). The
translation and scaling parts are given as vec3's, the rotation
is still given as a mat4.
"""
dummy = self.ortho()
dummy.setRow(3,_vec4(0.0, 0.0, 0.0, 1.0))
x = dummy.getColumn(0)
y = dummy.getColumn(1)
z = dummy.getColumn(2)
xl = x.length()
yl = y.length()
zl = z.length()
scale = _vec3(xl,yl,zl)
x/=xl
y/=yl
z/=zl
dummy.setColumn(0,x)
dummy.setColumn(1,y)
dummy.setColumn(2,z)
if dummy.determinant()<0.0:
dummy.setColumn(0,-x)
scale.x=-scale.x
return (_vec3(self.mlist[3], self.mlist[7], self.mlist[11]),
dummy,
scale)
def getMat3(self):
"""Convert to mat3 by discarding 4th row and column.
"""
m11,m12,m13,m14,m21,m22,m23,m24,m31,m32,m33,m34,m41,m42,m43,m44 = self.mlist
return _mat3(m11,m12,m13,
m21,m22,m23,
m31,m32,m33)
######################################################################
def _test():
import doctest, mat4
failed, total = doctest.testmod(mat4)
print "%d/%d failed" % (failed, total)
if __name__=="__main__":
_test()
| 36.057692 | 109 | 0.454974 | 3,579 | 24,375 | 3.065381 | 0.100866 | 0.026069 | 0.029806 | 0.030262 | 0.468508 | 0.458481 | 0.442075 | 0.436514 | 0.425303 | 0.390028 | 0 | 0.206001 | 0.379241 | 24,375 | 675 | 110 | 36.111111 | 0.519067 | 0.027528 | 0 | 0.227041 | 0 | 0 | 0.033511 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.012755 | null | null | 0.002551 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
65958e7861004b1f3934ff47c4a5e6dfe2a86170 | 239 | py | Python | iot/common_functions/all_imports.py | sankaet/IOT-DB | a554f49b9c25ae1a9a91b6a2564489b999da03bd | [
"MIT"
] | 1 | 2016-10-26T23:10:57.000Z | 2016-10-26T23:10:57.000Z | iot/common_functions/all_imports.py | sankaet/IOT-DB | a554f49b9c25ae1a9a91b6a2564489b999da03bd | [
"MIT"
] | null | null | null | iot/common_functions/all_imports.py | sankaet/IOT-DB | a554f49b9c25ae1a9a91b6a2564489b999da03bd | [
"MIT"
] | null | null | null | from pymongo import MongoClient
from bson import ObjectId
from bson.json_util import dumps
from json import loads
client = MongoClient('localhost', 27017)
IOT_DB = client.iot_db
IOT_SCHEMAS = IOT_DB.iot_schemas
IOT_DATA = IOT_DB.iot_data | 23.9 | 40 | 0.820084 | 39 | 239 | 4.794872 | 0.435897 | 0.106952 | 0.128342 | 0.160428 | 0.176471 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023923 | 0.125523 | 239 | 10 | 41 | 23.9 | 0.870813 | 0 | 0 | 0 | 0 | 0 | 0.0375 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
659ffa3c1d30e46aa593ca5d32d54d54bd7d5e35 | 218 | py | Python | plugins/pick/choices.py | rbracken/internbot | 58b802e0dd7597ace12acd9342bb938e2f33c25d | [
"BSD-2-Clause"
] | 1 | 2016-09-24T16:00:06.000Z | 2016-09-24T16:00:06.000Z | plugins/pick/choices.py | rbracken/internbot | 58b802e0dd7597ace12acd9342bb938e2f33c25d | [
"BSD-2-Clause"
] | null | null | null | plugins/pick/choices.py | rbracken/internbot | 58b802e0dd7597ace12acd9342bb938e2f33c25d | [
"BSD-2-Clause"
] | null | null | null | # Add your own choices here!
fruit = ["apples", "oranges", "pears", "grapes", "blueberries"]
lunch = ["pho", "timmies", "thai", "burgers", "buffet!", "indian", "montanas"]
situations = {"fruit":fruit, "lunch":lunch}
| 36.333333 | 79 | 0.62844 | 24 | 218 | 5.708333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12844 | 218 | 5 | 80 | 43.6 | 0.721053 | 0.119266 | 0 | 0 | 0 | 0 | 0.457895 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
65ac8cde7af97a0e6637820254f0d7a893315eae | 143 | py | Python | src/settings.py | MichaelJWelsh/bot-evolution | 6d8e3449fc5350f47e91a6aa7a3e8b719c0c2f16 | [
"MIT"
] | 151 | 2017-05-01T02:47:34.000Z | 2022-01-21T17:08:11.000Z | src/settings.py | MichaelJWelsh/bot-evolution | 6d8e3449fc5350f47e91a6aa7a3e8b719c0c2f16 | [
"MIT"
] | null | null | null | src/settings.py | MichaelJWelsh/bot-evolution | 6d8e3449fc5350f47e91a6aa7a3e8b719c0c2f16 | [
"MIT"
] | 26 | 2017-05-01T21:41:02.000Z | 2021-12-21T11:40:20.000Z | """
This module contains the general settings used across modules.
"""
FPS = 60
WINDOW_WIDTH = 1100
WINDOW_HEIGHT = 600
TIME_MULTIPLIER = 1.0
| 15.888889 | 62 | 0.748252 | 21 | 143 | 4.952381 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.092437 | 0.167832 | 143 | 8 | 63 | 17.875 | 0.781513 | 0.433566 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
65ae5ae925181ff1d726f472dfbdd87ce820d687 | 9,535 | py | Python | aiida/orm/entities.py | PercivalN/aiida-core | b215ed5a7ce9342bb7f671b67e95c1f474cc5940 | [
"BSD-2-Clause"
] | 1 | 2019-07-31T04:08:13.000Z | 2019-07-31T04:08:13.000Z | aiida/orm/entities.py | PercivalN/aiida-core | b215ed5a7ce9342bb7f671b67e95c1f474cc5940 | [
"BSD-2-Clause"
] | null | null | null | aiida/orm/entities.py | PercivalN/aiida-core | b215ed5a7ce9342bb7f671b67e95c1f474cc5940 | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
###########################################################################
# Copyright (c), The AiiDA team. All rights reserved. #
# This file is part of the AiiDA code. #
# #
# The code is hosted on GitHub at https://github.com/aiidateam/aiida-core #
# For further information on the license, see the LICENSE.txt file #
# For further information please visit http://www.aiida.net #
###########################################################################
"""Module for all common top level AiiDA entity classes and methods"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import typing
from plumpy.base.utils import super_check, call_with_super_check
from aiida.common import exceptions
from aiida.common import datastructures
from aiida.common.lang import classproperty, type_check
from aiida.manage.manager import get_manager
__all__ = ('Entity', 'Collection')
EntityType = typing.TypeVar('EntityType') # pylint: disable=invalid-name
class Collection(typing.Generic[EntityType]): # pylint: disable=unsubscriptable-object
"""Container class that represents the collection of objects of a particular type."""
# A store for any backend specific collections that already exist
_COLLECTIONS = datastructures.LazyStore()
@classmethod
def get_collection(cls, entity_type, backend):
"""
Get the collection for a given entity type and backend instance
:param entity_type: the entity type e.g. User, Computer, etc
:type entity_type: :class:`aiida.orm.Entity`
:param backend: the backend instance to get the collection for
:type backend: :class:`aiida.orm.implementation.Backend`
:return: a new collection with the new backend
:rtype: :class:`aiida.orm.Collection`
"""
# Lazily get the collection i.e. create only if we haven't done so yet
return cls._COLLECTIONS.get((entity_type, backend), lambda: entity_type.Collection(backend, entity_type))
def __init__(self, backend, entity_class):
""" Construct a new entity collection.
:param backend: the backend instance to get the collection for
:type backend: :class:`aiida.orm.implementation.Backend`
:param entity_class: the entity type e.g. User, Computer, etc
:type entity_class: :class:`aiida.orm.Entity`
"""
assert issubclass(entity_class, Entity), "Must provide an entity type"
self._backend = backend or get_manager().get_backend()
self._entity_type = entity_class
def __call__(self, backend):
""" Create a new objects collection using a new backend.
:param backend: the backend instance to get the collection for
:type backend: :class:`aiida.orm.implementation.Backend`
:return: a new collection with the new backend
:rtype: :class:`aiida.orm.Collection`
"""
if backend is self._backend:
# Special case if they actually want the same collection
return self
return self.get_collection(self.entity_type, backend)
@property
def backend(self):
"""Return the backend.
:return: the backend instance of this collection
:rtype: :class:`aiida.orm.implementation.Backend`
"""
return self._backend
@property
def entity_type(self):
"""The entity type.
:rtype: :class:`aiida.orm.Entity`
"""
return self._entity_type
def query(self):
"""
Get a query builder for the objects of this collection
:return: a new query builder instance
:rtype: :class:`aiida.orm.QueryBuilder`
"""
# pylint: disable=no-self-use
from . import querybuilder
query = querybuilder.QueryBuilder()
query.append(self._entity_type, project='*')
return query
def get(self, **filters):
"""
Get a single collection entry that matches the filter criteria
:param filters: the filters identifying the object to get
:type filters: dict
:return: the entry
"""
res = self.find(filters=filters)
if not res:
raise exceptions.NotExistent("No {} with filter '{}' found".format(self.entity_type.__name__, filters))
if len(res) > 1:
raise exceptions.MultipleObjectsError("Multiple {}s found with the same id '{}'".format(
self.entity_type.__name__, id))
return res[0]
def find(self, filters=None, order_by=None, limit=None):
"""
Find collection entries matching the filter criteria
:param filters: the keyword value pair filters to match
:type filters: dict
:param order_by: a list of (key, direction) pairs specifying the sort order
:type order_by: list
:param limit: the maximum number of results to return
:type limit: int
:return: a list of resulting matches
:rtype: list
"""
query = self.query()
filters = filters or {}
query.add_filter(self.entity_type, filters)
if order_by:
query.order_by({self.entity_type: order_by})
if limit:
query.limit(limit)
return [_[0] for _ in query.all()]
def all(self):
"""
Get all entities in this collection
:return: A collection of users matching the criteria
:rtype: list
"""
return [_[0] for _ in self.query().all()]
class Entity(object): # pylint: disable=useless-object-inheritance
"""An AiiDA entity"""
_objects = None
# Define our collection type
Collection = Collection
@classproperty
def objects(cls, backend=None): # pylint: disable=no-self-use, no-self-argument
"""
Get a collection for objects of this type.
:param backend: the optional backend to use (otherwise use default)
:type backend: :class:`aiida.orm.implementation.Backend`
:return: an object that can be used to access entities of this type
:rtype: :class:`aiida.orm.Collection`
"""
backend = backend or get_manager().get_backend()
return cls.Collection.get_collection(cls, backend)
@classmethod
def get(cls, **kwargs):
# pylint: disable=redefined-builtin, invalid-name
return cls.objects.get(**kwargs) # pylint: disable=no-member
@classmethod
def from_backend_entity(cls, backend_entity):
"""
Construct an entity from a backend entity instance
:param backend_entity: the backend entity
:return: an AiiDA entity instance
"""
from . import implementation
type_check(backend_entity, implementation.BackendEntity)
entity = cls.__new__(cls)
entity.init_from_backend(backend_entity)
call_with_super_check(entity.initialize)
return entity
def __init__(self, backend_entity):
"""
:param backend_entity: the backend model supporting this entity
:type backend_entity: :class:`aiida.orm.implementation.BackendEntity`
"""
self._backend_entity = backend_entity
call_with_super_check(self.initialize)
def init_from_backend(self, backend_entity):
"""
:param backend_entity: the backend model supporting this entity
:type backend_entity: :class:`aiida.orm.implementation.BackendEntity`
"""
self._backend_entity = backend_entity
@super_check
def initialize(self):
"""Initialize instance attributes.
This will be called after the constructor is called or an entity is created from an existing backend entity.
"""
@property
def id(self):
"""Return the id for this entity.
This identifier is guaranteed to be unique amongst entities of the same type for a single backend instance.
:return: the entity's id
"""
# pylint: disable=redefined-builtin, invalid-name
return self._backend_entity.id
@property
def pk(self):
"""Return the primary key for this entity.
This identifier is guaranteed to be unique amongst entities of the same type for a single backend instance.
:return: the entity's principal key
"""
return self.id
@property
def uuid(self):
"""Return the UUID for this entity.
This identifier is unique across all entities types and backend instances.
:return: the entity uuid
:rtype: :class:`uuid.UUID`
"""
return self._backend_entity.uuid
def store(self):
"""Store the entity."""
self._backend_entity.store()
return self
@property
def is_stored(self):
"""Return whether the entity is stored.
:return: boolean, True if stored, False otherwise
:rtype: bool
"""
return self._backend_entity.is_stored
@property
def backend(self):
"""
Get the backend for this entity
:return: the backend instance
"""
return self._backend_entity.backend
@property
def backend_entity(self):
"""
Get the implementing class for this object
:return: the class model
"""
return self._backend_entity
| 32.431973 | 116 | 0.627687 | 1,121 | 9,535 | 5.214095 | 0.211418 | 0.060051 | 0.031138 | 0.032335 | 0.282293 | 0.24722 | 0.216082 | 0.188024 | 0.179299 | 0.179299 | 0 | 0.000722 | 0.273309 | 9,535 | 293 | 117 | 32.542662 | 0.842834 | 0.476875 | 0 | 0.17 | 0 | 0 | 0.030538 | 0 | 0 | 0 | 0 | 0 | 0.01 | 1 | 0.22 | false | 0 | 0.11 | 0.01 | 0.57 | 0.01 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
65be1ffede01306450a5f34b42845bf53968f1d8 | 248 | py | Python | pre_definition/solve_caller.py | sr9000/stepik_code_task_baking | 60a5197f659db1734132eeb9d82624f1b7aaeb3f | [
"MIT"
] | null | null | null | pre_definition/solve_caller.py | sr9000/stepik_code_task_baking | 60a5197f659db1734132eeb9d82624f1b7aaeb3f | [
"MIT"
] | null | null | null | pre_definition/solve_caller.py | sr9000/stepik_code_task_baking | 60a5197f659db1734132eeb9d82624f1b7aaeb3f | [
"MIT"
] | null | null | null | from collections.abc import Iterable as ABCIterable
def call_with_args(func, args):
if isinstance(args, dict):
return func(**args)
elif isinstance(args, ABCIterable):
return func(*args)
else:
return func(args)
| 22.545455 | 51 | 0.665323 | 31 | 248 | 5.258065 | 0.580645 | 0.196319 | 0.257669 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.241935 | 248 | 10 | 52 | 24.8 | 0.867021 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
65c451b4c4af62ac430c54bacf4793ebfef0c2ef | 48,201 | py | Python | pysnmp-with-texts/DOCS-LOADBALANCING-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/DOCS-LOADBALANCING-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/DOCS-LOADBALANCING-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module DOCS-LOADBALANCING-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/DOCS-LOADBALANCING-MIB
# Produced by pysmi-0.3.4 at Wed May 1 12:53:17 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, ObjectIdentifier, OctetString = mibBuilder.importSymbols("ASN1", "Integer", "ObjectIdentifier", "OctetString")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ConstraintsIntersection, ValueSizeConstraint, ValueRangeConstraint, ConstraintsUnion, SingleValueConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ConstraintsIntersection", "ValueSizeConstraint", "ValueRangeConstraint", "ConstraintsUnion", "SingleValueConstraint")
clabProjDocsis, = mibBuilder.importSymbols("CLAB-DEF-MIB", "clabProjDocsis")
docsIfCmtsCmStatusIndex, docsIfCmtsCmStatusEntry = mibBuilder.importSymbols("DOCS-IF-MIB", "docsIfCmtsCmStatusIndex", "docsIfCmtsCmStatusEntry")
InterfaceIndex, = mibBuilder.importSymbols("IF-MIB", "InterfaceIndex")
ObjectGroup, NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "ObjectGroup", "NotificationGroup", "ModuleCompliance")
ModuleIdentity, Gauge32, Counter32, IpAddress, MibIdentifier, MibScalar, MibTable, MibTableRow, MibTableColumn, NotificationType, Bits, TimeTicks, Counter64, Unsigned32, zeroDotZero, Integer32, iso, ObjectIdentity = mibBuilder.importSymbols("SNMPv2-SMI", "ModuleIdentity", "Gauge32", "Counter32", "IpAddress", "MibIdentifier", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "NotificationType", "Bits", "TimeTicks", "Counter64", "Unsigned32", "zeroDotZero", "Integer32", "iso", "ObjectIdentity")
TimeStamp, TruthValue, TextualConvention, DisplayString, RowStatus, RowPointer, MacAddress = mibBuilder.importSymbols("SNMPv2-TC", "TimeStamp", "TruthValue", "TextualConvention", "DisplayString", "RowStatus", "RowPointer", "MacAddress")
docsLoadBalanceMib = ModuleIdentity((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2))
docsLoadBalanceMib.setRevisions(('2004-03-10 17:00',))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
if mibBuilder.loadTexts: docsLoadBalanceMib.setRevisionsDescriptions(('Initial version of this mib module.',))
if mibBuilder.loadTexts: docsLoadBalanceMib.setLastUpdated('200403101700Z')
if mibBuilder.loadTexts: docsLoadBalanceMib.setOrganization('Cable Television Laboratories, Inc')
if mibBuilder.loadTexts: docsLoadBalanceMib.setContactInfo(' Postal: Cable Television Laboratories, Inc. 400 Centennial Parkway Louisville, Colorado 80027-1266 U.S.A. Phone: +1 303-661-9100 Fax: +1 303-661-9199 E-mail: mibs@cablelabs.com')
if mibBuilder.loadTexts: docsLoadBalanceMib.setDescription('This is the MIB Module for the load balancing. Load balancing is manageable on a per-CM basis. Each CM is assigned: a) to a set of channels (a Load Balancing Group) among which it can be moved by the CMTS b) a policy which governs if and when the CM can be moved c) a priority value which can be used by the CMTS in order to select CMs to move.')
class ChannelChgInitTechMap(TextualConvention, Bits):
description = "This textual convention enumerates the Initialization techniques for Dynamic Channel Change (DCC). The techniques are represented by the 5 most significant bits (MSB). Bits 0 through 4 map to initialization techniques 0 through 4. Each bit position represents the internal associated technique as described below: reinitializeMac(0) : Reinitialize the MAC broadcastInitRanging(1): Perform Broadcast initial ranging on new channel before normal operation unicastInitRanging(2) : Perform unicast ranging on new channel before normal operation initRanging(3) : Perform either broadcast or unicast ranging on new channel before normal operation direct(4) : Use the new channel(s) directly without re-initializing or ranging Multiple bits selection in 1's means the CMTS selects the best suitable technique among the selected in a proprietary manner. An empty value or a value with all bits in '0' means no channel changes allowed"
status = 'current'
namedValues = NamedValues(("reinitializeMac", 0), ("broadcastInitRanging", 1), ("unicastInitRanging", 2), ("initRanging", 3), ("direct", 4))
docsLoadBalNotifications = MibIdentifier((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 0))
docsLoadBalMibObjects = MibIdentifier((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1))
docsLoadBalSystem = MibIdentifier((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 1))
docsLoadBalChgOverObjects = MibIdentifier((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2))
docsLoadBalGrpObjects = MibIdentifier((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3))
docsLoadBalPolicyObjects = MibIdentifier((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4))
docsLoadBalChgOverGroup = MibIdentifier((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 1))
docsLoadBalEnable = MibScalar((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 1, 1), TruthValue()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: docsLoadBalEnable.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalEnable.setDescription('Setting this object to true(1) enables internal autonomous load balancing operation on this CMTS. Setting it to false(2) disables the autonomous load balancing operations. However moving a cable modem via docsLoadBalChgOverTable is allowed even when this object is set to false(2).')
docsLoadBalChgOverMacAddress = MibScalar((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 1, 1), MacAddress().clone(hexValue="000000000000")).setMaxAccess("readwrite")
if mibBuilder.loadTexts: docsLoadBalChgOverMacAddress.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverMacAddress.setDescription('The mac address of the cable modem that the CMTS instructs to move to a new downstream frequency and/or upstream channel.')
docsLoadBalChgOverDownFrequency = MibScalar((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1000000000))).setUnits('hertz').setMaxAccess("readwrite")
if mibBuilder.loadTexts: docsLoadBalChgOverDownFrequency.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverDownFrequency.setDescription('The new downstream frequency to which the cable modem is instructed to move. The value 0 indicates that the CMTS does not create a TLV for the downstream frequency in the DCC-REQ message. This object has no meaning when executing UCC operations.')
docsLoadBalChgOverUpChannelId = MibScalar((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-1, 255)).clone(-1)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: docsLoadBalChgOverUpChannelId.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverUpChannelId.setDescription('The new upstream channel ID to which the cable modem is instructed to move. The value -1 indicates that the CMTS does not create a TLV for the upstream channel ID in the channel change request.')
docsLoadBalChgOverInitTech = MibScalar((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 1, 4), ChannelChgInitTechMap()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: docsLoadBalChgOverInitTech.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverInitTech.setDescription("The initialization technique that the cable modem is instructed to use when performing change over operation. By default this object is initialized with all the defined bits having a value of '1'.")
docsLoadBalChgOverCmd = MibScalar((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("any", 1), ("dcc", 2), ("ucc", 3))).clone('any')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: docsLoadBalChgOverCmd.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverCmd.setDescription('The change over command that the CMTS is instructed use when performing change over operation. The any(1) value indicates that the CMTS is to use its own algorithm to determine the appropriate command.')
docsLoadBalChgOverCommit = MibScalar((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 1, 6), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: docsLoadBalChgOverCommit.setReference('Data-Over-Cable Service Interface Specifications: Radio Frequency Interface Specification SP-RFIv2.0-I04-030730, Sections C.4.1, 11.4.5.1.')
if mibBuilder.loadTexts: docsLoadBalChgOverCommit.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverCommit.setDescription("The command to execute the DCC/UCC operation when set to true(1). The following are reasons for rejecting an SNMP SET to this object: - The MAC address in docsLoadBalChgOverMacAddr is not an existing MAC address in docsIfCmtsMacToCmEntry. - docsLoadBalChgOverCmd is ucc(3) and docsLoadBalChgOverUpChannelId is '-1', - docsLoadBalChgOverUpChannelId is '-1' and docsLoadBalChgOverDownFrequency is '0'. - DCC/UCC operation is currently being executed for the cable modem, on which the new command is committed, specifically if the value of docsLoadBalChgOverStatusValue is one of: messageSent(1), modemDeparting(4), waitToSendMessage(6). - An UCC operation is committed for a non-existing upstream channel ID or the corresponding ifOperStatus is down(2). - A DCC operation is committed for an invalid or non-existing downstream frequency, or the corresponding ifOperStatus is down(2). In those cases, the SET is rejected with an error code 'commitFailed'. After processing the SNMP SET the information in docsLoadBalChgOverGroup is updated in a corresponding entry in docsLoadBalChgOverStatusEntry. Reading this object always returns false(2).")
docsLoadBalChgOverLastCommit = MibScalar((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 1, 7), TimeStamp()).setMaxAccess("readonly")
if mibBuilder.loadTexts: docsLoadBalChgOverLastCommit.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverLastCommit.setDescription('The value of sysUpTime when docsLoadBalChgOverCommit was last set to true. Zero if never set.')
docsLoadBalChgOverStatusTable = MibTable((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 2), )
if mibBuilder.loadTexts: docsLoadBalChgOverStatusTable.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverStatusTable.setDescription('A table of CMTS operation entries to reports the status of cable modems instructed to move to a new downstream and/or upstream channel. Using the docsLoadBalChgOverGroup objects. An entry in this table is created or updated for the entry with docsIfCmtsCmStatusIndex that correspond to the cable modem MAC address of the Load Balancing operation. docsLoadBalChgOverCommit to true(1).')
docsLoadBalChgOverStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 2, 1), ).setIndexNames((0, "DOCS-IF-MIB", "docsIfCmtsCmStatusIndex"))
if mibBuilder.loadTexts: docsLoadBalChgOverStatusEntry.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverStatusEntry.setDescription('A CMTS operation entry to instruct a cable modem to move to a new downstream frequency and/or upstream channel. An operator can use this to initiate an operation in CMTS to instruct the selected cable modem to move to a new downstream frequency and/or upstream channel.')
docsLoadBalChgOverStatusMacAddr = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 2, 1, 1), MacAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: docsLoadBalChgOverStatusMacAddr.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverStatusMacAddr.setDescription('The mac address set in docsLoadBalChgOverMacAddress.')
docsLoadBalChgOverStatusDownFreq = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1000000000))).setUnits('hertz').setMaxAccess("readonly")
if mibBuilder.loadTexts: docsLoadBalChgOverStatusDownFreq.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverStatusDownFreq.setDescription('The Downstream frequency set in docsLoadBalChgOverDownFrequency.')
docsLoadBalChgOverStatusUpChnId = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 2, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-1, 255)).clone(-1)).setMaxAccess("readonly")
if mibBuilder.loadTexts: docsLoadBalChgOverStatusUpChnId.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverStatusUpChnId.setDescription('The upstream channel ID set in docsLoadBalChgOverUpChannelId.')
docsLoadBalChgOverStatusInitTech = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 2, 1, 4), ChannelChgInitTechMap()).setMaxAccess("readonly")
if mibBuilder.loadTexts: docsLoadBalChgOverStatusInitTech.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverStatusInitTech.setDescription('The initialization technique set in docsLoadBalChgOverInitTech.')
docsLoadBalChgOverStatusCmd = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 2, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("any", 1), ("dcc", 2), ("ucc", 3))).clone('any')).setMaxAccess("readonly")
if mibBuilder.loadTexts: docsLoadBalChgOverStatusCmd.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverStatusCmd.setDescription('The load balancing command set in docsLoadBalChgOverCmd.')
docsLoadBalChgOverStatusValue = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 2, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9, 10))).clone(namedValues=NamedValues(("messageSent", 1), ("noOpNeeded", 2), ("modemDeparting", 3), ("waitToSendMessage", 4), ("cmOperationRejected", 5), ("cmtsOperationRejected", 6), ("timeOutT13", 7), ("timeOutT15", 8), ("rejectinit", 9), ("success", 10))).clone('waitToSendMessage')).setMaxAccess("readonly")
if mibBuilder.loadTexts: docsLoadBalChgOverStatusValue.setReference('Data-Over-Cable Service Interface Specifications: Radio Frequency Interface Specification SP-RFIv2.0-I04-030730, Sections C.4.1, 11.4.5.1.')
if mibBuilder.loadTexts: docsLoadBalChgOverStatusValue.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverStatusValue.setDescription("The status of the specified DCC/UCC operation. The enumerations are: messageSent(1): The CMTS has sent change over request message to the cable modem. noOpNeed(2): A operation was requested in which neither the DS Frequency nor the Upstream Channel ID was changed. An active value in this entry's row status indicates that no CMTS operation is required. modemDeparting(3): The cable modem has responded with a change over response of either a DCC-RSP with a confirmation code of depart(180) or a UCC-RSP. waitToSendMessage(4): The specified operation is active and CMTS is waiting to send the channel change message with channel info to the cable modem. cmOperationRejected(5): Channel Change (such as DCC or UCC) operation was rejected by the cable modem. cmtsOperationRejected(6) Channel Change (such as DCC or UCC) operation was rejected by the Cable modem Termination System. timeOutT13(7): Failure due to no DCC-RSP with confirmation code depart(180) received prior to expiration of the T13 timer. timeOutT15(8): T15 timer timed out prior to the arrival of a bandwidth request, RNG-REQ message, or DCC-RSP message with confirmation code of arrive(181) from the cable modem. rejectInit(9): DCC operation rejected due to unsupported initialization tech requested. success(10): CMTS received an indication that the CM successfully completed the change over operation. e.g., If an initialization technique of re-initialize the MAC is used, success in indicated by the receipt of a DCC-RSP message with a confirmation code of depart(180). In all other cases, success is indicated by: (1) the CMTS received a DCC-RSP message with confirmation code of arrive(181) or (2) the CMTS internally confirms the presence of the CM on the new channel.")
docsLoadBalChgOverStatusUpdate = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 2, 2, 1, 7), TimeStamp()).setMaxAccess("readonly")
if mibBuilder.loadTexts: docsLoadBalChgOverStatusUpdate.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChgOverStatusUpdate.setDescription('The value of sysUpTime when docsLoadBalChgOverStatusValue was last updated.')
docsLoadBalGrpTable = MibTable((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 1), )
if mibBuilder.loadTexts: docsLoadBalGrpTable.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalGrpTable.setDescription('This table contains the attributes of the load balancing groups present in this CMTS.')
docsLoadBalGrpEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 1, 1), ).setIndexNames((0, "DOCS-LOADBALANCING-MIB", "docsLoadBalGrpId"))
if mibBuilder.loadTexts: docsLoadBalGrpEntry.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalGrpEntry.setDescription('A set of attributes of load balancing group in the CMTS. It is index by a docsLoadBalGrpId which is unique within a CMTS. Entries in this table persist after CMTS initialization.')
docsLoadBalGrpId = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 1, 1, 1), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 4294967295)))
if mibBuilder.loadTexts: docsLoadBalGrpId.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalGrpId.setDescription('A unique index assigned to the load balancing group by the CMTS.')
docsLoadBalGrpIsRestricted = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 1, 1, 2), TruthValue().clone('false')).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalGrpIsRestricted.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalGrpIsRestricted.setDescription('A value true(1)Indicates type of load balancing group. A Restricted Load Balancing Group is associated to a specific provisioned set of cable modems. Restricted Load Balancing Group is used to accommodate a topology specific or provisioning specific restriction. Example such as a group that are reserved for business customers). Setting this object to true(1) means it is a Restricted Load Balancing type and setting it to false(2) means it is a General Load Balancing group type. This object should not be changed while its group ID is referenced by an active entry in docsLoadBalRestrictCmEntry.')
docsLoadBalGrpInitTech = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 1, 1, 3), ChannelChgInitTechMap()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalGrpInitTech.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalGrpInitTech.setDescription("The initialization techniques that the CMTS can use when load balancing cable modems in the load balancing group. By default this object is initialized with all the defined bits having a value of '1'.")
docsLoadBalGrpDefaultPolicy = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 1, 1, 4), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalGrpDefaultPolicy.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalGrpDefaultPolicy.setDescription('Each Load Balancing Group has a default Load Balancing Policy. A policy is described by a set of conditions (rules) that govern the load balancing process for a cable modem. The CMTS assigns this Policy ID value to a cable modem associated with the group ID when the cable modem does not signal a Policy ID during registration. The Policy ID value is intended to be a numeric reference to a row entry in docsLoadBalPolicyEntry. However, It is not required to have an existing or active entry in docsLoadBalPolicyEntry when setting the value of docsLoadBalGrpDefaultPolicy, in which case it indicates no policy is associated with the load Balancing Group. The Policy ID of value 0 is reserved to indicate no policy is associated with the load balancing group.')
docsLoadBalGrpEnable = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 1, 1, 5), TruthValue().clone('true')).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalGrpEnable.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalGrpEnable.setDescription('Setting this object to true(1) enables internal autonomous load balancing on this group. Setting it to false(2) disables the load balancing operation on this group.')
docsLoadBalGrpChgOverSuccess = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 1, 1, 6), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: docsLoadBalGrpChgOverSuccess.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalGrpChgOverSuccess.setDescription('The number of successful load balancing change over operations initiated within this load balancing group.')
docsLoadBalGrpChgOverFails = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 1, 1, 7), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: docsLoadBalGrpChgOverFails.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalGrpChgOverFails.setDescription('The number of failed load balancing change over operations initiated within this load balancing group.')
docsLoadBalGrpStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 1, 1, 8), RowStatus()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalGrpStatus.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalGrpStatus.setDescription("Indicates the status of the row in this table. Setting this object to 'destroy' or 'notInService' for a group ID entry already referenced by docsLoadBalChannelEntry, docsLoadBalChnPairsEntry or docsLoadBalRestrictCmEntry returns an error code inconsistentValue.")
docsLoadBalChannelTable = MibTable((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 2), )
if mibBuilder.loadTexts: docsLoadBalChannelTable.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChannelTable.setDescription('Lists all upstream and downstream channels associated with load balancing groups.')
docsLoadBalChannelEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 2, 1), ).setIndexNames((0, "DOCS-LOADBALANCING-MIB", "docsLoadBalGrpId"), (0, "DOCS-LOADBALANCING-MIB", "docsLoadBalChannelIfIndex"))
if mibBuilder.loadTexts: docsLoadBalChannelEntry.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChannelEntry.setDescription('Lists a specific upstream or downstream, within a load Balancing group. An entry in this table exists for each ifEntry with an ifType of docsCableDownstream(128) and docsCableUpstream(129) associated with the Load Balancing Group. Entries in this table persist after CMTS initialization.')
docsLoadBalChannelIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 2, 1, 1), InterfaceIndex())
if mibBuilder.loadTexts: docsLoadBalChannelIfIndex.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChannelIfIndex.setDescription('The ifIndex of either the downstream or upstream.')
docsLoadBalChannelStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 2, 1, 2), RowStatus()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalChannelStatus.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChannelStatus.setDescription("Indicates the status of the rows in this table. Creating entries in this table requires an existing value for docsLoadBalGrpId in docsLoadBalGrpEntry and an existing value of docsLoadBalChannelIfIndex in ifEntry, otherwise is rejected with error 'noCreation'. Setting this object to 'destroy' or 'notInService for a a row entry that is being referenced by docsLoadBalChnPairsEntry is rejected with error code inconsistentValue.")
docsLoadBalChnPairsTable = MibTable((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 3), )
if mibBuilder.loadTexts: docsLoadBalChnPairsTable.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChnPairsTable.setDescription('This table contains pairs of upstream channels within a Load Balancing Group. Entries in this table are used to override the initialization techniques defined for the associated Load Balancing Group.')
docsLoadBalChnPairsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 3, 1), ).setIndexNames((0, "DOCS-LOADBALANCING-MIB", "docsLoadBalGrpId"), (0, "DOCS-LOADBALANCING-MIB", "docsLoadBalChnPairsIfIndexDepart"), (0, "DOCS-LOADBALANCING-MIB", "docsLoadBalChnPairsIfIndexArrive"))
if mibBuilder.loadTexts: docsLoadBalChnPairsEntry.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChnPairsEntry.setDescription('An entry in this table describes a channel pair for which an initialization technique override is needed. On a CMTS which supports logical upstream channels (ifType is equal to docsCableUpstreamChannel(205)), the entries in this table correspond to pairs of ifType 205. On a CMTS which only supports physical upstream channels (ifType is equal to docsCableUpstream(129)), the entries in this table correspond to pairs of ifType 129. Entries in this table persist after CMTS initialization.')
docsLoadBalChnPairsIfIndexDepart = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 3, 1, 1), InterfaceIndex())
if mibBuilder.loadTexts: docsLoadBalChnPairsIfIndexDepart.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChnPairsIfIndexDepart.setDescription('This index indicates the ifIndex of the upstream channel from which a cable modem would depart in a load balancing channel change operation.')
docsLoadBalChnPairsIfIndexArrive = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 3, 1, 2), InterfaceIndex())
if mibBuilder.loadTexts: docsLoadBalChnPairsIfIndexArrive.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChnPairsIfIndexArrive.setDescription('This index indicates the ifIndex of the upstream channel on which a cable modem would arrive in a load balancing channel change operation.')
docsLoadBalChnPairsOperStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 3, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("operational", 1), ("notOperational", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: docsLoadBalChnPairsOperStatus.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChnPairsOperStatus.setDescription('Operational status of the channel pair. The value operational(1) indicates that ifOperStatus of both channels is up(1). The value notOperational(2) means that ifOperStatus of one or both is not up(1).')
docsLoadBalChnPairsInitTech = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 3, 1, 4), ChannelChgInitTechMap()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalChnPairsInitTech.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChnPairsInitTech.setDescription("Specifies initialization technique for load balancing for the Depart/Arrive pair. By default this object's value is the initialization technique configured for the Load Balancing Group indicated by docsLoadBalGrpId.")
docsLoadBalChnPairsRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 3, 1, 5), RowStatus()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalChnPairsRowStatus.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalChnPairsRowStatus.setDescription("The object for conceptual rows creation. An attempt to create a row with values for docsLoadBalChnPairsIfIndexDepart or docsLoadBalChnPairsIfIndexArrive which are not a part of the Load Balancing Group (or for a 2.0 CMTS are not logical channels (ifType 205)) are rejected with a 'noCreation' error status reported. There is no restriction on settings columns in this table when the value of docsLoadBalChnPairsRowStatus is active(1).")
docsLoadBalRestrictCmTable = MibTable((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 4), )
if mibBuilder.loadTexts: docsLoadBalRestrictCmTable.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalRestrictCmTable.setDescription('Lists all cable modems in each Restricted Load Balancing Groups.')
docsLoadBalRestrictCmEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 4, 1), ).setIndexNames((0, "DOCS-LOADBALANCING-MIB", "docsLoadBalGrpId"), (0, "DOCS-LOADBALANCING-MIB", "docsLoadBalRestrictCmIndex"))
if mibBuilder.loadTexts: docsLoadBalRestrictCmEntry.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalRestrictCmEntry.setDescription('An entry of modem within a restricted load balancing group type. An entry represents a cable modem that is associated with the Restricted Load Balancing Group ID of a Restricted Load Balancing Group. Entries in this table persist after CMTS initialization.')
docsLoadBalRestrictCmIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 4, 1, 1), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 4294967295)))
if mibBuilder.loadTexts: docsLoadBalRestrictCmIndex.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalRestrictCmIndex.setDescription('The index that uniquely identifies an entry which represents restricted cable modem(s) within each Restricted Load Balancing Group.')
docsLoadBalRestrictCmMACAddr = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 4, 1, 2), MacAddress()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalRestrictCmMACAddr.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalRestrictCmMACAddr.setDescription('Mac Address of the cable modem within the restricted load balancing group.')
docsLoadBalRestrictCmMacAddrMask = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 4, 1, 3), OctetString().subtype(subtypeSpec=ConstraintsUnion(ValueSizeConstraint(0, 0), ValueSizeConstraint(6, 6), )).clone(hexValue="")).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalRestrictCmMacAddrMask.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalRestrictCmMacAddrMask.setDescription('A bit mask acting as a wild card to associate a set of modem MAC addresses to the same Group ID. Cable modem look up is performed first with entries containing this value not null, if several entries match, the largest consecutive bit match from MSB to LSB is used. Empty value is equivalent to the bit mask all in ones.')
docsLoadBalRestrictCmStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 3, 4, 1, 4), RowStatus()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalRestrictCmStatus.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalRestrictCmStatus.setDescription("Indicates the status of the rows in this table. The attempt to create an entry associated to a group ID with docsLoadBalGrpIsRestricted equal to false(2) returns an error 'noCreation'. There is no restriction on settings columns in this table any time.")
docsLoadBalPolicyTable = MibTable((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4, 1), )
if mibBuilder.loadTexts: docsLoadBalPolicyTable.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalPolicyTable.setDescription('This table describes the set of Load Balancing policies. Rows in this table might be referenced by rows in docsLoadBalGrpEntry.')
docsLoadBalPolicyEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4, 1, 1), ).setIndexNames((0, "DOCS-LOADBALANCING-MIB", "docsLoadBalPolicyId"), (0, "DOCS-LOADBALANCING-MIB", "docsLoadBalPolicyRuleId"))
if mibBuilder.loadTexts: docsLoadBalPolicyEntry.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalPolicyEntry.setDescription('Entries containing rules for policies. When a load balancing policy is defined by multiple rules, all the rules apply. Load balancing rules can be created to allow for specific vendor-defined load balancing actions. However there is a basic rule that the CMTS is required to support by configuring a pointer in docsLoadBalPolicyRulePtr to the table docsLoadBalBasicRuleTable. Vendor specific rules may be added by pointing the object docsLoadBalPolicyRulePtr to proprietary mib structures. Entries in this table persist after CMTS initialization.')
docsLoadBalPolicyId = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4, 1, 1, 1), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 4294967295)))
if mibBuilder.loadTexts: docsLoadBalPolicyId.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalPolicyId.setDescription('An index identifying the Load Balancing Policy.')
docsLoadBalPolicyRuleId = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4, 1, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 4294967295)))
if mibBuilder.loadTexts: docsLoadBalPolicyRuleId.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalPolicyRuleId.setDescription('An index for the rules entries associated within a policy.')
docsLoadBalPolicyRulePtr = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4, 1, 1, 3), RowPointer().clone((0, 0))).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalPolicyRulePtr.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalPolicyRulePtr.setDescription('A pointer to an entry in a rule table. E.g., docsLoadBalBasicRuleEnable in docsLoadBalBasicRuleEntry. A value pointing to zeroDotZero, an inactive Row or a non-existing entry is treated as no rule defined for this policy entry.')
docsLoadBalPolicyRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4, 1, 1, 5), RowStatus()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalPolicyRowStatus.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalPolicyRowStatus.setDescription("The status of this conceptual row. There is no restriction on settings columns in this table when the value of docsLoadBalPolicyRowStatus is active(1). Setting this object to 'destroy' or 'notInService' for a row entry that is being referenced by docsLoadBalGrpDefaultPolicy in docsLoadBalGrpEntry returns an error code inconsistentValue.")
docsLoadBalBasicRuleTable = MibTable((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4, 2), )
if mibBuilder.loadTexts: docsLoadBalBasicRuleTable.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalBasicRuleTable.setDescription('DOCSIS defined basic ruleset for load Balancing Policy. This table enables of disable load balancing for the groups pointing to this ruleset in the policy group.')
docsLoadBalBasicRuleEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4, 2, 1), ).setIndexNames((0, "DOCS-LOADBALANCING-MIB", "docsLoadBalBasicRuleId"))
if mibBuilder.loadTexts: docsLoadBalBasicRuleEntry.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalBasicRuleEntry.setDescription('An entry of DOCSIS defined basic ruleset. The object docsLoadBalBasicRuleEnable is used for instantiating an entry in this table via a RowPointer. Entries in this table persist after CMTS initialization.')
docsLoadBalBasicRuleId = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4, 2, 1, 1), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 4294967295)))
if mibBuilder.loadTexts: docsLoadBalBasicRuleId.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalBasicRuleId.setDescription('The unique index for this row.')
docsLoadBalBasicRuleEnable = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4, 2, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("enabled", 1), ("disabled", 2), ("disabledPeriod", 3)))).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalBasicRuleEnable.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalBasicRuleEnable.setDescription('When using this ruleset, load balancing is enabled or disabled by the values enabled(1) and disabled(2) respectively. Additionally, a Load Balancing disabling period is defined in docsLoadBalBasicRuleDisStart and docsLoadBalBasicRuleDisPeriod if this object value is set to disabledPeriod(3).')
docsLoadBalBasicRuleDisStart = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4, 2, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 86400))).setUnits('seconds').setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalBasicRuleDisStart.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalBasicRuleDisStart.setDescription('if object docsLoadBalBasicRuleEnable is disablePeriod(3) Load Balancing is disabled starting at this object value time (seconds from 12 AM). Otherwise, this object has no meaning.')
docsLoadBalBasicRuleDisPeriod = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4, 2, 1, 4), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 86400))).setUnits('seconds').setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalBasicRuleDisPeriod.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalBasicRuleDisPeriod.setDescription('If object docsLoadBalBasicRuleEnable is disablePeriod(3) Load Balancing is disabled for the period of time defined between docsLoadBalBasicRuleDisStart and docsLoadBalBasicRuleDisStart plus the period of time of docsLoadBalBasicRuleDisPeriod. Otherwise, this object value has no meaning.')
docsLoadBalBasicRuleRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 4, 2, 1, 5), RowStatus()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: docsLoadBalBasicRuleRowStatus.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalBasicRuleRowStatus.setDescription("This object is to create or delete rows in this table. There is no restriction for changing this row status or object's values in this table at any time.")
docsLoadBalCmtsCmStatusTable = MibTable((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 1, 4), )
if mibBuilder.loadTexts: docsLoadBalCmtsCmStatusTable.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalCmtsCmStatusTable.setDescription('The list contains the load balancing attributes associated with the cable modem. ')
docsLoadBalCmtsCmStatusEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 1, 4, 1), )
docsIfCmtsCmStatusEntry.registerAugmentions(("DOCS-LOADBALANCING-MIB", "docsLoadBalCmtsCmStatusEntry"))
docsLoadBalCmtsCmStatusEntry.setIndexNames(*docsIfCmtsCmStatusEntry.getIndexNames())
if mibBuilder.loadTexts: docsLoadBalCmtsCmStatusEntry.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalCmtsCmStatusEntry.setDescription('Additional objects for docsIfCmtsCmStatusTable entry that relate to load balancing ')
docsLoadBalCmtsCmStatusGroupId = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 1, 4, 1, 1), Unsigned32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: docsLoadBalCmtsCmStatusGroupId.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalCmtsCmStatusGroupId.setDescription('The Group ID associated with this cable modem.')
docsLoadBalCmtsCmStatusPolicyId = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 1, 4, 1, 2), Unsigned32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: docsLoadBalCmtsCmStatusPolicyId.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalCmtsCmStatusPolicyId.setDescription('The Policy ID associated with this cable modem.')
docsLoadBalCmtsCmStatusPriority = MibTableColumn((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 1, 1, 4, 1, 3), Unsigned32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: docsLoadBalCmtsCmStatusPriority.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalCmtsCmStatusPriority.setDescription('The Priority associated with this cable modem.')
docsLoadBalConformance = MibIdentifier((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 2))
docsLoadBalCompliances = MibIdentifier((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 2, 1))
docsLoadBalGroups = MibIdentifier((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 2, 2))
docsLoadBalBasicCompliance = ModuleCompliance((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 2, 1, 1)).setObjects(("DOCS-LOADBALANCING-MIB", "docsLoadBalSystemGroup"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalParametersGroup"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalPoliciesGroup"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalBasicRuleGroup"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalCmtsCmStatusGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
docsLoadBalBasicCompliance = docsLoadBalBasicCompliance.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalBasicCompliance.setDescription('The compliance statement for DOCSIS load balancing systems.')
docsLoadBalSystemGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 2, 2, 1)).setObjects(("DOCS-LOADBALANCING-MIB", "docsLoadBalEnable"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverMacAddress"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverDownFrequency"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverUpChannelId"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverInitTech"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverCmd"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverCommit"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverLastCommit"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverStatusMacAddr"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverStatusDownFreq"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverStatusUpChnId"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverStatusInitTech"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverStatusCmd"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverStatusValue"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChgOverStatusUpdate"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
docsLoadBalSystemGroup = docsLoadBalSystemGroup.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalSystemGroup.setDescription('A collection of objects providing system-wide parameters for load balancing.')
docsLoadBalParametersGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 2, 2, 2)).setObjects(("DOCS-LOADBALANCING-MIB", "docsLoadBalGrpIsRestricted"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalGrpInitTech"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalGrpDefaultPolicy"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalGrpEnable"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalGrpChgOverSuccess"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalGrpChgOverFails"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalGrpStatus"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChannelStatus"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChnPairsOperStatus"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChnPairsInitTech"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalChnPairsRowStatus"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalRestrictCmMACAddr"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalRestrictCmMacAddrMask"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalRestrictCmStatus"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
docsLoadBalParametersGroup = docsLoadBalParametersGroup.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalParametersGroup.setDescription('A collection of objects containing the load balancing parameters.')
docsLoadBalPoliciesGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 2, 2, 3)).setObjects(("DOCS-LOADBALANCING-MIB", "docsLoadBalPolicyRulePtr"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalPolicyRowStatus"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
docsLoadBalPoliciesGroup = docsLoadBalPoliciesGroup.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalPoliciesGroup.setDescription('A collection of objects providing policies.')
docsLoadBalBasicRuleGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 2, 2, 4)).setObjects(("DOCS-LOADBALANCING-MIB", "docsLoadBalBasicRuleEnable"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalBasicRuleDisStart"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalBasicRuleDisPeriod"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalBasicRuleRowStatus"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
docsLoadBalBasicRuleGroup = docsLoadBalBasicRuleGroup.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalBasicRuleGroup.setDescription('DOCSIS defined basic Ruleset for load balancing policies.')
docsLoadBalCmtsCmStatusGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4491, 2, 1, 2, 2, 2, 5)).setObjects(("DOCS-LOADBALANCING-MIB", "docsLoadBalCmtsCmStatusGroupId"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalCmtsCmStatusPolicyId"), ("DOCS-LOADBALANCING-MIB", "docsLoadBalCmtsCmStatusPriority"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
docsLoadBalCmtsCmStatusGroup = docsLoadBalCmtsCmStatusGroup.setStatus('current')
if mibBuilder.loadTexts: docsLoadBalCmtsCmStatusGroup.setDescription('Cable mode status extension objects.')
mibBuilder.exportSymbols("DOCS-LOADBALANCING-MIB", docsLoadBalChgOverStatusEntry=docsLoadBalChgOverStatusEntry, docsLoadBalCmtsCmStatusTable=docsLoadBalCmtsCmStatusTable, docsLoadBalCmtsCmStatusEntry=docsLoadBalCmtsCmStatusEntry, docsLoadBalBasicRuleDisStart=docsLoadBalBasicRuleDisStart, docsLoadBalBasicCompliance=docsLoadBalBasicCompliance, docsLoadBalChnPairsIfIndexDepart=docsLoadBalChnPairsIfIndexDepart, docsLoadBalChgOverStatusValue=docsLoadBalChgOverStatusValue, docsLoadBalMibObjects=docsLoadBalMibObjects, docsLoadBalEnable=docsLoadBalEnable, docsLoadBalGrpChgOverFails=docsLoadBalGrpChgOverFails, docsLoadBalCmtsCmStatusPriority=docsLoadBalCmtsCmStatusPriority, docsLoadBalBasicRuleDisPeriod=docsLoadBalBasicRuleDisPeriod, docsLoadBalChgOverStatusMacAddr=docsLoadBalChgOverStatusMacAddr, docsLoadBalGrpDefaultPolicy=docsLoadBalGrpDefaultPolicy, docsLoadBalGrpInitTech=docsLoadBalGrpInitTech, docsLoadBalRestrictCmStatus=docsLoadBalRestrictCmStatus, docsLoadBalChgOverGroup=docsLoadBalChgOverGroup, docsLoadBalChnPairsIfIndexArrive=docsLoadBalChnPairsIfIndexArrive, docsLoadBalChgOverLastCommit=docsLoadBalChgOverLastCommit, docsLoadBalPolicyEntry=docsLoadBalPolicyEntry, docsLoadBalChgOverStatusUpdate=docsLoadBalChgOverStatusUpdate, docsLoadBalChannelEntry=docsLoadBalChannelEntry, docsLoadBalChnPairsEntry=docsLoadBalChnPairsEntry, docsLoadBalGrpIsRestricted=docsLoadBalGrpIsRestricted, docsLoadBalSystem=docsLoadBalSystem, docsLoadBalChnPairsInitTech=docsLoadBalChnPairsInitTech, docsLoadBalBasicRuleGroup=docsLoadBalBasicRuleGroup, docsLoadBalChgOverStatusUpChnId=docsLoadBalChgOverStatusUpChnId, docsLoadBalParametersGroup=docsLoadBalParametersGroup, docsLoadBalBasicRuleEntry=docsLoadBalBasicRuleEntry, docsLoadBalRestrictCmMacAddrMask=docsLoadBalRestrictCmMacAddrMask, docsLoadBalPolicyRulePtr=docsLoadBalPolicyRulePtr, docsLoadBalGrpStatus=docsLoadBalGrpStatus, docsLoadBalSystemGroup=docsLoadBalSystemGroup, docsLoadBalGrpChgOverSuccess=docsLoadBalGrpChgOverSuccess, docsLoadBalPolicyObjects=docsLoadBalPolicyObjects, docsLoadBalGroups=docsLoadBalGroups, docsLoadBalanceMib=docsLoadBalanceMib, docsLoadBalChgOverInitTech=docsLoadBalChgOverInitTech, docsLoadBalChgOverStatusDownFreq=docsLoadBalChgOverStatusDownFreq, docsLoadBalGrpObjects=docsLoadBalGrpObjects, docsLoadBalChnPairsTable=docsLoadBalChnPairsTable, docsLoadBalCompliances=docsLoadBalCompliances, docsLoadBalCmtsCmStatusPolicyId=docsLoadBalCmtsCmStatusPolicyId, docsLoadBalGrpEnable=docsLoadBalGrpEnable, docsLoadBalBasicRuleRowStatus=docsLoadBalBasicRuleRowStatus, docsLoadBalChgOverStatusInitTech=docsLoadBalChgOverStatusInitTech, docsLoadBalGrpTable=docsLoadBalGrpTable, docsLoadBalChgOverCmd=docsLoadBalChgOverCmd, docsLoadBalGrpEntry=docsLoadBalGrpEntry, docsLoadBalRestrictCmIndex=docsLoadBalRestrictCmIndex, docsLoadBalChannelTable=docsLoadBalChannelTable, docsLoadBalChgOverObjects=docsLoadBalChgOverObjects, docsLoadBalPolicyTable=docsLoadBalPolicyTable, docsLoadBalBasicRuleTable=docsLoadBalBasicRuleTable, docsLoadBalGrpId=docsLoadBalGrpId, docsLoadBalChgOverDownFrequency=docsLoadBalChgOverDownFrequency, docsLoadBalChgOverUpChannelId=docsLoadBalChgOverUpChannelId, docsLoadBalChgOverCommit=docsLoadBalChgOverCommit, docsLoadBalPolicyRowStatus=docsLoadBalPolicyRowStatus, docsLoadBalRestrictCmMACAddr=docsLoadBalRestrictCmMACAddr, docsLoadBalPolicyId=docsLoadBalPolicyId, docsLoadBalRestrictCmTable=docsLoadBalRestrictCmTable, PYSNMP_MODULE_ID=docsLoadBalanceMib, docsLoadBalNotifications=docsLoadBalNotifications, docsLoadBalBasicRuleEnable=docsLoadBalBasicRuleEnable, docsLoadBalPolicyRuleId=docsLoadBalPolicyRuleId, docsLoadBalChnPairsOperStatus=docsLoadBalChnPairsOperStatus, docsLoadBalChgOverMacAddress=docsLoadBalChgOverMacAddress, docsLoadBalRestrictCmEntry=docsLoadBalRestrictCmEntry, docsLoadBalBasicRuleId=docsLoadBalBasicRuleId, docsLoadBalChannelIfIndex=docsLoadBalChannelIfIndex, docsLoadBalCmtsCmStatusGroup=docsLoadBalCmtsCmStatusGroup, docsLoadBalConformance=docsLoadBalConformance, docsLoadBalCmtsCmStatusGroupId=docsLoadBalCmtsCmStatusGroupId, docsLoadBalChannelStatus=docsLoadBalChannelStatus, docsLoadBalChnPairsRowStatus=docsLoadBalChnPairsRowStatus, docsLoadBalChgOverStatusTable=docsLoadBalChgOverStatusTable, ChannelChgInitTechMap=ChannelChgInitTechMap, docsLoadBalChgOverStatusCmd=docsLoadBalChgOverStatusCmd, docsLoadBalPoliciesGroup=docsLoadBalPoliciesGroup)
| 187.552529 | 4,381 | 0.801954 | 5,662 | 48,201 | 6.82674 | 0.119922 | 0.009055 | 0.074431 | 0.008175 | 0.350063 | 0.240033 | 0.210385 | 0.191318 | 0.173596 | 0.160867 | 0 | 0.043466 | 0.097903 | 48,201 | 256 | 4,382 | 188.285156 | 0.845476 | 0.007012 | 0 | 0.028455 | 0 | 0.178862 | 0.441865 | 0.088289 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.036585 | 0 | 0.052846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
65cc242de89c19efa4090dc93f9caa33777e25e0 | 837 | py | Python | monitor/monitorlibs/sendemail.py | vaedit/- | 4e68910737ac794390df05ac34a6cf46339b0002 | [
"Apache-2.0"
] | 1 | 2021-04-09T05:47:42.000Z | 2021-04-09T05:47:42.000Z | monitor/monitorlibs/sendemail.py | vaedit/python-monitor-script | 4e68910737ac794390df05ac34a6cf46339b0002 | [
"Apache-2.0"
] | null | null | null | monitor/monitorlibs/sendemail.py | vaedit/python-monitor-script | 4e68910737ac794390df05ac34a6cf46339b0002 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding:utf-8 -*-
import smtplib
from email.mime.text import MIMEText
from email.header import Header
#发送邮件函数
def smail(sub,body):
tolist = ["xx@qq.com", "xx@qq.com"]
cc = ["xx@qq.com", "xx@163.com"]
sender = '管理员 <worktest2020@163.com>'
subject = sub
smtpserver = 'smtp.163.com'
username = 'xx@163.com'
password = 'xxx'
messages = body
msg = MIMEText(messages, 'plain', 'utf-8')
msg['Subject'] = Header(subject, 'utf-8')
msg['From'] = sender
msg['To'] = ','.join(tolist)
msg['Cc'] = ','.join(cc)
try:
s = smtplib.SMTP()
s.connect(smtpserver, '25')
s.login(username, password)
s.sendmail(sender, tolist+cc, msg.as_string())
s.quit()
print '邮件发送成功'
except Exception as e:
print '邮件发送失败:%s' %e
| 26.15625 | 54 | 0.577061 | 113 | 837 | 4.265487 | 0.486726 | 0.049793 | 0.043568 | 0.037344 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033123 | 0.242533 | 837 | 31 | 55 | 27 | 0.727129 | 0.056153 | 0 | 0 | 0 | 0 | 0.174079 | 0.027954 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.076923 | 0.115385 | null | null | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
65da9cfd0758b74606005cccaa574f86bf734619 | 969 | py | Python | sharpy/linear/utils/sselements.py | ACea15/sharpy | c89ecb74be3cb9e37b23ac8a282c73b9b55dd792 | [
"BSD-3-Clause"
] | 80 | 2018-08-30T13:01:52.000Z | 2022-03-24T15:02:48.000Z | sharpy/linear/utils/sselements.py | ACea15/sharpy | c89ecb74be3cb9e37b23ac8a282c73b9b55dd792 | [
"BSD-3-Clause"
] | 88 | 2018-05-17T16:18:58.000Z | 2022-03-11T21:05:48.000Z | sharpy/linear/utils/sselements.py | ACea15/sharpy | c89ecb74be3cb9e37b23ac8a282c73b9b55dd792 | [
"BSD-3-Clause"
] | 44 | 2018-01-02T14:27:28.000Z | 2022-03-12T13:49:36.000Z | """
Linear State Space Element Class
"""
class Element(object):
"""
State space member
"""
def __init__(self):
self.sys_id = str() # A string with the name of the element
self.sys = None # The actual object
self.ss = None # The state space object
self.settings = dict()
def initialise(self, data, sys_id):
self.sys_id = sys_id
settings = data.linear.settings[sys_id] # Load settings, the settings should be stored in data.linear.settings
# data.linear.settings should be created in the class above containing the entire set up
# Get the actual class object (like lingebm) from a dictionary in the same way that it is done for the solvers
# in sharpy
# sys = sys_from_string(sys_id)
# To use the decorator idea we would first need to instantiate the class. Need to see how this is done with NL
# SHARPy
def assemble(self):
pass | 28.5 | 119 | 0.643963 | 141 | 969 | 4.340426 | 0.468085 | 0.04902 | 0.088235 | 0.084967 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.287926 | 969 | 34 | 120 | 28.5 | 0.886957 | 0.569659 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0.090909 | 0 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
65de85c428d2e16780398c226cf7243329f834fa | 1,895 | py | Python | arrp/utils/sanitize.py | LucaCappelletti94/arrp_dataset | bcea455a504e8ff718458ce12623c63e0314badb | [
"MIT"
] | null | null | null | arrp/utils/sanitize.py | LucaCappelletti94/arrp_dataset | bcea455a504e8ff718458ce12623c63e0314badb | [
"MIT"
] | null | null | null | arrp/utils/sanitize.py | LucaCappelletti94/arrp_dataset | bcea455a504e8ff718458ce12623c63e0314badb | [
"MIT"
] | null | null | null | import numpy as np
import pandas as pd
from typing import Tuple, Dict
from .load_csv import load_raw_classes, load_raw_epigenomic_data, load_raw_nucleotides_sequences
from .store_csv import store_raw_classes, store_raw_epigenomic_data, store_raw_nucleotides_sequences
from auto_tqdm import tqdm
def drop_unknown_datapoints(epigenomic_data:pd.DataFrame, nucleotides_sequences:np.ndarray, nucleotides_sequences_index:np.ndarray, classes:pd.DataFrame)->Tuple[pd.DataFrame, np.ndarray, np.ndarray, pd.DataFrame]:
"""Remove datapoints labeled as unknown (UK)."""
unknown = classes["UK"] == 1
epigenomic_data = epigenomic_data.drop(index=epigenomic_data.index[unknown])
nucleotides_sequences = nucleotides_sequences[~unknown]
nucleotides_sequences_index = nucleotides_sequences_index[~unknown]
classes = classes.drop(index=classes.index[unknown])
classes = classes.drop(columns=["UK"])
return epigenomic_data, nucleotides_sequences, nucleotides_sequences_index, classes
def sanitize(target:str, settings:Dict):
for cell_line in tqdm(settings["cell_lines"], desc="Sanitizing data"):
classes = load_raw_classes(target, cell_line)
if "UK" not in classes.columns:
continue
epigenomic_data = load_raw_epigenomic_data(target, cell_line)
nucleotides_sequences, nucleotides_sequences_index, nucleotides_sequences_columns = load_raw_nucleotides_sequences(target, cell_line)
epigenomic_data, nucleotides_sequences, nucleotides_sequences_index, classes = drop_unknown_datapoints(epigenomic_data, nucleotides_sequences, nucleotides_sequences_index, classes)
store_raw_epigenomic_data(target, cell_line, epigenomic_data)
store_raw_nucleotides_sequences(target, cell_line, nucleotides_sequences, nucleotides_sequences_index, nucleotides_sequences_columns)
store_raw_classes(target, cell_line, classes) | 67.678571 | 213 | 0.803694 | 235 | 1,895 | 6.123404 | 0.204255 | 0.305768 | 0.138985 | 0.166782 | 0.533704 | 0.374566 | 0.257123 | 0.257123 | 0.119527 | 0.119527 | 0 | 0.000601 | 0.121372 | 1,895 | 28 | 214 | 67.678571 | 0.863664 | 0.022164 | 0 | 0 | 0 | 0 | 0.016775 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0 | 0.24 | 0 | 0.36 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
65ded556650f5e35ee3489143d332a0dbd1e324c | 7,857 | py | Python | plugin.video.plexodus/resources/lib/indexers/fanarttv.py | MR-Unknown-Cm/addons | 8df1ebe58c95620bb02a05dbae7bf37954915cbd | [
"Apache-2.0"
] | 1 | 2020-03-03T10:01:21.000Z | 2020-03-03T10:01:21.000Z | plugin.video.plexodus/resources/lib/indexers/fanarttv.py | MR-Unknown-Cm/addons | 8df1ebe58c95620bb02a05dbae7bf37954915cbd | [
"Apache-2.0"
] | null | null | null | plugin.video.plexodus/resources/lib/indexers/fanarttv.py | MR-Unknown-Cm/addons | 8df1ebe58c95620bb02a05dbae7bf37954915cbd | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
'''
plexOdus Add-on
'''
import json
from resources.lib.modules import client
from resources.lib.modules import control
user = control.setting('fanart.tv.user')
if user == '' or user is None:
user = 'cf0ebcc2f7b824bd04cf3a318f15c17d'
headers = {'api-key': '3eb5ed2c401a206391ea8d1a0312c347'}
if not user == '':
headers.update({'client-key': user})
base_url = "http://webservice.fanart.tv/v3/%s/%s"
lang = control.apiLanguage()['trakt']
def get_tvshow_art(tvdb):
url = base_url % ('tv', '%s')
try:
art = client.request(url % tvdb, headers=headers, timeout='30', error=True)
art = json.loads(art)
except:
return None
try:
poster2 = art['tvposter']
poster2 = [(x['url'], x['likes']) for x in poster2 if x.get('lang') == lang] + [(x['url'], x['likes']) for x in poster2 if x.get('lang') == '']
poster2 = [(x[0], x[1]) for x in poster2]
poster2 = sorted(poster2, key=lambda x: int(x[1]), reverse=True)
poster2 = [x[0] for x in poster2][0]
poster2 = poster2.encode('utf-8')
except:
poster2 = '0'
try:
fanart2 = art['showbackground']
fanart2 = [(x['url'], x['likes']) for x in fanart2 if x.get('lang') == lang] + [(x['url'], x['likes']) for x in fanart2 if x.get('lang') == '']
fanart2 = [(x[0], x[1]) for x in fanart2]
fanart2 = sorted(fanart2, key=lambda x: int(x[1]), reverse=True)
fanart2 = [x[0] for x in fanart2][0]
fanart2 = fanart2.encode('utf-8')
except:
fanart2= '0'
try:
banner2 = art['tvbanner']
banner2 = [(x['url'], x['likes']) for x in banner2 if x.get('lang') == lang] + [(x['url'], x['likes']) for x in banner2 if x.get('lang') == '']
banner2 = [(x[0], x[1]) for x in banner2]
banner2 = sorted(banner2, key=lambda x: int(x[1]), reverse=True)
banner2 = [x[0] for x in banner2][0]
banner2 = banner2.encode('utf-8')
except:
banner2 = '0'
try:
if 'hdtvlogo' in art:
clearlogo = art['hdtvlogo']
else:
clearlogo = art['clearlogo']
clearlogo = [(x['url'], x['likes']) for x in clearlogo if x.get('lang') == lang] + [(x['url'], x['likes']) for x in clearlogo if x.get('lang') == '']
clearlogo = [(x[0], x[1]) for x in clearlogo]
clearlogo = sorted(clearlogo, key=lambda x: int(x[1]), reverse=True)
clearlogo = [x[0] for x in clearlogo][0]
clearlogo = clearlogo.encode('utf-8')
except:
clearlogo = '0'
try:
if 'hdclearart' in art:
clearart = art['hdclearart']
else:
clearart = art['clearart']
clearart = [(x['url'], x['likes']) for x in clearart if x.get('lang') == lang] + [(x['url'], x['likes']) for x in clearart if x.get('lang') == '']
clearart = [(x[0], x[1]) for x in clearart]
clearart = sorted(clearart, key=lambda x: int(x[1]), reverse=True)
clearart = [x[0] for x in clearart][0]
clearart = clearart.encode('utf-8')
except:
clearart = '0'
try:
if 'tvthumb' in art:
landscape = art['tvthumb']
else:
landscape = art['showbackground']
landscape = [(x['url'], x['likes']) for x in landscape if x.get('lang') == lang] + [(x['url'], x['likes']) for x in landscape if x.get('lang') == '']
landscape = [(x[0], x[1]) for x in landscape]
landscape = sorted(landscape, key=lambda x: int(x[1]), reverse=True)
landscape = [x[0] for x in landscape][0]
landscape = landscape.encode('utf-8')
except:
landscape = '0'
extended_art = {'extended': True, 'poster2': poster2, 'banner2': banner2, 'fanart2': fanart2, 'clearlogo': clearlogo, 'clearart': clearart, 'landscape': landscape}
return extended_art
def get_movie_art(imdb):
url = base_url % ('movies', '%s')
try:
art = client.request(url % imdb, headers=headers, timeout='30', error=True)
art = json.loads(art)
except:
return None
try:
poster2 = art['movieposter']
poster2 = [(x['url'], x['likes']) for x in poster2 if x.get('lang') == lang] + [(x['url'], x['likes']) for x in poster2 if x.get('lang') == '']
poster2 = [(x[0], x[1]) for x in poster2]
poster2 = sorted(poster2, key=lambda x: int(x[1]), reverse=True)
poster2 = [x[0] for x in poster2][0]
poster2 = poster2.encode('utf-8')
except:
poster2 = '0'
try:
if 'moviebackground' in art:
fanart2 = art['moviebackground']
else:
fanart2 = art['moviethumb']
fanart2 = [(x['url'], x['likes']) for x in fanart2 if x.get('lang') == lang] + [(x['url'], x['likes']) for x in fanart2 if x.get('lang') == '']
fanart2 = [(x[0], x[1]) for x in fanart2]
fanart2 = sorted(fanart2, key=lambda x: int(x[1]), reverse=True)
fanart2 = [x[0] for x in fanart2][0]
fanart2 = fanart2.encode('utf-8')
except:
fanart2 = '0'
try:
banner2 = art['moviebanner']
banner2 = [(x['url'], x['likes']) for x in banner2 if x.get('lang') == lang] + [(x['url'], x['likes']) for x in banner2 if x.get('lang') == '']
banner2 = [(x[0], x[1]) for x in banner2]
banner2 = sorted(banner2, key=lambda x: int(x[1]), reverse=True)
banner2 = [x[0] for x in banner2][0]
banner2 = banner2.encode('utf-8')
except:
banner2 = '0'
try:
if 'hdmovielogo' in art:
clearlogo = art['hdmovielogo']
else:
clearlogo = art['movielogo']
clearlogo = [(x['url'], x['likes']) for x in clearlogo if x.get('lang') == lang] + [(x['url'], x['likes']) for x in clearlogo if x.get('lang') == '']
clearlogo = [(x[0], x[1]) for x in clearlogo]
clearlogo = sorted(clearlogo, key=lambda x: int(x[1]), reverse=True)
clearlogo = [x[0] for x in clearlogo][0]
clearlogo = clearlogo.encode('utf-8')
except:
clearlogo = '0'
try:
if 'hdmovieclearart' in art:
clearart = art['hdmovieclearart']
else:
clearart = art['movieart']
clearart = [(x['url'], x['likes']) for x in clearart if x.get('lang') == lang] + [(x['url'], x['likes']) for x in clearart if x.get('lang') == '']
clearart = [(x[0], x[1]) for x in clearart]
clearart = sorted(clearart, key=lambda x: int(x[1]), reverse=True)
clearart = [x[0] for x in clearart][0]
clearart = clearart.encode('utf-8')
except:
clearart = '0'
try:
discart = art['moviedisc']
discart = [(x['url'], x['likes']) for x in discart if x.get('lang') == lang] + [(x['url'], x['likes']) for x in discart if x.get('lang') == '']
discart = [(x[0], x[1]) for x in discart]
discart = sorted(discart, key=lambda x: int(x[1]), reverse=True)
discart = [x[0] for x in discart][0]
discart = discart.encode('utf-8')
except:
discart = '0'
try:
if 'moviethumb' in art:
landscape = art['moviethumb']
else:
landscape = art['moviebackground']
landscape = [(x['url'], x['likes']) for x in landscape if x.get('lang') == lang] + [(x['url'], x['likes']) for x in landscape if x.get('lang') == '']
landscape = [(x[0], x[1]) for x in landscape]
landscape = sorted(landscape, key=lambda x: int(x[1]), reverse=True)
landscape = [x[0] for x in landscape][0]
landscape = landscape.encode('utf-8')
except:
landscape = '0'
extended_art = {'extended': True, 'poster2': poster2, 'fanart2': fanart2, 'banner2': banner2, 'clearlogo': clearlogo, 'clearart': clearart, 'discart': discart, 'landscape': landscape}
return extended_art | 38.704433 | 187 | 0.54881 | 1,071 | 7,857 | 4.015873 | 0.084034 | 0.048361 | 0.072541 | 0.060451 | 0.741223 | 0.709602 | 0.698907 | 0.696582 | 0.690537 | 0.690537 | 0 | 0.03911 | 0.267787 | 7,857 | 203 | 188 | 38.704434 | 0.7085 | 0.004836 | 0 | 0.688623 | 0 | 0 | 0.12103 | 0.008197 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011976 | false | 0 | 0.017964 | 0 | 0.053892 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
028e7466100505ca2d031073edf99db35fd3966b | 2,773 | py | Python | registration_eval/different_days/dd_compute_dense_transformation_error.py | mirestrepo/voxels-at-lems | df47d031653d2ad877a97b3c1ea574b924b7d4c2 | [
"BSD-2-Clause"
] | 2 | 2015-09-18T00:17:16.000Z | 2019-02-06T04:41:29.000Z | registration_eval/different_days/dd_compute_dense_transformation_error.py | mirestrepo/voxels-at-lems | df47d031653d2ad877a97b3c1ea574b924b7d4c2 | [
"BSD-2-Clause"
] | null | null | null | registration_eval/different_days/dd_compute_dense_transformation_error.py | mirestrepo/voxels-at-lems | df47d031653d2ad877a97b3c1ea574b924b7d4c2 | [
"BSD-2-Clause"
] | null | null | null | #!/usr/bin/env python
# encoding: utf-8
"""
compute_transformation_error.py
Created by Maria Isabel Restrepo on 2012-09-24.
Copyright (c) 2012 . All rights reserved.
This script computes the distances betweeen an estimated similarity transformation and its ground trutrransformation is used to transform a "source" coordinate system into a "target coordinate system"
To compute the error between the translations, the L2 norm diference translation vectors in the
"source coordinate system" is computed. Since distances are preserved under R and T, only scale is applied.
The rotation error is computed as the half angle between the normalized queternions i.e acos(|<q1,q2>|) in [0, pi/2]
This script was intended to use with Vishal's results
"""
import os
import sys
import logging
import argparse
from vpcl_adaptor import *
import numpy as np
from numpy import linalg as LA
import transformations as tf
import math
import matplotlib.pyplot as plt
sys.path.append(os.pardir)
import reg3d
if __name__ == '__main__':
# fname = "/Users/isa/Dropbox/data/registration_for_vj/capitol_2011/original/2011-2006_Hs_matrix_vj_dense.txt"
# gt_fname = "/Users/isa/Dropbox/data/registration_for_vj/capitol_2011/original/2011-2006_Hs.txt"
# geo_fname ="/Users/isa/Dropbox/data/registration_for_vj/capitol_2006/original/Hs_geo.txt"
# error = reg3d.transformation_error_general(fname = fname,
# gt_fname = gt_fname,
# geo_fname = geo_fname )
# # Error (S,R,T) 1.39523511977e-06 0.802221070301 2.98789826592
# fname = "/Users/isa/Dropbox/data/registration_for_vj/downtown_2006/original/2006-2011_Hs_matrix_vj_dense.txt"
# gt_fname = "/Users/isa/Dropbox/data/registration_for_vj/downtown_2006/original/2006-2011_Hs.txt"
# geo_fname ="/Users/isa/Dropbox/data/registration_for_vj/capitol_2011/original/Hs_geo.txt"
# error = reg3d.transformation_error_general(fname = fname,
# gt_fname = gt_fname,
# geo_fname = geo_fname )
# # Error (S,R,T) 5.31970689721e-08 0.808909241082 4.83449482984
# fname = "/Users/isa/Dropbox/data/registration_for_vj/BH_VSI/original/f4-2006_Hs_matrix_vj_dense.txt"
# gt_fname = "/Users/isa/Dropbox/data/registration_for_vj/BH_VSI/original/f4-2006_Hs.txt"
# geo_fname ="/Users/isa/Dropbox/data/registration_for_vj/BH_2006/original/Hs_geo.txt"
# error = reg3d.transformation_error_general(fname = fname,
# gt_fname = gt_fname,
# geo_fname = geo_fname )
# # Error (S,R,T) 2.57980939389e-07 0.763324882652 4.79257669203 | 48.649123 | 200 | 0.695636 | 385 | 2,773 | 4.802597 | 0.38961 | 0.048675 | 0.063277 | 0.09735 | 0.502434 | 0.502434 | 0.502434 | 0.502434 | 0.502434 | 0.501352 | 0 | 0.096092 | 0.215651 | 2,773 | 57 | 201 | 48.649123 | 0.754023 | 0.596827 | 0 | 0 | 0 | 0 | 0.020672 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.846154 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
028f14718283c8b1eabad98e17db6f0ca1dee6eb | 16,301 | py | Python | migrations/versions/be21086640ad_country_added.py | anjinkristou/assistor | 02d9b826b9d8844d475c11c33db48cf278282183 | [
"MIT"
] | 1 | 2022-01-29T14:00:32.000Z | 2022-01-29T14:00:32.000Z | migrations/versions/be21086640ad_country_added.py | anjinkristou/assistor | 02d9b826b9d8844d475c11c33db48cf278282183 | [
"MIT"
] | null | null | null | migrations/versions/be21086640ad_country_added.py | anjinkristou/assistor | 02d9b826b9d8844d475c11c33db48cf278282183 | [
"MIT"
] | null | null | null | """Country added
Revision ID: be21086640ad
Revises: 153f720f966f
Create Date: 2021-11-09 15:34:04.306218
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'be21086640ad'
down_revision = '153f720f966f'
branch_labels = None
depends_on = None
naming_convention = {
"ix": 'ix_%(column_0_label)s',
"uq": "uq_%(table_name)s_%(column_0_name)s",
"ck": "ck_%(table_name)s_%(column_0_name)s",
"fk": "fk_%(table_name)s_%(column_0_name)s_%(referred_table_name)s",
"pk": "pk_%(table_name)s"
}
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('countries',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('iso', sa.String(length=2), nullable=True),
sa.Column('name', sa.String(length=80), nullable=True),
sa.Column('nicename', sa.String(length=80), nullable=True),
sa.Column('iso3', sa.String(length=3), nullable=True),
sa.Column('numcode', sa.Integer(), nullable=True),
sa.Column('phonecode', sa.Integer(), nullable=True),
sa.PrimaryKeyConstraint('id')
)
with op.batch_alter_table('companies', schema=None, naming_convention=naming_convention) as batch_op:
batch_op.add_column(sa.Column('country_id', sa.Integer(), nullable=True))
batch_op.create_foreign_key(batch_op.f('fk_company_country_id_country'), 'countries', ['country_id'], ['id'])
# ### end Alembic commands ###
op.execute("""
INSERT INTO `countries` (`id`, `iso`, `name`, `nicename`, `iso3`, `numcode`, `phonecode`) VALUES
(1, 'AF', 'AFGHANISTAN', 'Afghanistan', 'AFG', 4, 93),
(2, 'AL', 'ALBANIA', 'Albania', 'ALB', 8, 355),
(3, 'DZ', 'ALGERIA', 'Algeria', 'DZA', 12, 213),
(4, 'AS', 'AMERICAN SAMOA', 'American Samoa', 'ASM', 16, 1684),
(5, 'AD', 'ANDORRA', 'Andorra', 'AND', 20, 376),
(6, 'AO', 'ANGOLA', 'Angola', 'AGO', 24, 244),
(7, 'AI', 'ANGUILLA', 'Anguilla', 'AIA', 660, 1264),
(8, 'AQ', 'ANTARCTICA', 'Antarctica', NULL, NULL, 0),
(9, 'AG', 'ANTIGUA AND BARBUDA', 'Antigua and Barbuda', 'ATG', 28, 1268),
(10, 'AR', 'ARGENTINA', 'Argentina', 'ARG', 32, 54),
(11, 'AM', 'ARMENIA', 'Armenia', 'ARM', 51, 374),
(12, 'AW', 'ARUBA', 'Aruba', 'ABW', 533, 297),
(13, 'AU', 'AUSTRALIA', 'Australia', 'AUS', 36, 61),
(14, 'AT', 'AUSTRIA', 'Austria', 'AUT', 40, 43),
(15, 'AZ', 'AZERBAIJAN', 'Azerbaijan', 'AZE', 31, 994),
(16, 'BS', 'BAHAMAS', 'Bahamas', 'BHS', 44, 1242),
(17, 'BH', 'BAHRAIN', 'Bahrain', 'BHR', 48, 973),
(18, 'BD', 'BANGLADESH', 'Bangladesh', 'BGD', 50, 880),
(19, 'BB', 'BARBADOS', 'Barbados', 'BRB', 52, 1246),
(20, 'BY', 'BELARUS', 'Belarus', 'BLR', 112, 375),
(21, 'BE', 'BELGIUM', 'Belgium', 'BEL', 56, 32),
(22, 'BZ', 'BELIZE', 'Belize', 'BLZ', 84, 501),
(23, 'BJ', 'BENIN', 'Benin', 'BEN', 204, 229),
(24, 'BM', 'BERMUDA', 'Bermuda', 'BMU', 60, 1441),
(25, 'BT', 'BHUTAN', 'Bhutan', 'BTN', 64, 975),
(26, 'BO', 'BOLIVIA', 'Bolivia', 'BOL', 68, 591),
(27, 'BA', 'BOSNIA AND HERZEGOVINA', 'Bosnia and Herzegovina', 'BIH', 70, 387),
(28, 'BW', 'BOTSWANA', 'Botswana', 'BWA', 72, 267),
(29, 'BV', 'BOUVET ISLAND', 'Bouvet Island', NULL, NULL, 0),
(30, 'BR', 'BRAZIL', 'Brazil', 'BRA', 76, 55),
(31, 'IO', 'BRITISH INDIAN OCEAN TERRITORY', 'British Indian Ocean Territory', NULL, NULL, 246),
(32, 'BN', 'BRUNEI DARUSSALAM', 'Brunei Darussalam', 'BRN', 96, 673),
(33, 'BG', 'BULGARIA', 'Bulgaria', 'BGR', 100, 359),
(34, 'BF', 'BURKINA FASO', 'Burkina Faso', 'BFA', 854, 226),
(35, 'BI', 'BURUNDI', 'Burundi', 'BDI', 108, 257),
(36, 'KH', 'CAMBODIA', 'Cambodia', 'KHM', 116, 855),
(37, 'CM', 'CAMEROON', 'Cameroon', 'CMR', 120, 237),
(38, 'CA', 'CANADA', 'Canada', 'CAN', 124, 1),
(39, 'CV', 'CAPE VERDE', 'Cape Verde', 'CPV', 132, 238),
(40, 'KY', 'CAYMAN ISLANDS', 'Cayman Islands', 'CYM', 136, 1345),
(41, 'CF', 'CENTRAL AFRICAN REPUBLIC', 'Central African Republic', 'CAF', 140, 236),
(42, 'TD', 'CHAD', 'Chad', 'TCD', 148, 235),
(43, 'CL', 'CHILE', 'Chile', 'CHL', 152, 56),
(44, 'CN', 'CHINA', 'China', 'CHN', 156, 86),
(45, 'CX', 'CHRISTMAS ISLAND', 'Christmas Island', NULL, NULL, 61),
(46, 'CC', 'COCOS (KEELING) ISLANDS', 'Cocos (Keeling) Islands', NULL, NULL, 672),
(47, 'CO', 'COLOMBIA', 'Colombia', 'COL', 170, 57),
(48, 'KM', 'COMOROS', 'Comoros', 'COM', 174, 269),
(49, 'CG', 'CONGO', 'Congo', 'COG', 178, 242),
(50, 'CD', 'CONGO, THE DEMOCRATIC REPUBLIC OF THE', 'Congo, the Democratic Republic of the', 'COD', 180, 242),
(51, 'CK', 'COOK ISLANDS', 'Cook Islands', 'COK', 184, 682),
(52, 'CR', 'COSTA RICA', 'Costa Rica', 'CRI', 188, 506),
(53, 'CI', 'COTE D''IVOIRE', 'Cote D''Ivoire', 'CIV', 384, 225),
(54, 'HR', 'CROATIA', 'Croatia', 'HRV', 191, 385),
(55, 'CU', 'CUBA', 'Cuba', 'CUB', 192, 53),
(56, 'CY', 'CYPRUS', 'Cyprus', 'CYP', 196, 357),
(57, 'CZ', 'CZECH REPUBLIC', 'Czech Republic', 'CZE', 203, 420),
(58, 'DK', 'DENMARK', 'Denmark', 'DNK', 208, 45),
(59, 'DJ', 'DJIBOUTI', 'Djibouti', 'DJI', 262, 253),
(60, 'DM', 'DOMINICA', 'Dominica', 'DMA', 212, 1767),
(61, 'DO', 'DOMINICAN REPUBLIC', 'Dominican Republic', 'DOM', 214, 1809),
(62, 'EC', 'ECUADOR', 'Ecuador', 'ECU', 218, 593),
(63, 'EG', 'EGYPT', 'Egypt', 'EGY', 818, 20),
(64, 'SV', 'EL SALVADOR', 'El Salvador', 'SLV', 222, 503),
(65, 'GQ', 'EQUATORIAL GUINEA', 'Equatorial Guinea', 'GNQ', 226, 240),
(66, 'ER', 'ERITREA', 'Eritrea', 'ERI', 232, 291),
(67, 'EE', 'ESTONIA', 'Estonia', 'EST', 233, 372),
(68, 'ET', 'ETHIOPIA', 'Ethiopia', 'ETH', 231, 251),
(69, 'FK', 'FALKLAND ISLANDS (MALVINAS)', 'Falkland Islands (Malvinas)', 'FLK', 238, 500),
(70, 'FO', 'FAROE ISLANDS', 'Faroe Islands', 'FRO', 234, 298),
(71, 'FJ', 'FIJI', 'Fiji', 'FJI', 242, 679),
(72, 'FI', 'FINLAND', 'Finland', 'FIN', 246, 358),
(73, 'FR', 'FRANCE', 'France', 'FRA', 250, 33),
(74, 'GF', 'FRENCH GUIANA', 'French Guiana', 'GUF', 254, 594),
(75, 'PF', 'FRENCH POLYNESIA', 'French Polynesia', 'PYF', 258, 689),
(76, 'TF', 'FRENCH SOUTHERN TERRITORIES', 'French Southern Territories', NULL, NULL, 0),
(77, 'GA', 'GABON', 'Gabon', 'GAB', 266, 241),
(78, 'GM', 'GAMBIA', 'Gambia', 'GMB', 270, 220),
(79, 'GE', 'GEORGIA', 'Georgia', 'GEO', 268, 995),
(80, 'DE', 'GERMANY', 'Germany', 'DEU', 276, 49),
(81, 'GH', 'GHANA', 'Ghana', 'GHA', 288, 233),
(82, 'GI', 'GIBRALTAR', 'Gibraltar', 'GIB', 292, 350),
(83, 'GR', 'GREECE', 'Greece', 'GRC', 300, 30),
(84, 'GL', 'GREENLAND', 'Greenland', 'GRL', 304, 299),
(85, 'GD', 'GRENADA', 'Grenada', 'GRD', 308, 1473),
(86, 'GP', 'GUADELOUPE', 'Guadeloupe', 'GLP', 312, 590),
(87, 'GU', 'GUAM', 'Guam', 'GUM', 316, 1671),
(88, 'GT', 'GUATEMALA', 'Guatemala', 'GTM', 320, 502),
(89, 'GN', 'GUINEA', 'Guinea', 'GIN', 324, 224),
(90, 'GW', 'GUINEA-BISSAU', 'Guinea-Bissau', 'GNB', 624, 245),
(91, 'GY', 'GUYANA', 'Guyana', 'GUY', 328, 592),
(92, 'HT', 'HAITI', 'Haiti', 'HTI', 332, 509),
(93, 'HM', 'HEARD ISLAND AND MCDONALD ISLANDS', 'Heard Island and Mcdonald Islands', NULL, NULL, 0),
(94, 'VA', 'HOLY SEE (VATICAN CITY STATE)', 'Holy See (Vatican City State)', 'VAT', 336, 39),
(95, 'HN', 'HONDURAS', 'Honduras', 'HND', 340, 504),
(96, 'HK', 'HONG KONG', 'Hong Kong', 'HKG', 344, 852),
(97, 'HU', 'HUNGARY', 'Hungary', 'HUN', 348, 36),
(98, 'IS', 'ICELAND', 'Iceland', 'ISL', 352, 354),
(99, 'IN', 'INDIA', 'India', 'IND', 356, 91),
(100, 'ID', 'INDONESIA', 'Indonesia', 'IDN', 360, 62),
(101, 'IR', 'IRAN, ISLAMIC REPUBLIC OF', 'Iran, Islamic Republic of', 'IRN', 364, 98),
(102, 'IQ', 'IRAQ', 'Iraq', 'IRQ', 368, 964),
(103, 'IE', 'IRELAND', 'Ireland', 'IRL', 372, 353),
(104, 'IL', 'ISRAEL', 'Israel', 'ISR', 376, 972),
(105, 'IT', 'ITALY', 'Italy', 'ITA', 380, 39),
(106, 'JM', 'JAMAICA', 'Jamaica', 'JAM', 388, 1876),
(107, 'JP', 'JAPAN', 'Japan', 'JPN', 392, 81),
(108, 'JO', 'JORDAN', 'Jordan', 'JOR', 400, 962),
(109, 'KZ', 'KAZAKHSTAN', 'Kazakhstan', 'KAZ', 398, 7),
(110, 'KE', 'KENYA', 'Kenya', 'KEN', 404, 254),
(111, 'KI', 'KIRIBATI', 'Kiribati', 'KIR', 296, 686),
(112, 'KP', 'KOREA, DEMOCRATIC PEOPLE''S REPUBLIC OF', 'Korea, Democratic People''s Republic of', 'PRK', 408, 850),
(113, 'KR', 'KOREA, REPUBLIC OF', 'Korea, Republic of', 'KOR', 410, 82),
(114, 'KW', 'KUWAIT', 'Kuwait', 'KWT', 414, 965),
(115, 'KG', 'KYRGYZSTAN', 'Kyrgyzstan', 'KGZ', 417, 996),
(116, 'LA', 'LAO PEOPLE''S DEMOCRATIC REPUBLIC', 'Lao People''s Democratic Republic', 'LAO', 418, 856),
(117, 'LV', 'LATVIA', 'Latvia', 'LVA', 428, 371),
(118, 'LB', 'LEBANON', 'Lebanon', 'LBN', 422, 961),
(119, 'LS', 'LESOTHO', 'Lesotho', 'LSO', 426, 266),
(120, 'LR', 'LIBERIA', 'Liberia', 'LBR', 430, 231),
(121, 'LY', 'LIBYAN ARAB JAMAHIRIYA', 'Libyan Arab Jamahiriya', 'LBY', 434, 218),
(122, 'LI', 'LIECHTENSTEIN', 'Liechtenstein', 'LIE', 438, 423),
(123, 'LT', 'LITHUANIA', 'Lithuania', 'LTU', 440, 370),
(124, 'LU', 'LUXEMBOURG', 'Luxembourg', 'LUX', 442, 352),
(125, 'MO', 'MACAO', 'Macao', 'MAC', 446, 853),
(126, 'MK', 'MACEDONIA, THE FORMER YUGOSLAV REPUBLIC OF', 'Macedonia, the Former Yugoslav Republic of', 'MKD', 807, 389),
(127, 'MG', 'MADAGASCAR', 'Madagascar', 'MDG', 450, 261),
(128, 'MW', 'MALAWI', 'Malawi', 'MWI', 454, 265),
(129, 'MY', 'MALAYSIA', 'Malaysia', 'MYS', 458, 60),
(130, 'MV', 'MALDIVES', 'Maldives', 'MDV', 462, 960),
(131, 'ML', 'MALI', 'Mali', 'MLI', 466, 223),
(132, 'MT', 'MALTA', 'Malta', 'MLT', 470, 356),
(133, 'MH', 'MARSHALL ISLANDS', 'Marshall Islands', 'MHL', 584, 692),
(134, 'MQ', 'MARTINIQUE', 'Martinique', 'MTQ', 474, 596),
(135, 'MR', 'MAURITANIA', 'Mauritania', 'MRT', 478, 222),
(136, 'MU', 'MAURITIUS', 'Mauritius', 'MUS', 480, 230),
(137, 'YT', 'MAYOTTE', 'Mayotte', NULL, NULL, 269),
(138, 'MX', 'MEXICO', 'Mexico', 'MEX', 484, 52),
(139, 'FM', 'MICRONESIA, FEDERATED STATES OF', 'Micronesia, Federated States of', 'FSM', 583, 691),
(140, 'MD', 'MOLDOVA, REPUBLIC OF', 'Moldova, Republic of', 'MDA', 498, 373),
(141, 'MC', 'MONACO', 'Monaco', 'MCO', 492, 377),
(142, 'MN', 'MONGOLIA', 'Mongolia', 'MNG', 496, 976),
(143, 'MS', 'MONTSERRAT', 'Montserrat', 'MSR', 500, 1664),
(144, 'MA', 'MOROCCO', 'Morocco', 'MAR', 504, 212),
(145, 'MZ', 'MOZAMBIQUE', 'Mozambique', 'MOZ', 508, 258),
(146, 'MM', 'MYANMAR', 'Myanmar', 'MMR', 104, 95),
(147, 'NA', 'NAMIBIA', 'Namibia', 'NAM', 516, 264),
(148, 'NR', 'NAURU', 'Nauru', 'NRU', 520, 674),
(149, 'NP', 'NEPAL', 'Nepal', 'NPL', 524, 977),
(150, 'NL', 'NETHERLANDS', 'Netherlands', 'NLD', 528, 31),
(151, 'AN', 'NETHERLANDS ANTILLES', 'Netherlands Antilles', 'ANT', 530, 599),
(152, 'NC', 'NEW CALEDONIA', 'New Caledonia', 'NCL', 540, 687),
(153, 'NZ', 'NEW ZEALAND', 'New Zealand', 'NZL', 554, 64),
(154, 'NI', 'NICARAGUA', 'Nicaragua', 'NIC', 558, 505),
(155, 'NE', 'NIGER', 'Niger', 'NER', 562, 227),
(156, 'NG', 'NIGERIA', 'Nigeria', 'NGA', 566, 234),
(157, 'NU', 'NIUE', 'Niue', 'NIU', 570, 683),
(158, 'NF', 'NORFOLK ISLAND', 'Norfolk Island', 'NFK', 574, 672),
(159, 'MP', 'NORTHERN MARIANA ISLANDS', 'Northern Mariana Islands', 'MNP', 580, 1670),
(160, 'NO', 'NORWAY', 'Norway', 'NOR', 578, 47),
(161, 'OM', 'OMAN', 'Oman', 'OMN', 512, 968),
(162, 'PK', 'PAKISTAN', 'Pakistan', 'PAK', 586, 92),
(163, 'PW', 'PALAU', 'Palau', 'PLW', 585, 680),
(164, 'PS', 'PALESTINIAN TERRITORY, OCCUPIED', 'Palestinian Territory, Occupied', NULL, NULL, 970),
(165, 'PA', 'PANAMA', 'Panama', 'PAN', 591, 507),
(166, 'PG', 'PAPUA NEW GUINEA', 'Papua New Guinea', 'PNG', 598, 675),
(167, 'PY', 'PARAGUAY', 'Paraguay', 'PRY', 600, 595),
(168, 'PE', 'PERU', 'Peru', 'PER', 604, 51),
(169, 'PH', 'PHILIPPINES', 'Philippines', 'PHL', 608, 63),
(170, 'PN', 'PITCAIRN', 'Pitcairn', 'PCN', 612, 0),
(171, 'PL', 'POLAND', 'Poland', 'POL', 616, 48),
(172, 'PT', 'PORTUGAL', 'Portugal', 'PRT', 620, 351),
(173, 'PR', 'PUERTO RICO', 'Puerto Rico', 'PRI', 630, 1787),
(174, 'QA', 'QATAR', 'Qatar', 'QAT', 634, 974),
(175, 'RE', 'REUNION', 'Reunion', 'REU', 638, 262),
(176, 'RO', 'ROMANIA', 'Romania', 'ROU', 642, 40),
(177, 'RU', 'RUSSIAN FEDERATION', 'Russian Federation', 'RUS', 643, 70),
(178, 'RW', 'RWANDA', 'Rwanda', 'RWA', 646, 250),
(179, 'SH', 'SAINT HELENA', 'Saint Helena', 'SHN', 654, 290),
(180, 'KN', 'SAINT KITTS AND NEVIS', 'Saint Kitts and Nevis', 'KNA', 659, 1869),
(181, 'LC', 'SAINT LUCIA', 'Saint Lucia', 'LCA', 662, 1758),
(182, 'PM', 'SAINT PIERRE AND MIQUELON', 'Saint Pierre and Miquelon', 'SPM', 666, 508),
(183, 'VC', 'SAINT VINCENT AND THE GRENADINES', 'Saint Vincent and the Grenadines', 'VCT', 670, 1784),
(184, 'WS', 'SAMOA', 'Samoa', 'WSM', 882, 684),
(185, 'SM', 'SAN MARINO', 'San Marino', 'SMR', 674, 378),
(186, 'ST', 'SAO TOME AND PRINCIPE', 'Sao Tome and Principe', 'STP', 678, 239),
(187, 'SA', 'SAUDI ARABIA', 'Saudi Arabia', 'SAU', 682, 966),
(188, 'SN', 'SENEGAL', 'Senegal', 'SEN', 686, 221),
(189, 'CS', 'SERBIA AND MONTENEGRO', 'Serbia and Montenegro', NULL, NULL, 381),
(190, 'SC', 'SEYCHELLES', 'Seychelles', 'SYC', 690, 248),
(191, 'SL', 'SIERRA LEONE', 'Sierra Leone', 'SLE', 694, 232),
(192, 'SG', 'SINGAPORE', 'Singapore', 'SGP', 702, 65),
(193, 'SK', 'SLOVAKIA', 'Slovakia', 'SVK', 703, 421),
(194, 'SI', 'SLOVENIA', 'Slovenia', 'SVN', 705, 386),
(195, 'SB', 'SOLOMON ISLANDS', 'Solomon Islands', 'SLB', 90, 677),
(196, 'SO', 'SOMALIA', 'Somalia', 'SOM', 706, 252),
(197, 'ZA', 'SOUTH AFRICA', 'South Africa', 'ZAF', 710, 27),
(198, 'GS', 'SOUTH GEORGIA AND THE SOUTH SANDWICH ISLANDS', 'South Georgia and the South Sandwich Islands', NULL, NULL, 0),
(199, 'ES', 'SPAIN', 'Spain', 'ESP', 724, 34),
(200, 'LK', 'SRI LANKA', 'Sri Lanka', 'LKA', 144, 94),
(201, 'SD', 'SUDAN', 'Sudan', 'SDN', 736, 249),
(202, 'SR', 'SURINAME', 'Suriname', 'SUR', 740, 597),
(203, 'SJ', 'SVALBARD AND JAN MAYEN', 'Svalbard and Jan Mayen', 'SJM', 744, 47),
(204, 'SZ', 'SWAZILAND', 'Swaziland', 'SWZ', 748, 268),
(205, 'SE', 'SWEDEN', 'Sweden', 'SWE', 752, 46),
(206, 'CH', 'SWITZERLAND', 'Switzerland', 'CHE', 756, 41),
(207, 'SY', 'SYRIAN ARAB REPUBLIC', 'Syrian Arab Republic', 'SYR', 760, 963),
(208, 'TW', 'TAIWAN, PROVINCE OF CHINA', 'Taiwan, Province of China', 'TWN', 158, 886),
(209, 'TJ', 'TAJIKISTAN', 'Tajikistan', 'TJK', 762, 992),
(210, 'TZ', 'TANZANIA, UNITED REPUBLIC OF', 'Tanzania, United Republic of', 'TZA', 834, 255),
(211, 'TH', 'THAILAND', 'Thailand', 'THA', 764, 66),
(212, 'TL', 'TIMOR-LESTE', 'Timor-Leste', NULL, NULL, 670),
(213, 'TG', 'TOGO', 'Togo', 'TGO', 768, 228),
(214, 'TK', 'TOKELAU', 'Tokelau', 'TKL', 772, 690),
(215, 'TO', 'TONGA', 'Tonga', 'TON', 776, 676),
(216, 'TT', 'TRINIDAD AND TOBAGO', 'Trinidad and Tobago', 'TTO', 780, 1868),
(217, 'TN', 'TUNISIA', 'Tunisia', 'TUN', 788, 216),
(218, 'TR', 'TURKEY', 'Turkey', 'TUR', 792, 90),
(219, 'TM', 'TURKMENISTAN', 'Turkmenistan', 'TKM', 795, 7370),
(220, 'TC', 'TURKS AND CAICOS ISLANDS', 'Turks and Caicos Islands', 'TCA', 796, 1649),
(221, 'TV', 'TUVALU', 'Tuvalu', 'TUV', 798, 688),
(222, 'UG', 'UGANDA', 'Uganda', 'UGA', 800, 256),
(223, 'UA', 'UKRAINE', 'Ukraine', 'UKR', 804, 380),
(224, 'AE', 'UNITED ARAB EMIRATES', 'United Arab Emirates', 'ARE', 784, 971),
(225, 'GB', 'UNITED KINGDOM', 'United Kingdom', 'GBR', 826, 44),
(226, 'US', 'UNITED STATES', 'United States', 'USA', 840, 1),
(227, 'UM', 'UNITED STATES MINOR OUTLYING ISLANDS', 'United States Minor Outlying Islands', NULL, NULL, 1),
(228, 'UY', 'URUGUAY', 'Uruguay', 'URY', 858, 598),
(229, 'UZ', 'UZBEKISTAN', 'Uzbekistan', 'UZB', 860, 998),
(230, 'VU', 'VANUATU', 'Vanuatu', 'VUT', 548, 678),
(231, 'VE', 'VENEZUELA', 'Venezuela', 'VEN', 862, 58),
(232, 'VN', 'VIET NAM', 'Viet Nam', 'VNM', 704, 84),
(233, 'VG', 'VIRGIN ISLANDS, BRITISH', 'Virgin Islands, British', 'VGB', 92, 1284),
(234, 'VI', 'VIRGIN ISLANDS, U.S.', 'Virgin Islands, U.s.', 'VIR', 850, 1340),
(235, 'WF', 'WALLIS AND FUTUNA', 'Wallis and Futuna', 'WLF', 876, 681),
(236, 'EH', 'WESTERN SAHARA', 'Western Sahara', 'ESH', 732, 212),
(237, 'YE', 'YEMEN', 'Yemen', 'YEM', 887, 967),
(238, 'ZM', 'ZAMBIA', 'Zambia', 'ZMB', 894, 260),
(239, 'ZW', 'ZIMBABWE', 'Zimbabwe', 'ZWE', 716, 263),
(240, 'RS', 'SERBIA', 'Serbia', 'SRB', NULL, 381),
(241, 'ME', 'MONTENEGRO', 'Montenegro', 'MNE', NULL, 382);
""")
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
with op.batch_alter_table('companies', schema=None, naming_convention=naming_convention) as batch_op:
batch_op.drop_constraint(batch_op.f('fk_company_country_id_country'), type_='foreignkey')
batch_op.drop_column('country_id')
op.drop_table('countries')
# ### end Alembic commands ###
| 55.070946 | 123 | 0.587081 | 2,255 | 16,301 | 4.216408 | 0.609756 | 0.014724 | 0.008835 | 0.010517 | 0.112221 | 0.083824 | 0.056794 | 0.033025 | 0.018511 | 0.018511 | 0 | 0.143142 | 0.139439 | 16,301 | 295 | 124 | 55.257627 | 0.534645 | 0.018097 | 0 | 0.007246 | 0 | 0.275362 | 0.927171 | 0.013025 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007246 | false | 0 | 0.007246 | 0 | 0.014493 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
02967401719aa2d8549023548710f426054a51b3 | 371 | py | Python | tests/conftest.py | charles-cooper/crvfunder | 63b4041ff06ff6ea943a7d69ae233719c4411bbd | [
"MIT"
] | 6 | 2022-03-17T21:10:41.000Z | 2022-03-27T04:38:53.000Z | tests/conftest.py | charles-cooper/crvfunder | 63b4041ff06ff6ea943a7d69ae233719c4411bbd | [
"MIT"
] | null | null | null | tests/conftest.py | charles-cooper/crvfunder | 63b4041ff06ff6ea943a7d69ae233719c4411bbd | [
"MIT"
] | 2 | 2022-03-26T03:37:40.000Z | 2022-03-28T22:01:20.000Z | import pytest
pytest_plugins = ["fixtures.accounts", "fixtures.deployments"]
def pytest_sessionfinish(session, exitstatus):
if exitstatus == pytest.ExitCode.NO_TESTS_COLLECTED:
# we treat "no tests collected" as passing
session.exitstatus = pytest.ExitCode.OK
@pytest.fixture(autouse=True)
def isolation(module_isolation, fn_isolation):
pass
| 24.733333 | 62 | 0.749326 | 43 | 371 | 6.325581 | 0.627907 | 0.125 | 0.176471 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156334 | 371 | 14 | 63 | 26.5 | 0.86901 | 0.107817 | 0 | 0 | 0 | 0 | 0.112462 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.125 | 0.125 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
02b6186616262617c89a1e33d1bf620cc853ca2d | 334 | py | Python | 28. Implement strStr()/solution.py | alexwhyy/leetcode | 41664aa48137677d2f98817b9c512d76f13c525f | [
"MIT"
] | null | null | null | 28. Implement strStr()/solution.py | alexwhyy/leetcode | 41664aa48137677d2f98817b9c512d76f13c525f | [
"MIT"
] | null | null | null | 28. Implement strStr()/solution.py | alexwhyy/leetcode | 41664aa48137677d2f98817b9c512d76f13c525f | [
"MIT"
] | null | null | null | class Solution:
def strStr(self, haystack: str, needle: str) -> int:
if needle == "":
return 0
for i in range(0, len(haystack) - len(needle) + 1):
print(haystack[i : i + len(needle)], needle)
if haystack[i : i + len(needle)] == needle:
return i
return -1 | 37.111111 | 59 | 0.502994 | 42 | 334 | 4 | 0.452381 | 0.160714 | 0.119048 | 0.154762 | 0.297619 | 0.297619 | 0 | 0 | 0 | 0 | 0 | 0.018868 | 0.365269 | 334 | 9 | 60 | 37.111111 | 0.773585 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0 | 0 | 0.555556 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
02c92d6241ebe854f6a64f06a949f8d5440cd141 | 13,749 | py | Python | matching/retrieval.py | Macielyoung/sentence_representation_matching | aa33147eb870a805f69dbc54c2177b11a94cf814 | [
"Apache-2.0"
] | 22 | 2022-01-24T10:08:39.000Z | 2022-03-31T10:47:05.000Z | matching/retrieval.py | Macielyoung/sentence_representation_matching | aa33147eb870a805f69dbc54c2177b11a94cf814 | [
"Apache-2.0"
] | 3 | 2022-03-06T11:52:25.000Z | 2022-03-15T06:32:17.000Z | matching/retrieval.py | Macielyoung/sentence_representation_matching | aa33147eb870a805f69dbc54c2177b11a94cf814 | [
"Apache-2.0"
] | 5 | 2022-02-28T09:13:04.000Z | 2022-03-22T12:50:09.000Z | from simcse import SimCSE
from esimcse import ESimCSE
from promptbert import PromptBERT
from sbert import SBERT
from cosent import CoSent
from config import Params
from log import logger
import torch
from transformers import AutoTokenizer
class SimCSERetrieval(object):
def __init__(self, pretrained_model_path, simcse_path, pool_type, dropout):
self.tokenizer = AutoTokenizer.from_pretrained(pretrained_model_path)
model = SimCSE(Params.pretrained_model, pool_type, dropout)
self.checkpoint = torch.load(simcse_path, map_location='cpu')
model.load_state_dict(self.checkpoint['model_state_dict'])
model.eval()
self.model = model
def print_checkpoint_info(self):
loss = self.checkpoint['loss']
epoch = self.checkpoint['epoch']
model_info = {'loss': loss, 'epoch': epoch}
return model_info
def calculate_sentence_embedding(self, sentence):
device = "cpu"
input_encodings = self.tokenizer(sentence,
padding=True,
truncation=True,
max_length=Params.max_length,
return_tensors='pt')
sentence_embedding = self.model(input_encodings['input_ids'].to(device),
input_encodings['attention_mask'].to(device),
input_encodings['token_type_ids'].to(device))
return sentence_embedding
def calculate_sentence_similarity(self, sentence1, sentence2):
sentence1 = sentence1.strip()
sentence2 = sentence2.strip()
sentence1_embedding = self.calculate_sentence_embedding(sentence1)
sentence2_embedding = self.calculate_sentence_embedding(sentence2)
similarity = torch.cosine_similarity(sentence1_embedding, sentence2_embedding, dim=-1)
similarity = float(similarity.item())
return similarity
class ESimCSERetrieval(object):
def __init__(self, pretrained_model_path, esimcse_path, dropout):
self.tokenizer = AutoTokenizer.from_pretrained(pretrained_model_path)
model = ESimCSE(Params.pretrained_model, dropout)
self.checkpoint = torch.load(esimcse_path, map_location='cpu')
model.load_state_dict(self.checkpoint['model_state_dict'])
model.eval()
self.model = model
def print_checkpoint_info(self):
loss = self.checkpoint['loss']
epoch = self.checkpoint['epoch']
model_info = {'loss': loss, 'epoch': epoch}
return model_info
def calculate_sentence_embedding(self, sentence):
device = "cpu"
input_encodings = self.tokenizer(sentence,
padding=True,
truncation=True,
max_length=Params.max_length,
return_tensors='pt')
sentence_embedding = self.model(input_encodings['input_ids'].to(device),
input_encodings['attention_mask'].to(device),
input_encodings['token_type_ids'].to(device))
return sentence_embedding
def calculate_sentence_similarity(self, sentence1, sentence2):
sentence1 = sentence1.strip()
sentence2 = sentence2.strip()
sentence1_embedding = self.calculate_sentence_embedding(sentence1)
sentence2_embedding = self.calculate_sentence_embedding(sentence2)
similarity = torch.cosine_similarity(sentence1_embedding, sentence2_embedding, dim=-1)
similarity = float(similarity.item())
return similarity
class PromptBertRetrieval(object):
def __init__(self, pretrained_model_path, promptbert_path, dropout):
super().__init__()
self.tokenizer = AutoTokenizer.from_pretrained(pretrained_model_path)
special_token_dict = {'additional_special_tokens': ['[X]']}
self.tokenizer.add_special_tokens(special_token_dict)
mask_id = self.tokenizer.convert_tokens_to_ids(Params.mask_token)
model = PromptBERT(pretrained_model_path, dropout, mask_id)
model.encoder.resize_token_embeddings(len(self.tokenizer))
checkpoint = torch.load(promptbert_path, map_location='cpu')
model.load_state_dict(checkpoint['model_state_dict'])
self.checkpoint = checkpoint
self.model = model
def print_checkpoint_info(self):
loss = self.checkpoint['loss']
epoch = self.checkpoint['epoch']
model_info = {'loss': loss, 'epoch': epoch}
return model_info
def calculate_sentence_mask_embedding(self, sentence):
device = "cpu"
prompt_sentence = Params.prompt_templates[0].replace("[X]", sentence)
prompt_encodings = self.tokenizer(prompt_sentence,
padding=True,
truncation=True,
max_length=Params.max_length,
return_tensors='pt')
sentence_mask_embedding = self.model.calculate_mask_embedding(prompt_encodings['input_ids'].to(device),
prompt_encodings['attention_mask'].to(device),
prompt_encodings['token_type_ids'].to(device))
return sentence_mask_embedding
def calculate_sentence_embedding(self, sentence):
device = "cpu"
prompt_sentence = Params.prompt_templates[0].replace("[X]", sentence)
sentence_num = len(self.tokenizer.tokenize(sentence))
template_sentence = Params.prompt_templates[0].replace("[X]", "[X]"*sentence_num)
prompt_encodings = self.tokenizer(prompt_sentence,
padding=True,
truncation=True,
max_length=Params.max_length,
return_tensors='pt')
template_encodings = self.tokenizer(template_sentence,
padding=True,
truncation=True,
max_length=Params.max_length,
return_tensors='pt')
sentence_embedding = self.model(prompt_input_ids=prompt_encodings['input_ids'].to(device),
prompt_attention_mask=prompt_encodings['attention_mask'].to(device),
prompt_token_type_ids=prompt_encodings['token_type_ids'].to(device),
template_input_ids=template_encodings['input_ids'].to(device),
template_attention_mask=template_encodings['attention_mask'].to(device),
template_token_type_ids=template_encodings['token_type_ids'].to(device))
return sentence_embedding
def calculate_sentence_similarity(self, sentence1, sentence2):
# sentence1_embedding = self.calculate_sentence_mask_embedding(sentence1)
# sentence2_embedding = self.calculate_sentence_mask_embedding(sentence2)
sentence1_embedding = self.calculate_sentence_embedding(sentence1)
sentence2_embedding = self.calculate_sentence_embedding(sentence2)
similarity = torch.cosine_similarity(sentence1_embedding, sentence2_embedding, dim=-1)
similarity = float(similarity.item())
return similarity
class SBERTRetrieval(object):
def __init__(self, pretrained_model_path, sbert_path, pool_type, dropout):
self.tokenizer = AutoTokenizer.from_pretrained(pretrained_model_path)
model = SBERT(Params.pretrained_model, pool_type, dropout)
self.checkpoint = torch.load(sbert_path, map_location='cpu')
model.load_state_dict(self.checkpoint['model_state_dict'])
model.eval()
self.model = model
def print_checkpoint_info(self):
loss = self.checkpoint['train_loss']
epoch = self.checkpoint['epoch']
model_info = {'loss': loss, 'epoch': epoch}
return model_info
def calculate_sentence_embedding(self, sentence):
device = "cpu"
input_encodings = self.tokenizer(sentence,
padding=True,
truncation=True,
max_length=Params.max_length,
return_tensors='pt')
sentence_embedding = self.model(input_encodings['input_ids'].to(device),
input_encodings['attention_mask'].to(device),
input_encodings['token_type_ids'].to(device))
return sentence_embedding
def calculate_sentence_similarity(self, sentence1, sentence2):
sentence1 = sentence1.strip()
sentence2 = sentence2.strip()
sentence1_embedding = self.calculate_sentence_embedding(sentence1)
sentence2_embedding = self.calculate_sentence_embedding(sentence2)
similarity = torch.cosine_similarity(sentence1_embedding, sentence2_embedding, dim=-1)
similarity = float(similarity.item())
return similarity
class CoSentRetrieval(object):
def __init__(self, pretrained_model_path, cosent_path):
self.tokenizer = AutoTokenizer.from_pretrained(pretrained_model_path)
model = CoSent(Params.pretrained_model, Params.cosent_pool_type, Params.cosent_dropout)
self.checkpoint = torch.load(cosent_path, map_location='cpu')
model.load_state_dict(self.checkpoint['model_state_dict'])
model.eval()
self.model = model
def print_checkpoint_info(self):
loss = self.checkpoint['loss']
epoch = self.checkpoint['epoch']
model_info = {'loss': loss, 'epoch': epoch}
return model_info
def calculate_sentence_embedding(self, sentence):
device = "cpu"
input_encodings = self.tokenizer(sentence,
padding=True,
truncation=True,
max_length=Params.max_length,
return_tensors='pt')
sentence_embedding = self.model(input_encodings['input_ids'].to(device),
input_encodings['attention_mask'].to(device),
input_encodings['token_type_ids'].to(device))
return sentence_embedding
def calculate_sentence_similarity(self, sentence1, sentence2):
sentence1 = sentence1.strip()
sentence2 = sentence2.strip()
sentence1_embedding = self.calculate_sentence_embedding(sentence1)
sentence2_embedding = self.calculate_sentence_embedding(sentence2)
similarity = torch.cosine_similarity(sentence1_embedding, sentence2_embedding, dim=-1)
similarity = float(similarity.item())
return similarity
simcse_retrieval = SimCSERetrieval(Params.pretrained_model, Params.simcse_model, Params.pool_type, Params.simcse_dropout)
logger.info("start simcse model succussfully!")
esimcse_repeat_retrieval = ESimCSERetrieval(Params.pretrained_model, Params.esimcse_repeat_model, Params.esimcse_repeat_dropout)
logger.info("start esimcse repeat model succussfully!")
esimcse_same_retrieval = ESimCSERetrieval(Params.pretrained_model, Params.esimcse_same_model, Params.esimcse_same_dropout)
logger.info("start esimcse same model succussfully!")
esimcse_multi_retrieval = ESimCSERetrieval(Params.pretrained_model, Params.esimcse_multi_model, Params.esimcse_multi_dropout)
logger.info("start esimcse multi model succussfully!")
promptbert_retrieval = PromptBertRetrieval(Params.pretrained_model, Params.promptbert_model, Params.promptbert_dropout)
logger.info("start promptbert model succussfully!")
sbert_retrieval = SBERTRetrieval(Params.pretrained_model, Params.sbert_model, Params.sbert_pool_type, Params.sbert_dropout)
logger.info("start sbert model succussfully!")
cosent_retrieval = CoSentRetrieval(Params.pretrained_model, Params.cosent_model)
logger.info("start cosent model succussfully!")
if __name__ == "__main__":
# model_path = "models/esimcse_0.32_0.15_160.pth"
# model_path = "models/esimcse_multi_0.15_64.pth"
# model_path = "models/esimcse_0.15_64.pth"
# simcse_retrieval = SimCSERetrieval(Params.pretrained_model, Params.simcse_model, Params.pool_type, Params.simcse_dropout)
# model_info = simcse_retrieval.print_checkpoint_info()
# print(model_info)
model_info = sbert_retrieval.print_checkpoint_info()
print(model_info)
while True:
print("input your sentence1:")
sentence1 = input()
print("input your sentence2:")
sentence2 = input()
sbert_sentence_similarity = sbert_retrieval.calculate_sentence_similarity(sentence1, sentence2)
# promptbert_sentence_similarity = prom.calculate_sentence_similarity(sentence1, sentence2)
# print("simcse sim: {}, promptbert sim: {}".format(simcse_sentence_similarity, promptbert_sentence_similarity))
print("sbert similarity: {}".format(sbert_sentence_similarity)) | 48.928826 | 128 | 0.630664 | 1,331 | 13,749 | 6.202104 | 0.083396 | 0.051484 | 0.047244 | 0.04361 | 0.761599 | 0.709267 | 0.702483 | 0.6149 | 0.59576 | 0.588007 | 0 | 0.009685 | 0.286566 | 13,749 | 281 | 129 | 48.928826 | 0.831889 | 0.049167 | 0 | 0.657534 | 0 | 0 | 0.064304 | 0.001914 | 0 | 0 | 0 | 0 | 0 | 1 | 0.09589 | false | 0 | 0.041096 | 0 | 0.232877 | 0.045662 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
02e3d3385e104cc569c1458b36ecf6ad0a158a11 | 613 | py | Python | lintcode/499.py | jianershi/algorithm | c3c38723b9c5f1cc745550d89e228f92fd4abfb2 | [
"MIT"
] | 1 | 2021-01-08T06:57:49.000Z | 2021-01-08T06:57:49.000Z | lintcode/499.py | jianershi/algorithm | c3c38723b9c5f1cc745550d89e228f92fd4abfb2 | [
"MIT"
] | null | null | null | lintcode/499.py | jianershi/algorithm | c3c38723b9c5f1cc745550d89e228f92fd4abfb2 | [
"MIT"
] | 1 | 2021-01-08T06:57:52.000Z | 2021-01-08T06:57:52.000Z | """
499. Word Count (Map Reduce)
https://www.lintcode.com/problem/word-count-map-reduce/description
"""
class WordCount:
# @param {str} line a text, for example "Bye Bye see you next"
def mapper(self, _, line):
# Write your code here
# Please use 'yield key, value'
word_lists = line.split(" ")
for word in word_lists:
yield word, 1
# @param key is from mapper
# @param values is a set of value with the same key
def reducer(self, key, values):
# Write your code here
# Please use 'yield key, value'
yield key, sum(values)
| 29.190476 | 66 | 0.611746 | 88 | 613 | 4.227273 | 0.568182 | 0.064516 | 0.064516 | 0.096774 | 0.209677 | 0.209677 | 0.209677 | 0.209677 | 0.209677 | 0 | 0 | 0.009132 | 0.285481 | 613 | 20 | 67 | 30.65 | 0.840183 | 0.546493 | 0 | 0 | 0 | 0 | 0.003788 | 0 | 0 | 0 | 0 | 0.05 | 0 | 1 | 0.285714 | false | 0 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
02ed59dd65e3f0007ed59a3660fc0e47a1a878ad | 461 | py | Python | config/dotenv.py | CharuchithRanjit/open-pos | ac749a0f2a6c59077d2c13f13e776963e130501f | [
"MIT"
] | null | null | null | config/dotenv.py | CharuchithRanjit/open-pos | ac749a0f2a6c59077d2c13f13e776963e130501f | [
"MIT"
] | null | null | null | config/dotenv.py | CharuchithRanjit/open-pos | ac749a0f2a6c59077d2c13f13e776963e130501f | [
"MIT"
] | null | null | null | """
Loads dotenv variables
Classes:
None
Functions:
None
Misc variables:
DATABASE_KEY (str) -- The key for the database
DATABASE_PASSWORD (str) -- The password for the database
DATABASE_URL (str) -- The url for the database
"""
import os
from dotenv import load_dotenv, find_dotenv
load_dotenv(find_dotenv())
DATABASE_KEY = os.environ.get("DATABASE_KEY")
DATABASE_PASSWORD = os.environ.get("DATABASE_PASSWORD")
DATABASE_URL = os.environ.get("SUPABASE_URL") | 20.954545 | 56 | 0.776573 | 67 | 461 | 5.149254 | 0.328358 | 0.095652 | 0.121739 | 0.127536 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121475 | 461 | 22 | 57 | 20.954545 | 0.851852 | 0.481562 | 0 | 0 | 0 | 0 | 0.176724 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.166667 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
02f1e521d0c60cd1bdde651eb786414631bc4c55 | 1,377 | py | Python | classifier.py | hemu243/focus-web-crawler | 8e882315d947f04b207ec76a64fa952f18105d73 | [
"MIT"
] | 2 | 2020-02-03T02:31:09.000Z | 2021-02-03T11:54:44.000Z | classifier.py | hemu243/focus-web-crawler | 8e882315d947f04b207ec76a64fa952f18105d73 | [
"MIT"
] | null | null | null | classifier.py | hemu243/focus-web-crawler | 8e882315d947f04b207ec76a64fa952f18105d73 | [
"MIT"
] | null | null | null | #
from abc import ABCMeta
import metapy
class WebClassifier(object):
"""
This module is abstract module which needs to be called by
inherit
"""
__metaclass__ = ABCMeta
def __init__(self, configFile):
"""
Initialized basic of classifier like fwd index, index
:param configFile:
"""
#metapy.log_to_stderr()
# Loading indexes
self.invertedIndex = metapy.index.make_inverted_index(configFile)
self.fwdIndex = metapy.index.make_forward_index(configFile)
# Define multi class data set
self.multiClassDataset = metapy.classify.MulticlassDataset(self.fwdIndex)
self.classifier = self.getClassifier(self.multiClassDataset, self.fwdIndex, self.invertedIndex)
def getClassifier(self, training, fwdIndex, invertedIndex):
"""
This function needs to implemented by inherit class
:param training: training set
:param fwdIndex: fwdIndex created by metapy
:param invertedIndex: inverted index created by metapy
:return: classifier instance
"""
raise NotImplemented("Must implement this function by inherit class")
def score(self, link_text, page_title, body_text):
"""
Should implemented by inherit class
:param link_text: url link text (anchor text)
:param page_title: page title
:param body_text: body text
:return: double score value between 0 - 1
"""
raise NotImplemented("Must implement this function by inherit class") | 30.6 | 97 | 0.760349 | 172 | 1,377 | 5.97093 | 0.418605 | 0.043817 | 0.054528 | 0.064265 | 0.171373 | 0.11295 | 0.11295 | 0.11295 | 0.11295 | 0 | 0 | 0.001729 | 0.159768 | 1,377 | 45 | 98 | 30.6 | 0.885912 | 0.43573 | 0 | 0.153846 | 0 | 0 | 0.126227 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.153846 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
02f2dc9948709df77cd05687fd7477b4be25fe0c | 609 | py | Python | backend/tests/access/test_access_event_publish.py | fjacob21/mididecweb | b65f28eb6fdeafa265796b6190a4264a5eac54ce | [
"MIT"
] | null | null | null | backend/tests/access/test_access_event_publish.py | fjacob21/mididecweb | b65f28eb6fdeafa265796b6190a4264a5eac54ce | [
"MIT"
] | 88 | 2016-11-12T14:54:38.000Z | 2018-08-02T00:25:07.000Z | backend/tests/access/test_access_event_publish.py | mididecouverte/mididecweb | b65f28eb6fdeafa265796b6190a4264a5eac54ce | [
"MIT"
] | null | null | null | from src.access import EventPublishAccess
from generate_access_data import generate_access_data
def test_publish_event_access():
sessions = generate_access_data()
event = sessions['user'].events.get('test')
useraccess = EventPublishAccess(sessions['user'], event)
manageraccess = EventPublishAccess(sessions['manager'], event)
superaccess = EventPublishAccess(sessions['super'], event)
noneaccess = EventPublishAccess(sessions['none'], event)
assert not useraccess.granted()
assert manageraccess.granted()
assert superaccess.granted()
assert not noneaccess.granted()
| 38.0625 | 66 | 0.760263 | 62 | 609 | 7.322581 | 0.403226 | 0.229075 | 0.118943 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 609 | 15 | 67 | 40.6 | 0.864762 | 0 | 0 | 0 | 1 | 0 | 0.045977 | 0 | 0 | 0 | 0 | 0 | 0.307692 | 1 | 0.076923 | false | 0 | 0.153846 | 0 | 0.230769 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
02f7dfdc4c7be780ca3def3290b1d78bbe909246 | 959 | py | Python | setup.py | jnsgruk/lightkube-models | 7fce1ed1d00ee599eaa4fad82868ec6b55c84c8d | [
"MIT"
] | 1 | 2021-10-14T08:49:10.000Z | 2021-10-14T08:49:10.000Z | setup.py | jnsgruk/lightkube-models | 7fce1ed1d00ee599eaa4fad82868ec6b55c84c8d | [
"MIT"
] | 2 | 2021-10-14T18:09:31.000Z | 2021-10-14T18:09:52.000Z | setup.py | jnsgruk/lightkube-models | 7fce1ed1d00ee599eaa4fad82868ec6b55c84c8d | [
"MIT"
] | 1 | 2021-10-13T15:08:58.000Z | 2021-10-13T15:08:58.000Z | from setuptools import setup
from pathlib import Path
from lightkube.models import __version__
setup(
name='lightkube-models',
version=__version__,
description='Models and Resources for lightkube module',
long_description=Path("README.md").read_text(),
long_description_content_type="text/markdown",
author='Giuseppe Tribulato',
author_email='gtsystem@gmail.com',
license='Apache Software License',
url='https://github.com/gtsystem/lightkube-models',
packages=['lightkube.models', 'lightkube.resources'],
classifiers=[
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'Intended Audience :: System Administrators',
'License :: OSI Approved :: MIT License',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
]
)
| 33.068966 | 60 | 0.667362 | 100 | 959 | 6.26 | 0.58 | 0.095847 | 0.159744 | 0.166134 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011796 | 0.20438 | 959 | 28 | 61 | 34.25 | 0.80865 | 0 | 0 | 0 | 0 | 0 | 0.527633 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.12 | 0 | 0.12 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
02fc1e3721895fe496443e7ceaa950d900683542 | 3,002 | py | Python | examples/session2-fi/start2.py | futurice/PythonInBrowser | 066ab28ffad265efc7968b87f33dab2c68216d9d | [
"MIT"
] | 4 | 2015-12-08T19:34:49.000Z | 2019-09-08T22:11:05.000Z | examples/session2-fi/start2.py | futurice/PythonInBrowser | 066ab28ffad265efc7968b87f33dab2c68216d9d | [
"MIT"
] | 18 | 2016-10-14T13:48:39.000Z | 2019-10-11T12:14:21.000Z | examples/session2-fi/start2.py | futurice/PythonInBrowser | 066ab28ffad265efc7968b87f33dab2c68216d9d | [
"MIT"
] | 4 | 2015-11-18T15:18:43.000Z | 2018-03-02T09:36:23.000Z | # Käydään läpi mitä opimme viime viikolla (ja myös jotakin uutta)
# Jos jokin asia mietityttää, kysy vain rohkeasti apua!
##### INFO #####
# Viime viikon tärkeimmät asiat olivat:
# 1. print-komento
# 2. muuttujan käyttö
# 3. kilpikonnan käyttäminen piirtämiseen
# Ohelmointi vaatii usein tiedon etsimistä muista lähteistä
# ja uuden tiedon soveltamista omaan ohjelmaasi.
# Käytännössä tietoa ohjelmoinnista löytää hyvin Internetistä.
# Käytä viime viikon tehtäviä lähteenä tehdessäsi seuraavia tehtäviä
##### TEHTÄVÄT #####
##### TEHTÄVÄ 1 #####
# 1. kirjoita koodinpätkä joka printtaa kaksi riviä
# Ensimmäisellä rivillä tulee olla teksti:
# "Minun lempivärini on 'lempivärisi'"
# Toisella rivillä pitää olla yhtälö joka laskee
# kuukauden jäljellä olevat päivät
# VINKKI: tarkista tietokoneelta kuinka monesko päivä tänään on ja kuinka monta päivää tässä kuussa on.
# Printtauksen tulee sisältää vain yksi numero: yhtälön ratkaisu
# <------ kirjoita koodisi tähän (ja klikkaa 'Run' printataksesi)------->
##### TEHTÄVÄ 2 #####
# Yhtenä päivänä lempivärisi saattaa olla vihreä ja toisena oranssi.
# Luo muuttuja nimeltä lempivari ja anna sille arvoksi lempivärisi
# <------ kirjoita muuttuja tähän ------->
# Kirjoita sitten koodi joka printtaa tekstin "Lempivärini in 'lempivärisi'"
# Käytä tällä kertaa muuttujaa lempivari ilmaisemaan lempivärisi
# <------ kirjoita koodisi tähän (ja klikkaa 'Run' printataksesi)------->
# Tarkistuksena muuta lempiVari muuttujan arvoa ja klikkaa 'Run'
# tarkista että lempiväri on muuttunut printtauksessa
##### TEHTÄVÄ 3 #####
# Pystyäksemme piirtämään viereiselle piirtoalueelle, meidän täytyy käyttää kilpikonnaa
# Tätä varten meidän tulee tuoda (importtaa) kilpikonna ja asettaa se muuttujaan.
# <------ Tuo (import) kilpikonna tässä ------->
# näin: import turtle
# <------ aseta kilpikonna muuttujaan 'jane', muistatko? ------>
# Piirrä seuraava kuvia
#
# eteenpäin 50 pikseliä, käännä 135 astetta oikealle
# eteenpäin 100 pikseliä, käännä 135 astetta oikealle, eteenpäin 100 pikseliä,
# käännä 135 astetta oikealla ja siirrä 50 pikseliä eteenpäin.
#
# Pystytkö arvaamaan minkä kuvion kilpikonna piirtää?
# <------ kirjoita koodisi tähän ------->
# On mahdollista piirtää myös muilla väreillä. Musta on vain oletusväri.
# Kilpikonnan värin voi muuttaa lisäämällä seuraavan rivin ennen piirtämistä:
# jane.color("pink")
# Voit myös käyttää muuttujaa määrittääksesi piirroksen värin.
# Muuta muuttujan lempivari arvo englanniksi esim. "green" (vihreä), "blue" (sininen) tai "yellow" (keltainen)
# ja korvaa väriä vaihtava koodi seuraavalla rivillä
#
# jane.color(lempivari)
#
# Muista että käyttäessäsi muuttujia et tarvitse lainausmerkkejä
# Onnittelut! Olet käynyt läpi viime viikon tärkeimmät asiat
# ja oppinut piirtämään eri väreillä
##### LISÄTEHTÄVÄT #####
# Mikä olisi helpoin tapa piirtää kolmio loppuun?
# Muuta muuttujan lempivari arvoa ja kokeile että se toimii.
# Miten voisit piirtää toisen kolmion eri suuntaan ja eri värillä
| 37.525 | 110 | 0.758161 | 351 | 3,002 | 6.48433 | 0.646724 | 0.014499 | 0.026362 | 0.031634 | 0.088752 | 0.088752 | 0.088752 | 0.049209 | 0.049209 | 0.049209 | 0 | 0.010156 | 0.147235 | 3,002 | 79 | 111 | 38 | 0.878906 | 0.935376 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
02fe1589d692043102c05d5d014222183830f3c7 | 45,373 | py | Python | clients/python/core_pb2.py | cloudwheels/grpc-test-gateway | 5fe6564804cc1dfd2761138977d9282519b8ffc6 | [
"MIT"
] | 3 | 2020-05-01T15:27:18.000Z | 2020-05-28T15:11:34.000Z | clients/python/core_pb2.py | cloudwheels/grpc-test-gateway | 5fe6564804cc1dfd2761138977d9282519b8ffc6 | [
"MIT"
] | null | null | null | clients/python/core_pb2.py | cloudwheels/grpc-test-gateway | 5fe6564804cc1dfd2761138977d9282519b8ffc6 | [
"MIT"
] | 3 | 2020-09-15T17:24:52.000Z | 2021-07-07T10:01:25.000Z | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: core.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='core.proto',
package='org.dash.platform.dapi.v0',
syntax='proto3',
serialized_pb=_b('\n\ncore.proto\x12\x19org.dash.platform.dapi.v0\"\x12\n\x10GetStatusRequest\"\xe5\x01\n\x11GetStatusResponse\x12\x14\n\x0c\x63ore_version\x18\x01 \x01(\r\x12\x18\n\x10protocol_version\x18\x02 \x01(\r\x12\x0e\n\x06\x62locks\x18\x03 \x01(\r\x12\x13\n\x0btime_offset\x18\x04 \x01(\r\x12\x13\n\x0b\x63onnections\x18\x05 \x01(\r\x12\r\n\x05proxy\x18\x06 \x01(\t\x12\x12\n\ndifficulty\x18\x07 \x01(\x01\x12\x0f\n\x07testnet\x18\x08 \x01(\x08\x12\x11\n\trelay_fee\x18\t \x01(\x01\x12\x0e\n\x06\x65rrors\x18\n \x01(\t\x12\x0f\n\x07network\x18\x0b \x01(\t\"<\n\x0fGetBlockRequest\x12\x10\n\x06height\x18\x01 \x01(\rH\x00\x12\x0e\n\x04hash\x18\x02 \x01(\tH\x00\x42\x07\n\x05\x62lock\"!\n\x10GetBlockResponse\x12\r\n\x05\x62lock\x18\x01 \x01(\x0c\"]\n\x16SendTransactionRequest\x12\x13\n\x0btransaction\x18\x01 \x01(\x0c\x12\x17\n\x0f\x61llow_high_fees\x18\x02 \x01(\x08\x12\x15\n\rbypass_limits\x18\x03 \x01(\x08\"1\n\x17SendTransactionResponse\x12\x16\n\x0etransaction_id\x18\x01 \x01(\t\"#\n\x15GetTransactionRequest\x12\n\n\x02id\x18\x01 \x01(\t\"-\n\x16GetTransactionResponse\x12\x13\n\x0btransaction\x18\x01 \x01(\x0c\"x\n!BlockHeadersWithChainLocksRequest\x12\x19\n\x0f\x66rom_block_hash\x18\x01 \x01(\x0cH\x00\x12\x1b\n\x11\x66rom_block_height\x18\x02 \x01(\rH\x00\x12\r\n\x05\x63ount\x18\x03 \x01(\rB\x0c\n\nfrom_block\"\xd3\x01\n\"BlockHeadersWithChainLocksResponse\x12@\n\rblock_headers\x18\x01 \x01(\x0b\x32\'.org.dash.platform.dapi.v0.BlockHeadersH\x00\x12^\n\x1d\x63hain_lock_signature_messages\x18\x02 \x01(\x0b\x32\x35.org.dash.platform.dapi.v0.ChainLockSignatureMessagesH\x00\x42\x0b\n\tresponses\"\x1f\n\x0c\x42lockHeaders\x12\x0f\n\x07headers\x18\x01 \x03(\x0c\".\n\x1a\x43hainLockSignatureMessages\x12\x10\n\x08messages\x18\x01 \x03(\x0c\"3\n!GetEstimatedTransactionFeeRequest\x12\x0e\n\x06\x62locks\x18\x01 \x01(\r\"1\n\"GetEstimatedTransactionFeeResponse\x12\x0b\n\x03\x66\x65\x65\x18\x01 \x01(\x01\x32\x89\x06\n\x04\x43ore\x12\x66\n\tgetStatus\x12+.org.dash.platform.dapi.v0.GetStatusRequest\x1a,.org.dash.platform.dapi.v0.GetStatusResponse\x12\x63\n\x08getBlock\x12*.org.dash.platform.dapi.v0.GetBlockRequest\x1a+.org.dash.platform.dapi.v0.GetBlockResponse\x12x\n\x0fsendTransaction\x12\x31.org.dash.platform.dapi.v0.SendTransactionRequest\x1a\x32.org.dash.platform.dapi.v0.SendTransactionResponse\x12u\n\x0egetTransaction\x12\x30.org.dash.platform.dapi.v0.GetTransactionRequest\x1a\x31.org.dash.platform.dapi.v0.GetTransactionResponse\x12\x99\x01\n\x1agetEstimatedTransactionFee\x12<.org.dash.platform.dapi.v0.GetEstimatedTransactionFeeRequest\x1a=.org.dash.platform.dapi.v0.GetEstimatedTransactionFeeResponse\x12\xa6\x01\n%subscribeToBlockHeadersWithChainLocks\x12<.org.dash.platform.dapi.v0.BlockHeadersWithChainLocksRequest\x1a=.org.dash.platform.dapi.v0.BlockHeadersWithChainLocksResponse0\x01\x62\x06proto3')
)
_GETSTATUSREQUEST = _descriptor.Descriptor(
name='GetStatusRequest',
full_name='org.dash.platform.dapi.v0.GetStatusRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=41,
serialized_end=59,
)
_GETSTATUSRESPONSE = _descriptor.Descriptor(
name='GetStatusResponse',
full_name='org.dash.platform.dapi.v0.GetStatusResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='core_version', full_name='org.dash.platform.dapi.v0.GetStatusResponse.core_version', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='protocol_version', full_name='org.dash.platform.dapi.v0.GetStatusResponse.protocol_version', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='blocks', full_name='org.dash.platform.dapi.v0.GetStatusResponse.blocks', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='time_offset', full_name='org.dash.platform.dapi.v0.GetStatusResponse.time_offset', index=3,
number=4, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='connections', full_name='org.dash.platform.dapi.v0.GetStatusResponse.connections', index=4,
number=5, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='proxy', full_name='org.dash.platform.dapi.v0.GetStatusResponse.proxy', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='difficulty', full_name='org.dash.platform.dapi.v0.GetStatusResponse.difficulty', index=6,
number=7, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='testnet', full_name='org.dash.platform.dapi.v0.GetStatusResponse.testnet', index=7,
number=8, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='relay_fee', full_name='org.dash.platform.dapi.v0.GetStatusResponse.relay_fee', index=8,
number=9, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='errors', full_name='org.dash.platform.dapi.v0.GetStatusResponse.errors', index=9,
number=10, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='network', full_name='org.dash.platform.dapi.v0.GetStatusResponse.network', index=10,
number=11, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=62,
serialized_end=291,
)
_GETBLOCKREQUEST = _descriptor.Descriptor(
name='GetBlockRequest',
full_name='org.dash.platform.dapi.v0.GetBlockRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='height', full_name='org.dash.platform.dapi.v0.GetBlockRequest.height', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='hash', full_name='org.dash.platform.dapi.v0.GetBlockRequest.hash', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='block', full_name='org.dash.platform.dapi.v0.GetBlockRequest.block',
index=0, containing_type=None, fields=[]),
],
serialized_start=293,
serialized_end=353,
)
_GETBLOCKRESPONSE = _descriptor.Descriptor(
name='GetBlockResponse',
full_name='org.dash.platform.dapi.v0.GetBlockResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='block', full_name='org.dash.platform.dapi.v0.GetBlockResponse.block', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=355,
serialized_end=388,
)
_SENDTRANSACTIONREQUEST = _descriptor.Descriptor(
name='SendTransactionRequest',
full_name='org.dash.platform.dapi.v0.SendTransactionRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='transaction', full_name='org.dash.platform.dapi.v0.SendTransactionRequest.transaction', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='allow_high_fees', full_name='org.dash.platform.dapi.v0.SendTransactionRequest.allow_high_fees', index=1,
number=2, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='bypass_limits', full_name='org.dash.platform.dapi.v0.SendTransactionRequest.bypass_limits', index=2,
number=3, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=390,
serialized_end=483,
)
_SENDTRANSACTIONRESPONSE = _descriptor.Descriptor(
name='SendTransactionResponse',
full_name='org.dash.platform.dapi.v0.SendTransactionResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='transaction_id', full_name='org.dash.platform.dapi.v0.SendTransactionResponse.transaction_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=485,
serialized_end=534,
)
_GETTRANSACTIONREQUEST = _descriptor.Descriptor(
name='GetTransactionRequest',
full_name='org.dash.platform.dapi.v0.GetTransactionRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='org.dash.platform.dapi.v0.GetTransactionRequest.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=536,
serialized_end=571,
)
_GETTRANSACTIONRESPONSE = _descriptor.Descriptor(
name='GetTransactionResponse',
full_name='org.dash.platform.dapi.v0.GetTransactionResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='transaction', full_name='org.dash.platform.dapi.v0.GetTransactionResponse.transaction', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=573,
serialized_end=618,
)
_BLOCKHEADERSWITHCHAINLOCKSREQUEST = _descriptor.Descriptor(
name='BlockHeadersWithChainLocksRequest',
full_name='org.dash.platform.dapi.v0.BlockHeadersWithChainLocksRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='from_block_hash', full_name='org.dash.platform.dapi.v0.BlockHeadersWithChainLocksRequest.from_block_hash', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='from_block_height', full_name='org.dash.platform.dapi.v0.BlockHeadersWithChainLocksRequest.from_block_height', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='count', full_name='org.dash.platform.dapi.v0.BlockHeadersWithChainLocksRequest.count', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='from_block', full_name='org.dash.platform.dapi.v0.BlockHeadersWithChainLocksRequest.from_block',
index=0, containing_type=None, fields=[]),
],
serialized_start=620,
serialized_end=740,
)
_BLOCKHEADERSWITHCHAINLOCKSRESPONSE = _descriptor.Descriptor(
name='BlockHeadersWithChainLocksResponse',
full_name='org.dash.platform.dapi.v0.BlockHeadersWithChainLocksResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='block_headers', full_name='org.dash.platform.dapi.v0.BlockHeadersWithChainLocksResponse.block_headers', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='chain_lock_signature_messages', full_name='org.dash.platform.dapi.v0.BlockHeadersWithChainLocksResponse.chain_lock_signature_messages', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='responses', full_name='org.dash.platform.dapi.v0.BlockHeadersWithChainLocksResponse.responses',
index=0, containing_type=None, fields=[]),
],
serialized_start=743,
serialized_end=954,
)
_BLOCKHEADERS = _descriptor.Descriptor(
name='BlockHeaders',
full_name='org.dash.platform.dapi.v0.BlockHeaders',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='headers', full_name='org.dash.platform.dapi.v0.BlockHeaders.headers', index=0,
number=1, type=12, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=956,
serialized_end=987,
)
_CHAINLOCKSIGNATUREMESSAGES = _descriptor.Descriptor(
name='ChainLockSignatureMessages',
full_name='org.dash.platform.dapi.v0.ChainLockSignatureMessages',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='messages', full_name='org.dash.platform.dapi.v0.ChainLockSignatureMessages.messages', index=0,
number=1, type=12, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=989,
serialized_end=1035,
)
_GETESTIMATEDTRANSACTIONFEEREQUEST = _descriptor.Descriptor(
name='GetEstimatedTransactionFeeRequest',
full_name='org.dash.platform.dapi.v0.GetEstimatedTransactionFeeRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='blocks', full_name='org.dash.platform.dapi.v0.GetEstimatedTransactionFeeRequest.blocks', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1037,
serialized_end=1088,
)
_GETESTIMATEDTRANSACTIONFEERESPONSE = _descriptor.Descriptor(
name='GetEstimatedTransactionFeeResponse',
full_name='org.dash.platform.dapi.v0.GetEstimatedTransactionFeeResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='fee', full_name='org.dash.platform.dapi.v0.GetEstimatedTransactionFeeResponse.fee', index=0,
number=1, type=1, cpp_type=5, label=1,
has_default_value=False, default_value=float(0),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1090,
serialized_end=1139,
)
_GETBLOCKREQUEST.oneofs_by_name['block'].fields.append(
_GETBLOCKREQUEST.fields_by_name['height'])
_GETBLOCKREQUEST.fields_by_name['height'].containing_oneof = _GETBLOCKREQUEST.oneofs_by_name['block']
_GETBLOCKREQUEST.oneofs_by_name['block'].fields.append(
_GETBLOCKREQUEST.fields_by_name['hash'])
_GETBLOCKREQUEST.fields_by_name['hash'].containing_oneof = _GETBLOCKREQUEST.oneofs_by_name['block']
_BLOCKHEADERSWITHCHAINLOCKSREQUEST.oneofs_by_name['from_block'].fields.append(
_BLOCKHEADERSWITHCHAINLOCKSREQUEST.fields_by_name['from_block_hash'])
_BLOCKHEADERSWITHCHAINLOCKSREQUEST.fields_by_name['from_block_hash'].containing_oneof = _BLOCKHEADERSWITHCHAINLOCKSREQUEST.oneofs_by_name['from_block']
_BLOCKHEADERSWITHCHAINLOCKSREQUEST.oneofs_by_name['from_block'].fields.append(
_BLOCKHEADERSWITHCHAINLOCKSREQUEST.fields_by_name['from_block_height'])
_BLOCKHEADERSWITHCHAINLOCKSREQUEST.fields_by_name['from_block_height'].containing_oneof = _BLOCKHEADERSWITHCHAINLOCKSREQUEST.oneofs_by_name['from_block']
_BLOCKHEADERSWITHCHAINLOCKSRESPONSE.fields_by_name['block_headers'].message_type = _BLOCKHEADERS
_BLOCKHEADERSWITHCHAINLOCKSRESPONSE.fields_by_name['chain_lock_signature_messages'].message_type = _CHAINLOCKSIGNATUREMESSAGES
_BLOCKHEADERSWITHCHAINLOCKSRESPONSE.oneofs_by_name['responses'].fields.append(
_BLOCKHEADERSWITHCHAINLOCKSRESPONSE.fields_by_name['block_headers'])
_BLOCKHEADERSWITHCHAINLOCKSRESPONSE.fields_by_name['block_headers'].containing_oneof = _BLOCKHEADERSWITHCHAINLOCKSRESPONSE.oneofs_by_name['responses']
_BLOCKHEADERSWITHCHAINLOCKSRESPONSE.oneofs_by_name['responses'].fields.append(
_BLOCKHEADERSWITHCHAINLOCKSRESPONSE.fields_by_name['chain_lock_signature_messages'])
_BLOCKHEADERSWITHCHAINLOCKSRESPONSE.fields_by_name['chain_lock_signature_messages'].containing_oneof = _BLOCKHEADERSWITHCHAINLOCKSRESPONSE.oneofs_by_name['responses']
DESCRIPTOR.message_types_by_name['GetStatusRequest'] = _GETSTATUSREQUEST
DESCRIPTOR.message_types_by_name['GetStatusResponse'] = _GETSTATUSRESPONSE
DESCRIPTOR.message_types_by_name['GetBlockRequest'] = _GETBLOCKREQUEST
DESCRIPTOR.message_types_by_name['GetBlockResponse'] = _GETBLOCKRESPONSE
DESCRIPTOR.message_types_by_name['SendTransactionRequest'] = _SENDTRANSACTIONREQUEST
DESCRIPTOR.message_types_by_name['SendTransactionResponse'] = _SENDTRANSACTIONRESPONSE
DESCRIPTOR.message_types_by_name['GetTransactionRequest'] = _GETTRANSACTIONREQUEST
DESCRIPTOR.message_types_by_name['GetTransactionResponse'] = _GETTRANSACTIONRESPONSE
DESCRIPTOR.message_types_by_name['BlockHeadersWithChainLocksRequest'] = _BLOCKHEADERSWITHCHAINLOCKSREQUEST
DESCRIPTOR.message_types_by_name['BlockHeadersWithChainLocksResponse'] = _BLOCKHEADERSWITHCHAINLOCKSRESPONSE
DESCRIPTOR.message_types_by_name['BlockHeaders'] = _BLOCKHEADERS
DESCRIPTOR.message_types_by_name['ChainLockSignatureMessages'] = _CHAINLOCKSIGNATUREMESSAGES
DESCRIPTOR.message_types_by_name['GetEstimatedTransactionFeeRequest'] = _GETESTIMATEDTRANSACTIONFEEREQUEST
DESCRIPTOR.message_types_by_name['GetEstimatedTransactionFeeResponse'] = _GETESTIMATEDTRANSACTIONFEERESPONSE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
GetStatusRequest = _reflection.GeneratedProtocolMessageType('GetStatusRequest', (_message.Message,), dict(
DESCRIPTOR = _GETSTATUSREQUEST,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.GetStatusRequest)
))
_sym_db.RegisterMessage(GetStatusRequest)
GetStatusResponse = _reflection.GeneratedProtocolMessageType('GetStatusResponse', (_message.Message,), dict(
DESCRIPTOR = _GETSTATUSRESPONSE,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.GetStatusResponse)
))
_sym_db.RegisterMessage(GetStatusResponse)
GetBlockRequest = _reflection.GeneratedProtocolMessageType('GetBlockRequest', (_message.Message,), dict(
DESCRIPTOR = _GETBLOCKREQUEST,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.GetBlockRequest)
))
_sym_db.RegisterMessage(GetBlockRequest)
GetBlockResponse = _reflection.GeneratedProtocolMessageType('GetBlockResponse', (_message.Message,), dict(
DESCRIPTOR = _GETBLOCKRESPONSE,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.GetBlockResponse)
))
_sym_db.RegisterMessage(GetBlockResponse)
SendTransactionRequest = _reflection.GeneratedProtocolMessageType('SendTransactionRequest', (_message.Message,), dict(
DESCRIPTOR = _SENDTRANSACTIONREQUEST,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.SendTransactionRequest)
))
_sym_db.RegisterMessage(SendTransactionRequest)
SendTransactionResponse = _reflection.GeneratedProtocolMessageType('SendTransactionResponse', (_message.Message,), dict(
DESCRIPTOR = _SENDTRANSACTIONRESPONSE,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.SendTransactionResponse)
))
_sym_db.RegisterMessage(SendTransactionResponse)
GetTransactionRequest = _reflection.GeneratedProtocolMessageType('GetTransactionRequest', (_message.Message,), dict(
DESCRIPTOR = _GETTRANSACTIONREQUEST,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.GetTransactionRequest)
))
_sym_db.RegisterMessage(GetTransactionRequest)
GetTransactionResponse = _reflection.GeneratedProtocolMessageType('GetTransactionResponse', (_message.Message,), dict(
DESCRIPTOR = _GETTRANSACTIONRESPONSE,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.GetTransactionResponse)
))
_sym_db.RegisterMessage(GetTransactionResponse)
BlockHeadersWithChainLocksRequest = _reflection.GeneratedProtocolMessageType('BlockHeadersWithChainLocksRequest', (_message.Message,), dict(
DESCRIPTOR = _BLOCKHEADERSWITHCHAINLOCKSREQUEST,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.BlockHeadersWithChainLocksRequest)
))
_sym_db.RegisterMessage(BlockHeadersWithChainLocksRequest)
BlockHeadersWithChainLocksResponse = _reflection.GeneratedProtocolMessageType('BlockHeadersWithChainLocksResponse', (_message.Message,), dict(
DESCRIPTOR = _BLOCKHEADERSWITHCHAINLOCKSRESPONSE,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.BlockHeadersWithChainLocksResponse)
))
_sym_db.RegisterMessage(BlockHeadersWithChainLocksResponse)
BlockHeaders = _reflection.GeneratedProtocolMessageType('BlockHeaders', (_message.Message,), dict(
DESCRIPTOR = _BLOCKHEADERS,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.BlockHeaders)
))
_sym_db.RegisterMessage(BlockHeaders)
ChainLockSignatureMessages = _reflection.GeneratedProtocolMessageType('ChainLockSignatureMessages', (_message.Message,), dict(
DESCRIPTOR = _CHAINLOCKSIGNATUREMESSAGES,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.ChainLockSignatureMessages)
))
_sym_db.RegisterMessage(ChainLockSignatureMessages)
GetEstimatedTransactionFeeRequest = _reflection.GeneratedProtocolMessageType('GetEstimatedTransactionFeeRequest', (_message.Message,), dict(
DESCRIPTOR = _GETESTIMATEDTRANSACTIONFEEREQUEST,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.GetEstimatedTransactionFeeRequest)
))
_sym_db.RegisterMessage(GetEstimatedTransactionFeeRequest)
GetEstimatedTransactionFeeResponse = _reflection.GeneratedProtocolMessageType('GetEstimatedTransactionFeeResponse', (_message.Message,), dict(
DESCRIPTOR = _GETESTIMATEDTRANSACTIONFEERESPONSE,
__module__ = 'core_pb2'
# @@protoc_insertion_point(class_scope:org.dash.platform.dapi.v0.GetEstimatedTransactionFeeResponse)
))
_sym_db.RegisterMessage(GetEstimatedTransactionFeeResponse)
_CORE = _descriptor.ServiceDescriptor(
name='Core',
full_name='org.dash.platform.dapi.v0.Core',
file=DESCRIPTOR,
index=0,
options=None,
serialized_start=1142,
serialized_end=1919,
methods=[
_descriptor.MethodDescriptor(
name='getStatus',
full_name='org.dash.platform.dapi.v0.Core.getStatus',
index=0,
containing_service=None,
input_type=_GETSTATUSREQUEST,
output_type=_GETSTATUSRESPONSE,
options=None,
),
_descriptor.MethodDescriptor(
name='getBlock',
full_name='org.dash.platform.dapi.v0.Core.getBlock',
index=1,
containing_service=None,
input_type=_GETBLOCKREQUEST,
output_type=_GETBLOCKRESPONSE,
options=None,
),
_descriptor.MethodDescriptor(
name='sendTransaction',
full_name='org.dash.platform.dapi.v0.Core.sendTransaction',
index=2,
containing_service=None,
input_type=_SENDTRANSACTIONREQUEST,
output_type=_SENDTRANSACTIONRESPONSE,
options=None,
),
_descriptor.MethodDescriptor(
name='getTransaction',
full_name='org.dash.platform.dapi.v0.Core.getTransaction',
index=3,
containing_service=None,
input_type=_GETTRANSACTIONREQUEST,
output_type=_GETTRANSACTIONRESPONSE,
options=None,
),
_descriptor.MethodDescriptor(
name='getEstimatedTransactionFee',
full_name='org.dash.platform.dapi.v0.Core.getEstimatedTransactionFee',
index=4,
containing_service=None,
input_type=_GETESTIMATEDTRANSACTIONFEEREQUEST,
output_type=_GETESTIMATEDTRANSACTIONFEERESPONSE,
options=None,
),
_descriptor.MethodDescriptor(
name='subscribeToBlockHeadersWithChainLocks',
full_name='org.dash.platform.dapi.v0.Core.subscribeToBlockHeadersWithChainLocks',
index=5,
containing_service=None,
input_type=_BLOCKHEADERSWITHCHAINLOCKSREQUEST,
output_type=_BLOCKHEADERSWITHCHAINLOCKSRESPONSE,
options=None,
),
])
_sym_db.RegisterServiceDescriptor(_CORE)
DESCRIPTOR.services_by_name['Core'] = _CORE
try:
# THESE ELEMENTS WILL BE DEPRECATED.
# Please use the generated *_pb2_grpc.py files instead.
import grpc
from grpc.beta import implementations as beta_implementations
from grpc.beta import interfaces as beta_interfaces
from grpc.framework.common import cardinality
from grpc.framework.interfaces.face import utilities as face_utilities
class CoreStub(object):
# missing associated documentation comment in .proto file
pass
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.getStatus = channel.unary_unary(
'/org.dash.platform.dapi.v0.Core/getStatus',
request_serializer=GetStatusRequest.SerializeToString,
response_deserializer=GetStatusResponse.FromString,
)
self.getBlock = channel.unary_unary(
'/org.dash.platform.dapi.v0.Core/getBlock',
request_serializer=GetBlockRequest.SerializeToString,
response_deserializer=GetBlockResponse.FromString,
)
self.sendTransaction = channel.unary_unary(
'/org.dash.platform.dapi.v0.Core/sendTransaction',
request_serializer=SendTransactionRequest.SerializeToString,
response_deserializer=SendTransactionResponse.FromString,
)
self.getTransaction = channel.unary_unary(
'/org.dash.platform.dapi.v0.Core/getTransaction',
request_serializer=GetTransactionRequest.SerializeToString,
response_deserializer=GetTransactionResponse.FromString,
)
self.getEstimatedTransactionFee = channel.unary_unary(
'/org.dash.platform.dapi.v0.Core/getEstimatedTransactionFee',
request_serializer=GetEstimatedTransactionFeeRequest.SerializeToString,
response_deserializer=GetEstimatedTransactionFeeResponse.FromString,
)
self.subscribeToBlockHeadersWithChainLocks = channel.unary_stream(
'/org.dash.platform.dapi.v0.Core/subscribeToBlockHeadersWithChainLocks',
request_serializer=BlockHeadersWithChainLocksRequest.SerializeToString,
response_deserializer=BlockHeadersWithChainLocksResponse.FromString,
)
class CoreServicer(object):
# missing associated documentation comment in .proto file
pass
def getStatus(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def getBlock(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def sendTransaction(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def getTransaction(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def getEstimatedTransactionFee(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def subscribeToBlockHeadersWithChainLocks(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_CoreServicer_to_server(servicer, server):
rpc_method_handlers = {
'getStatus': grpc.unary_unary_rpc_method_handler(
servicer.getStatus,
request_deserializer=GetStatusRequest.FromString,
response_serializer=GetStatusResponse.SerializeToString,
),
'getBlock': grpc.unary_unary_rpc_method_handler(
servicer.getBlock,
request_deserializer=GetBlockRequest.FromString,
response_serializer=GetBlockResponse.SerializeToString,
),
'sendTransaction': grpc.unary_unary_rpc_method_handler(
servicer.sendTransaction,
request_deserializer=SendTransactionRequest.FromString,
response_serializer=SendTransactionResponse.SerializeToString,
),
'getTransaction': grpc.unary_unary_rpc_method_handler(
servicer.getTransaction,
request_deserializer=GetTransactionRequest.FromString,
response_serializer=GetTransactionResponse.SerializeToString,
),
'getEstimatedTransactionFee': grpc.unary_unary_rpc_method_handler(
servicer.getEstimatedTransactionFee,
request_deserializer=GetEstimatedTransactionFeeRequest.FromString,
response_serializer=GetEstimatedTransactionFeeResponse.SerializeToString,
),
'subscribeToBlockHeadersWithChainLocks': grpc.unary_stream_rpc_method_handler(
servicer.subscribeToBlockHeadersWithChainLocks,
request_deserializer=BlockHeadersWithChainLocksRequest.FromString,
response_serializer=BlockHeadersWithChainLocksResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'org.dash.platform.dapi.v0.Core', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
class BetaCoreServicer(object):
"""The Beta API is deprecated for 0.15.0 and later.
It is recommended to use the GA API (classes and functions in this
file not marked beta) for all further purposes. This class was generated
only to ease transition from grpcio<0.15.0 to grpcio>=0.15.0."""
# missing associated documentation comment in .proto file
pass
def getStatus(self, request, context):
# missing associated documentation comment in .proto file
pass
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def getBlock(self, request, context):
# missing associated documentation comment in .proto file
pass
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def sendTransaction(self, request, context):
# missing associated documentation comment in .proto file
pass
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def getTransaction(self, request, context):
# missing associated documentation comment in .proto file
pass
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def getEstimatedTransactionFee(self, request, context):
# missing associated documentation comment in .proto file
pass
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def subscribeToBlockHeadersWithChainLocks(self, request, context):
# missing associated documentation comment in .proto file
pass
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
class BetaCoreStub(object):
"""The Beta API is deprecated for 0.15.0 and later.
It is recommended to use the GA API (classes and functions in this
file not marked beta) for all further purposes. This class was generated
only to ease transition from grpcio<0.15.0 to grpcio>=0.15.0."""
# missing associated documentation comment in .proto file
pass
def getStatus(self, request, timeout, metadata=None, with_call=False, protocol_options=None):
# missing associated documentation comment in .proto file
pass
raise NotImplementedError()
getStatus.future = None
def getBlock(self, request, timeout, metadata=None, with_call=False, protocol_options=None):
# missing associated documentation comment in .proto file
pass
raise NotImplementedError()
getBlock.future = None
def sendTransaction(self, request, timeout, metadata=None, with_call=False, protocol_options=None):
# missing associated documentation comment in .proto file
pass
raise NotImplementedError()
sendTransaction.future = None
def getTransaction(self, request, timeout, metadata=None, with_call=False, protocol_options=None):
# missing associated documentation comment in .proto file
pass
raise NotImplementedError()
getTransaction.future = None
def getEstimatedTransactionFee(self, request, timeout, metadata=None, with_call=False, protocol_options=None):
# missing associated documentation comment in .proto file
pass
raise NotImplementedError()
getEstimatedTransactionFee.future = None
def subscribeToBlockHeadersWithChainLocks(self, request, timeout, metadata=None, with_call=False, protocol_options=None):
# missing associated documentation comment in .proto file
pass
raise NotImplementedError()
def beta_create_Core_server(servicer, pool=None, pool_size=None, default_timeout=None, maximum_timeout=None):
"""The Beta API is deprecated for 0.15.0 and later.
It is recommended to use the GA API (classes and functions in this
file not marked beta) for all further purposes. This function was
generated only to ease transition from grpcio<0.15.0 to grpcio>=0.15.0"""
request_deserializers = {
('org.dash.platform.dapi.v0.Core', 'getBlock'): GetBlockRequest.FromString,
('org.dash.platform.dapi.v0.Core', 'getEstimatedTransactionFee'): GetEstimatedTransactionFeeRequest.FromString,
('org.dash.platform.dapi.v0.Core', 'getStatus'): GetStatusRequest.FromString,
('org.dash.platform.dapi.v0.Core', 'getTransaction'): GetTransactionRequest.FromString,
('org.dash.platform.dapi.v0.Core', 'sendTransaction'): SendTransactionRequest.FromString,
('org.dash.platform.dapi.v0.Core', 'subscribeToBlockHeadersWithChainLocks'): BlockHeadersWithChainLocksRequest.FromString,
}
response_serializers = {
('org.dash.platform.dapi.v0.Core', 'getBlock'): GetBlockResponse.SerializeToString,
('org.dash.platform.dapi.v0.Core', 'getEstimatedTransactionFee'): GetEstimatedTransactionFeeResponse.SerializeToString,
('org.dash.platform.dapi.v0.Core', 'getStatus'): GetStatusResponse.SerializeToString,
('org.dash.platform.dapi.v0.Core', 'getTransaction'): GetTransactionResponse.SerializeToString,
('org.dash.platform.dapi.v0.Core', 'sendTransaction'): SendTransactionResponse.SerializeToString,
('org.dash.platform.dapi.v0.Core', 'subscribeToBlockHeadersWithChainLocks'): BlockHeadersWithChainLocksResponse.SerializeToString,
}
method_implementations = {
('org.dash.platform.dapi.v0.Core', 'getBlock'): face_utilities.unary_unary_inline(servicer.getBlock),
('org.dash.platform.dapi.v0.Core', 'getEstimatedTransactionFee'): face_utilities.unary_unary_inline(servicer.getEstimatedTransactionFee),
('org.dash.platform.dapi.v0.Core', 'getStatus'): face_utilities.unary_unary_inline(servicer.getStatus),
('org.dash.platform.dapi.v0.Core', 'getTransaction'): face_utilities.unary_unary_inline(servicer.getTransaction),
('org.dash.platform.dapi.v0.Core', 'sendTransaction'): face_utilities.unary_unary_inline(servicer.sendTransaction),
('org.dash.platform.dapi.v0.Core', 'subscribeToBlockHeadersWithChainLocks'): face_utilities.unary_stream_inline(servicer.subscribeToBlockHeadersWithChainLocks),
}
server_options = beta_implementations.server_options(request_deserializers=request_deserializers, response_serializers=response_serializers, thread_pool=pool, thread_pool_size=pool_size, default_timeout=default_timeout, maximum_timeout=maximum_timeout)
return beta_implementations.server(method_implementations, options=server_options)
def beta_create_Core_stub(channel, host=None, metadata_transformer=None, pool=None, pool_size=None):
"""The Beta API is deprecated for 0.15.0 and later.
It is recommended to use the GA API (classes and functions in this
file not marked beta) for all further purposes. This function was
generated only to ease transition from grpcio<0.15.0 to grpcio>=0.15.0"""
request_serializers = {
('org.dash.platform.dapi.v0.Core', 'getBlock'): GetBlockRequest.SerializeToString,
('org.dash.platform.dapi.v0.Core', 'getEstimatedTransactionFee'): GetEstimatedTransactionFeeRequest.SerializeToString,
('org.dash.platform.dapi.v0.Core', 'getStatus'): GetStatusRequest.SerializeToString,
('org.dash.platform.dapi.v0.Core', 'getTransaction'): GetTransactionRequest.SerializeToString,
('org.dash.platform.dapi.v0.Core', 'sendTransaction'): SendTransactionRequest.SerializeToString,
('org.dash.platform.dapi.v0.Core', 'subscribeToBlockHeadersWithChainLocks'): BlockHeadersWithChainLocksRequest.SerializeToString,
}
response_deserializers = {
('org.dash.platform.dapi.v0.Core', 'getBlock'): GetBlockResponse.FromString,
('org.dash.platform.dapi.v0.Core', 'getEstimatedTransactionFee'): GetEstimatedTransactionFeeResponse.FromString,
('org.dash.platform.dapi.v0.Core', 'getStatus'): GetStatusResponse.FromString,
('org.dash.platform.dapi.v0.Core', 'getTransaction'): GetTransactionResponse.FromString,
('org.dash.platform.dapi.v0.Core', 'sendTransaction'): SendTransactionResponse.FromString,
('org.dash.platform.dapi.v0.Core', 'subscribeToBlockHeadersWithChainLocks'): BlockHeadersWithChainLocksResponse.FromString,
}
cardinalities = {
'getBlock': cardinality.Cardinality.UNARY_UNARY,
'getEstimatedTransactionFee': cardinality.Cardinality.UNARY_UNARY,
'getStatus': cardinality.Cardinality.UNARY_UNARY,
'getTransaction': cardinality.Cardinality.UNARY_UNARY,
'sendTransaction': cardinality.Cardinality.UNARY_UNARY,
'subscribeToBlockHeadersWithChainLocks': cardinality.Cardinality.UNARY_STREAM,
}
stub_options = beta_implementations.stub_options(host=host, metadata_transformer=metadata_transformer, request_serializers=request_serializers, response_deserializers=response_deserializers, thread_pool=pool, thread_pool_size=pool_size)
return beta_implementations.dynamic_stub(channel, 'org.dash.platform.dapi.v0.Core', cardinalities, options=stub_options)
except ImportError:
pass
# @@protoc_insertion_point(module_scope)
| 42.885633 | 2,847 | 0.760518 | 4,972 | 45,373 | 6.711585 | 0.079646 | 0.043512 | 0.058016 | 0.065268 | 0.660144 | 0.625921 | 0.585975 | 0.553671 | 0.414684 | 0.400479 | 0 | 0.024895 | 0.128865 | 45,373 | 1,057 | 2,848 | 42.926206 | 0.819359 | 0.082935 | 0 | 0.573932 | 1 | 0.027382 | 0.210507 | 0.169905 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024096 | false | 0.027382 | 0.013143 | 0 | 0.043812 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f301917c422d9318495feced737c153caa8bd9a9 | 290 | py | Python | baekjoon/not-classified/10844/10844.py | honux77/algorithm | 2ed8cef1fbee7ad96d8f2ae583666d52bd8892ee | [
"MIT"
] | 2 | 2019-02-08T01:23:07.000Z | 2020-11-19T12:23:52.000Z | baekjoon/not-classified/10844/10844.py | honux77/algorithm | 2ed8cef1fbee7ad96d8f2ae583666d52bd8892ee | [
"MIT"
] | null | null | null | baekjoon/not-classified/10844/10844.py | honux77/algorithm | 2ed8cef1fbee7ad96d8f2ae583666d52bd8892ee | [
"MIT"
] | null | null | null | n = int(input())
s = [[0] * 10 for _ in range(n + 1)]
s[1] = [0] + [1] * 9
mod = 1000 ** 3
for i in range(2, n + 1):
for j in range(0, 9 + 1):
if j >= 1:
s[i][j] += s[i - 1][j - 1]
if j <= 8:
s[i][j] += s[i - 1][j + 1]
print(sum(s[n]) % mod) | 19.333333 | 38 | 0.358621 | 60 | 290 | 1.716667 | 0.35 | 0.07767 | 0.07767 | 0.07767 | 0.15534 | 0.15534 | 0.15534 | 0.15534 | 0 | 0 | 0 | 0.135593 | 0.389655 | 290 | 15 | 39 | 19.333333 | 0.446328 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.090909 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f3052e2208b42e9e168f9e6bcc11e27d4f1b41d3 | 9,922 | py | Python | mc/opcodes.py | iximeow/binja-m16c | debf368e5df90a96d6c8b0bc128626a9d6834bb4 | [
"0BSD"
] | 12 | 2020-01-15T00:51:06.000Z | 2021-10-02T12:45:50.000Z | mc/opcodes.py | iximeow/binja-m16c | debf368e5df90a96d6c8b0bc128626a9d6834bb4 | [
"0BSD"
] | 2 | 2020-02-03T08:26:26.000Z | 2020-07-01T19:51:44.000Z | mc/opcodes.py | iximeow/binja-m16c | debf368e5df90a96d6c8b0bc128626a9d6834bb4 | [
"0BSD"
] | 4 | 2020-02-03T07:51:12.000Z | 2021-02-14T19:13:07.000Z | import re
from . import tables
from .instr import Instruction
from .instr.nop import *
from .instr.alu import *
from .instr.bcd import *
from .instr.bit import *
from .instr.flag import *
from .instr.mov import *
from .instr.smov import *
from .instr.ld_st import *
from .instr.stack import *
from .instr.jmp import *
from .instr.call import *
from .instr.ctx import *
from .instr.trap import *
enumerations = {
'R': tables.rx_ax,
'I': tables.dsp8_dsp16_abs16,
'6': tables.dsp8_abs16,
'7': tables.r0x_r0y_dsp8_abs16,
'8': tables.r0x_dsp8_abs16,
'A': tables.reg16_dsp8_dsp16_dsp20_abs16,
'E': tables.reg8l_dsp8_dsp16_abs16,
'N': tables.reg8_dsp8_dsp16_abs16,
'C': tables.creg,
'J': tables.cnd_j3,
'K': tables.cnd_j4,
'M': tables.cnd_bm4,
}
encodings = {
'0111_011z_1111_dddd': AbsReg,
'0111_011z_0110_dddd': AdcImm,
'1011_000z_ssss_dddd': AdcReg,
'0111_011z_1110_dddd': Adcf,
'0111_011z_0100_dddd': AddImm,
'1100_100z_iiii_dddd': AddImm4,
'1000_0DDD;8': AddImm8,
'1010_000z_ssss_dddd': AddReg,
'0010_0DSS;7': AddReg8,
'0111_110z_1110_1011': AddImmSP,
'0111_1101_1011_iiii': AddImm4SP,
'1111_100z_iiii_dddd': Adjnz,
'0111_011z_0010_dddd': AndImm,
'1001_0DDD;8': AndImm8,
'1001_000z_ssss_dddd': AndReg,
'0001_0DSS;7': AndReg8,
'0111_1110_0100_ssss': Band,
'0111_1110_1000_dddd': Bclr,
'0100_0bbb': BclrSB,
'0111_1110_0010_dddd': Bmcnd,
'0111_1101_1101_CCCC;M': BmcndC,
'0111_1110_0101_ssss': Bnand,
'0111_1110_0111_ssss': Bnor,
'0111_1110_1010_dddd': Bnot,
'0101_0bbb': BnotSB,
'0111_1110_0011_ssss': Bntst,
'0111_1110_1101_ssss': Bnxor,
'0111_1110_0110_ssss': Bor,
'0111_1110_1001_dddd': Bset,
'0100_1bbb': BsetSB,
'0111_1110_1011_ssss': Btst,
'0101_1bbb': BtstSB,
'0111_1110_0000_dddd': Btstc,
'0111_1110_0001_dddd': Btsts,
'0111_1110_1100_ssss': Bxor,
'0000_0000': Brk,
'0111_011z_1000_dddd': CmpImm,
'1101_000z_iiii_dddd': CmpImm4,
'1110_0DDD;8': CmpImm8,
'1100_000z_ssss_dddd': CmpReg,
'0011_1DSS;7': CmpReg8,
'0111_1100_1110_1110': DadcImm8,
'0111_1101_1110_1110': DadcImm16,
'0111_1100_1110_0110': DadcReg8,
'0111_1101_1110_0110': DadcReg16,
'0111_1100_1110_1100': DaddImm8,
'0111_1101_1110_1100': DaddImm16,
'0111_1100_1110_0100': DaddReg8,
'0111_1101_1110_0100': DaddReg16,
'1010_1DDD;8': Dec,
'1111_d010': DecAdr,
'0111_110z_1110_0001': DivImm,
'0111_011z_1101_ssss': DivReg,
'0111_110z_1110_0000': DivuImm,
'0111_011z_1100_ssss': DivuReg,
'0111_110z_1110_0011': DivxImm,
'0111_011z_1001_ssss': DivxReg,
'0111_1100_1110_1111': DsbbImm8,
'0111_1101_1110_1111': DsbbImm16,
'0111_1100_1110_0111': DsbbReg8,
'0111_1101_1110_0111': DsbbReg16,
'0111_1100_1110_1101': DsubImm8,
'0111_1101_1110_1101': DsubImm16,
'0111_1100_1110_0101': DsubReg8,
'0111_1101_1110_0101': DsubReg16,
'0111_1100_1111_0010': Enter,
'0111_1101_1111_0010': Exitd,
'0111_1100_0110_DDDD;E': Exts,
'0111_1100_1111_0011': ExtsR0,
'1110_1011_0fff_0101': Fclr,
'1110_1011_0fff_0100': Fset,
'1010_0DDD;8': Inc,
'1011_d010': IncAdr,
'1110_1011_11ii_iiii': Int,
'1111_0110': Into,
'0110_1CCC;J': Jcnd1,
'0111_1101_1100_CCCC;K': Jcnd2,
'0110_0iii': Jmp3,
'1111_1110': Jmp8,
'1111_0100': Jmp16,
'1111_1100': JmpAbs,
'0111_1101_0010_ssss': Jmpi,
'0111_1101_0000_SSSS;A': JmpiAbs,
'1110_1110': Jmps,
'1111_0101': Jsr16,
'1111_1101': JsrAbs,
'0111_1101_0011_ssss': Jsri,
'0111_1101_0001_SSSS;A': JsriAbs,
'1110_1111': Jsrs,
'1110_1011_0DDD;C_0000': LdcImm,
'0111_1010_1DDD;C_ssss': LdcReg,
'0111_1100_1111_0000': Ldctx,
'0111_010z_1000_dddd': Lde,
'0111_010z_1001_dddd': LdeA0,
'0111_010z_1010_dddd': LdeA1A0,
'0111_1101_1010_0iii': Ldipl,
'0111_010z_1100_dddd': MovImmReg,
'1101_100z_iiii_dddd': MovImm4Reg,
'1100_0DDD;8': MovImm8Reg,
'1110_d010': MovImm8Adr,
'1010_d010': MovImm16Adr,
'1011_0DDD;8': MovZero8Reg,
'0111_001z_ssss_dddd': MovRegReg,
'0011_0dss': MovRegAdr,
'0000_0sDD;6': MovReg8Reg,
'0000_1DSS;7': MovRegReg8,
'0111_010z_1011_dddd': MovIndSPReg,
'0111_010z_0011_ssss': MovRegIndSP,
'1110_1011_0DDD;R_SSSS;I': Mova,
'0111_1100_10rr_DDDD;N': MovdirR0LReg,
'0111_1100_00rr_SSSS;N': MovdirRegR0L,
'0111_110z_0101_dddd': MulImm,
'0111_100z_ssss_dddd': MulReg,
'0111_110z_0100_dddd': MuluImm,
'0111_000z_ssss_dddd': MuluReg,
'0111_010z_0101_dddd': NegReg,
'0000_0100': Nop,
'0111_010z_0111_dddd': NotReg,
'1011_1DDD;8': NotReg8,
'0111_011z_0011_dddd': OrImm,
'1001_1DDD;8': OrImm8,
'1001_100z_ssss_dddd': OrReg,
'0001_1DSS;7': OrReg8,
'0111_010z_1101_dddd': Pop,
'1001_d010': PopReg8,
'1101_d010': PopAdr,
'1110_1011_0DDD;C_0011': Popc,
'1110_1101': Popm,
'0111_110z_1110_0010': PushImm,
'0111_010z_0100_ssss': Push,
'1000_s010': PushReg8,
'1100_s010': PushAdr,
'0111_1101_1001_SSSS;I': Pusha,
'1110_1011_0SSS;C_0010': Pushc,
'1110_1100': Pushm,
'1111_1011': Reit,
'0111_110z_1111_0001': Rmpa,
'1110_000z_iiii_dddd': RotImm4,
'0111_010z_0110_dddd': RotR1H,
'0111_011z_1010_dddd': Rolc,
'0111_011z_1011_dddd': Rorc,
'1111_0011': Rts,
'0111_011z_0111_dddd': SbbImm,
'1011_100z_ssss_dddd': SbbReg,
'1111_000z_iiii_dddd': ShaImm4,
'0111_010z_1111_dddd': ShaR1H,
'1110_1011_101d_iiii': Sha32Imm4,
'1110_1011_001d_0001': Sha32R1H,
'1110_100z_iiii_dddd': ShlImm4,
'0111_010z_1110_dddd': ShlR1H,
'1110_1011_100d_iiii': Shl32Imm4,
'1110_1011_000d_0001': Shl32R1H,
'0111_110z_1110_1001': Smovb,
'0111_110z_1110_1000': Smovf,
'0111_110z_1110_1010': Sstr,
'0111_1011_1SSS;C_dddd': StcReg,
'0111_1100_1100_DDDD;A': StcPc,
'0111_1101_1111_0000': Stctx,
'0111_010z_0000_ssss': Ste,
'0111_010z_0001_ssss': SteA0,
'0111_010z_0010_ssss': SteA1A0,
'1101_0DDD;8': Stnz,
'1100_1DDD;8': Stz,
'1101_1DDD;8': Stzx,
'0111_011z_0101_dddd': SubImm,
'1000_1DDD;8': SubImm8,
'1010_100z_ssss_dddd': SubReg,
'0010_1DSS;7': SubReg8,
'0111_011z_0000_dddd': TstImm,
'1000_000z_ssss_dddd': TstReg,
'1111_1111': Und,
'0111_1101_1111_0011': Wait,
'0111_101z_00ss_dddd': Xchg,
'0111_011z_0001_dddd': XorImm,
'1000_100z_ssss_dddd': XorReg,
}
def generate_tables():
for encoding, instr in encodings.items():
def expand_encoding(table, parts):
part, *parts = parts
if ';' in part:
part, enum = part.split(';', 2)
else:
enum = ''
assert len(part) == 4 and len(enum) <= 1
chunks = []
try:
chunks.append(int(part, 2))
except ValueError:
wildcard_part = re.sub(r'[A-Z]', '0', part)
instr_code = int(re.sub(r'[^01]', '0', wildcard_part), 2)
instr_mask = int(re.sub(r'[^01]', '0', wildcard_part.replace('0', '1')), 2)
operand_mask = int(re.sub(r'[^01]', '1', wildcard_part.replace('1', '0')), 2)
operand_code = 0
while True:
chunks.append(instr_code | operand_code)
if operand_code == operand_mask:
break
# The following line cleverly uses carries to make a counter only from the bits
# that are set in `operand_mask`. To understand it, consider that `instr_mask`
# is the inverse of `operand_mask`, and adding 1 to a 011...1 chunk changes it
# into a 100...0 chunk.
operand_code = ((operand_code | instr_mask) + 1) & operand_mask
if enum:
shift = 4 - re.search(r'[A-Z]+', part).end()
chunks, chunk_templates = [], chunks
for template in chunk_templates:
for legal_bits in enumerations[enum]:
chunks.append(template | (legal_bits << shift))
for chunk in chunks:
if parts:
try:
subtable = table[chunk]
except KeyError:
subtable = table[chunk] = dict()
assert isinstance(subtable, dict)
expand_encoding(subtable, parts)
else:
assert chunk not in table, "{} conflicts with {}".format(instr, table[chunk])
table[chunk] = instr
parts = encoding.split('_')
while re.match(r"^[a-z]+$", parts[-1]):
parts.pop()
expand_encoding(Instruction.opcodes, parts)
def print_assigned():
def contract_encoding(table, parts):
for part, entry in table.items():
if isinstance(entry, dict):
contract_encoding(entry, (*parts, part))
else:
encoding = '_'.join('{:04b}'.format(part) for part in (*parts, part))
mnemonic = entry().name()
print('{:20s} {}'.format(encoding, mnemonic))
contract_encoding(Instruction.opcodes, ())
def print_unassigned():
def contract_encoding(table, parts):
unassigned = set(range(16))
for part, entry in table.items():
unassigned.remove(part)
if isinstance(entry, dict):
contract_encoding(entry, (*parts, part))
for part in unassigned:
print('_'.join('{:04b}'.format(part) for part in (*parts, part)))
contract_encoding(Instruction.opcodes, ())
generate_tables()
# print_assigned()
# print_unassigned()
| 28.429799 | 99 | 0.621951 | 1,289 | 9,922 | 4.408068 | 0.282389 | 0.028159 | 0.031679 | 0.004752 | 0.060718 | 0.05051 | 0.038719 | 0.038719 | 0.030271 | 0 | 0 | 0.266937 | 0.253175 | 9,922 | 348 | 100 | 28.511494 | 0.499865 | 0.029127 | 0 | 0.055147 | 0 | 0 | 0.311552 | 0.032932 | 0 | 0 | 0 | 0 | 0.011029 | 1 | 0.022059 | false | 0 | 0.058824 | 0 | 0.080882 | 0.014706 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f3126093965615fe8a8564523762df648831f740 | 171 | py | Python | functional_tests.py | idanmel/soccer_friends | db370c384e99308c5f6a39a18eac1556b83cc786 | [
"MIT"
] | null | null | null | functional_tests.py | idanmel/soccer_friends | db370c384e99308c5f6a39a18eac1556b83cc786 | [
"MIT"
] | null | null | null | functional_tests.py | idanmel/soccer_friends | db370c384e99308c5f6a39a18eac1556b83cc786 | [
"MIT"
] | null | null | null | from selenium import webdriver
browser = webdriver.Firefox()
browser.get('http://localhost:8000')
try:
assert 'Django' in browser.title
finally:
browser.close()
| 17.1 | 36 | 0.730994 | 21 | 171 | 5.952381 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027397 | 0.146199 | 171 | 9 | 37 | 19 | 0.828767 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f3128d3872baa827767bd09bf278c2956175ee90 | 963 | py | Python | lorenzsj/blog/views.py | lorenzsj/lorenzsj | 631c6632f8fe70a021836c52aafd8746e13fc8a8 | [
"MIT"
] | null | null | null | lorenzsj/blog/views.py | lorenzsj/lorenzsj | 631c6632f8fe70a021836c52aafd8746e13fc8a8 | [
"MIT"
] | null | null | null | lorenzsj/blog/views.py | lorenzsj/lorenzsj | 631c6632f8fe70a021836c52aafd8746e13fc8a8 | [
"MIT"
] | null | null | null | from django.contrib.auth.models import User
from rest_framework import viewsets
from rest_framework import permissions
from rest_framework.response import Response
from blog.models import Post
from blog.serializers import PostSerializer
from blog.serializers import UserSerializer
from blog.permissions import IsAuthorOrReadOnly
class UserViewSet(viewsets.ReadOnlyModelViewSet):
"""This viewset automatically provides `list` and `detail` actions."""
queryset = User.objects.all()
serializer_class = UserSerializer
class PostViewSet(viewsets.ModelViewSet):
"""This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions.
"""
queryset = Post.objects.all()
serializer_class = PostSerializer
permission_classes = [
permissions.IsAuthenticatedOrReadOnly,
IsAuthorOrReadOnly,
]
def perform_create(self, serializer):
serializer.save(author=self.request.user) | 31.064516 | 74 | 0.764278 | 100 | 963 | 7.29 | 0.48 | 0.043896 | 0.069959 | 0.0631 | 0.098765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15784 | 963 | 31 | 75 | 31.064516 | 0.89889 | 0.168224 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.4 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
f3163b561595dcd3e021c0a5f070a6337bbb8499 | 1,745 | py | Python | model/k1_clustering_pre-processing.py | not-a-hot-dog/spotify_project | b928fecb136cffdd62c650b054ca543047800f11 | [
"MIT"
] | null | null | null | model/k1_clustering_pre-processing.py | not-a-hot-dog/spotify_project | b928fecb136cffdd62c650b054ca543047800f11 | [
"MIT"
] | 1 | 2019-12-08T17:23:49.000Z | 2019-12-08T17:23:49.000Z | model/k1_clustering_pre-processing.py | not-a-hot-dog/spotify_project | b928fecb136cffdd62c650b054ca543047800f11 | [
"MIT"
] | null | null | null | import pandas as pd
import numpy as np
from model.helper_functions import build_playlist_features
print('Reading data into memory')
pid_list = np.genfromtxt('../data/train_pids.csv', skip_header=1, dtype=int)
playlistfile = '../data/playlists.csv'
playlist_df = pd.read_csv(playlistfile)
trackfile = '../data/songs_100000_feat_cleaned.csv'
track_df = pd.read_csv(trackfile, index_col='track_uri')
print('Finding playlist features')
playlist_features = build_playlist_features(pid_list, playlist_df, track_df)
playlist_features.to_csv('../data/playlist_features_train.csv')
print('Finding top artists')
# Find the top artists who dominate playlists
top_playlist_defining_artists = playlist_features.artist_uri_top.value_counts(normalize=False)
top_playlist_defining_artists.to_csv('../data/top_playlist_defining_artists_train_all.csv', header=True)
top_playlist_defining_artists = playlist_features.artist_uri_top.value_counts().index.values[:50]
np.savetxt('../data/top_playlist_defining_artists_train.csv', top_playlist_defining_artists, delimiter=',', fmt="%s")
# Keep only those artists who dominate playlists and one hot encode
artists_to_keep = playlist_features.artist_uri_top.isin(top_playlist_defining_artists)
playlist_features.artist_uri_top = playlist_features.artist_uri_top[artists_to_keep]
playlist_features.artist_uri_freq = playlist_features.artist_uri_freq[artists_to_keep]
playlist_features.artist_uri_freq.fillna(0, inplace=True)
top_artist_dummies = pd.get_dummies(playlist_features.artist_uri_top)
playlist_features = pd.concat([playlist_features, top_artist_dummies], axis=1)
playlist_features.drop(['artist_uri_top'], axis=1, inplace=True)
playlist_features.to_csv('../data/playlist_features_with_artists_train.csv')
| 52.878788 | 117 | 0.837249 | 257 | 1,745 | 5.291829 | 0.319066 | 0.235294 | 0.145588 | 0.165441 | 0.413235 | 0.380882 | 0.329412 | 0.197059 | 0.135294 | 0.095588 | 0 | 0.007322 | 0.060745 | 1,745 | 32 | 118 | 54.53125 | 0.822453 | 0.062464 | 0 | 0 | 0 | 0 | 0.217391 | 0.159829 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.12 | 0 | 0.12 | 0.12 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f316cbca5e61cde2ebe07f8eb9690a7626e13407 | 497 | py | Python | agenda/tests/test_models.py | migueleichler/django-tdd | 5b8bd6088b5e2de4d70026b761391bce3aa52f32 | [
"MIT"
] | null | null | null | agenda/tests/test_models.py | migueleichler/django-tdd | 5b8bd6088b5e2de4d70026b761391bce3aa52f32 | [
"MIT"
] | null | null | null | agenda/tests/test_models.py | migueleichler/django-tdd | 5b8bd6088b5e2de4d70026b761391bce3aa52f32 | [
"MIT"
] | null | null | null | from django.test import TestCase
from agenda.models import Compromisso
from model_mommy import mommy
class CompromissoModelTest(TestCase):
def setUp(self):
self.instance = mommy.make('Compromisso')
def test_string_representation(self):
self.assertEqual(str(self.instance), self.instance.titulo)
def test_obrigatory_fields(self):
created = Compromisso.objects.create(horario=self.instance.horario)
self.assertTrue(isinstance(created, Compromisso))
| 27.611111 | 75 | 0.750503 | 57 | 497 | 6.45614 | 0.526316 | 0.130435 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.160966 | 497 | 17 | 76 | 29.235294 | 0.882494 | 0 | 0 | 0 | 0 | 0 | 0.022133 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 1 | 0.272727 | false | 0 | 0.272727 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f3200d5d53315321e6ef6c3cef5d42425590c96b | 743 | py | Python | strings/reverse_string.py | ahcode0919/python-ds-algorithms | 0d617b78c50b6c18da40d9fa101438749bfc82e1 | [
"MIT"
] | null | null | null | strings/reverse_string.py | ahcode0919/python-ds-algorithms | 0d617b78c50b6c18da40d9fa101438749bfc82e1 | [
"MIT"
] | null | null | null | strings/reverse_string.py | ahcode0919/python-ds-algorithms | 0d617b78c50b6c18da40d9fa101438749bfc82e1 | [
"MIT"
] | 3 | 2020-10-07T20:24:45.000Z | 2020-12-16T04:53:19.000Z | from typing import List, Optional
def reverse_string(string: str) -> str:
return string[::-1]
def reverse_string_in_place(string: [str]):
index = 0
length = len(string)
middle = length / 2
while index < middle:
string[index], string[length - 1 - index] = string[length - 1 - index], string[index]
index += 1
def reverse_string_with_list_comprehension(string: str) -> str:
return ''.join([string[i] for i in range(len(string) - 1, -1, -1)])
def reverse_string_with_loop(string: str) -> str:
reversed_str: List[Optional[str]] = [None] * len(string)
for index in range(len(string) - 1, -1, -1):
reversed_str[len(string) - 1 - index] = string[index]
return ''.join(reversed_str)
| 28.576923 | 93 | 0.643338 | 105 | 743 | 4.419048 | 0.27619 | 0.096983 | 0.137931 | 0.109914 | 0.271552 | 0.18319 | 0.081897 | 0 | 0 | 0 | 0 | 0.022184 | 0.211306 | 743 | 25 | 94 | 29.72 | 0.769625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.058824 | 0.117647 | 0.470588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
b825f9f00f6901c5d7cf23cfa47cb3197933eecd | 1,855 | py | Python | loadbalanceRL/utils/exceptions.py | fqzhou/LoadBalanceControl-RL | 689eec3b3b27e121aa45d2793e411f1863f6fc0b | [
"MIT"
] | 11 | 2018-10-29T06:50:43.000Z | 2022-03-28T14:26:09.000Z | loadbalanceRL/utils/exceptions.py | fqzhou/LoadBalanceControl-RL | 689eec3b3b27e121aa45d2793e411f1863f6fc0b | [
"MIT"
] | 1 | 2022-03-01T13:46:25.000Z | 2022-03-01T13:46:25.000Z | loadbalanceRL/utils/exceptions.py | fqzhou/LoadBalanceControl-RL | 689eec3b3b27e121aa45d2793e411f1863f6fc0b | [
"MIT"
] | 6 | 2019-02-05T20:01:53.000Z | 2020-09-04T12:30:00.000Z | #! /usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Definition of all Rainman2 exceptions
"""
__author__ = 'Ari Saha (arisaha@icloud.com), Mingyang Liu(liux3941@umn.edu)'
__date__ = 'Wednesday, February 14th 2018, 11:38:08 am'
class FileOpenError(IOError):
"""
Exception raised when a file couldn't be opened.
"""
pass
class AgentNotSupported(Exception):
"""
Exception raised when agent is not valid for requested algorithm
"""
pass
class AgentMethodNotImplemented(NotImplementedError):
"""
Exception raised when trying to access a private method of an agent
that is not implemented yet.
"""
pass
class AlgorithmNotImplemented(NotImplementedError):
"""
Exception raised when trying to access algorithm that is not
implemented yet.
"""
pass
class AlgorithmMethodNotImplemented(NotImplementedError):
"""
Exception raised when trying to access a private method of an algorithm
that is not implemented yet.
"""
pass
class ClientNotImplemented(NotImplementedError):
"""
Exception raised when trying to access client that is not
implemented yet.
"""
pass
class ClientMethodNotImplemented(NotImplementedError):
"""
Exception raised when trying to access a private method of a client
that is not implemented yet.
"""
pass
class EnvironmentNotImplemented(NotImplementedError):
"""
Exception raised when trying to access Environment that is not
implemented yet.
"""
pass
class EnvironmentMethodNotImplemented(NotImplementedError):
"""
Exception raised when trying to access a private method of an environment
that is not implemented yet.
"""
pass
class ExternalServerError(Exception):
"""
Exception raised when external server is not accessible
"""
pass
| 21.079545 | 77 | 0.698652 | 203 | 1,855 | 6.344828 | 0.374384 | 0.11646 | 0.147516 | 0.206522 | 0.551242 | 0.551242 | 0.551242 | 0.40528 | 0.215839 | 0.215839 | 0 | 0.01325 | 0.226954 | 1,855 | 87 | 78 | 21.321839 | 0.884937 | 0.476011 | 0 | 0.454545 | 0 | 0 | 0.13121 | 0.053503 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.454545 | 0 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b83a4b8131231e8ffeccb27881d8404fa73c602e | 649 | py | Python | dynamic programming/python/leetcode303_Range_Sum_Query_Immutable.py | wenxinjie/leetcode | c459a01040c8fe0783e15a16b8d7cca4baf4612a | [
"Apache-2.0"
] | null | null | null | dynamic programming/python/leetcode303_Range_Sum_Query_Immutable.py | wenxinjie/leetcode | c459a01040c8fe0783e15a16b8d7cca4baf4612a | [
"Apache-2.0"
] | null | null | null | dynamic programming/python/leetcode303_Range_Sum_Query_Immutable.py | wenxinjie/leetcode | c459a01040c8fe0783e15a16b8d7cca4baf4612a | [
"Apache-2.0"
] | null | null | null | # Given an integer array nums, find the sum of the elements between indices i and j (i ≤ j), inclusive.
# Example:
# Given nums = [-2, 0, 3, -5, 2, -1]
# sumRange(0, 2) -> 1
# sumRange(2, 5) -> -1
# sumRange(0, 5) -> -3
class NumArray:
def __init__(self, nums):
"""
:type nums: List[int]
"""
self.array = [0]
for num in nums:
self.array.append(self.array[-1] + num)
def sumRange(self, i, j):
"""
:type i: int
:type j: int
:rtype: int
"""
return self.array[j+1] - self.array[i]
# Time: O(n)
# Space: O(n)
# Difficulty: easy | 20.935484 | 103 | 0.497689 | 94 | 649 | 3.404255 | 0.468085 | 0.140625 | 0.0625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042056 | 0.340524 | 649 | 31 | 104 | 20.935484 | 0.703271 | 0.474576 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b84e7cc9d16e3f0b3e8a9ecacf33341e96af47cb | 102 | py | Python | Desafio 46.py | MisaelGuilherme/100_Exercicios_Em_Python | 8c4cdad7e60201abcdd2c4a5646f52aed4e7041e | [
"MIT"
] | null | null | null | Desafio 46.py | MisaelGuilherme/100_Exercicios_Em_Python | 8c4cdad7e60201abcdd2c4a5646f52aed4e7041e | [
"MIT"
] | null | null | null | Desafio 46.py | MisaelGuilherme/100_Exercicios_Em_Python | 8c4cdad7e60201abcdd2c4a5646f52aed4e7041e | [
"MIT"
] | null | null | null | print('====== DESAFIO 46 ======')
import time
for c in range(10,-1,-1):
time.sleep(1)
print(c) | 20.4 | 33 | 0.539216 | 17 | 102 | 3.235294 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084337 | 0.186275 | 102 | 5 | 34 | 20.4 | 0.578313 | 0 | 0 | 0 | 0 | 0 | 0.23301 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.4 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b876fc588cb708294748bda2b97c2a9bb2b7cc83 | 539 | py | Python | files/OOP/Encapsulation/Encapsulation 3.py | grzegorzpikus/grzegorzpikus.github.io | 652233e0b98f48a3396583bab2559f5981bac8ad | [
"CC-BY-3.0"
] | null | null | null | files/OOP/Encapsulation/Encapsulation 3.py | grzegorzpikus/grzegorzpikus.github.io | 652233e0b98f48a3396583bab2559f5981bac8ad | [
"CC-BY-3.0"
] | null | null | null | files/OOP/Encapsulation/Encapsulation 3.py | grzegorzpikus/grzegorzpikus.github.io | 652233e0b98f48a3396583bab2559f5981bac8ad | [
"CC-BY-3.0"
] | null | null | null | class BankAccount:
def __init__(self, checking = None, savings = None):
self._checking = checking
self._savings = savings
def get_checking(self):
return self._checking
def set_checking(self, new_checking):
self._checking = new_checking
def get_savings(self):
return self._savings
def set_savings(self, new_savings):
self._savings = new_savings
my_account = BankAccount()
my_account.set_checking(523.48)
print(my_account.get_checking())
my_account.set_savings(386.15)
print(my_account.get_savings()) | 22.458333 | 54 | 0.747681 | 74 | 539 | 5.081081 | 0.256757 | 0.119681 | 0.074468 | 0.090426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02193 | 0.153989 | 539 | 24 | 55 | 22.458333 | 0.802632 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.294118 | false | 0 | 0 | 0.117647 | 0.470588 | 0.117647 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
b894e0eb0f3f3a5eab5eca43855c560fff5104ea | 2,040 | py | Python | meterbus/wtelegram_header.py | noda/pyMeterBus | a1bb6b6ef9b3db4583dfb2b154e4f65365dee9d9 | [
"BSD-3-Clause"
] | 44 | 2016-12-11T14:43:14.000Z | 2022-03-17T18:31:14.000Z | meterbus/wtelegram_header.py | noda/pyMeterBus | a1bb6b6ef9b3db4583dfb2b154e4f65365dee9d9 | [
"BSD-3-Clause"
] | 13 | 2017-11-29T14:36:34.000Z | 2020-12-20T18:33:35.000Z | meterbus/wtelegram_header.py | noda/pyMeterBus | a1bb6b6ef9b3db4583dfb2b154e4f65365dee9d9 | [
"BSD-3-Clause"
] | 32 | 2015-09-15T12:23:19.000Z | 2022-03-22T08:32:22.000Z | import simplejson as json
from .telegram_field import TelegramField
class WTelegramHeader(object):
def __init__(self):
# self._startField = TelegramField()
self._lField = TelegramField()
self._cField = TelegramField()
# self._crcField = TelegramField()
# self._stopField = TelegramField()
self._headerLength = 2
# self._headerLengthCRCStop = 8
@property
def headerLength(self):
return self._headerLength
# @property
# def headerLengthCRCStop(self):
# return self._headerLengthCRCStop
@property
def startField(self):
return self._startField
@startField.setter
def startField(self, value):
self._startField = TelegramField(value)
@property
def lField(self):
return self._lField
@lField.setter
def lField(self, value):
self._lField = TelegramField(value)
@property
def cField(self):
return self._cField
@cField.setter
def cField(self, value):
self._cField = TelegramField(value)
@property
def interpreted(self):
return {
'length': hex(self.lField.parts[0]),
'c': hex(self.cField.parts[0]),
}
# @property
# def crcField(self):
# return self._crcField
# @crcField.setter
# def crcField(self, value):
# self._crcField = TelegramField(value)
# @property
# def stopField(self):
# return self._stopField
# @stopField.setter
# def stopField(self, value):
# self._stopField = TelegramField(value)
def load(self, hat):
header = hat
if isinstance(hat, str):
header = list(map(ord, hat))
# self.startField = header[0]
self.lField = header[0]
self.cField = header[1]
# self.crcField = header[-2]
# self.stopField = header[-1]
def to_JSON(self):
return json.dumps(self.interpreted, sort_keys=False,
indent=4, use_decimal=True)
| 24 | 60 | 0.601471 | 204 | 2,040 | 5.882353 | 0.259804 | 0.075 | 0.081667 | 0.096667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00694 | 0.293627 | 2,040 | 84 | 61 | 24.285714 | 0.825815 | 0.284804 | 0 | 0.116279 | 0 | 0 | 0.004875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.255814 | false | 0 | 0.046512 | 0.139535 | 0.465116 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
b898caf8e8371912904209cfea669349d7d43e84 | 453 | py | Python | SimplePyGA/FitnessCalc/__init__.py | UglySoftware/SimplePyGA | 2cc0ef5709800059b323de2be6ea8bf77fb94384 | [
"MIT"
] | 1 | 2019-09-03T17:52:12.000Z | 2019-09-03T17:52:12.000Z | SimplePyGA/FitnessCalc/__init__.py | UglySoftware/SimplePyGA | 2cc0ef5709800059b323de2be6ea8bf77fb94384 | [
"MIT"
] | null | null | null | SimplePyGA/FitnessCalc/__init__.py | UglySoftware/SimplePyGA | 2cc0ef5709800059b323de2be6ea8bf77fb94384 | [
"MIT"
] | 1 | 2019-09-03T17:52:13.000Z | 2019-09-03T17:52:13.000Z | #-----------------------------------------------------------------------
#
# __init__.py (FitnessCalc)
#
# FitnessCalc package init module
#
# Copyright and Distribution
#
# Part of SimplePyGA: Simple Genetic Algorithms in Python
# Copyright (c) 2016 Terry McKiernan (terry@mckiernan.com)
# Released under The MIT License
# See LICENSE file in top-level package folder
#
#----------------------------------------------------------------------- | 32.357143 | 72 | 0.503311 | 39 | 453 | 5.74359 | 0.794872 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010127 | 0.128035 | 453 | 14 | 73 | 32.357143 | 0.556962 | 0.940397 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b8a9652011ddd210555829c017c928bd04cf38bf | 920 | py | Python | azure_utility_tool/config.py | alextricity25/azure_utility_tool | 2975b5f415e5c64335618e83ed0216b7923c4166 | [
"MIT"
] | 5 | 2020-01-02T03:12:14.000Z | 2020-08-19T02:31:19.000Z | azure_utility_tool/config.py | alextricity25/azure_utility_tool | 2975b5f415e5c64335618e83ed0216b7923c4166 | [
"MIT"
] | null | null | null | azure_utility_tool/config.py | alextricity25/azure_utility_tool | 2975b5f415e5c64335618e83ed0216b7923c4166 | [
"MIT"
] | 2 | 2020-03-16T00:19:06.000Z | 2020-08-20T19:31:10.000Z | """
Author: Miguel Alex Cantu
Email: miguel.can2@gmail.com
Date: 12/21/2019
Description:
Loads Azure Utility Tool configuration file. The configuration
file is a blend of what the Microsoft Authentication Library
requires and some extra directives that the Auzre Utility
Tool requires. It is a JSON file that is required to be
stored in ~/.aut/aut_config.json
"""
import json
import sys
import os
from azure_utility_tool.exceptions import ConfigFileNotFound
def get_config(config_file="~/.aut/aut_config.json"):
CONFIG_PATH = os.path.expanduser(config_file)
# Ensure the directory exists, if not, then throw an Exception.
if not os.path.exists(CONFIG_PATH):
raise ConfigFileNotFound("The configuration file for the Azure"
" Utility Tool was not found in " +
config_file)
return json.load(open(CONFIG_PATH))
| 34.074074 | 71 | 0.702174 | 127 | 920 | 5 | 0.551181 | 0.069291 | 0.075591 | 0.050394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012748 | 0.232609 | 920 | 26 | 72 | 35.384615 | 0.886686 | 0.476087 | 0 | 0 | 0 | 0 | 0.187764 | 0.046414 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.363636 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b8ab8fab99f75b9332a0131cf9ea65ac9a6bcb59 | 1,848 | py | Python | python_app/supervised_learning/train_data/Data.py | 0xsuu/Project-Mahjong | e82edc67651ff93c8ec158b590cd728f28504be9 | [
"Apache-2.0"
] | 9 | 2018-06-08T00:09:08.000Z | 2021-11-17T11:05:11.000Z | python_app/supervised_learning/train_data/Data.py | 0xsuu/Project-Mahjong | e82edc67651ff93c8ec158b590cd728f28504be9 | [
"Apache-2.0"
] | 1 | 2020-04-25T12:43:26.000Z | 2020-04-25T12:43:26.000Z | python_app/supervised_learning/train_data/Data.py | 0xsuu/Project-Mahjong | e82edc67651ff93c8ec158b590cd728f28504be9 | [
"Apache-2.0"
] | 2 | 2019-05-30T07:18:45.000Z | 2019-11-05T09:15:13.000Z | #!/usr/bin/env python3
'''
The MIT License (MIT)
Copyright (c) 2014 Mark Haines
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
'''
def asdata(obj, asdata):
if isinstance(obj, Data):
return obj.asdata(asdata)
elif isinstance(obj, str):
return obj
elif hasattr(obj, '_asdict'):
return asdata(obj._asdict(), asdata)
elif isinstance(obj, dict):
return dict((k, asdata(v, asdata)) for (k, v) in obj.items())
else:
try:
return list(asdata(child, asdata) for child in obj)
except:
return obj
class Data:
def asdata(self, asdata = asdata):
return dict((k, asdata(v, asdata)) for (k, v) in self.__dict__.items())
def __repr__(self):
return self.asdata().__repr__()
| 36.96 | 82 | 0.682359 | 260 | 1,848 | 4.796154 | 0.476923 | 0.070569 | 0.02085 | 0.036889 | 0.049719 | 0.049719 | 0.049719 | 0.049719 | 0.049719 | 0.049719 | 0 | 0.003592 | 0.246753 | 1,848 | 49 | 83 | 37.714286 | 0.892241 | 0.594697 | 0 | 0.105263 | 0 | 0 | 0.010753 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.157895 | false | 0 | 0 | 0.105263 | 0.631579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
b8ae0034ffcbb27bca0f7745d8873b03677fa88a | 1,569 | py | Python | autobiography.py | wcmckee/wcmckee | 19315a37b592b7bcebb5f2720c965aea58f928ce | [
"MIT"
] | null | null | null | autobiography.py | wcmckee/wcmckee | 19315a37b592b7bcebb5f2720c965aea58f928ce | [
"MIT"
] | null | null | null | autobiography.py | wcmckee/wcmckee | 19315a37b592b7bcebb5f2720c965aea58f928ce | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# <nbformat>3.0</nbformat>
# <markdowncell>
# My name is William Clifford Mckee and this is my autobiography. Written in November 2014.
#
# Structure:
#
# <markdowncell>
# Hello and goodNIĢ
# testing one two there.
# <markdowncell>
# Hello. Testing one two three.
# Screw you guys. I'm going to bed.
#
# History. Mum and Dad
#
#
# One of my early drawing memories was of my friend Wayne. Around age 10. His art was better than mine. I wanted to be better.
# I can't remember seriously drawing untill after high school.
#
# I had a friend at highschool whos artwork I admired.
# He got in trouble once for drawing nudes.
# I rembmer being ifn the art room at intermediate. I have better memories of cooking and wood work than art.
# Paint yourself said the reliever.
#
# We had art folder. Kids would cover these black folders in art. I was always embarrassed by the art on mine. I would hide it by carrying the folder so that art was facing the inside.
# Today I walk around with a visual diary and will let anyone look.
# I'm always very critical of my art though.
#
#
# I hated using artist models and copy their painting.
# My painting skills were low - needed to develop drawing and confidence.
# I am tired.
# More bad news tonight
#
# I had some excellent tutors that helped me develop my painting. Most notable was Gary Freemantle and Roger Key.
# Gary pushed my abstraction and color
#
# Key pushed obsovational painting and focusing on lights and darks.
#
# The classes I did at The Learning Connextion were
| 32.020408 | 185 | 0.733588 | 257 | 1,569 | 4.478599 | 0.634241 | 0.010426 | 0.022589 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007217 | 0.205226 | 1,569 | 48 | 186 | 32.6875 | 0.915798 | 0.942001 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b8b45d47d2d2b0c8935936a0ff5a2cb55518f1d6 | 2,558 | py | Python | experiments/examples/example_run_bench_s1_periodic_bench.py | cogsys-tuebingen/uninas | 06729b9cf517ec416fb798ae387c5bd9c3a278ac | [
"MIT"
] | 18 | 2020-11-22T16:03:08.000Z | 2022-03-15T12:11:46.000Z | experiments/examples/example_run_bench_s1_periodic_bench.py | cogsys-tuebingen/uninas | 06729b9cf517ec416fb798ae387c5bd9c3a278ac | [
"MIT"
] | 2 | 2022-01-04T08:10:17.000Z | 2022-01-05T08:13:14.000Z | experiments/examples/example_run_bench_s1_periodic_bench.py | cogsys-tuebingen/uninas | 06729b9cf517ec416fb798ae387c5bd9c3a278ac | [
"MIT"
] | 6 | 2021-03-08T07:08:52.000Z | 2022-02-24T12:00:43.000Z | """
training a super-network and periodically evaluating its performance on bench architectures
a work in this direction exists: https://arxiv.org/abs/2001.01431
"""
from uninas.main import Main
# default configurations, for the search process and the network design
# config_files = "{path_conf_bench_tasks}/s1_fairnas_cifar.run_config, {path_conf_net_search}/bench201.run_config"
config_files = "{path_conf_bench_tasks}/s1_random_cifar.run_config, {path_conf_net_search}/bench201.run_config"
# these changes are applied to the default configuration in the config files
changes = {
"{cls_task}.is_test_run": True,
"{cls_task}.save_dir": "{path_tmp}/run_bench_s1_per/",
"{cls_task}.save_del_old": True,
"{cls_trainer}.max_epochs": 4,
"{cls_data}.dir": "{path_data}/cifar_data/",
"{cls_data}.fake": False,
"{cls_data}.download": False,
"{cls_data}.batch_size_train": 96,
# example how to mask options
"{cls_method}.mask_indices": "0, 1, 4", # mask Zero, Skip, Pool
"{cls_network_body}.cell_order": "n, n, r, n, n, r, n, n", # 2 normal cells, one reduction cell, ...
"{cls_network_stem}.features": 16, # start with 16 channels
# some augmentations
"cls_augmentations": "DartsCifarAug", # default augmentations for cifar
"{cls_schedulers#0}.warmup_epochs": 0,
# specifying how to add weights, note that SplitWeightsMixedOp requires a SplitWeightsMixedOpCallback
"{cls_network_cells_primitives#0}.mixed_cls": "MixedOp", # MixedOp, BiasD1MixedOp, ...
"{cls_network_cells_primitives#1}.mixed_cls": "MixedOp", # MixedOp, BiasD1MixedOp, ...
"cls_callbacks": "CheckpointCallback, CreateBenchCallback",
"{cls_callbacks#1}.each_epochs": 1,
"{cls_callbacks#1}.reset_bn": True,
"{cls_callbacks#1}.benchmark_path": "{path_data}/bench/nats/nats_bench_1.1_subset_m_test.pt",
# what and how to evaluate each specific network
"cls_cb_objectives": "NetValueEstimator",
"{cls_cb_objectives#0}.key": "acc1/valid",
"{cls_cb_objectives#0}.is_constraint": False,
"{cls_cb_objectives#0}.is_objective": True,
"{cls_cb_objectives#0}.maximize": True,
"{cls_cb_objectives#0}.load": True,
"{cls_cb_objectives#0}.batches_forward": 20,
"{cls_cb_objectives#0}.batches_train": 0,
"{cls_cb_objectives#0}.batches_eval": -1,
"{cls_cb_objectives#0}.value": "val/accuracy/1",
}
if __name__ == "__main__":
task = Main.new_task(config_files, args_changes=changes)
task.run()
| 41.258065 | 114 | 0.69742 | 347 | 2,558 | 4.821326 | 0.443804 | 0.029886 | 0.089659 | 0.086073 | 0.233712 | 0.139869 | 0.094441 | 0.057382 | 0.057382 | 0.057382 | 0 | 0.02623 | 0.165364 | 2,558 | 61 | 115 | 41.934426 | 0.757377 | 0.306099 | 0 | 0 | 0 | 0 | 0.638177 | 0.490598 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.027778 | 0 | 0.027778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b8b7e91501f23e4c04cf067b13d9a9480a460c77 | 59 | py | Python | python/testData/debug/test4.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 2 | 2018-12-29T09:53:39.000Z | 2018-12-29T09:53:42.000Z | python/testData/debug/test4.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/debug/test4.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 1 | 2020-11-27T10:36:50.000Z | 2020-11-27T10:36:50.000Z | xval = 0
xvalue1 = 1
xvalue2 = 2
print(xvalue1 + xvalue2)
| 9.833333 | 24 | 0.677966 | 9 | 59 | 4.444444 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152174 | 0.220339 | 59 | 5 | 25 | 11.8 | 0.717391 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b211aa66d913dc22688021d4550c75de1f8811d6 | 2,877 | py | Python | tests/sdict/test_sdict_substitutor.py | nikitanovosibirsk/district42-exp-types | e36e43da62f32d58d4b14c65afa16856dc8849e1 | [
"Apache-2.0"
] | null | null | null | tests/sdict/test_sdict_substitutor.py | nikitanovosibirsk/district42-exp-types | e36e43da62f32d58d4b14c65afa16856dc8849e1 | [
"Apache-2.0"
] | 2 | 2021-08-01T05:02:21.000Z | 2021-08-01T10:06:28.000Z | tests/sdict/test_sdict_substitutor.py | nikitanovosibirsk/district42-exp-types | e36e43da62f32d58d4b14c65afa16856dc8849e1 | [
"Apache-2.0"
] | null | null | null | from _pytest.python_api import raises
from baby_steps import given, then, when
from district42 import schema
from revolt import substitute
from revolt.errors import SubstitutionError
from district42_exp_types.sdict import schema_sdict
def test_sdict_substitution():
with given:
sch = schema_sdict
with when:
res = substitute(sch, {})
with then:
assert res == schema_sdict({})
assert res != sch
def test_sdict_nested_substitution():
with given:
sch = schema_sdict({
"result": schema_sdict({
"id": schema.int,
"name": schema.str,
"friend": schema_sdict({
"id": schema.int,
"name": schema.str
})
})
})
with when:
res = substitute(sch, {
"result.id": 1,
"result.name": "Bob",
"result.friend.id": 2,
"result.friend.name": "Alice",
})
with then:
assert res == schema_sdict({
"result": schema_sdict({
"id": schema.int(1),
"name": schema.str("Bob"),
"friend": schema_sdict({
"id": schema.int(2),
"name": schema.str("Alice")
})
})
})
assert res != sch
def test_sdict_relaxed_substitution():
with given:
sch = schema_sdict({
"result": schema_sdict({
"id": schema.int,
"name": schema.str,
...: ...
})
})
with when:
res = substitute(sch, {
"result.id": 1,
})
with then:
assert res == schema_sdict({
"result": schema_sdict({
"id": schema.int(1),
"name": schema.str,
...: ...
})
})
assert res != sch
def test_sdict_relaxed_extra_key_substitution_error():
with given:
sch = schema_sdict({
"result": schema_sdict({
"id": schema.int,
"name": schema.str,
...: ...
})
})
with when, raises(Exception) as exception:
substitute(sch, {
"result.id": 1,
"result.deleted_at": None
})
with then:
assert exception.type is SubstitutionError
def test_sdict_relaxed_ellipsis_substitution_error():
with given:
sch = schema_sdict({
"result": schema_sdict({
"id": schema.int,
"name": schema.str,
...: ...
})
})
with when, raises(Exception) as exception:
substitute(sch, {
"result.id": 1,
...: ...
})
with then:
assert exception.type is SubstitutionError
| 23.77686 | 54 | 0.468891 | 264 | 2,877 | 4.950758 | 0.189394 | 0.143076 | 0.079572 | 0.116297 | 0.748279 | 0.729916 | 0.628156 | 0.521041 | 0.510329 | 0.510329 | 0 | 0.007088 | 0.41154 | 2,877 | 120 | 55 | 23.975 | 0.764914 | 0 | 0 | 0.777778 | 0 | 0 | 0.072993 | 0 | 0 | 0 | 0 | 0 | 0.080808 | 1 | 0.050505 | false | 0 | 0.060606 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b212f83168a6342d8bcbdaa233860a911b7cdadb | 1,117 | py | Python | drf_ujson/parsers.py | radzhome/drf-ujson-renderer | b65c01edc5311404178a9d245d40ccc10733c5d7 | [
"MIT"
] | null | null | null | drf_ujson/parsers.py | radzhome/drf-ujson-renderer | b65c01edc5311404178a9d245d40ccc10733c5d7 | [
"MIT"
] | null | null | null | drf_ujson/parsers.py | radzhome/drf-ujson-renderer | b65c01edc5311404178a9d245d40ccc10733c5d7 | [
"MIT"
] | 1 | 2019-04-04T13:25:22.000Z | 2019-04-04T13:25:22.000Z | from __future__ import unicode_literals
import codecs
from django.conf import settings
from rest_framework.compat import six
from rest_framework.parsers import BaseParser, ParseError
from rest_framework import renderers
from rest_framework.settings import api_settings
import ujson
class UJSONParser(BaseParser):
"""
Parses JSON-serialized data.
"""
media_type = 'application/json'
renderer_class = renderers.JSONRenderer
strict = api_settings.STRICT_JSON
def parse(self, stream, media_type=None, parser_context=None):
"""
Parses the incoming bytestream as JSON and returns the resulting data.
"""
parser_context = parser_context or {}
encoding = parser_context.get('encoding', settings.DEFAULT_CHARSET)
try:
decoded_stream = codecs.getreader(encoding)(stream)
parse_constant = ujson.strict_constant if self.strict else None
return ujson.load(decoded_stream, parse_constant=parse_constant)
except ValueError as exc:
raise ParseError('JSON parse error - %s' % six.text_type(exc))
| 32.852941 | 78 | 0.72068 | 133 | 1,117 | 5.849624 | 0.488722 | 0.041131 | 0.087404 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.21128 | 1,117 | 33 | 79 | 33.848485 | 0.883087 | 0.08863 | 0 | 0 | 0 | 0 | 0.045965 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.380952 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b24fa470c54ab2d92980faab3b5c114f1efa0392 | 151 | py | Python | Recursion/recursiopow.py | TheG0dfath3r/Python | 73f40e9828b953c3e614a21a8980eaa81b5c066e | [
"MIT"
] | null | null | null | Recursion/recursiopow.py | TheG0dfath3r/Python | 73f40e9828b953c3e614a21a8980eaa81b5c066e | [
"MIT"
] | null | null | null | Recursion/recursiopow.py | TheG0dfath3r/Python | 73f40e9828b953c3e614a21a8980eaa81b5c066e | [
"MIT"
] | 2 | 2019-09-30T21:17:57.000Z | 2019-10-01T16:23:33.000Z | x=int(input("no 1 "))
y=int(input("no 2 "))
def pow(x,y):
if y!=0:
return(x*pow(x,y-1))
else:
return 1
print(pow(x,y))
| 16.777778 | 29 | 0.463576 | 30 | 151 | 2.333333 | 0.466667 | 0.171429 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048077 | 0.311258 | 151 | 8 | 30 | 18.875 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0.06993 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0 | 0 | 0.25 | 0.125 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b267e740ceab58f8898f41e8edaa0a8a6747e59b | 6,299 | py | Python | 4_neural_networks.py | scientificprogrammer123/Udacity_Machine-Learning | e6f5a73724ac51c9dcc9c28ee1652991982598ca | [
"MIT"
] | null | null | null | 4_neural_networks.py | scientificprogrammer123/Udacity_Machine-Learning | e6f5a73724ac51c9dcc9c28ee1652991982598ca | [
"MIT"
] | null | null | null | 4_neural_networks.py | scientificprogrammer123/Udacity_Machine-Learning | e6f5a73724ac51c9dcc9c28ee1652991982598ca | [
"MIT"
] | 1 | 2021-04-14T22:04:52.000Z | 2021-04-14T22:04:52.000Z | # lesson 1: neural networks
# cell body, neuron, axon, synapse
# spike trains travel down the axon, and causes excitation to occur at other axons.
# a computation unit.
#
# x1 -> w1 ->
# x2 -> w2 -> theta -> y
# x3 -> w3 ->
#
# sum_{=1}^{k} xi*wi, activation
# >=theta, firing threshold
#
# For perceptron, yes: y=1
# no: y=0
#
# lesson 2, ANN
# x1 1 w1 0.5 theta=0, y=0
# x2 0 w2 0.6
# x3 -1.5 w3 1
# lesson 3, how powerful is a perceptron? and
# y = 0,1
# w1 = 1/2
# w2 = 1/2
# theta = 3/4
#
# if x1=0, x2*1/2=3/4, x2=3/2
# if x2=0, x1*1/2=3/4, x1=3/2
#
# r = return 0, g = return 1
#
# 1 g g g g g
# 0.75 rg g g g g
# 0.5 r rg g g g
# 0.25 r r rg g g
# 0 r r r rg g
# 0 0.25 0.5 0.75 1
# lesson 4, how powerful is a perceptron 4?
# if we focus on x1 E {0,1}, x2 E {0,1}
# what is y? y is and
# lesson 5, how powerful is a perceptron 5?
# w1 = 0.5
# w2 = 0.5
# theta = 1/4
#
# if we focus on x1 E {0,1}, x2 E {0,1}
# what is y? y is or
#
#
# 1 g g g g g
# 0.75 g g g g g
# 0.5 g g g g g
# 0.25 rg g g g g
# 0 r rg g g g
# 0 0.25 0.5 0.75 1
# lesson 6, how powerful is a perceptron? not
# x1=1, y=0
# x1=0, y=1
# w1=-0.5, theta=0
#
# G R
# -1 0 1 2
#
# and or not are all expressible as perceptron units
# lesson 7, xor function
# theta = 0.5
# x1-> -> 0.5 ->
# and -> -1 -> or -> y
# x2-> -> 0.5 ->
#
# x1 x2 and or xor=or-and
# 0 0 0 0 0
# 0 1 0 1 1
# 1 0 0 1 1
# 1 1 1 1 0
# lesson 8, perceptron training
# perceptron rule -> single unit
# wi = wi + delta wi
# delta wi = nu(yi- yi^hat)xi
# yi^hat = (sum_i wi yi >= 0)
#
# y: target
# y_hat: output
# nu: learning rate
# x: input
#
# repeat x,y
# bias x y (0/1)
# | xxxx y
# | xxxx y
# | xxxx y
# | xxxx y
# | xxxx y
# | xxxx y
# | xxxx y
# | xxxx y
# theta w
#
# y y_hat y-y_hat
# 0 0 0
# 0 1 -1
# 1 0 1
# 1 1 0
#
# 2D training set, learn a half plane
# if the half plane is linearly separable, then perceptron rule will find it in
# finite number of iterations.
#
# if the data is not linearly seperable, see if it ever stops,
# problem, this algorithm never stops,
# so run while there are some errors, if you solve the halting problem then you
# can solve the halting problem.
# lesson 9, gradient descent
# need something that can work for linearly non-separability.
#
# a = sum_i x_i w_i
# y^hat = {a>=0}
# E(w) = 1/2 sum_{(x,y) E D} (y-a)^2
# d E(w) / d w_i = d/dw_i 1/2 sum_{(x,y) E D} (y-a)^2
# = sum_{(x,y) E D} (y-a) d/dw_i -sum_i x_i w_i'
# = sum_{(x,y) E D} (y-a)(-x_i) <- looks a lot like the perceptron rule
# lesson 10, comparison of learning rules
# delta w_i = nu (y-y^hat) x_i, perceptron: guarantee of finite convergence, in the case of linearly separable
# delta w_i = nu (y-a) x_i, gradient descent: calculus, robust, converge to local optimum
# activation, vs activation and thresholding it
# lesson 11, comparison of learning rules
# quiz: why not do gradient descent on y^hat
# intractable, no
# non differentiable, yes
# grows too fast, no
# multiple answers, no
# lesson 12, sigmoid
# sigma(a) = 1 / (1+e^(-a))
# as a -> -infinity, sigma(a)->0
# as a -> +infinity, sigma(a)->1
# D sigma(a) = sigma(a) (1-sigma(a))
# lesson 13, neural network sketch
# input, hidden layers, hidden layers, output
#
# whole thing is differentiable,
#
# back propogation, computationally beneficial organization of the chain rule
# we are just computing the derivatives with respect to the different weights
# in the network, all in one convenient way, that has, this lovely interpretation
# of having information flowing from the inputs to the outputs. And then error
# information flowing back from the outputs towards the inputs, and that tells you
# how to compute all the derivatives. And then, therefore how to make all the weight
# updates to make the network produce something more like what you want it to
# produce. so this is where the learning is actually taking place.
#
# the error function can have many local minimums, or local optima, stuck
# lesson 14, optimizing weights
# -> gradient descent
# -> advanced optimization methods, optimization and learning are the same according to people
#
# momentum terms in the gradient, in gradient descent, continue in direction,
# higher order derivatives, combinations of weights hamiltonia, and what not
# randmized optimization
# penalty for complexity
# philosophy based optimization, has this been tried?
#
# add more nodes,
# add more layers,
# higher weights
# these parameters make the network more complex
# make it as simple as possible.
# lesson 15, restrition bias
# restriction bias tells you the representational power, i.e. what you are able to represent
# set of hypotheses we will consider
# perceptron units are linear
# half spaces
# sigmoids
# complex
# much more complex, not as much
# Boolean: network of threshold-like units
# continuous function: connected, no jumps, hidden
# arbitrary: stitched together
#
# dangers of overfitting: cross validation
# error - iterations
# cross validation error can increase again, so if it works, then just stop
# lesson 16, preference bias
# preference bias tells you, given two representations, why I would prefer one
# over the other.
# prefer correct tree, prefer shorter tree
# how do we start weights:
# small, random values, for weights, avoid local minima, variability,
# large weights leads to overfitting,
# small random values, simple explaination,
# neural networks implement simpler explaination, occam's razor
# don't make something more complex unnecessarily
# better generalization
# lesson 17, summary
# perceptron, linear threshold unit, can create boolean function
# perceptron rule - finite time for linearly separable
# general differentiable - backprop and gradient descent
# preference/restriction bias of neural networks | 29.712264 | 111 | 0.62883 | 1,046 | 6,299 | 3.764818 | 0.304971 | 0.013713 | 0.013713 | 0.010157 | 0.117065 | 0.073134 | 0.050279 | 0.047994 | 0.041138 | 0.041138 | 0 | 0.047891 | 0.277346 | 6,299 | 212 | 112 | 29.712264 | 0.817223 | 0.90173 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b26b2d344b00d14f7c80d63267fca336b474dfed | 287 | py | Python | FluentPython/ch02/cartesian.py | eroicaleo/LearningPython | 297d46eddce6e43ce0c160d2660dff5f5d616800 | [
"MIT"
] | 1 | 2020-10-12T13:33:29.000Z | 2020-10-12T13:33:29.000Z | FluentPython/ch02/cartesian.py | eroicaleo/LearningPython | 297d46eddce6e43ce0c160d2660dff5f5d616800 | [
"MIT"
] | null | null | null | FluentPython/ch02/cartesian.py | eroicaleo/LearningPython | 297d46eddce6e43ce0c160d2660dff5f5d616800 | [
"MIT"
] | 1 | 2016-11-09T07:28:45.000Z | 2016-11-09T07:28:45.000Z | #!/usr/bin/env python
colors = ['white', 'black']
sizes = ['S', 'M', 'L']
tshirts = [(color, size) for size in sizes
for color in colors ]
print(tshirts)
tshirts = [(color, size) for color in colors
for size in sizes ]
print(tshirts)
| 22.076923 | 46 | 0.54007 | 37 | 287 | 4.189189 | 0.459459 | 0.154839 | 0.206452 | 0.245161 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.320557 | 287 | 12 | 47 | 23.916667 | 0.794872 | 0.069686 | 0 | 0.25 | 0 | 0 | 0.048872 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b27869ddfe009d8e2d025f4f2f3d4a1de697cced | 1,401 | py | Python | EventManager/Home/models.py | 201901407/woc3.0-eventmanager-DarshilParikh | 8174cd5373e3f3e4723a9fd6381266a56dddc4e6 | [
"MIT"
] | 1 | 2021-01-03T13:57:38.000Z | 2021-01-03T13:57:38.000Z | EventManager/Home/models.py | 201901407/woc3.0-eventmanager-DarshilParikh | 8174cd5373e3f3e4723a9fd6381266a56dddc4e6 | [
"MIT"
] | null | null | null | EventManager/Home/models.py | 201901407/woc3.0-eventmanager-DarshilParikh | 8174cd5373e3f3e4723a9fd6381266a56dddc4e6 | [
"MIT"
] | null | null | null | from django.db import models
import uuid, datetime
from django.utils import timezone
# Create your models here.
class User(models.Model):
user_id = models.CharField(max_length=100,default=uuid.uuid4)
email = models.EmailField(max_length=100)
name = models.CharField(max_length=100)
password = models.CharField(max_length=250)
def getUserDetails(self):
return self.email
class Event(models.Model):
event_id = models.CharField(max_length=100,default=uuid.uuid4)
event_name = models.CharField(max_length = 120)
event_start = models.DateTimeField()
event_end = models.DateTimeField()
host_email = models.EmailField(max_length = 100)
host_name = models.CharField(max_length = 100)
event_description = models.CharField(max_length = 300)
registration_deadline = models.DateTimeField(default=timezone.now)
event_poster = models.URLField(max_length=150,default = '')
def getEventDetails(self):
return [self.event_name,self.event_start,self.event_end,self.host,self.event_description]
class Participant(models.Model):
pevent_id = models.CharField(max_length=100)
participant_email = models.EmailField(max_length = 100)
participant_name = models.CharField(max_length=100)
participant_contactno = models.IntegerField()
group_registration = models.BooleanField()
no_of_members = models.IntegerField()
| 35.025 | 97 | 0.751606 | 176 | 1,401 | 5.789773 | 0.3125 | 0.114818 | 0.158979 | 0.211973 | 0.354269 | 0.326791 | 0.088322 | 0.088322 | 0.088322 | 0 | 0 | 0.034483 | 0.15132 | 1,401 | 39 | 98 | 35.923077 | 0.82254 | 0.017131 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068966 | false | 0.034483 | 0.103448 | 0.068966 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b27af965481a6eface77ab77feda170f704b5500 | 543 | py | Python | photoseleven/db.py | photoseleven/photoseleven-backend | 2e511d5e48477b6b41a6d98f0630b1bcada8a298 | [
"MIT"
] | null | null | null | photoseleven/db.py | photoseleven/photoseleven-backend | 2e511d5e48477b6b41a6d98f0630b1bcada8a298 | [
"MIT"
] | null | null | null | photoseleven/db.py | photoseleven/photoseleven-backend | 2e511d5e48477b6b41a6d98f0630b1bcada8a298 | [
"MIT"
] | 1 | 2020-03-29T11:20:40.000Z | 2020-03-29T11:20:40.000Z | import click
from flask import current_app, g
from flask.cli import with_appcontext
from flask_pymongo import PyMongo
from werkzeug.security import check_password_hash, generate_password_hash
def get_db():
if 'db' not in g:
mongo = PyMongo(current_app)
g.db = mongo.db
g.db_client = mongo.cx
return g.db
def close_db(e=None):
g.pop('db', None)
db_client = g.pop('db_client', None)
if db_client is not None:
db_client.close()
def init_app(app):
app.teardown_appcontext(close_db)
| 19.392857 | 73 | 0.692449 | 87 | 543 | 4.114943 | 0.390805 | 0.111732 | 0.061453 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.220994 | 543 | 27 | 74 | 20.111111 | 0.846336 | 0 | 0 | 0 | 1 | 0 | 0.023985 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.055556 | 0.277778 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b27c6063523b17b12c8d4769a34a8058b47c5491 | 366 | py | Python | DataStructures/Stacks/Stack.py | hhimmmmii/Data_Structures_and_Algorithms | 21169d21172fd1242cbb998324a6953b0c32cd05 | [
"MIT"
] | null | null | null | DataStructures/Stacks/Stack.py | hhimmmmii/Data_Structures_and_Algorithms | 21169d21172fd1242cbb998324a6953b0c32cd05 | [
"MIT"
] | 2 | 2020-10-05T05:23:40.000Z | 2020-10-15T17:34:32.000Z | DataStructures/Stacks/Stack.py | hhimmmmii/Data_Structures_and_Algorithms | 21169d21172fd1242cbb998324a6953b0c32cd05 | [
"MIT"
] | 10 | 2020-10-03T06:31:41.000Z | 2020-12-28T18:54:40.000Z | class Stack:
def __init__(self):
self.stack = []
def add(self, dataval):
# Use list append method to add element
if dataval not in self.stack:
self.stack.append(dataval)
return True
else:
return False
# Use peek to look at the top of the stack
def peek(self):
return self.stack[-1] | 22.875 | 42 | 0.57377 | 50 | 366 | 4.12 | 0.54 | 0.174757 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004202 | 0.349727 | 366 | 16 | 43 | 22.875 | 0.861345 | 0.213115 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0 | 0.090909 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b27fdc318377fdd21756f01199453a4713d91df6 | 1,794 | py | Python | forecast_box/validate.py | kyleclo/forecast-box | 5b965f0c7f45c92e800c31df1c7a12a6d08527b1 | [
"Apache-2.0"
] | 1 | 2017-02-08T19:34:35.000Z | 2017-02-08T19:34:35.000Z | forecast_box/validate.py | kyleclo/forecast-box | 5b965f0c7f45c92e800c31df1c7a12a6d08527b1 | [
"Apache-2.0"
] | null | null | null | forecast_box/validate.py | kyleclo/forecast-box | 5b965f0c7f45c92e800c31df1c7a12a6d08527b1 | [
"Apache-2.0"
] | null | null | null | """
Validation
"""
import numpy as np
import pandas as pd
from model import Model
# TODO: different versions with resampling or subsampling
# TODO: return DataFrame of forecasted_values along with metric?
def validate_model(name, params, time_series, metric_fun):
"""Evaluates performance of Model forecast method on time series"""
min_size = max(params['forward_steps']) + params['ar_order']
max_size = time_series.size - max(params['forward_steps'])
metric = []
for n in range(min_size, max_size + 1):
print 'Simulating forecasts for ' + str(time_series.index[n - 1])
sub_time_series = time_series.head(n)
model = Model.create(name, params)
model.train(sub_time_series)
forecasted_values = model.forecast(sub_time_series)
actual_values = time_series[forecasted_values.index]
metric.append(metric_fun(actual_values, forecasted_values))
return pd.Series(data=metric,
index=time_series.index[(min_size - 1):max_size])
# def validate_forecaster(forecaster, time_series, performance_fun):
# """Applies a forecaster to a time series to evaluate performance"""
#
# performance = []
# min_size = forecaster.min_size
# max_size = time_series.size - max(forecaster.forward_steps)
# for n in range(min_size, max_size + 1):
# print 'Simulating forecaster for ' + str(time_series.index[n - 1])
# sub_time_series = time_series.head(n)
# forecasted_values = forecaster.forecast(sub_time_series)
# actual_values = time_series[forecasted_values.index]
# performance.append(performance_fun(actual_values, forecasted_values))
#
# return pd.Series(data=performance,
# index=time_series.index[min_size - 1:max_size])
| 36.612245 | 79 | 0.696767 | 233 | 1,794 | 5.124464 | 0.274678 | 0.159129 | 0.054439 | 0.035176 | 0.478224 | 0.442211 | 0.40201 | 0.40201 | 0.40201 | 0.261307 | 0 | 0.004196 | 0.202899 | 1,794 | 48 | 80 | 37.375 | 0.830769 | 0.476031 | 0 | 0 | 0 | 0 | 0.070743 | 0 | 0 | 0 | 0 | 0.020833 | 0 | 0 | null | null | 0 | 0.176471 | null | null | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b28976d7d07ee0d85891e3ce1f95a592baa06a72 | 717 | py | Python | highway_env/__init__.py | songanz/highway-env | ac21d1da25e224dbdbf8ba39509f4013bd029f52 | [
"MIT"
] | 1 | 2019-11-06T15:28:27.000Z | 2019-11-06T15:28:27.000Z | highway_env/__init__.py | songanz/highway-env | ac21d1da25e224dbdbf8ba39509f4013bd029f52 | [
"MIT"
] | null | null | null | highway_env/__init__.py | songanz/highway-env | ac21d1da25e224dbdbf8ba39509f4013bd029f52 | [
"MIT"
] | 1 | 2019-07-22T03:37:09.000Z | 2019-07-22T03:37:09.000Z | from gym.envs.registration import register
register(
id='highway-v0',
entry_point='highway_env.envs:HighwayEnv',
)
register(
id='highway-continuous-v0',
entry_point='highway_env.envs:HighwayEnvCon',
)
register(
id='highway-continuous-intrinsic-rew-v0',
entry_point='highway_env.envs:HighwayEnvCon_intrinsic_rew',
)
register(
id='merge-v0',
entry_point='highway_env.envs:MergeEnv',
)
register(
id='roundabout-v0',
entry_point='highway_env.envs:RoundaboutEnv',
)
register(
id='two-way-v0',
entry_point='highway_env.envs:TwoWayEnv',
max_episode_steps=15
)
register(
id='parking-v0',
entry_point='highway_env.envs:ParkingEnv',
max_episode_steps=20
)
| 18.384615 | 63 | 0.714086 | 91 | 717 | 5.406593 | 0.340659 | 0.142276 | 0.170732 | 0.270325 | 0.422764 | 0.422764 | 0.158537 | 0 | 0 | 0 | 0 | 0.017974 | 0.146444 | 717 | 38 | 64 | 18.868421 | 0.785948 | 0 | 0 | 0.225806 | 0 | 0 | 0.440725 | 0.369596 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.032258 | 0 | 0.032258 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b28b8885604c48606cb8d4e162a310c2bb979435 | 1,005 | py | Python | tests/test_topic_matching.py | InfraPixels/powerlibs-aws-sqs-dequeue_to_api | 67ae744c96c7658229acc6fd1b1c432d24f0817d | [
"MIT"
] | null | null | null | tests/test_topic_matching.py | InfraPixels/powerlibs-aws-sqs-dequeue_to_api | 67ae744c96c7658229acc6fd1b1c432d24f0817d | [
"MIT"
] | null | null | null | tests/test_topic_matching.py | InfraPixels/powerlibs-aws-sqs-dequeue_to_api | 67ae744c96c7658229acc6fd1b1c432d24f0817d | [
"MIT"
] | 1 | 2021-05-26T00:16:26.000Z | 2021-05-26T00:16:26.000Z | def test_topic_regexp_matching(dequeuer):
msg = {'company_name': 'test_company'}
actions_1 = tuple(dequeuer.get_actions_for_topic('object__created', msg))
actions_2 = tuple(dequeuer.get_actions_for_topic('object__deleted', msg))
actions_3 = tuple(dequeuer.get_actions_for_topic('otherthing__created', msg))
assert actions_1 == actions_2
assert actions_1 != actions_3
def test_topic_regexp_matching_with_groups(dequeuer):
msg = {'company_name': 'test_company'}
actions_1 = tuple(dequeuer.get_actions_for_topic('step__alfa__started', msg))
payload = actions_1[0][1]['run'].args[2][0]
assert 'name' in payload
assert payload['name'] == 'alfa'
assert 'status' in payload
assert payload['status'] == 'started', payload
actions_2 = tuple(dequeuer.get_actions_for_topic('step__beta__finished', msg))
actions_3 = tuple(dequeuer.get_actions_for_topic('otherthing__created', msg))
assert actions_1 == actions_2
assert actions_1 != actions_3
| 38.653846 | 82 | 0.736318 | 138 | 1,005 | 4.92029 | 0.246377 | 0.082474 | 0.141384 | 0.20324 | 0.745214 | 0.66863 | 0.66863 | 0.639175 | 0.5243 | 0.5243 | 0 | 0.022119 | 0.145274 | 1,005 | 25 | 83 | 40.2 | 0.768335 | 0 | 0 | 0.421053 | 0 | 0 | 0.18806 | 0 | 0 | 0 | 0 | 0 | 0.421053 | 1 | 0.105263 | false | 0 | 0 | 0 | 0.105263 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b28d0dae8fb9ed9ee50b81bbf1aae13554854cbe | 1,352 | py | Python | src/baskerville/models/model_interface.py | deflect-ca/baskerville | 9659f4b39ab66fcf5329a4eccff15e97245b04f0 | [
"CC-BY-4.0"
] | 2 | 2021-12-03T11:26:38.000Z | 2022-01-12T22:24:29.000Z | src/baskerville/models/model_interface.py | deflect-ca/baskerville | 9659f4b39ab66fcf5329a4eccff15e97245b04f0 | [
"CC-BY-4.0"
] | 3 | 2022-01-19T15:17:37.000Z | 2022-03-22T04:55:22.000Z | src/baskerville/models/model_interface.py | deflect-ca/baskerville | 9659f4b39ab66fcf5329a4eccff15e97245b04f0 | [
"CC-BY-4.0"
] | null | null | null | # Copyright (c) 2020, eQualit.ie inc.
# All rights reserved.
#
# This source code is licensed under the BSD-style license found in the
# LICENSE file in the root directory of this source tree.
import inspect
import logging
class ModelInterface(object):
def __init__(self):
super().__init__()
self.logger = logging.getLogger(self.__class__.__name__)
def get_param_names(self):
return list(inspect.signature(self.__init__).parameters.keys())
def set_params(self, **params):
param_names = self.get_param_names()
for key, value in params.items():
if key not in param_names:
raise RuntimeError(
f'Class {self.__class__.__name__} does not '
f'have {key} attribute')
setattr(self, key, value)
def get_params(self):
params = {}
for name in self.get_param_names():
params[name] = getattr(self, name)
return params
def _get_class_path(self):
return f'{self.__class__.__module__}.{self.__class__.__name__}'
def train(self, df):
pass
def predict(self, df):
pass
def save(self, path, spark_session=None):
pass
def load(self, path, spark_session=None):
pass
def set_logger(self, logger):
self.logger = logger
| 26 | 71 | 0.621302 | 170 | 1,352 | 4.588235 | 0.435294 | 0.064103 | 0.05 | 0.041026 | 0.079487 | 0.079487 | 0.079487 | 0 | 0 | 0 | 0 | 0.004115 | 0.281065 | 1,352 | 51 | 72 | 26.509804 | 0.798354 | 0.134615 | 0 | 0.121212 | 0 | 0 | 0.097938 | 0.06701 | 0 | 0 | 0 | 0 | 0 | 1 | 0.30303 | false | 0.121212 | 0.060606 | 0.060606 | 0.484848 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b299f61f9bab8f0fdfd0cbba6dbcac61cd8b37ce | 239 | py | Python | dags/minimal_dag.py | MarcusJones/kaggle_petfinder_adoption | 2d745b48405f4d4211b523eae272b9169fcf9fa2 | [
"MIT"
] | 1 | 2019-01-24T04:22:39.000Z | 2019-01-24T04:22:39.000Z | dags/minimal_dag.py | MarcusJones/kaggle_petfinder_adoption | 2d745b48405f4d4211b523eae272b9169fcf9fa2 | [
"MIT"
] | null | null | null | dags/minimal_dag.py | MarcusJones/kaggle_petfinder_adoption | 2d745b48405f4d4211b523eae272b9169fcf9fa2 | [
"MIT"
] | null | null | null | import airflow as af
from airflow.operators.dummy_operator import DummyOperator
from datetime import datetime
with af.DAG('minimal_dag', start_date=datetime(2016, 1, 1)) as dag:
op = DummyOperator(task_id='op')
op.dag is dag # True
| 23.9 | 67 | 0.76569 | 38 | 239 | 4.710526 | 0.578947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029268 | 0.142259 | 239 | 9 | 68 | 26.555556 | 0.843902 | 0.016736 | 0 | 0 | 0 | 0 | 0.056034 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
b2a1766bc5fbc87d90f9559b3c26e49052f3b261 | 869 | py | Python | tests/test_tunnels_released.py | jhaapako/tcf | ecd75404459c6fec9d9fa1522b70a8deab896644 | [
"Apache-2.0"
] | 24 | 2018-08-21T18:04:48.000Z | 2022-02-07T22:50:06.000Z | tests/test_tunnels_released.py | jhaapako/tcf | ecd75404459c6fec9d9fa1522b70a8deab896644 | [
"Apache-2.0"
] | 16 | 2018-08-21T18:03:52.000Z | 2022-03-01T17:15:42.000Z | tests/test_tunnels_released.py | jhaapako/tcf | ecd75404459c6fec9d9fa1522b70a8deab896644 | [
"Apache-2.0"
] | 29 | 2018-08-22T19:40:59.000Z | 2021-12-21T11:13:23.000Z | #! /usr/bin/python3
#
# Copyright (c) 2017 Intel Corporation
#
# SPDX-License-Identifier: Apache-2.0
#
# pylint: disable = missing-docstring
import os
import socket
import commonl.testing
import tcfl
import tcfl.tc
srcdir = os.path.dirname(__file__)
ttbd = commonl.testing.test_ttbd(config_files = [
# strip to remove the compiled/optimized version -> get source
os.path.join(srcdir, "conf_%s" % os.path.basename(__file__.rstrip('cd')))
])
@tcfl.tc.target(ttbd.url_spec)
class release_hooks(tcfl.tc.tc_c):
"""
We allocate a target, create tunnels and then we release it; when
released, the tunnels are destroyed.
"""
def eval(self, target):
target.tunnel.add(22, "127.0.0.1", 'tcp')
self.report_pass("release hooks were called on target release")
def teardown_90_scb(self):
ttbd.check_log_for_issues(self)
| 24.138889 | 77 | 0.700806 | 127 | 869 | 4.637795 | 0.685039 | 0.03056 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023743 | 0.176064 | 869 | 35 | 78 | 24.828571 | 0.798883 | 0.334868 | 0 | 0 | 0 | 0 | 0.116152 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.0625 | 0.3125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
b2a64ad7dcb9aaa41898aea3c2d8af7ef4fc0f3f | 1,582 | py | Python | template.py | deepak7376/design_pattern | 855aa0879d478f7b2682c2ae5e92599b5c81a1c6 | [
"MIT"
] | null | null | null | template.py | deepak7376/design_pattern | 855aa0879d478f7b2682c2ae5e92599b5c81a1c6 | [
"MIT"
] | null | null | null | template.py | deepak7376/design_pattern | 855aa0879d478f7b2682c2ae5e92599b5c81a1c6 | [
"MIT"
] | null | null | null | from abc import ABC, abstractmethod
class AverageCalculator(ABC):
def average(self):
try:
num_items = 0
total_sum = 0
while self.has_next():
total_sum += self.next_item()
num_items += 1
if num_items == 0:
raise RuntimeError("Can't compute the average of zero items.")
return total_sum / num_items
finally:
self.dispose()
@abstractmethod
def has_next(self):
pass
@abstractmethod
def next_item(self):
pass
def dispose(self):
pass
class FileAverageCalculator(AverageCalculator):
def __init__(self, file):
self.file = file
self.last_line = self.file.readline()
def has_next(self):
return self.last_line != ''
def next_item(self):
result = float(self.last_line)
self.last_line = self.file.readline()
return result
def dispose(self):
self.file.close()
class MemoryAverageCalculator(AverageCalculator):
def __init__(self, lst):
self.lst = lst
self.index = 0
def has_next(self):
return self.index<len(self.lst)
def next_item(self):
result = float(self.lst[self.index])
self.index+=1
return result
def dispose(self):
pass
mac = MemoryAverageCalculator([3, 1, 4, 1, 5, 9, 2, 6, 5, 3])
print(mac.average()) # Call the template method
# fac = FileAverageCalculator(open('data.txt'))
# print(fac.average()) # Call the template method | 21.972222 | 78 | 0.583439 | 188 | 1,582 | 4.765957 | 0.324468 | 0.044643 | 0.053571 | 0.046875 | 0.303571 | 0.183036 | 0.066964 | 0 | 0 | 0 | 0 | 0.014774 | 0.315424 | 1,582 | 72 | 79 | 21.972222 | 0.812558 | 0.073957 | 0 | 0.395833 | 0 | 0 | 0.027379 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.083333 | 0.020833 | 0.041667 | 0.4375 | 0.020833 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
a22cbabe9b6d8f3afdad45c7ee147591f90ad7e9 | 3,406 | py | Python | src/npu/comprehension.py | feagi/feagi | 598abbe294b5d9cd7ff34861fa6568ba899b2ab8 | [
"Apache-2.0"
] | 1 | 2022-03-17T08:27:11.000Z | 2022-03-17T08:27:11.000Z | src/npu/comprehension.py | feagi/feagi | 598abbe294b5d9cd7ff34861fa6568ba899b2ab8 | [
"Apache-2.0"
] | 1 | 2022-02-10T16:30:35.000Z | 2022-02-10T16:33:21.000Z | src/npu/comprehension.py | feagi/feagi | 598abbe294b5d9cd7ff34861fa6568ba899b2ab8 | [
"Apache-2.0"
] | 1 | 2022-02-07T22:15:54.000Z | 2022-02-07T22:15:54.000Z |
# Copyright 2016-2022 The FEAGI Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
def utf_detection_logic(detection_list):
# todo: Add a logic to account for cases were two top ranked items are too close
# Identifies the detected UTF character with highest activity
highest_ranked_item = '-'
second_highest_ranked_item = '-'
for item in detection_list:
if highest_ranked_item == '-':
highest_ranked_item = item
else:
if detection_list[item]['rank'] > detection_list[highest_ranked_item]['rank']:
second_highest_ranked_item = highest_ranked_item
highest_ranked_item = item
elif second_highest_ranked_item == '-':
second_highest_ranked_item = item
else:
if detection_list[item]['rank'] > detection_list[second_highest_ranked_item]['rank']:
second_highest_ranked_item = item
# todo: export detection factor to genome not parameters
detection_tolerance = 1.5
if highest_ranked_item != '-' and second_highest_ranked_item == '-':
print("Highest ranking number was chosen.")
print("1st and 2nd highest ranked numbers are: ", highest_ranked_item, second_highest_ranked_item)
return highest_ranked_item
elif highest_ranked_item != '-' and \
second_highest_ranked_item != '-' and \
detection_list[second_highest_ranked_item]['rank'] != 0:
if detection_list[highest_ranked_item]['rank'] / detection_list[second_highest_ranked_item]['rank'] > \
detection_tolerance:
print("Highest ranking number was chosen.")
print("1st and 2nd highest ranked numbers are: ", highest_ranked_item, second_highest_ranked_item)
return highest_ranked_item
else:
print(">>>> >>> >> >> >> >> > > Tolerance factor was not met!! !! !!")
print("Highest and 2nd highest ranked numbers are: ", highest_ranked_item, second_highest_ranked_item)
return '-'
else:
return '-'
# list_length = len(detection_list)
# if list_length == 1:
# for key in detection_list:
# return key
# elif list_length >= 2 or list_length == 0:
# return '-'
# else:
# temp = []
# counter = 0
# # print(">><<>><<>><<", detection_list)
# for key in detection_list:
# temp[counter] = (key, detection_list[key])
# if temp[0][1] > (3 * temp[1][1]):
# return temp[0][0]
# elif temp[1][1] > (3 * temp[0][1]):
# return temp[1][0]
# else:
# return '-'
# Load copy of all MNIST training images into mnist_data in form of an iterator. Each object has image label + image
| 44.233766 | 120 | 0.620376 | 416 | 3,406 | 4.862981 | 0.314904 | 0.192783 | 0.226891 | 0.1478 | 0.460702 | 0.433515 | 0.420662 | 0.342561 | 0.26693 | 0.229857 | 0 | 0.015091 | 0.260716 | 3,406 | 76 | 121 | 44.815789 | 0.788324 | 0.426307 | 0 | 0.470588 | 0 | 0 | 0.152038 | 0 | 0 | 0 | 0 | 0.013158 | 0 | 1 | 0.029412 | false | 0 | 0 | 0 | 0.147059 | 0.176471 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a22d14123c5934e462a7334c1d55b574adf6c9be | 3,403 | py | Python | 10-19/14. normalize_sentences/test_normalize_sentences.py | dcragusa/PythonMorsels | 5f75b51a68769036e4004e9ccdada6b220124ab6 | [
"MIT"
] | 1 | 2021-11-30T05:03:24.000Z | 2021-11-30T05:03:24.000Z | 10-19/14. normalize_sentences/test_normalize_sentences.py | dcragusa/PythonMorsels | 5f75b51a68769036e4004e9ccdada6b220124ab6 | [
"MIT"
] | null | null | null | 10-19/14. normalize_sentences/test_normalize_sentences.py | dcragusa/PythonMorsels | 5f75b51a68769036e4004e9ccdada6b220124ab6 | [
"MIT"
] | 2 | 2021-04-18T05:26:43.000Z | 2021-11-28T18:46:43.000Z | import unittest
from textwrap import dedent
from normalize_sentences import normalize_sentences
class NormalizeSentencesTests(unittest.TestCase):
"""Tests for normalize_sentences."""
maxDiff = 1000
def test_no_sentences(self):
sentence = "This isn't a sentence"
self.assertEqual(normalize_sentences(sentence), sentence)
def test_one_sentence(self):
sentence = "This is a sentence."
self.assertEqual(normalize_sentences(sentence), sentence)
def test_two_sentences(self):
sentences = ["Sentence 1.", "Sentence 2."]
self.assertEqual(
normalize_sentences(" ".join(sentences)),
" ".join(sentences),
)
def test_multiple_punctuation_marks(self):
sentences = ["Sentence 1!", "Sentence 2?", "Sentence 3."]
self.assertEqual(
normalize_sentences(" ".join(sentences)),
" ".join(sentences),
)
def test_multiple_paragraphs(self):
sentences = dedent("""
This is a paragraph. With two sentences in it.
And this is one. With three. Three short sentences.
""").strip()
expected = dedent("""
This is a paragraph. With two sentences in it.
And this is one. With three. Three short sentences.
""").strip()
self.assertEqual(
normalize_sentences(sentences),
expected,
)
# To test the Bonus part of this exercise, comment out the following line
# @unittest.expectedFailure
def test_no_extra_spaces(self):
sentences = """
Sentence 1. And two spaces after. But one space after this.
"""
expected = """
Sentence 1. And two spaces after. But one space after this.
"""
self.assertEqual(
normalize_sentences(sentences),
expected,
)
# To test the Bonus part of this exercise, comment out the following line
# @unittest.expectedFailure
def test_with_abbreviations_and_numbers(self):
sentences = "P.S. I like fish (e.g. salmon). That is all."
expected = "P.S. I like fish (e.g. salmon). That is all."
self.assertEqual(
normalize_sentences(sentences),
expected,
)
sentences = "I ate 5.5 oranges. They cost $.50 each. They were good."
expected = "I ate 5.5 oranges. They cost $.50 each. They were good."
self.assertEqual(
normalize_sentences(sentences),
expected,
)
# To test the Bonus part of this exercise, comment out the following line
# @unittest.expectedFailure
def test_excluded_words_work(self):
sentences = (
"Do you know about the work of Dr. Rosalind Franklin? You can "
"find out what she did by using google.com. Google is used by "
"1.17 billion people (as of December 2012). That's a lot people!"
)
expected = (
"Do you know about the work of Dr. Rosalind Franklin? You can "
"find out what she did by using google.com. Google is used by "
"1.17 billion people (as of December 2012). That's a lot people!"
)
self.assertEqual(
normalize_sentences(sentences),
expected,
)
if __name__ == "__main__":
unittest.main(verbosity=2)
| 33.362745 | 78 | 0.601234 | 394 | 3,403 | 5.088832 | 0.284264 | 0.107731 | 0.107731 | 0.14813 | 0.766085 | 0.766085 | 0.685287 | 0.685287 | 0.685287 | 0.685287 | 0 | 0.014462 | 0.309139 | 3,403 | 101 | 79 | 33.693069 | 0.838367 | 0.095504 | 0 | 0.506494 | 0 | 0 | 0.3642 | 0 | 0 | 0 | 0 | 0 | 0.116883 | 1 | 0.103896 | false | 0 | 0.038961 | 0 | 0.168831 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a23aa98e817822c0db3ba0e76ac9fe51cc297075 | 486 | py | Python | Exercism/triangle/triangle.py | adityaarakeri/Interview-solved | e924011d101621c7121f4f86d82bee089f4c1e25 | [
"MIT"
] | 46 | 2019-10-14T01:21:35.000Z | 2022-01-08T23:55:15.000Z | Exercism/triangle/triangle.py | Siddhant-K-code/Interview-solved | e924011d101621c7121f4f86d82bee089f4c1e25 | [
"MIT"
] | 53 | 2019-10-03T17:16:43.000Z | 2020-12-08T12:48:19.000Z | Exercism/triangle/triangle.py | Siddhant-K-code/Interview-solved | e924011d101621c7121f4f86d82bee089f4c1e25 | [
"MIT"
] | 96 | 2019-10-03T18:12:10.000Z | 2021-03-14T19:41:06.000Z | def is_triangle(func):
def wrapped(sides):
if any(i <= 0 for i in sides):
return False
sum_ = sum(sides)
if any(sides[i] > sum_ - sides[i] for i in range(3)):
return False
return func(sides)
return wrapped
@is_triangle
def is_equilateral(sides):
return len(set(sides)) == 1
@is_triangle
def is_isosceles(sides):
return len(set(sides)) != 3
@is_triangle
def is_scalene(sides):
return len(set(sides)) == 3
| 19.44 | 61 | 0.602881 | 72 | 486 | 3.944444 | 0.319444 | 0.193662 | 0.137324 | 0.158451 | 0.239437 | 0.161972 | 0 | 0 | 0 | 0 | 0 | 0.014286 | 0.279835 | 486 | 24 | 62 | 20.25 | 0.797143 | 0 | 0 | 0.277778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.277778 | false | 0 | 0 | 0.166667 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
a2431b76a7fd7273de98b3d8241bb7216ee7d296 | 2,182 | py | Python | python/src/main/python/pygw/query/aggregation_query_builder.py | radiant-maxar/geowave | 2d9f39d32e4621c8f5965a4dffff0623c1c03231 | [
"Apache-2.0"
] | 280 | 2017-06-14T01:26:19.000Z | 2022-03-28T15:45:23.000Z | python/src/main/python/pygw/query/aggregation_query_builder.py | radiant-maxar/geowave | 2d9f39d32e4621c8f5965a4dffff0623c1c03231 | [
"Apache-2.0"
] | 458 | 2017-06-12T20:00:59.000Z | 2022-03-31T04:41:59.000Z | python/src/main/python/pygw/query/aggregation_query_builder.py | radiant-maxar/geowave | 2d9f39d32e4621c8f5965a4dffff0623c1c03231 | [
"Apache-2.0"
] | 135 | 2017-06-12T20:39:34.000Z | 2022-03-15T13:42:30.000Z | #
# Copyright (c) 2013-2020 Contributors to the Eclipse Foundation
#
# See the NOTICE file distributed with this work for additional information regarding copyright
# ownership. All rights reserved. This program and the accompanying materials are made available
# under the terms of the Apache License, Version 2.0 which accompanies this distribution and is
# available at http://www.apache.org/licenses/LICENSE-2.0.txt
# ===============================================================================================
from .base_query_builder import BaseQueryBuilder
from .aggregation_query import AggregationQuery
from ..base.type_conversions import StringArrayType
class AggregationQueryBuilder(BaseQueryBuilder):
"""
A builder for creating aggregation queries. This class should not be used directly. Instead, use one of the derived
classes such as `pygw.query.vector.VectorAggregationQueryBuilder`.
"""
def __init__(self, java_ref):
super().__init__(java_ref)
def count(self, *type_names):
"""
This is a convenience method to set the count aggregation if no type names are given it is
assumed to count every type.
Args:
type_names (str): The type names to count results.
Returns:
This query builder.
"""
if type_names is None:
self._java_ref.count()
else:
self._java_ref.count(StringArrayType().to_java(type_names))
return self
def aggregate(self, type_name, j_aggregation):
"""
Provide the Java Aggregation function and the type name to apply the aggregation on.
Args:
type_name (str): The type name to aggregate.
j_aggregation (Aggregation): The Java aggregation function to
Returns:
This query builder.
"""
return self._java_ref.aggregate(type_name, j_aggregation)
def build(self):
"""
Builds the configured aggregation query.
Returns:
The final constructed `pygw.query.AggregationQuery`.
"""
return AggregationQuery(self._java_ref.build(), self._java_transformer)
| 35.193548 | 120 | 0.651696 | 255 | 2,182 | 5.439216 | 0.458824 | 0.034607 | 0.039654 | 0.033165 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007233 | 0.239688 | 2,182 | 61 | 121 | 35.770492 | 0.828813 | 0.572869 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.1875 | 0 | 0.6875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
a25a329785c9f77e159427cefe14e85a15f3128c | 157 | py | Python | ch02/number_eight.py | joy-joy/pcc | 6c7d166a1694a2c3f371307aea6c4bdf340c4c42 | [
"MIT"
] | null | null | null | ch02/number_eight.py | joy-joy/pcc | 6c7d166a1694a2c3f371307aea6c4bdf340c4c42 | [
"MIT"
] | null | null | null | ch02/number_eight.py | joy-joy/pcc | 6c7d166a1694a2c3f371307aea6c4bdf340c4c42 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Jan 9 00:00:43 2018
@author: joy
"""
print(5 + 3)
print(9 - 1)
print(2 * 4)
print(16//2) | 13.083333 | 35 | 0.573248 | 30 | 157 | 3 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173228 | 0.191083 | 157 | 12 | 36 | 13.083333 | 0.535433 | 0.592357 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
a25c1f80b839438c40bc8b1ec20e3dcbcc9d3fa1 | 181 | py | Python | proxy_config.py | Nou4r/YandexMail-Account-Creator | b65f24630d23c59dfb8d196f3efe5a222aa3e11a | [
"MIT"
] | 1 | 2021-11-23T05:28:16.000Z | 2021-11-23T05:28:16.000Z | proxy_config.py | Nou4r/YandexMail-Account-Creator | b65f24630d23c59dfb8d196f3efe5a222aa3e11a | [
"MIT"
] | null | null | null | proxy_config.py | Nou4r/YandexMail-Account-Creator | b65f24630d23c59dfb8d196f3efe5a222aa3e11a | [
"MIT"
] | null | null | null | try:
with open('proxies.txt', 'r') as file:
proxy = [ line.rstrip() for line in file.readlines()]
except FileNotFoundError:
raise Exception('Proxies.txt not found.') | 36.2 | 61 | 0.662983 | 24 | 181 | 5 | 0.833333 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.19337 | 181 | 5 | 62 | 36.2 | 0.821918 | 0 | 0 | 0 | 0 | 0 | 0.186813 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a25efb76b91de6c5a6535d8621723808a44381dd | 8,046 | py | Python | dilami_calendar/constants.py | Jangal/python-deylami-calendar | 65b4a36ea6d9cba71b7086b3c488fd6842ead687 | [
"MIT"
] | 12 | 2019-08-05T19:11:24.000Z | 2021-11-17T03:52:12.000Z | dilami_calendar/constants.py | Jangal/python-dilami-calendar | 65b4a36ea6d9cba71b7086b3c488fd6842ead687 | [
"MIT"
] | 2 | 2019-08-03T05:42:02.000Z | 2021-12-01T07:34:26.000Z | dilami_calendar/constants.py | Jangal/python-dilami-calendar | 65b4a36ea6d9cba71b7086b3c488fd6842ead687 | [
"MIT"
] | null | null | null | DILAMI_WEEKDAY_NAMES = {
0: "شمبه",
1: "یکشمبه",
2: "دۊشمبه",
3: "سۊشمبه",
4: "چارشمبه",
5: "پئنشمبه",
6: "جۊمه",
}
DILAMI_MONTH_NAMES = {
0: "پنجيک",
1: "نؤرۊز ما",
2: "کۊرچ ٚ ما",
3: "أرئه ما",
4: "تیر ما",
5: "مۊردال ما",
6: "شریرما",
7: "أمیر ما",
8: "آول ما",
9: "سیا ما",
10: "دیا ما",
11: "ورفن ٚ ما",
12: "اسفندار ما",
}
DILAMI_LEAP_YEARS = (
199,
203,
207,
211,
215,
220,
224,
228,
232,
236,
240,
244,
248,
253,
257,
261,
265,
269,
273,
277,
281,
286,
290,
294,
298,
302,
306,
310,
315,
319,
323,
327,
331,
335,
339,
343,
348,
352,
356,
360,
364,
368,
372,
376,
381,
385,
389,
393,
397,
401,
405,
409,
414,
418,
422,
426,
430,
434,
438,
443,
447,
451,
455,
459,
463,
467,
471,
476,
480,
484,
488,
492,
496,
500,
504,
509,
513,
517,
521,
525,
529,
533,
537,
542,
546,
550,
554,
558,
562,
566,
571,
575,
579,
583,
587,
591,
595,
599,
604,
608,
612,
616,
620,
624,
628,
632,
637,
641,
645,
649,
653,
657,
661,
665,
669,
674,
678,
682,
686,
690,
694,
698,
703,
707,
711,
715,
719,
723,
727,
731,
736,
740,
744,
748,
752,
756,
760,
764,
769,
773,
777,
781,
785,
789,
793,
797,
802,
806,
810,
814,
818,
822,
826,
831,
835,
839,
843,
847,
851,
855,
859,
864,
868,
872,
876,
880,
884,
888,
892,
897,
901,
905,
909,
913,
917,
921,
925,
930,
934,
938,
942,
946,
950,
954,
959,
963,
967,
971,
975,
979,
983,
987,
992,
996,
1000,
1004,
1008,
1012,
1016,
1020,
1025,
1029,
1033,
1037,
1041,
1045,
1049,
1053,
1058,
1062,
1066,
1070,
1074,
1078,
1082,
1087,
1091,
1095,
1099,
1103,
1107,
1111,
1115,
1120,
1124,
1128,
1132,
1136,
1140,
1144,
1148,
1153,
1157,
1161,
1165,
1169,
1173,
1177,
1181,
1186,
1190,
1194,
1198,
1202,
1206,
1210,
1215,
1219,
1223,
1227,
1231,
1235,
1239,
1243,
1248,
1252,
1256,
1260,
1264,
1268,
1272,
1276,
1281,
1285,
1289,
1293,
1297,
1301,
1305,
1309,
1314,
1318,
1322,
1326,
1330,
1334,
1338,
1343,
1347,
1351,
1355,
1359,
1363,
1367,
1371,
1376,
1380,
1384,
1388,
1392,
1396,
1400,
1404,
1409,
1413,
1417,
1421,
1425,
1429,
1433,
1437,
1442,
1446,
1450,
1454,
1458,
1462,
1466,
1471,
1475,
1479,
1483,
1487,
1491,
1495,
1499,
1504,
1508,
1512,
1516,
1520,
1524,
1528,
1532,
1537,
1541,
1545,
1549,
1553,
1557,
1561,
1565,
1570,
1574,
1578,
1582,
1586,
1590,
1594,
1599,
1603,
1607,
1611,
1615,
1619,
1623,
1627,
1632,
1636,
1640,
1644,
1648,
1652,
1656,
1660,
1665,
1669,
1673,
1677,
1681,
1685,
1689,
1693,
1698,
1702,
1706,
1710,
1714,
1718,
1722,
1727,
1731,
1735,
1739,
1743,
1747,
1751,
1755,
1760,
1764,
1768,
1772,
1776,
1780,
1784,
1788,
1793,
1797,
1801,
1805,
1809,
1813,
1817,
1821,
1826,
1830,
1834,
1838,
1842,
1846,
1850,
1855,
1859,
1863,
1867,
1871,
1875,
1879,
1883,
1888,
1892,
1896,
1900,
1904,
1908,
1912,
1916,
1921,
1925,
1929,
1933,
1937,
1941,
1945,
1949,
1954,
1958,
1962,
1966,
1970,
1974,
1978,
1983,
1987,
1991,
1995,
1999,
2003,
2007,
2011,
2016,
2020,
2024,
2028,
2032,
2036,
2040,
2044,
2049,
2053,
2057,
2061,
2065,
2069,
2073,
2077,
2082,
2086,
2090,
2094,
2098,
2102,
2106,
2111,
2115,
2119,
2123,
2127,
2131,
2135,
2139,
2144,
2148,
2152,
2156,
2160,
2164,
2168,
2172,
2177,
2181,
2185,
2189,
2193,
2197,
2201,
2205,
2210,
2214,
2218,
2222,
2226,
2230,
2234,
2239,
2243,
2247,
2251,
2255,
2259,
2263,
2267,
2272,
2276,
2280,
2284,
2288,
2292,
2296,
2300,
2305,
2309,
2313,
2317,
2321,
2325,
2329,
2333,
2338,
2342,
2346,
2350,
2354,
2358,
2362,
2367,
2371,
2375,
2379,
2383,
2387,
2391,
2395,
2400,
2404,
2408,
2412,
2416,
2420,
2424,
2428,
2433,
2437,
2441,
2445,
2449,
2453,
2457,
2461,
2466,
2470,
2474,
2478,
2482,
2486,
2490,
2495,
2499,
2503,
2507,
2511,
2515,
2519,
2523,
2528,
2532,
2536,
2540,
2544,
2548,
2552,
2556,
2561,
2565,
2569,
2573,
2577,
2581,
2585,
2589,
2594,
2598,
2602,
2606,
2610,
2614,
2618,
2623,
2627,
2631,
2635,
2639,
2643,
2647,
2651,
2656,
2660,
2664,
2668,
2672,
2676,
2680,
2684,
2689,
2693,
2697,
2701,
2705,
2709,
2713,
2717,
2722,
2726,
2730,
2734,
2738,
2742,
2746,
2751,
2755,
2759,
2763,
2767,
2771,
2775,
2779,
2784,
2788,
2792,
2796,
2800,
2804,
2808,
2812,
2817,
2821,
2825,
2829,
2833,
2837,
2841,
2845,
2850,
2854,
2858,
2862,
2866,
2870,
2874,
2879,
2883,
2887,
2891,
2895,
2899,
2903,
2907,
2912,
2916,
2920,
2924,
2928,
2932,
2936,
2940,
2945,
2949,
2953,
2957,
2961,
2965,
2969,
2973,
2978,
2982,
2986,
2990,
2994,
2998,
3002,
3007,
3011,
3015,
3019,
3023,
3027,
3031,
3035,
3040,
3044,
3048,
3052,
3056,
3060,
3064,
3068,
3073,
3077,
3081,
3085,
3089,
3093,
3097,
3101,
3106,
3110,
3114,
3118,
3122,
3126,
3130,
3135,
3139,
3143,
3147,
3151,
3155,
3159,
3163,
3168,
3172,
3176,
3180,
3184,
3188,
3192,
3196,
3201,
3205,
3209,
3213,
3217,
3221,
3225,
3229,
3234,
3238,
3242,
3246,
3250,
3254,
3258,
3263,
3267,
3271,
3275,
3279,
3283,
3287,
3291,
3296,
3300,
3304,
3308,
3312,
3316,
3320,
3324,
3329,
3333,
3337,
3341,
3345,
3349,
3353,
3357,
3362,
3366,
3370,
)
#: Minimum year supported by the library.
MINYEAR = 195
#: Maximum year supported by the library.
MAXYEAR = 3372
| 10.007463 | 41 | 0.393239 | 847 | 8,046 | 3.730815 | 0.969303 | 0.003797 | 0.009494 | 0.011392 | 0.015823 | 0 | 0 | 0 | 0 | 0 | 0 | 0.722222 | 0.498881 | 8,046 | 803 | 42 | 10.019925 | 0.061012 | 0.009943 | 0 | 0 | 0 | 0 | 0.016826 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a25fceaa81b9a2397bbf59a5c9765ebd1d84a0d6 | 324 | py | Python | inputs/sineClock.py | hongaar/ringctl | 9e2adbdf16e85852019466e42be9d88a9e63cde5 | [
"MIT"
] | null | null | null | inputs/sineClock.py | hongaar/ringctl | 9e2adbdf16e85852019466e42be9d88a9e63cde5 | [
"MIT"
] | null | null | null | inputs/sineClock.py | hongaar/ringctl | 9e2adbdf16e85852019466e42be9d88a9e63cde5 | [
"MIT"
] | null | null | null | import math
from inputs.sine import Sine
from inputs.timeElapsed import TimeElapsed
from utils.number import Number
class SineClock(Number):
def __init__(self, sine: Sine):
self.__sine = sine
self.__elapsed = TimeElapsed()
def get(self):
return self.__sine.at_time(self.__elapsed.get())
| 20.25 | 56 | 0.70679 | 42 | 324 | 5.142857 | 0.428571 | 0.111111 | 0.111111 | 0.148148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.20679 | 324 | 15 | 57 | 21.6 | 0.840467 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0.1 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
a26b73d904e11aae41e76e1fb93f09e8f345dc84 | 534 | py | Python | projects/cassava-leaf-disease/code/src/config.py | dric2018/coding-room | ff538ed16d09ab4918d1b0d55aef09fe95b1078a | [
"MIT"
] | 1 | 2021-02-02T08:30:50.000Z | 2021-02-02T08:30:50.000Z | projects/cassava-leaf-disease/code/src/.ipynb_checkpoints/config-checkpoint.py | dric2018/coding-room | ff538ed16d09ab4918d1b0d55aef09fe95b1078a | [
"MIT"
] | null | null | null | projects/cassava-leaf-disease/code/src/.ipynb_checkpoints/config-checkpoint.py | dric2018/coding-room | ff538ed16d09ab4918d1b0d55aef09fe95b1078a | [
"MIT"
] | 1 | 2021-03-09T14:27:00.000Z | 2021-03-09T14:27:00.000Z | import os
class Config:
data_dir = os.path.abspath('../data/input/')
models_dir = os.path.abspath('../models')
logs_dir = os.path.abspath('../logs')
train_data_dir = os.path.abspath('../data/input/train_images')
test_data_dir = os.path.abspath('../data/input/test_images')
num_epochs = 15
lr = 2e-2
resize = 500
img_h = 400
img_w = 400
weight_decay = .01
eps = 1e-8
train_batch_size = 16
test_batch_size = 16
base_model = 'resnet34'
seed_val = 2021
num_workers = 2 | 25.428571 | 66 | 0.629213 | 80 | 534 | 3.95 | 0.525 | 0.079114 | 0.142405 | 0.253165 | 0.275316 | 0.275316 | 0.275316 | 0 | 0 | 0 | 0 | 0.068293 | 0.23221 | 534 | 21 | 67 | 25.428571 | 0.702439 | 0 | 0 | 0 | 0 | 0 | 0.166355 | 0.095327 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.052632 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
a26c405342f3cf01116c7589d07a48162ad6f4f5 | 1,265 | py | Python | midburn/migrations/0007_auto_20160116_0902.py | mtr574/projectMidbrunFirstReg | 2569c3f07e1af746bfc1f213632708c76d8fc829 | [
"Apache-2.0"
] | null | null | null | midburn/migrations/0007_auto_20160116_0902.py | mtr574/projectMidbrunFirstReg | 2569c3f07e1af746bfc1f213632708c76d8fc829 | [
"Apache-2.0"
] | 1 | 2016-01-22T09:32:04.000Z | 2016-01-22T12:14:12.000Z | midburn/migrations/0007_auto_20160116_0902.py | mtr574/projectMidbrunFirstReg | 2569c3f07e1af746bfc1f213632708c76d8fc829 | [
"Apache-2.0"
] | 3 | 2016-11-04T12:10:03.000Z | 2017-02-23T08:52:53.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('midburn', '0006_auto_20160116_0842'),
]
operations = [
migrations.AlterField(
model_name='camp',
name='contact_email',
field=models.CharField(max_length=254, blank=True, unique=True, default=''),
preserve_default=False,
),
migrations.AlterField(
model_name='camp',
name='contact_facebook',
field=models.CharField(max_length=254, blank=True, default=''),
preserve_default=False,
),
migrations.AlterField(
model_name='camp',
name='contact_name_en',
field=models.CharField(max_length=50, blank=True),
),
migrations.AlterField(
model_name='camp',
name='contact_name_he',
field=models.CharField(max_length=50, blank=True),
),
migrations.AlterField(
model_name='camp',
name='contact_phone',
field=models.CharField(max_length=50, blank=True, default=''),
preserve_default=False,
),
]
| 29.418605 | 88 | 0.575494 | 121 | 1,265 | 5.785124 | 0.355372 | 0.142857 | 0.178571 | 0.207143 | 0.735714 | 0.735714 | 0.697143 | 0.634286 | 0.454286 | 0.454286 | 0 | 0.033067 | 0.306719 | 1,265 | 42 | 89 | 30.119048 | 0.765108 | 0.016601 | 0 | 0.555556 | 0 | 0 | 0.098229 | 0.018519 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.138889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a27af76ac557d5a5a06d9803200c94099e5080e2 | 301 | py | Python | scikit/Adaboost/example.py | JayMiao/MLAction | fec1c08fa33ed1f5d9b0befecc6dac551cc02302 | [
"MIT"
] | 1 | 2017-02-13T10:25:11.000Z | 2017-02-13T10:25:11.000Z | scikit/Adaboost/example.py | JayMiao/MLAction | fec1c08fa33ed1f5d9b0befecc6dac551cc02302 | [
"MIT"
] | null | null | null | scikit/Adaboost/example.py | JayMiao/MLAction | fec1c08fa33ed1f5d9b0befecc6dac551cc02302 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from sklearn.datasets import load_iris
from sklearn.model_selection import cross_val_score
from sklearn.ensemble import AdaBoostClassifier
iris = load_iris()
clf = AdaBoostClassifier(n_estimators=1000)
scores = cross_val_score(clf, iris.data, iris.target)
print scores.mean() | 30.1 | 53 | 0.800664 | 42 | 301 | 5.547619 | 0.595238 | 0.141631 | 0.111588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018519 | 0.10299 | 301 | 10 | 54 | 30.1 | 0.844444 | 0.069767 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.428571 | null | null | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
a280eaab2887649d537621914d70995f7a90e0ab | 327 | py | Python | rotary/rotary/doctype/monthly_report/monthly_report.py | neilLasrado/rotary | 66659b41c6fbd04d22aa368573c786dabe1102e5 | [
"MIT"
] | null | null | null | rotary/rotary/doctype/monthly_report/monthly_report.py | neilLasrado/rotary | 66659b41c6fbd04d22aa368573c786dabe1102e5 | [
"MIT"
] | null | null | null | rotary/rotary/doctype/monthly_report/monthly_report.py | neilLasrado/rotary | 66659b41c6fbd04d22aa368573c786dabe1102e5 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright (c) 2015, Neil Lasrado and contributors
# For license information, please see license.txt
from __future__ import unicode_literals
import frappe
from frappe.model.document import Document
from frappe.utils import now
class MonthlyReport(Document):
def on_submit(self):
self.date = now()
| 25.153846 | 51 | 0.770642 | 45 | 327 | 5.466667 | 0.733333 | 0.081301 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017794 | 0.140673 | 327 | 12 | 52 | 27.25 | 0.857651 | 0.363914 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.571429 | 0 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
a281068a96d517af66fbb0b7cc8c9a41a817af13 | 109 | py | Python | kili/mutations/project_version/fragments.py | ASonay/kili-playground | 9624073703b5e6151cf496f44f17f531576875b7 | [
"Apache-2.0"
] | 214 | 2019-08-05T14:55:01.000Z | 2022-03-28T21:02:22.000Z | kili/mutations/project_version/fragments.py | x213212/kili-playground | dfb94c2d54bedfd7fec452b91f811587a2156c13 | [
"Apache-2.0"
] | 10 | 2020-05-14T10:44:16.000Z | 2022-03-08T09:39:24.000Z | kili/mutations/project_version/fragments.py | x213212/kili-playground | dfb94c2d54bedfd7fec452b91f811587a2156c13 | [
"Apache-2.0"
] | 19 | 2019-11-26T22:41:09.000Z | 2022-01-16T19:17:38.000Z | """
Fragments of project version mutations
"""
PROJECT_VERSION_FRAGMENT = '''
content
id
name
projectId
'''
| 9.909091 | 38 | 0.733945 | 12 | 109 | 6.5 | 0.833333 | 0.358974 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.146789 | 109 | 10 | 39 | 10.9 | 0.83871 | 0.348624 | 0 | 0 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a281d5a3c0cadb9b0e4f53931b714575ab5662a4 | 169 | py | Python | test/test.py | ttkltll/fisher | 8889705c7bde10304cfde7972b805226ac59d735 | [
"MIT"
] | null | null | null | test/test.py | ttkltll/fisher | 8889705c7bde10304cfde7972b805226ac59d735 | [
"MIT"
] | 3 | 2020-09-15T23:37:18.000Z | 2020-09-16T00:36:55.000Z | test/test.py | ttkltll/fisher | 8889705c7bde10304cfde7972b805226ac59d735 | [
"MIT"
] | 1 | 2020-09-15T02:55:54.000Z | 2020-09-15T02:55:54.000Z | from flask import Flask, current_app, request, Request
app = Flask(__name__)
ctx = app.app_context()
ctx.push()
current_app.static_floder = 'static'
ctx.pop()
app.run
| 16.9 | 54 | 0.751479 | 26 | 169 | 4.576923 | 0.538462 | 0.168067 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12426 | 169 | 9 | 55 | 18.777778 | 0.804054 | 0 | 0 | 0 | 0 | 0 | 0.035714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a2a9aa208a483f748111656782f9fb6afead659b | 910 | py | Python | tests/test_client.py | nyush-se-spring2021-forum/OurTieba | e7c5d75686e9cfda35050b8e40166d8a1d6ae83d | [
"MIT"
] | null | null | null | tests/test_client.py | nyush-se-spring2021-forum/OurTieba | e7c5d75686e9cfda35050b8e40166d8a1d6ae83d | [
"MIT"
] | 20 | 2021-04-08T11:16:51.000Z | 2021-05-21T16:17:51.000Z | tests/test_client.py | nyush-se-spring2021-forum/OurTieba | e7c5d75686e9cfda35050b8e40166d8a1d6ae83d | [
"MIT"
] | null | null | null | class TestClient:
"""
Test before_request and after_request decorators in __init__.py.
"""
def test_1(self, client): # disallowed methods
res = client.put("/")
assert res.status_code == 405
assert b"Method Not Allowed" in res.content
res = client.options("/api/post/add")
assert res.status_code == 405
assert b"Method Not Allowed" in res.content
res = client.delete("/notifications")
assert res.status_code == 405
assert b"Method Not Allowed" in res.content
def test_2(self, client): # empty/fake user agent
res = client.get("/", headers={"User-Agent": ""})
assert res.status_code == 403
assert b"No Scrappers!" in res.content
res = client.get("/board/2", headers={"User-Agent": "python/3.8"})
assert res.status_code == 403
assert b"No Scrappers!" in res.content
| 33.703704 | 74 | 0.615385 | 121 | 910 | 4.520661 | 0.396694 | 0.082267 | 0.137112 | 0.173675 | 0.552102 | 0.535649 | 0.535649 | 0.535649 | 0.535649 | 0.535649 | 0 | 0.029762 | 0.261538 | 910 | 26 | 75 | 35 | 0.784226 | 0.116484 | 0 | 0.555556 | 0 | 0 | 0.186785 | 0 | 0 | 0 | 0 | 0 | 0.555556 | 1 | 0.111111 | false | 0 | 0 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a2b509c564ad0b2601ed7a285ba7c94de901b242 | 754 | py | Python | 01_numbers.py | fernandobd42/Introduction_Python | 7a656df1341bda4e657baa146c28b98bef211fc6 | [
"OLDAP-2.5",
"Python-2.0",
"OLDAP-2.4",
"OLDAP-2.3"
] | 1 | 2016-10-02T00:51:43.000Z | 2016-10-02T00:51:43.000Z | 01_numbers.py | fernandobd42/Introduction_Python | 7a656df1341bda4e657baa146c28b98bef211fc6 | [
"OLDAP-2.5",
"Python-2.0",
"OLDAP-2.4",
"OLDAP-2.3"
] | null | null | null | 01_numbers.py | fernandobd42/Introduction_Python | 7a656df1341bda4e657baa146c28b98bef211fc6 | [
"OLDAP-2.5",
"Python-2.0",
"OLDAP-2.4",
"OLDAP-2.3"
] | null | null | null | x = 1 #x recebe 1
#print irá printar(mostrar) o valor desejado;
print(x) # resultado: 1
print(x + 4) # é possível somar uma variável com um número, desde que a variável tenha um valor definido - resultado: 5
print(2 * 2) #um asterisco é usado para multiplicar - resultado: 4
print(3 ** 3) #dois asterisco é usado para elevar a potência - resultado: 27
print(5 / 2) #divisão com uma barra é usado para retornar tipo flutuante - resultado = 2.5
print(5 // 2) #divisão com duas barras é usado para retorna tipo inteiro - resultado = 2;
print(5 % 2) #modulo é usado para retornar o resto da divisão - resultado = 1;
#OBS: o uso de '=' atribui um valor a uma variável, já o uso de '==' compara os valores;
y = 1 #1 é atribuído para y;
y == 1 #y é igual a 1?
| 53.857143 | 119 | 0.706897 | 137 | 754 | 3.890511 | 0.430657 | 0.056285 | 0.093809 | 0.071295 | 0.06379 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043118 | 0.200265 | 754 | 13 | 120 | 58 | 0.840796 | 0.823607 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.7 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.