hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
40ceddedb6eaf8c2406ca195dc49b3ad000c971a | 417 | py | Python | src/blacksmith/service/_sync/base.py | mardiros/blacksmith | c86a870da04b0d916f243cb51f8861529284337d | [
"BSD-3-Clause"
] | 15 | 2022-01-16T15:23:23.000Z | 2022-01-20T21:42:53.000Z | src/blacksmith/service/_sync/base.py | mardiros/blacksmith | c86a870da04b0d916f243cb51f8861529284337d | [
"BSD-3-Clause"
] | 9 | 2022-01-11T19:42:42.000Z | 2022-01-26T20:24:23.000Z | src/blacksmith/service/_sync/base.py | mardiros/blacksmith | c86a870da04b0d916f243cb51f8861529284337d | [
"BSD-3-Clause"
] | null | null | null | from typing import Optional
from blacksmith.domain.typing import SyncMiddleware
from blacksmith.typing import Proxies
class SyncAbstractTransport(SyncMiddleware):
verify_certificate: bool
proxies: Optional[Proxies]
def __init__(
self, verify_certificate: bool = True, proxies: Optional[Proxies] = None
):
self.verify_certificate = verify_certificate
self.proxies = proxies
| 26.0625 | 80 | 0.748201 | 43 | 417 | 7.069767 | 0.418605 | 0.223684 | 0.138158 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.191847 | 417 | 15 | 81 | 27.8 | 0.902077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.272727 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
40d998b95d36ea3b60c75bb2d30aae4f2366834c | 5,056 | py | Python | gadfly/DLispShort.py | moibenko/enstore | 6f2ff5b67ff73872a9e68f2a68b0bdaa70cef9b9 | [
"Intel",
"Unlicense"
] | 4 | 2021-10-17T11:17:59.000Z | 2022-02-28T16:58:40.000Z | gadfly/DLispShort.py | moibenko/enstore | 6f2ff5b67ff73872a9e68f2a68b0bdaa70cef9b9 | [
"Intel",
"Unlicense"
] | 17 | 2021-10-05T21:44:06.000Z | 2022-03-31T16:58:40.000Z | gadfly/DLispShort.py | moibenko/enstore | 6f2ff5b67ff73872a9e68f2a68b0bdaa70cef9b9 | [
"Intel",
"Unlicense"
] | 8 | 2021-09-02T18:55:49.000Z | 2022-03-09T21:05:28.000Z | # Grammar generation
# for lisp lists with strings, ints, vars, print, and setq
# set this variable to regenerate the grammar on each load
REGENERATEONLOAD = 1
import string
GRAMMARSTRING ="""
Value :: ## indicates Value is the root nonterminal for the grammar
@R SetqRule :: Value >> ( setq var Value )
@R ListRule :: Value >> ( ListTail
@R TailFull :: ListTail >> Value ListTail
@R TailEmpty :: ListTail >> )
@R Varrule :: Value >> var
@R Intrule :: Value >> int
@R Strrule :: Value >> str
@R PrintRule :: Value >> ( print Value )
"""
COMPILEDFILENAME = "TESTLispG.py"
MARSHALLEDFILENAME = "TESTLispG.mar"
LISPCOMMENTREGEX = ";.*"
INTREGEX = "["+string.digits+"]+"
STRREGEX = '"[^\n"]*"'
VARREGEX = "["+string.letters+"]["+string.letters+string.digits+"]*"
### declare interpretation functions and regex's for terminals
def intInterp( str ):
return string.atoi(str)
def stripQuotes( str ):
return str[1:len(str)-1]
def echo(string):
return string
def DeclareTerminals(Grammar):
Grammar.Addterm("int", INTREGEX, intInterp)
Grammar.Addterm("str", STRREGEX, stripQuotes)
Grammar.Addterm("var", VARREGEX, echo)
### declare the rule reduction interpretation functions.
def EchoValue( list, Context ):
return list[0]
def VarValue( list, Context ):
varName = list[0]
if Context.has_key(varName):
return Context[varName]
else:
raise NameError, "no such lisp variable in context "+varName
def NilTail( list, Context ):
return []
def AddToList( list, Context ):
return [ list[0] ] + list[1]
def MakeList( list, Context ):
return list[1]
def DoSetq( list, Context):
Context[ list[2] ] = list[3]
return list[3]
def DoPrint( list, Context ):
print list[2]
return list[2]
def BindRules(Grammar):
Grammar.Bind( "Intrule", EchoValue )
Grammar.Bind( "Strrule", EchoValue )
Grammar.Bind( "Varrule", VarValue )
Grammar.Bind( "TailEmpty", NilTail )
Grammar.Bind( "TailFull", AddToList )
Grammar.Bind( "ListRule", MakeList )
Grammar.Bind( "SetqRule", DoSetq )
Grammar.Bind( "PrintRule", DoPrint )
# This function generates the grammar and dumps it to a file.
def GrammarBuild():
import kjParseBuild
LispG = kjParseBuild.NullCGrammar()
LispG.SetCaseSensitivity(0) # grammar is not case sensitive for keywords
DeclareTerminals(LispG)
LispG.Keywords("setq print")
LispG.punct("().")
LispG.Nonterms("Value ListTail")
LispG.comments([LISPCOMMENTREGEX])
LispG.Declarerules(GRAMMARSTRING)
LispG.Compile()
print "dumping as python to "+COMPILEDFILENAME
outfile = open(COMPILEDFILENAME, "w")
LispG.Reconstruct("LispG",outfile,"GRAMMAR")
outfile.close()
print "dumping as binary to "+MARSHALLEDFILENAME
outfile = open(MARSHALLEDFILENAME, "w")
LispG.MarshalDump(outfile)
outfile.close()
BindRules(LispG)
return LispG
# this function initializes the compiled grammar from the generated file.
def LoadLispG():
import TESTLispG
# reload to make sure we get the most recent version!
# (only needed when debugging the grammar).
reload(TESTLispG)
LispG = TESTLispG.GRAMMAR()
DeclareTerminals(LispG)
BindRules(LispG)
return LispG
def unMarshalLispG():
import kjParser
infile = open(MARSHALLEDFILENAME, "r")
LispG = kjParser.UnMarshalGram(infile)
infile.close()
DeclareTerminals(LispG)
BindRules(LispG)
return LispG
########## test the grammar generation
if REGENERATEONLOAD:
print "(re)generating the LispG grammar in file TESTLispG.py"
Dummy = GrammarBuild()
print "(re)generation done."
print "loading grammar as python"
LispG = LoadLispG()
### declare an initial context, and do some tests.
Context = { 'x':3 }
test1 = LispG.DoParse1( '()', Context)
test2 = LispG.DoParse1( '(123)', Context)
test3 = LispG.DoParse1( '(x)', Context)
test4 = LispG.DoParse1( '" a string "', Context)
test5 = LispG.DoParse1( '(setq y (1 2 3) )', Context )
test6 = LispG.DoParse1( '(SeTq x ("a string" "another" 0))', Context )
test7str = """
; this is a lisp comment
(setq abc (("a" x)
("b" (setq d 12))
("c" y) ) ; another lisp comment
)
"""
test7 = LispG.DoParse1( test7str, Context)
test8 = LispG.DoParse1( '(print (1 x d))', Context)
print "unmarshalling the grammar"
LispG2 = unMarshalLispG()
### declare an initial context, and do some tests.
Context = { 'x':3 }
test1 = LispG2.DoParse1( '()', Context)
test2 = LispG2.DoParse1( '(123)', Context)
test3 = LispG2.DoParse1( '(x)', Context)
test4 = LispG2.DoParse1( '" a string "', Context)
test5 = LispG2.DoParse1( '(setq y (1 2 3) )', Context )
test6 = LispG2.DoParse1( '(SeTq x ("a string" "another" 0))', Context )
test7str = """
; this is a lisp comment
(setq abc (("a" x)
("b" (setq d 12))
("c" y) ) ; another lisp comment
)
"""
test7 = LispG2.DoParse1( test7str, Context)
test8 = LispG2.DoParse1( '(print (1 x d))', Context)
| 31.403727 | 76 | 0.658228 | 598 | 5,056 | 5.563545 | 0.302676 | 0.02645 | 0.020439 | 0.018936 | 0.180343 | 0.150887 | 0.109408 | 0.109408 | 0.092576 | 0.092576 | 0 | 0.020215 | 0.207476 | 5,056 | 160 | 77 | 31.6 | 0.810082 | 0.126582 | 0 | 0.19403 | 1 | 0 | 0.276916 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.029851 | null | null | 0.08209 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
40da08b92dbe7d3474b8baf7327355d57a38e211 | 1,058 | py | Python | azure-devops/azext_devops/dev/team/_help.py | vijayraavi/azure-devops-cli-extension | 88f1420c5815cb09bea15b050f4c553e0f326dad | [
"MIT"
] | null | null | null | azure-devops/azext_devops/dev/team/_help.py | vijayraavi/azure-devops-cli-extension | 88f1420c5815cb09bea15b050f4c553e0f326dad | [
"MIT"
] | 37 | 2020-04-27T07:45:19.000Z | 2021-04-05T07:27:15.000Z | azure-devops/azext_devops/dev/team/_help.py | vijayraavi/azure-devops-cli-extension | 88f1420c5815cb09bea15b050f4c553e0f326dad | [
"MIT"
] | null | null | null | # --------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# --------------------------------------------------------------------------------------------
from knack.help_files import helps
def load_team_help():
helps['devops'] = """
type: group
short-summary: Manage Azure DevOps organization level operations.
long-summary: |
Related Groups
az pipelines: Manage Azure Pipelines
az boards: Manage Azure Boards
az repos: Manage Azure Repos
az artifacts: Manage Azure Artifacts
"""
helps['devops project'] = """
type: group
short-summary: Manage team projects.
"""
helps['devops service-endpoint'] = """
type: group
short-summary: Manage service endpoints/service connections
"""
helps['devops team'] = """
type: group
short-summary: Manage teams
"""
| 30.228571 | 94 | 0.539698 | 100 | 1,058 | 5.68 | 0.52 | 0.096831 | 0.098592 | 0.147887 | 0.190141 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208885 | 1,058 | 34 | 95 | 31.117647 | 0.678614 | 0.31758 | 0 | 0.333333 | 0 | 0 | 0.778243 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | true | 0 | 0.041667 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
40df91a2a749f70d99be12938590c75c4ce5537e | 2,130 | py | Python | cortaswamp/authentication/migrations/0001_initial.py | parthakonda/cortaswamp-backend | 5e6875cbe994931cd747ac0d614250e3a6649500 | [
"MIT"
] | null | null | null | cortaswamp/authentication/migrations/0001_initial.py | parthakonda/cortaswamp-backend | 5e6875cbe994931cd747ac0d614250e3a6649500 | [
"MIT"
] | null | null | null | cortaswamp/authentication/migrations/0001_initial.py | parthakonda/cortaswamp-backend | 5e6875cbe994931cd747ac0d614250e3a6649500 | [
"MIT"
] | null | null | null | # Generated by Django 2.2.2 on 2019-06-09 08:16
import authentication.models
from django.db import migrations, models
import uuid
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='User',
fields=[
('password', models.CharField(max_length=128, verbose_name='password')),
('last_login', models.DateTimeField(blank=True, null=True, verbose_name='last login')),
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('first_name', models.CharField(help_text='First Name of user', max_length=200, null=True)),
('last_name', models.CharField(help_text='Last Name of user', max_length=200, null=True)),
('username', models.CharField(help_text='Username for the user', max_length=200, unique=True)),
('email', models.EmailField(help_text='Email of the user', max_length=200, unique=True)),
('login_attempts', models.IntegerField(default=0, help_text='To track no of invalid login attempts')),
],
options={
'db_table': 'user',
},
managers=[
('objects', authentication.models.UserAccountManager()),
],
),
migrations.CreateModel(
name='ForgotPassword',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('email', models.EmailField(help_text='Email of the user', max_length=200)),
('valid_upto', models.DateTimeField(help_text='DateTime valid upto')),
('expired', models.BooleanField(default=False, help_text='If True - Link can not be used, False - Link can be used')),
('created_on', models.DateTimeField(auto_now_add=True, help_text='Reset link creation date')),
],
options={
'db_table': 'forgot_password',
},
),
]
| 43.469388 | 134 | 0.589671 | 229 | 2,130 | 5.353712 | 0.393013 | 0.058728 | 0.053018 | 0.065253 | 0.335237 | 0.291191 | 0.291191 | 0.25938 | 0.21044 | 0.21044 | 0 | 0.023483 | 0.280282 | 2,130 | 48 | 135 | 44.375 | 0.776256 | 0.021127 | 0 | 0.317073 | 1 | 0 | 0.193951 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.073171 | 0.073171 | 0 | 0.170732 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
40effc952d41cb046df3a5dd039168c9870fb391 | 248 | py | Python | spotcrew_backend/venues/api/filters.py | dominpn/spotcrew | 1c48746189b78c1b370ea93d43968781155d4708 | [
"MIT"
] | 1 | 2020-03-17T17:11:18.000Z | 2020-03-17T17:11:18.000Z | spotcrew_backend/venues/api/filters.py | dominpn/spotcrew | 1c48746189b78c1b370ea93d43968781155d4708 | [
"MIT"
] | 3 | 2020-02-12T00:16:12.000Z | 2021-06-10T21:28:43.000Z | spotcrew_backend/venues/api/filters.py | dominpn/spotcrew | 1c48746189b78c1b370ea93d43968781155d4708 | [
"MIT"
] | 1 | 2020-03-17T21:30:26.000Z | 2020-03-17T21:30:26.000Z | from django_filters import rest_framework as filters
from venues.models import Venue
class VenueFilter(filters.FilterSet):
name = filters.CharFilter(lookup_expr='icontains')
class Meta:
model = Venue
fields = ['name', ]
| 20.666667 | 54 | 0.709677 | 29 | 248 | 5.965517 | 0.724138 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.209677 | 248 | 11 | 55 | 22.545455 | 0.882653 | 0 | 0 | 0 | 0 | 0 | 0.052419 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.714286 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
dc048eea5d1a0dd71e80d037a3ffccff463c2b51 | 334 | py | Python | blog/views.py | atikabdullah/News_blog | e71dea89411fe19a4d3d31f04d45ada395c65d94 | [
"MIT"
] | null | null | null | blog/views.py | atikabdullah/News_blog | e71dea89411fe19a4d3d31f04d45ada395c65d94 | [
"MIT"
] | null | null | null | blog/views.py | atikabdullah/News_blog | e71dea89411fe19a4d3d31f04d45ada395c65d94 | [
"MIT"
] | null | null | null | from django.views.generic import ListView, DetailView
from .models import Post
class BlogListView(ListView):
model = Post
context_object_name = 'posts'
template_name = 'blog/blog_list.html'
class BlogDetailView(DetailView):
model = Post
context_object_name = 'post'
template_name = 'blog/blog_detail.html'
| 22.266667 | 53 | 0.739521 | 41 | 334 | 5.829268 | 0.536585 | 0.075314 | 0.133891 | 0.1841 | 0.217573 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176647 | 334 | 14 | 54 | 23.857143 | 0.869091 | 0 | 0 | 0.2 | 0 | 0 | 0.146707 | 0.062874 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
dc0c35a5522bf20c19f6f4b282eb753d756345e6 | 156 | py | Python | dbookbee/test/example.py | cloudylan/dbooklib | fe93421bf8937e141ec0aa127acddbe7600f5257 | [
"Apache-2.0"
] | null | null | null | dbookbee/test/example.py | cloudylan/dbooklib | fe93421bf8937e141ec0aa127acddbe7600f5257 | [
"Apache-2.0"
] | null | null | null | dbookbee/test/example.py | cloudylan/dbooklib | fe93421bf8937e141ec0aa127acddbe7600f5257 | [
"Apache-2.0"
] | null | null | null | from mobi import Mobi
import pprint
book = Mobi("test/CharlesDarwin.mobi")
book.parse()
for record in book:
print(record)
pprint.pprint(book.config)
| 14.181818 | 38 | 0.74359 | 23 | 156 | 5.043478 | 0.565217 | 0.172414 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147436 | 156 | 10 | 39 | 15.6 | 0.87218 | 0 | 0 | 0 | 0 | 0 | 0.147436 | 0.147436 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0.428571 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
904ec2940e1f65bb25aa3addb0151641590c11be | 244 | py | Python | examples/testing/test_add.py | abhiabhi94/learn-python | 448211b4b9fa828f076618624b1ff00144adfd9f | [
"MIT"
] | null | null | null | examples/testing/test_add.py | abhiabhi94/learn-python | 448211b4b9fa828f076618624b1ff00144adfd9f | [
"MIT"
] | 1 | 2021-06-20T23:14:59.000Z | 2021-06-23T23:23:15.000Z | examples/testing/test_add.py | abhiabhi94/learn-python | 448211b4b9fa828f076618624b1ff00144adfd9f | [
"MIT"
] | null | null | null | import unittest
import unittest.mock
from add import add
class TestAdd(unittest.TestCase):
def test_nums(self):
self.assertEqual(add(1, 3), 4)
def test_num_and_string(self):
self.assertRaises(TypeError, add, 1, 'a')
| 18.769231 | 49 | 0.692623 | 35 | 244 | 4.714286 | 0.628571 | 0.169697 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020513 | 0.20082 | 244 | 12 | 50 | 20.333333 | 0.825641 | 0 | 0 | 0 | 0 | 0 | 0.004098 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | false | 0 | 0.375 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
905b5381e1739a27121047a3ef30a74b256f9593 | 235 | py | Python | airbox/__init__.py | lewisjared/airbox | 56bfdeb3e81bac47c80fbf249d9ead31c94a2139 | [
"MIT"
] | null | null | null | airbox/__init__.py | lewisjared/airbox | 56bfdeb3e81bac47c80fbf249d9ead31c94a2139 | [
"MIT"
] | null | null | null | airbox/__init__.py | lewisjared/airbox | 56bfdeb3e81bac47c80fbf249d9ead31c94a2139 | [
"MIT"
] | null | null | null | import matplotlib
# Only use core pdf fonts to save size
matplotlib.rcParams['pdf.use14corefonts'] = True
matplotlib.rcParams['legend.fontsize'] = 5
__version__ = '1.0.2'
from .configstore import ConfigStore
config = ConfigStore()
| 19.583333 | 48 | 0.765957 | 30 | 235 | 5.866667 | 0.766667 | 0.204545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029412 | 0.131915 | 235 | 11 | 49 | 21.363636 | 0.833333 | 0.153191 | 0 | 0 | 0 | 0 | 0.192893 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
906508b96e57731420e2c190ff9aab3b89b31cec | 3,594 | py | Python | vkbottle/bot/events/__init__.py | croogg/vkbottle | 7355c2ef89d302410c8e05be162ba71e5f040990 | [
"MIT"
] | null | null | null | vkbottle/bot/events/__init__.py | croogg/vkbottle | 7355c2ef89d302410c8e05be162ba71e5f040990 | [
"MIT"
] | null | null | null | vkbottle/bot/events/__init__.py | croogg/vkbottle | 7355c2ef89d302410c8e05be162ba71e5f040990 | [
"MIT"
] | 2 | 2020-05-10T11:48:25.000Z | 2021-12-02T09:22:54.000Z | """
MIT License
Copyright (c) 2019 Arseniy Timonik
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
"""
"""
VKBOTTLE EVENTS TYPES
"""
from ...utils import Logger, make_priority_path, dict_of_dicts_merge
from ...notifications import add_undefined
from .events import OnMessage, OnMessageChat, OnMessageBoth
class Events:
def __init__(self, logger: Logger = Logger(True), use_regex: bool = True):
"""
Make decorator processors (dictionaries with functions)
:param logger: Logging object
:param use_regex: More comfortable with regex, but if speed is main priority...
fixme RU - не доделал это..
"""
# Collections
self.use_regex = use_regex
# Processors
self.processor_message_regex = {}
self.processor_message_chat_regex = {}
self.undefined_message_func = (lambda *args: logger.warn(add_undefined))
self.events = {}
self.chat_action_types = {}
# Decorators
self.message = OnMessage(self)
self.message_chat = OnMessageChat(self)
self.message_both = OnMessageBoth(self)
def merge_processors(self):
"""
Merge message decorators with message-both decorators. Using deepcopy and MutableMapping to merge dictionaries
own priorities
todo RU - исправить эту неприятную жижу
"""
self.processor_message_regex = dict_of_dicts_merge(self.message.processor, self.message_both.processor_message)
self.processor_message_chat_regex = dict_of_dicts_merge(self.message_chat.processor, self.message_both.processor_chat)
def chat_action(self, type_: str):
"""
Special express processor of chat actions (https://vk.com/dev/objects/message - action object)
:param type_: action name
"""
def decorator(func):
self.chat_action_types[type_] = {'call': func}
return func
return decorator
def message_undefined(self):
"""
If private message is not in message processor this single function will be caused
"""
def decorator(func):
self.undefined_message_func = func
return func
return decorator
def event(self, name: str):
"""
Event decorator. Needed for events processing.
For example:
@bot.on.event('on_group_join')
:param name: Event name, find this in VK API Docs
"""
def decorator(func):
self.chat_action_types[name] = {'call': func}
return func
return decorator
| 34.228571 | 126 | 0.686978 | 454 | 3,594 | 5.314978 | 0.407489 | 0.036469 | 0.033154 | 0.019892 | 0.146705 | 0.097389 | 0.055533 | 0 | 0 | 0 | 0 | 0.001473 | 0.244574 | 3,594 | 104 | 127 | 34.557692 | 0.887293 | 0.504174 | 0 | 0.28125 | 0 | 0 | 0.005198 | 0 | 0 | 0 | 0 | 0.019231 | 0 | 1 | 0.25 | false | 0 | 0.09375 | 0 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
906722fe5d4405c861e48bef2cfba2132f326c38 | 426 | py | Python | beeline/internal.py | mariokostelac/beeline-python | 32a30f22f35ea585ac167997a4732e5f20d72d7a | [
"Apache-2.0"
] | null | null | null | beeline/internal.py | mariokostelac/beeline-python | 32a30f22f35ea585ac167997a4732e5f20d72d7a | [
"Apache-2.0"
] | null | null | null | beeline/internal.py | mariokostelac/beeline-python | 32a30f22f35ea585ac167997a4732e5f20d72d7a | [
"Apache-2.0"
] | null | null | null | import beeline
# these are mostly convenience methods for safely calling beeline methods
# even if the beeline hasn't been initialized
def send_event():
bl = beeline.get_beeline()
if bl:
return bl.send_event()
def send_all():
bl = beeline.get_beeline()
if bl:
return bl.send_all()
def log(msg, *args, **kwargs):
bl = beeline.get_beeline()
if bl:
bl.log(msg, *args, **kwargs)
| 22.421053 | 73 | 0.652582 | 62 | 426 | 4.370968 | 0.451613 | 0.099631 | 0.132841 | 0.210332 | 0.343173 | 0.343173 | 0.258303 | 0.258303 | 0.258303 | 0 | 0 | 0 | 0.239437 | 426 | 18 | 74 | 23.666667 | 0.83642 | 0.269953 | 0 | 0.461538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.076923 | 0 | 0.461538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
90719615cb678876e1e7001dbd98295f053cd2d2 | 297 | py | Python | Python_Aulas_3,6+/04_Dados_Primitivos.py | PASSINP/Python_3-Django | 0257ce9aada1d63d7912ed9a60be5515c5100028 | [
"MIT"
] | 1 | 2021-01-04T12:48:47.000Z | 2021-01-04T12:48:47.000Z | Python_Aulas_3,6+/04_Dados_Primitivos.py | PASSINP/Python_3-Django | 0257ce9aada1d63d7912ed9a60be5515c5100028 | [
"MIT"
] | null | null | null | Python_Aulas_3,6+/04_Dados_Primitivos.py | PASSINP/Python_3-Django | 0257ce9aada1d63d7912ed9a60be5515c5100028 | [
"MIT"
] | null | null | null | """
Tipos de dados
str - string - "Texto" ,'Texto'
int - inteiro - 10, 20 -45
float - numero com ponto - 0.2 -3.14 8000.1
bool - booleano - True/False
"""
# Utilizando a função type é possivel ver o tipo(classe) de um valor
print(type('Luiz'))
print(type(10))
print(type(3.14))
print(type(True))
| 21.214286 | 68 | 0.670034 | 52 | 297 | 3.826923 | 0.75 | 0.180905 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08502 | 0.16835 | 297 | 13 | 69 | 22.846154 | 0.720648 | 0.720539 | 0 | 0 | 0 | 0 | 0.053333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
90723e8c4410c192ca93198eab900fcc4790903f | 2,609 | py | Python | lib/aquilon/aqdb/model/model.py | ned21/aquilon | 6562ea0f224cda33b72a6f7664f48d65f96bd41a | [
"Apache-2.0"
] | 7 | 2015-07-31T05:57:30.000Z | 2021-09-07T15:18:56.000Z | lib/aquilon/aqdb/model/model.py | ned21/aquilon | 6562ea0f224cda33b72a6f7664f48d65f96bd41a | [
"Apache-2.0"
] | 115 | 2015-03-03T13:11:46.000Z | 2021-09-20T12:42:24.000Z | lib/aquilon/aqdb/model/model.py | ned21/aquilon | 6562ea0f224cda33b72a6f7664f48d65f96bd41a | [
"Apache-2.0"
] | 13 | 2015-03-03T11:17:59.000Z | 2021-09-09T09:16:41.000Z | # -*- cpy-indent-level: 4; indent-tabs-mode: nil -*-
# ex: set expandtab softtabstop=4 shiftwidth=4:
#
# Copyright (C) 2008,2009,2010,2011,2012,2013,2014 Contributor
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
""" basic construct of model = vendor name + product name """
from datetime import datetime
from sqlalchemy import (Integer, DateTime, Sequence, String, Column, ForeignKey,
UniqueConstraint)
from sqlalchemy.orm import relation, object_session, deferred
from aquilon.exceptions_ import AquilonError
from aquilon.aqdb.model import Base, Vendor
from aquilon.aqdb.column_types import AqStr, StringEnumColumn
from aquilon.aqdb.types import ModelType, NicType
class Model(Base):
""" Vendor and Model are representations of the various manufacturers and
the asset inventory of the kinds of machines we use in the plant """
__tablename__ = 'model'
id = Column(Integer, Sequence('model_id_seq'), primary_key=True)
name = Column(AqStr(64), nullable=False)
vendor_id = Column(ForeignKey(Vendor.id), nullable=False)
model_type = Column(StringEnumColumn(ModelType, 20, True), nullable=False)
creation_date = deferred(Column(DateTime, default=datetime.now,
nullable=False))
comments = Column(String(255))
vendor = relation(Vendor, innerjoin=True)
__table_args__ = (UniqueConstraint(vendor_id, name),
{'info': {'unique_fields': ['name', 'vendor'],
'extra_search_fields': ['model_type']}})
@property
def qualified_name(self):
return self.vendor.name + "/" + self.name
@classmethod
def default_nic_model(cls, session):
# TODO: make this configurable
return cls.get_unique(session, model_type=NicType.Nic, name='generic_nic',
vendor='generic', compel=AquilonError)
@property
def nic_model(self):
if self.machine_specs:
return self.machine_specs.nic_model
session = object_session(self)
return self.default_nic_model(session)
| 36.236111 | 82 | 0.692603 | 327 | 2,609 | 5.415902 | 0.504587 | 0.033879 | 0.025409 | 0.018069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020508 | 0.215025 | 2,609 | 71 | 83 | 36.746479 | 0.844238 | 0.345343 | 0 | 0.058824 | 0 | 0 | 0.054925 | 0 | 0 | 0 | 0 | 0.014085 | 0 | 1 | 0.088235 | false | 0 | 0.205882 | 0.058824 | 0.705882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
90742a24ba8ef6a7531ae1a57ccb056e534daf0c | 4,003 | py | Python | examples/compose_prj_from_csv/compose_prj.py | OliverPe/ogs6py | 41d5424e565f16d3b47be6f85258fd5b64728cd1 | [
"BSD-3-Clause"
] | 9 | 2019-07-18T22:47:37.000Z | 2022-01-20T11:57:53.000Z | examples/compose_prj_from_csv/compose_prj.py | OliverPe/ogs6py | 41d5424e565f16d3b47be6f85258fd5b64728cd1 | [
"BSD-3-Clause"
] | 35 | 2021-02-18T16:46:13.000Z | 2022-02-23T11:45:40.000Z | examples/compose_prj_from_csv/compose_prj.py | OliverPe/ogs6py | 41d5424e565f16d3b47be6f85258fd5b64728cd1 | [
"BSD-3-Clause"
] | 16 | 2019-04-09T16:08:26.000Z | 2022-01-20T11:57:10.000Z | # make sure path ogs6py is known via pip or
# export PYTHONPATH=$PYTHONPATH:<PATH>/ogs6py/ogs6py
# This script generates an OGS input-file (prj) for a variable number of
# materials which are included as separate xml-files inside <medium> sections.
#
# It reads three files
# - parameterTable.csv with the parameters of the layers (materials)
# - OGStemplate_main.prj with empty media block, i.e. only whitespaces between <media> tags.
# - OGStemplate_medium.xml with the medium parameters, to be set inside <medium>
#
# The output is written to (existing files get overwritten)
# - OGSmain.prj containing include-directives to
# - OGSmedium_0.xml
# - OGSmedium_1.xml
# - ...
#
# TODO:
# use include "filename" (quotes) in latest OGS version?
# make sure soil sequence in slice matches sequence in table (id_list?)
import ogs6py as ogs
import pandas as pd
### READ MEDIA (MATERIALS) FROM TABLE ###
parameter_table = pd.read_csv('parameterTable.csv')
print(parameter_table.info())
N_mat = len(parameter_table.index) # number of data rows (without header)
names_list = list(parameter_table['„stratigraphy_unit“']) # TODO add as comment
density_list = list(parameter_table['„solid_mass_density_[kg/m³]_max“'])
viscosity_list = list(parameter_table['„viscosity“'])
permeability_list = list(parameter_table['„hydraulic_permeability_[m²]_mean“'])
porosity_list = list(parameter_table['„Porosity_[-]_mean“'])
storage_list = list(parameter_table['„hydraulic_storage_[1/Pa]_mean“'])
### PROCESS MEDIA FILES ###
# enter parameters into template (ogs6py)
medium_template_filename = "OGStemplate_medium.xml"
medium_filenames=[]
for n in range(N_mat):
medium_filename="OGSmedium_"+str(n)+".xml"
medium_filenames.append(medium_filename)
medium_xml= ogs.OGS(INPUT_FILE=medium_template_filename, PROJECT_FILE=medium_filename)
medium_xml.replace_text(viscosity_list[n], xpath="./phases/phase/properties/property[name='viscosity']/value")
medium_xml.replace_text(density_list[n], xpath="./phases/phase/properties/property[name='density']/value")
medium_xml.replace_text(permeability_list[n], xpath="./properties/property[name='permeability']/value")
medium_xml.replace_text(porosity_list[n], xpath="./properties/property[name='porosity']/value")
medium_xml.replace_text(storage_list[n], xpath="./properties/property[name='storage']/type")
medium_xml.write_input()
# The <medium> tags were needed for ogs6py but are removed now,
# they will be reinserted as numbered <medium id ...> in main.prj.
# This approach was chosen, as OGS allows only one file inclusion per XML-tag
for n in range(N_mat):
medium_filedata = open(medium_filenames[n]).read()
medium_filedata = medium_filedata.replace('<medium>', '<!-- stratigraphy unit: ' + names_list[n] + '-->' )
medium_filedata = medium_filedata.replace('</medium>','')
medium_file = open(medium_filenames[n], 'w')
medium_file.write(medium_filedata)
medium_file.close
### PROCESS MAIN FILE ###
# add medium blocks (ogs6py)
main_template_filename='OGStemplate_main.prj'
main_filename="OGSmain.prj"
main_prj= ogs.OGS(INPUT_FILE=main_template_filename, PROJECT_FILE=main_filename)
for n in range(N_mat):
include_file="<include file="+medium_filenames[n]+"/>"
main_prj.add_entry(parent_xpath="./media", tag="medium", text=include_file, attrib="id", attrib_value=str(n))
main_prj.write_input()
# ogs6py codes "<" as "<" and ">" as ">", although XML readers should accept this, we replace it for sake of beauty
main_filedata = open(main_filename).read()
main_filedata = main_filedata.replace('<', '<')
main_filedata = main_filedata.replace('>','>')
main_file = open(main_filename, 'w')
main_file.write(main_filedata)
main_file.close
# this is how to replace other parameters than media (ogs6py)
#medium_prj.= ogs.OGS(INPUT_FILE=in, PROJECT_FILE=out)
#medium_prj.replace_text(9.81, xpath="./processes/process/darcy_gravity/g")
| 42.136842 | 122 | 0.741694 | 565 | 4,003 | 5.058407 | 0.329204 | 0.028342 | 0.035689 | 0.046186 | 0.236179 | 0.13471 | 0.044787 | 0.030091 | 0 | 0 | 0 | 0.004867 | 0.127404 | 4,003 | 94 | 123 | 42.585106 | 0.811623 | 0.380215 | 0 | 0.069767 | 1 | 0 | 0.23273 | 0.150905 | 0 | 0 | 0 | 0.010638 | 0 | 1 | 0 | false | 0 | 0.046512 | 0 | 0.046512 | 0.023256 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9077ce8bf4e83dd0200a5a6f799ccd9fc39487a8 | 1,898 | py | Python | mlbriefcase/datasource.py | Bhaskers-Blu-Org2/Briefcase | f551079b05d3f8494cdff6a0b393969def5a2443 | [
"MIT"
] | 2 | 2020-05-04T12:59:05.000Z | 2020-05-05T09:31:43.000Z | mlbriefcase/datasource.py | Bhaskers-Blu-Org2/Briefcase | f551079b05d3f8494cdff6a0b393969def5a2443 | [
"MIT"
] | 4 | 2020-02-05T11:34:51.000Z | 2020-02-05T11:35:12.000Z | mlbriefcase/datasource.py | microsoft/Briefcase | f551079b05d3f8494cdff6a0b393969def5a2443 | [
"MIT"
] | 5 | 2020-06-30T16:02:57.000Z | 2021-09-15T06:39:08.000Z | from .base import *
from .datasource import *
class DataSource(Resource):
def download(self, target, overwrite=False):
raise NotImplementedError
class URLDataSource(DataSource):
def download(self, filename, **kwargs):
import urllib.request
return urllib.request.urlretrieve(self.get_url(), filename, **kwargs)
def get_url(self) -> str:
# TODO: check if getattr works too, check if the default is evaluated here
if hasattr(self, 'url'):
return self.url
else:
return self.dataset.get_url()
def to_dataflow(self) -> 'dprep.Dataflow':
import azureml.dataprep as dprep
return dprep.auto_read_file(self.get_url())
def to_pandas_dataframe(self):
return self.to_dataflow().to_pandas_dataframe()
def to_spark_dataframe(self) -> 'pyspark.sql.DataFrame':
return self.to_dataflow().to_spark_dataframe()
# TODO: can we auto-generate this type (or register for multiple yaml_tag?)
# class CSVDataSource(URLDataSource):
# yaml_tag = u'!csv'
# def to_dataflow(self) -> 'dprep.Dataflow':
# import azureml.dataprep as dprep
# # dprep.read_json
# # TODO: lookup method dprep.read_*
# # not sure if the **self.get_params() is such a great idea as it ain't portable?
# url = self.get_url()
# try:
# ds = url
# if url.startswith('http://') or url.startswith('https://'):
# ds = dprep.HttpDataSource(url)
# return dprep.read_csv(ds, **self.get_params('dataset', 'url'))
# except Exception as e:
# self.get_logger().warning("unable to fetch data using dprep. falling back to url download: {}".format(e))
# # since dprep doesn't forward the error message
# import urllib.request
# return urllib.request.urlopen(url).read()
| 37.96 | 119 | 0.631191 | 237 | 1,898 | 4.940928 | 0.438819 | 0.035867 | 0.025619 | 0.042699 | 0.201537 | 0.163962 | 0.099061 | 0.099061 | 0.099061 | 0.099061 | 0 | 0 | 0.255532 | 1,898 | 49 | 120 | 38.734694 | 0.828733 | 0.527397 | 0 | 0 | 0 | 0 | 0.043578 | 0.024083 | 0 | 0 | 0 | 0.020408 | 0 | 1 | 0.285714 | false | 0 | 0.190476 | 0.095238 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
907cf2eb4f6b1913e641447bdc269c005d897923 | 952 | py | Python | mkt/webapps/tests/test_utils_.py | muffinresearch/zamboni | 045a6f07c775b99672af6d9857d295ed02fe5dd9 | [
"BSD-3-Clause"
] | null | null | null | mkt/webapps/tests/test_utils_.py | muffinresearch/zamboni | 045a6f07c775b99672af6d9857d295ed02fe5dd9 | [
"BSD-3-Clause"
] | null | null | null | mkt/webapps/tests/test_utils_.py | muffinresearch/zamboni | 045a6f07c775b99672af6d9857d295ed02fe5dd9 | [
"BSD-3-Clause"
] | null | null | null | from nose.tools import eq_
import amo
from mkt.webapps.utils import get_supported_locales
class TestSupportedLocales(amo.tests.TestCase):
def setUp(self):
self.manifest = {'default_locale': 'en'}
def check(self, expected):
eq_(get_supported_locales(self.manifest), expected)
def test_empty_locale(self):
self.check([])
def test_single_locale(self):
self.manifest.update({'locales': {'es': {'name': 'eso'}}})
self.check(['es'])
def test_multiple_locales(self):
self.manifest.update({'locales': {'es': {'name': 'si'},
'fr': {'name': 'oui'}}})
self.check(['es', 'fr'])
def test_short_locale(self):
self.manifest.update({'locales': {'pt': {'name': 'sim'}}})
self.check(['pt-PT'])
def test_unsupported_locale(self):
self.manifest.update({'locales': {'xx': {'name': 'xx'}}})
self.check([])
| 28 | 66 | 0.578782 | 110 | 952 | 4.854545 | 0.372727 | 0.089888 | 0.149813 | 0.164794 | 0.273408 | 0.273408 | 0.131086 | 0 | 0 | 0 | 0 | 0 | 0.237395 | 952 | 33 | 67 | 28.848485 | 0.735537 | 0 | 0 | 0.086957 | 0 | 0 | 0.102941 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.304348 | false | 0 | 0.130435 | 0 | 0.478261 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
90999f68c5e83ae8a3ca4d99fc295d9d2d1c2576 | 1,649 | py | Python | filter_plugins/neutron_user_password_decrypt.py | ArdanaCLM/neutron-ansible | 777f0b3f9e495fa84c9d47091bc816e99f892a63 | [
"Apache-2.0"
] | null | null | null | filter_plugins/neutron_user_password_decrypt.py | ArdanaCLM/neutron-ansible | 777f0b3f9e495fa84c9d47091bc816e99f892a63 | [
"Apache-2.0"
] | null | null | null | filter_plugins/neutron_user_password_decrypt.py | ArdanaCLM/neutron-ansible | 777f0b3f9e495fa84c9d47091bc816e99f892a63 | [
"Apache-2.0"
] | 2 | 2018-03-09T19:50:30.000Z | 2019-02-26T19:53:51.000Z | #
# (c) Copyright 2015-2017 Hewlett Packard Enterprise Development LP
# (c) Copyright 2017 SUSE LLC
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
#
# TODO: maintain single copy of this filter in a shared repo, like ardana-ansible
#
import imp
import os.path
path = os.path.dirname(os.path.realpath(__file__))
ardanaencrypt = imp.load_source('ardanaencrypt', path + '/../ardanaencrypt.py')
encryption_class = 'openssl'
ardanaencrypt_class = getattr(ardanaencrypt, encryption_class)
def neutron_user_password_decrypt(value, *args, **kw):
prefix = None
if value.startswith(ardanaencrypt_class.prefix):
prefix = ardanaencrypt_class.prefix
# For upgrade cases, need to support existing encrypted values which may
# have legacy prefix in-use.
elif value.startswith(ardanaencrypt_class.legacy_prefix):
prefix = ardanaencrypt_class.legacy_prefix
if prefix is None:
return value
else:
obj = ardanaencrypt_class()
return obj.decrypt(value[len(prefix):])
class FilterModule(object):
def filters(self):
return {'neutron_user_password_decrypt': neutron_user_password_decrypt}
| 32.333333 | 81 | 0.745907 | 223 | 1,649 | 5.408072 | 0.556054 | 0.049751 | 0.047264 | 0.064677 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01173 | 0.172832 | 1,649 | 50 | 82 | 32.98 | 0.872434 | 0.478472 | 0 | 0 | 0 | 0 | 0.082241 | 0.034565 | 0 | 0 | 0 | 0.02 | 0 | 1 | 0.1 | false | 0.1 | 0.1 | 0.05 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
90a0f364c91d17fb928deb5d4abaf9b8ba2a1772 | 136 | py | Python | ZeroSeg/examples/text_example.py | TooFiveFive/Smart-Alarm-Clock-Plus | 4dd1d7f4637dd6ccf977f6cc5562cbfd341b3c3e | [
"MIT"
] | 12 | 2016-08-18T09:08:53.000Z | 2018-09-09T17:05:27.000Z | ZeroSeg/examples/text_example.py | TooFiveFive/Smart-Alarm-Clock-Plus | 4dd1d7f4637dd6ccf977f6cc5562cbfd341b3c3e | [
"MIT"
] | 7 | 2016-10-12T21:34:53.000Z | 2018-07-08T18:03:32.000Z | ZeroSeg/examples/text_example.py | TooFiveFive/Smart-Alarm-Clock-Plus | 4dd1d7f4637dd6ccf977f6cc5562cbfd341b3c3e | [
"MIT"
] | 5 | 2019-08-03T15:06:25.000Z | 2021-02-09T20:45:27.000Z | import ZeroSeg.led as led
import time
device = led.sevensegment(cascaded=2)
device.write_text(1,"YES")
time.sleep(3)
device.clear()
| 12.363636 | 37 | 0.75 | 22 | 136 | 4.590909 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025 | 0.117647 | 136 | 10 | 38 | 13.6 | 0.816667 | 0 | 0 | 0 | 0 | 0 | 0.022059 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
90a1721b47bb93d4c2e8a441cdbd1a8de5e45376 | 17,135 | py | Python | machine_learning/cost_sensitive_learning_functions.py | suleymanaslan/bilkent-coursework | 33b3e73d17a2b29800777e459809fc3e46f298a3 | [
"MIT"
] | null | null | null | machine_learning/cost_sensitive_learning_functions.py | suleymanaslan/bilkent-coursework | 33b3e73d17a2b29800777e459809fc3e46f298a3 | [
"MIT"
] | null | null | null | machine_learning/cost_sensitive_learning_functions.py | suleymanaslan/bilkent-coursework | 33b3e73d17a2b29800777e459809fc3e46f298a3 | [
"MIT"
] | null | null | null | import numpy as np
from sklearn.ensemble import RandomForestClassifier
def train_baseline(data_train_x, data_train_y, data_test_x, data_test_y):
clf = RandomForestClassifier(n_estimators=100, max_depth=9, random_state=550)
clf.fit(data_train_x, data_train_y)
train_acc = (clf.predict(data_train_x) == data_train_y).mean()
test_acc = (clf.predict(data_test_x) == data_test_y).mean()
train_acc_c1 = (clf.predict(data_train_x[data_train_y == 1]) == data_train_y[data_train_y == 1]).mean()
test_acc_c1 = (clf.predict(data_test_x[data_test_y == 1]) == data_test_y[data_test_y == 1]).mean()
train_acc_c2 = (clf.predict(data_train_x[data_train_y == 2]) == data_train_y[data_train_y == 2]).mean()
test_acc_c2 = (clf.predict(data_test_x[data_test_y == 2]) == data_test_y[data_test_y == 2]).mean()
train_acc_c3 = (clf.predict(data_train_x[data_train_y == 3]) == data_train_y[data_train_y == 3]).mean()
test_acc_c3 = (clf.predict(data_test_x[data_test_y == 3]) == data_test_y[data_test_y == 3]).mean()
print(f"Train Acc:{train_acc:.4f}")
print(f"Test Acc:{test_acc:.4f}")
print(f"Class 1 Train Acc:{train_acc_c1:.4f}")
print(f"Class 1 Test Acc:{test_acc_c1:.4f}")
print(f"Class 2 Train Acc:{train_acc_c2:.4f}")
print(f"Class 2 Test Acc:{test_acc_c2:.4f}")
print(f"Class 3 Train Acc:{train_acc_c3:.4f}")
print(f"Class 3 Test Acc:{test_acc_c3:.4f}")
train_misclassification = (clf.predict(data_train_x) != data_train_y).sum()
print(f"Train Misclassifications:{train_misclassification}")
def forward_selection(data_train_x, data_train_y, data_cost):
clf = RandomForestClassifier(n_estimators=100, max_depth=9, random_state=550)
cur_features = []
cur_joint_cost = np.inf
cur_feature_cost = 0
done = False
acc_0 = (np.ones_like(data_train_y) * 3 == data_train_y).mean()
all_f_cost = data_cost.sum()
while not done:
new_feature_index = -1
for i in range(data_train_x.shape[1]):
if i in cur_features:
continue
used_features = []
current_cost = cur_feature_cost
if i == 20:
used_features.append(i-2)
used_features.append(i-1)
current_cost += data_cost[i-2]
current_cost += data_cost[i-1]
else:
current_cost += data_cost[i]
used_features.append(i)
all_features = cur_features + used_features
used_features_x = None
for u_feature in all_features:
if used_features_x is None:
used_features_x = np.expand_dims(data_train_x[:,u_feature], axis=1)
else:
used_features_x = np.append(used_features_x, np.expand_dims(data_train_x[:,u_feature], axis=1), axis=1)
clf.fit(used_features_x, data_train_y)
used_features_acc = (clf.predict(used_features_x) == data_train_y).mean()
used_features_acc_c1 = (clf.predict(used_features_x[data_train_y == 1]) == data_train_y[data_train_y == 1]).mean()
used_features_acc_c2 = (clf.predict(used_features_x[data_train_y == 2]) == data_train_y[data_train_y == 2]).mean()
used_features_acc_c3 = (clf.predict(used_features_x[data_train_y == 3]) == data_train_y[data_train_y == 3]).mean()
joint_cost = (acc_0 + (1 - acc_0) * (current_cost / all_f_cost)) / used_features_acc
if joint_cost < cur_joint_cost:
cur_joint_cost = joint_cost
new_feature_index = i
best_feature_cost = current_cost
best_features = used_features
best_acc = used_features_acc
best_acc_c1 = used_features_acc_c1
best_acc_c2 = used_features_acc_c2
best_acc_c3 = used_features_acc_c3
print(f"Features:{used_features},\tAcc:{used_features_acc:.2f}, Class1_Acc:{used_features_acc_c1:.4f}, "
f"Class2_Acc:{used_features_acc_c2:.4f}, Class3_Acc:{used_features_acc_c3:.4f}, "
f"Joint Cost:{joint_cost:.4f}, Feature Cost:{current_cost}")
if new_feature_index > -1:
cur_features = cur_features + best_features
cur_feature_cost = best_feature_cost
else:
done = True
print("Done")
print(f"Selected Features:{cur_features}, Acc:{best_acc:.4f}, Class1_Acc:{best_acc_c1:.4f}, Class2_Acc:{best_acc_c2:.4f}, "
f"Class3_Acc:{best_acc_c3:.4f}, Joint Cost:{cur_joint_cost:.4f}, Feature Cost:{cur_feature_cost}")
print("")
return cur_features
def evaluate_forward_selection(cur_features, data_train_x, data_test_x, data_train_y, data_test_y):
clf = RandomForestClassifier(n_estimators=100, max_depth=9, random_state=550)
used_features_x = None
for u_feature in cur_features:
if used_features_x is None:
used_features_x = np.expand_dims(data_train_x[:,u_feature], axis=1)
else:
used_features_x = np.append(used_features_x, np.expand_dims(data_train_x[:,u_feature], axis=1), axis=1)
used_features_test_x = None
for u_feature in cur_features:
if used_features_test_x is None:
used_features_test_x = np.expand_dims(data_test_x[:,u_feature], axis=1)
else:
used_features_test_x = np.append(used_features_test_x, np.expand_dims(data_test_x[:,u_feature], axis=1), axis=1)
clf.fit(used_features_x, data_train_y)
selected_features_train_acc = (clf.predict(used_features_x) == data_train_y).mean()
selected_features_train_acc_c1 = (clf.predict(used_features_x[data_train_y == 1]) == data_train_y[data_train_y == 1]).mean()
selected_features_train_acc_c2 = (clf.predict(used_features_x[data_train_y == 2]) == data_train_y[data_train_y == 2]).mean()
selected_features_train_acc_c3 = (clf.predict(used_features_x[data_train_y == 3]) == data_train_y[data_train_y == 3]).mean()
selected_features_test_acc = (clf.predict(used_features_test_x) == data_test_y).mean()
selected_features_test_acc_c1 = (clf.predict(used_features_test_x[data_test_y == 1]) == data_test_y[data_test_y == 1]).mean()
selected_features_test_acc_c2 = (clf.predict(used_features_test_x[data_test_y == 2]) == data_test_y[data_test_y == 2]).mean()
selected_features_test_acc_c3 = (clf.predict(used_features_test_x[data_test_y == 3]) == data_test_y[data_test_y == 3]).mean()
print(f"Selected Features:{cur_features}, "
f"Train Acc:{selected_features_train_acc:.4f}, "
f"Train Class1_Acc:{selected_features_train_acc_c1:.4f}, "
f"Train Class2_Acc:{selected_features_train_acc_c2:.4f}, "
f"Train Class3_Acc:{selected_features_train_acc_c3:.4f}, ")
print(f"Selected Features:{cur_features}, "
f"Test Acc:{selected_features_test_acc:.4f}, "
f"Test Class1_Acc:{selected_features_test_acc_c1:.4f}, "
f"Test Class2_Acc:{selected_features_test_acc_c2:.4f}, "
f"Test Class3_Acc:{selected_features_test_acc_c3:.4f}, ")
def calculate_fitness(population, data_train_x, data_train_y, data_cost):
clf = RandomForestClassifier(n_estimators=100, max_depth=9, random_state=550)
acc_0 = (np.ones_like(data_train_y) * 3 == data_train_y).mean()
all_f_cost = data_cost.sum()
fitness_values = []
for individual in population:
cur_features = []
cur_feature_cost = 0
for u_feature, is_used in enumerate(individual):
if is_used:
cur_features.append(u_feature)
cur_feature_cost += data_cost[u_feature]
if 18 in cur_features and 19 in cur_features:
cur_features.append(20)
used_features_x = None
for u_feature in cur_features:
if used_features_x is None:
used_features_x = np.expand_dims(data_train_x[:,u_feature], axis=1)
else:
used_features_x = np.append(used_features_x, np.expand_dims(data_train_x[:,u_feature], axis=1), axis=1)
clf.fit(used_features_x, data_train_y)
used_features_acc = (clf.predict(used_features_x) == data_train_y).mean()
joint_cost = (acc_0 + (1 - acc_0) * (cur_feature_cost / all_f_cost)) / used_features_acc
fitness_values.append(joint_cost)
return fitness_values
def selection(population, fitness_values, to_be_selected):
new_population = []
temp_fitness = fitness_values.copy()
for i in range(to_be_selected):
best_fit = np.argmin(temp_fitness)
temp_fitness[best_fit] = np.inf
new_population.append(population[best_fit])
return new_population
def cross_over(new_population, to_be_crossed_over):
children = None
for _ in range(to_be_crossed_over):
temp = np.arange(len(new_population))
first_parent_id = np.random.choice(temp)
temp = np.delete(temp, first_parent_id)
second_parent_id = np.random.choice(temp)
first_parent = new_population[first_parent_id]
second_parent = new_population[second_parent_id]
mask = np.random.randint(2, size=(20))
first_genes = np.logical_and(first_parent, mask == 0).astype(np.int32)
second_genes = np.logical_and(second_parent, mask == 1).astype(np.int32)
child = np.logical_or(first_genes, second_genes).astype(np.int32)
children = child if children is None else np.vstack((children, child))
first_genes = np.logical_and(first_parent, mask == 1).astype(np.int32)
second_genes = np.logical_and(second_parent, mask == 0).astype(np.int32)
child = np.logical_or(first_genes, second_genes).astype(np.int32)
children = child if children is None else np.vstack((children, child))
return children
print(children)
def mutate(individuals_to_mutate):
mutants = None
nb_of_ones = np.random.randint(2) + 1
nb_of_zeros = 20 - nb_of_ones
mutation_mask = np.array([0] * nb_of_zeros + [1] * nb_of_ones)
np.random.shuffle(mutation_mask)
for ix, individual in enumerate(individuals_to_mutate):
individual_inverse = 1-individual
normal_genes = np.logical_and(individual, mutation_mask == 0).astype(np.int32)
mutant_genes = np.logical_and(individual_inverse, mutation_mask == 1).astype(np.int32)
mutant = np.logical_or(normal_genes, mutant_genes).astype(np.int32)
mutants = mutant if mutants is None else np.vstack((mutants, mutant))
return mutants
def next_generation(population, fitness_values, to_be_selected, to_be_crossed_over, to_be_mutated):
new_population = selection(population, fitness_values, to_be_selected)
children = cross_over(new_population, to_be_crossed_over)
mutants = mutate(children[:to_be_mutated])
children[:to_be_mutated] = mutants
return np.vstack((new_population, children))
def eval_individual(individual, data_train_x, data_train_y, data_cost, train=True):
clf = RandomForestClassifier(n_estimators=100, max_depth=9, random_state=550)
acc_0 = (np.ones_like(data_train_y) * 3 == data_train_y).mean()
all_f_cost = data_cost.sum()
cur_features = []
cur_feature_cost = 0
for u_feature, is_used in enumerate(individual):
if is_used:
cur_features.append(u_feature)
cur_feature_cost += data_cost[u_feature]
if 18 in cur_features and 19 in cur_features:
cur_features.append(20)
used_features_x = None
for u_feature in cur_features:
if used_features_x is None:
used_features_x = np.expand_dims(data_train_x[:,u_feature], axis=1)
else:
used_features_x = np.append(used_features_x, np.expand_dims(data_train_x[:,u_feature], axis=1), axis=1)
clf.fit(used_features_x, data_train_y)
used_features_acc = (clf.predict(used_features_x) == data_train_y).mean()
used_features_acc_c1 = (clf.predict(used_features_x[data_train_y == 1]) == data_train_y[data_train_y == 1]).mean()
used_features_acc_c2 = (clf.predict(used_features_x[data_train_y == 2]) == data_train_y[data_train_y == 2]).mean()
used_features_acc_c3 = (clf.predict(used_features_x[data_train_y == 3]) == data_train_y[data_train_y == 3]).mean()
joint_cost = (acc_0 + (1 - acc_0) * (cur_feature_cost / all_f_cost)) / used_features_acc
print(f"Features:{cur_features},\tAcc:{used_features_acc:.4f}, Class1_Acc:{used_features_acc_c1:.4f}, "
f"Class2_Acc:{used_features_acc_c2:.4f}, Class3_Acc:{used_features_acc_c3:.4f}, "
f"Joint Cost:{joint_cost:.4f}, Feature Cost:{cur_feature_cost}")
def genetic_algorithm(data_train_x, data_train_y, data_cost):
p = 80
r = 0.8
m = 0.6
to_be_selected = round((1 - r) * p)
to_be_crossed_over = round(r * p / 2)
to_be_mutated = round(m * p)
initial_features = 5
final_individual = None
final_joint_cost = np.inf
done = False
for g in range(10):
if g == 0:
population = None
for _ in range(p):
row_p = np.array([0] * (20 - initial_features) + [1] * initial_features)
np.random.shuffle(row_p)
population = row_p if population is None else np.vstack((population, row_p))
fitness_values = calculate_fitness(population, data_train_x, data_train_y, data_cost)
else:
population = next_generation(population, fitness_values, to_be_selected, to_be_crossed_over, to_be_mutated)
fitness_values = calculate_fitness(population, data_train_x, data_train_y, data_cost)
best_fit_ix = np.argsort(fitness_values)[0]
best_fit_individual = population[best_fit_ix]
best_fit_cost = fitness_values[best_fit_ix]
print(f"Generation:{g}, Individual:{best_fit_individual}, Joint Cost:{best_fit_cost:.8f}")
eval_individual(best_fit_individual, data_train_x, data_train_y, data_cost)
if best_fit_cost < final_joint_cost:
final_individual = best_fit_individual
final_joint_cost = best_fit_cost
else:
done = True
if done:
break
return best_fit_individual
def evaulate_genetic_algorithm(best_fit_individual, data_train_x, data_train_y, data_test_x, data_test_y, data_cost):
clf = RandomForestClassifier(n_estimators=100, max_depth=9, random_state=550)
cur_features = []
cur_feature_cost = 0
for u_feature, is_used in enumerate(best_fit_individual):
if is_used:
cur_features.append(u_feature)
cur_feature_cost += data_cost[u_feature]
if 18 in cur_features and 19 in cur_features:
cur_features.append(20)
used_features_x = None
for u_feature in cur_features:
if used_features_x is None:
used_features_x = np.expand_dims(data_train_x[:,u_feature], axis=1)
else:
used_features_x = np.append(used_features_x, np.expand_dims(data_train_x[:,u_feature], axis=1), axis=1)
used_features_test_x = None
for u_feature in cur_features:
if used_features_test_x is None:
used_features_test_x = np.expand_dims(data_test_x[:,u_feature], axis=1)
else:
used_features_test_x = np.append(used_features_test_x, np.expand_dims(data_test_x[:,u_feature], axis=1), axis=1)
clf.fit(used_features_x, data_train_y)
selected_features_train_acc = (clf.predict(used_features_x) == data_train_y).mean()
selected_features_train_acc_c1 = (clf.predict(used_features_x[data_train_y == 1]) == data_train_y[data_train_y == 1]).mean()
selected_features_train_acc_c2 = (clf.predict(used_features_x[data_train_y == 2]) == data_train_y[data_train_y == 2]).mean()
selected_features_train_acc_c3 = (clf.predict(used_features_x[data_train_y == 3]) == data_train_y[data_train_y == 3]).mean()
selected_features_test_acc = (clf.predict(used_features_test_x) == data_test_y).mean()
selected_features_test_acc_c1 = (clf.predict(used_features_test_x[data_test_y == 1]) == data_test_y[data_test_y == 1]).mean()
selected_features_test_acc_c2 = (clf.predict(used_features_test_x[data_test_y == 2]) == data_test_y[data_test_y == 2]).mean()
selected_features_test_acc_c3 = (clf.predict(used_features_test_x[data_test_y == 3]) == data_test_y[data_test_y == 3]).mean()
print(f"Selected Features:{cur_features}, "
f"Train Acc:{selected_features_train_acc:.4f}, "
f"Train Class1_Acc:{selected_features_train_acc_c1:.4f}, "
f"Train Class2_Acc:{selected_features_train_acc_c2:.4f}, "
f"Train Class3_Acc:{selected_features_train_acc_c3:.4f}, ")
print(f"Selected Features:{cur_features}, "
f"Test Acc:{selected_features_test_acc:.4f}, "
f"Test Class1_Acc:{selected_features_test_acc_c1:.4f}, "
f"Test Class2_Acc:{selected_features_test_acc_c2:.4f}, "
f"Test Class3_Acc:{selected_features_test_acc_c3:.4f}, ")
| 47.597222 | 131 | 0.679428 | 2,575 | 17,135 | 4.099806 | 0.062136 | 0.086104 | 0.070096 | 0.039595 | 0.757602 | 0.708629 | 0.688358 | 0.677465 | 0.635503 | 0.631524 | 0 | 0.024567 | 0.211322 | 17,135 | 359 | 132 | 47.729805 | 0.756623 | 0 | 0 | 0.50519 | 0 | 0.00346 | 0.117946 | 0.09396 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038062 | false | 0 | 0.00692 | 0 | 0.069204 | 0.069204 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
90af80382960fcf99a987461ba8a08469e02e26d | 1,182 | py | Python | flexget/plugins/input/generate.py | tvcsantos/Flexget | e08ce2957dd4f0668911d1e56347369939e4d0a5 | [
"MIT"
] | null | null | null | flexget/plugins/input/generate.py | tvcsantos/Flexget | e08ce2957dd4f0668911d1e56347369939e4d0a5 | [
"MIT"
] | 1 | 2018-06-09T18:03:35.000Z | 2018-06-09T18:03:35.000Z | flexget/plugins/input/generate.py | tvcsantos/Flexget | e08ce2957dd4f0668911d1e56347369939e4d0a5 | [
"MIT"
] | null | null | null | from __future__ import unicode_literals, division, absolute_import
import logging
from flexget import plugin
from flexget.event import event
from flexget.entry import Entry
from flexget import validator
log = logging.getLogger(__name__.rsplit('.')[-1])
class Generate(object):
"""Generates n number of random entries. Used for debugging purposes."""
schema = {'type': 'integer'}
def on_task_input(self, task, config):
amount = config or 0 # hackily makes sure it's an int value
entries = []
for i in range(amount):
entry = Entry()
import string
import random
entry['url'] = 'http://localhost/generate/%s/%s' % (i, ''.join([random.choice(string.letters + string.digits) for x in range(1, 30)]))
entry['title'] = ''.join([random.choice(string.letters + string.digits) for x in range(1, 30)])
entry['description'] = ''.join([random.choice(string.letters + string.digits) for x in range(1, 1000)])
entries.append(entry)
return entries
@event('plugin.register')
def register_plugin():
plugin.register(Generate, 'generate', api_ver=2, debug=True)
| 34.764706 | 146 | 0.65313 | 152 | 1,182 | 4.986842 | 0.493421 | 0.058047 | 0.063325 | 0.087071 | 0.228232 | 0.228232 | 0.228232 | 0.228232 | 0.228232 | 0.228232 | 0 | 0.015152 | 0.218274 | 1,182 | 33 | 147 | 35.818182 | 0.805195 | 0.087986 | 0 | 0 | 0 | 0 | 0.079291 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.333333 | 0 | 0.541667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
90d1072260263bedc9d86a9800070ef0ba79bdce | 8,002 | py | Python | src/old_2/theatre_stats.py | bryanabsmith/theatre | 4daf56d545b256aa9d609524dedaa591a2fb9b4b | [
"MIT"
] | 1 | 2015-12-19T16:48:32.000Z | 2015-12-19T16:48:32.000Z | src/old_2/theatre_stats.py | bryanabsmith/ssp | 4daf56d545b256aa9d609524dedaa591a2fb9b4b | [
"MIT"
] | null | null | null | src/old_2/theatre_stats.py | bryanabsmith/ssp | 4daf56d545b256aa9d609524dedaa591a2fb9b4b | [
"MIT"
] | null | null | null | #!/usr/bin/python
"""
theatre statistics reporter.
"""
import anydbm
import ConfigParser
import sys
import time
"""
This is the stats module which helps to generate
and report back stats about the server.
"""
class THEATREStats(object):
"""
THEATREStats module - the "workhorse" module.
"""
config = ConfigParser.RawConfigParser(allow_no_value=True)
"""
Statistics class for THEATRE - init.
"""
def __init__(self):
self.config.read("theatre.config")
#stats_db_location = self.config.get("stats", "location")
try:
stats_db = anydbm.open("%s/theatre_stats.db" %
self.config.get("stats", "location"), "c")
except IOError:
print(" :: It would appear as though the [stats] -> " + \
"location configuration option is set to a location " + \
"that isn't a valid directory. Please set it to a " +
"valid directory and re-execute theatre_stats.")
sys.exit(0)
#show_daily = self.config.get("stats", "show_daily_requests")
#visualize_bars = ""
date_time = {
"notformatted": time.strftime("%H-%M-%S_%d-%m-%Y"),
"formatted": time.strftime("%d/%m/%Y, %H:%M:%S")
}
try:
#option = sys.argv[1]
if sys.argv[1] == "export_csv":
output = "key, value\n"
for keys in sorted(stats_db.keys()):
output += "%s, %s\n" % (keys, stats_db[keys])
output_csv = open("%s/theatre_csv_%s.csv" %
(self.config.get("stats", "output_csv"),
date_time["notformatted"]), "w").write(output)
#output_csv.write(output)
output_csv.close()
print(" :: Statistics exported to %s/theatre_csv_%s.csv" %
(self.config.get("stats", "output_csv"), date_time["notformatted"]))
elif sys.argv[1] == "export_html":
totals = {
"browsers": 0,
"oses": 0,
"requests": 0
}
counts = {
"browsers": [],
"browsers_value": [],
"oses": [],
"oses_value": [],
"requests": [],
"requests_value": []
}
for keys in stats_db.keys():
if keys[:7] == "browser":
totals["browsers"] += int(stats_db[keys])
elif keys[:2] == "os":
totals["oses"] += int(stats_db[keys])
elif keys == "requests":
totals["requests"] = stats_db["requests"]
for keys in sorted(stats_db.keys()):
if keys[:7] == "browser":
counts["browsers"].append(keys[8:].replace("_", " ") +
" (%s, %s%%)" %
(stats_db[keys],
str(round((float(
stats_db[keys])/totals["browsers"])
*100, 2))))
counts["browsers_value"].append(int(stats_db[keys]))
elif keys[:2] == "os":
counts["oses"].append(keys[3:].replace("_", " ") +
" (%s, %s%%)" %
(stats_db[keys],
str(round((float(stats_db[keys])/totals["oses"])
*100, 2))))
counts["oses_value"].append(int(stats_db[keys]))
elif keys[:9] == "requests_":
counts["requests"].append(keys[9:].replace("_", "/"))
counts["requests_value"].append(int(stats_db[keys]))
#requests_max = max(requests_value) + 2
stats_html = """
<html>
<head>
<link rel='stylesheet' href='http://cdn.jsdelivr.net/chartist.js/latest/chartist.min.css'>
<link href='https://fonts.googleapis.com/css?family=Raleway' rel='stylesheet' type='text/css'>
<script src='http://cdn.jsdelivr.net/chartist.js/latest/chartist.min.js'></script>
<style>body {background-color: #F2F2F0; font-family: 'Raleway', sans-serif; margin: 5%%;}</style>
</head>
<body>
<h1>theatre Statistics (%s)</h1>
<h3>Browser</h3><div id='chartBrowser' class='ct-chart ct-perfect-fourth'></div><p></p>
<h3>Operating Systems</h3><div id='chartOS' class='ct-chart ct-perfect-fourth'></div><p></p>
<h3>Requests (%s total)</h3><div id='chartRequests' class='ct-chart ct-perfect-fourth'></div>
<script>
new Chartist.Pie('#chartBrowser', {labels: %s, series: %s}, {donut: true, donutWidth: 50, startAngle: 0, total: 0, showLabel: true, chartPadding: 100, labelOffset: 50, labelDirection: 'explode'});
new Chartist.Pie('#chartOS', {labels: %s, series: %s}, {donut: true, donutWidth: 50, startAngle: 0, total: 0, showLabel: true, chartPadding: 100, labelOffset: 50, labelDirection: 'explode'});
new Chartist.Line('#chartRequests', {labels: %s, series: [%s]}, {high: %i, low: 0, showArea: true});
</script>
</body>
</html>
""" % (date_time["formatted"],
totals["requests"],
counts["browsers"],
counts["browsers_value"],
counts["oses"],
counts["oses_value"],
counts["requests"],
counts["requests_value"],
int(max(counts["requests_value"]) + 2))
f_output = open("%s/theatre_html_%s.html" %
(self.config.get("stats", "output_html"),
date_time["notformatted"]), "w")
f_output.writelines(stats_html)
f_output.close()
print(" :: Statistics exported to %s/theatre_html_%s.html" %
(self.config.get("stats", "output_html"), date_time["notformatted"]))
else:
print(" :: Invalid option. Possible options:\n "
":: export_csv - Export the keys and values to a csv file.\n "
":: export_html - Export the keys and values to a html file.")
except IndexError:
try:
for keys in sorted(stats_db.keys()):
if self.config.get("stats", "show_daily_requests") == "False":
if keys[:8] == "requests":
pass
else:
print " :: %s=%s" % (keys, stats_db[keys])
else:
print " :: %s=%s" % (keys, stats_db[keys])
except KeyError:
print ":: No data."
@staticmethod
def get_server_version():
"""
Return server version.
"""
import theatre
return theatre.__theatre_version__
@staticmethod
def get_server_platform():
"""
Return server platform.
"""
import theatre
return theatre.__plat__
if __name__ == "__main__":
THEATRE_STATS = THEATREStats()
| 44.955056 | 222 | 0.44164 | 735 | 8,002 | 4.668027 | 0.278912 | 0.040804 | 0.051297 | 0.04197 | 0.389391 | 0.374235 | 0.360536 | 0.280968 | 0.222093 | 0.195861 | 0 | 0.012311 | 0.421395 | 8,002 | 177 | 223 | 45.20904 | 0.728726 | 0.029118 | 0 | 0.186567 | 1 | 0.074627 | 0.385208 | 0.035289 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.007463 | 0.044776 | null | null | 0.052239 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
90e49744ac5303efcf5fac0b9dd331d4295c50a6 | 312 | py | Python | private_transformers/privacy_utils/misc.py | JunyiZhu-AI/private-transformers | 9d70a570a56e6e19d8a8cb1f8e3b3ffd809a04ef | [
"Apache-2.0"
] | 29 | 2021-10-24T00:43:29.000Z | 2022-03-25T02:31:21.000Z | private_transformers/privacy_utils/misc.py | JunyiZhu-AI/private-transformers | 9d70a570a56e6e19d8a8cb1f8e3b3ffd809a04ef | [
"Apache-2.0"
] | 8 | 2021-10-30T05:57:31.000Z | 2022-03-30T16:22:46.000Z | private_transformers/privacy_utils/misc.py | JunyiZhu-AI/private-transformers | 9d70a570a56e6e19d8a8cb1f8e3b3ffd809a04ef | [
"Apache-2.0"
] | 4 | 2021-11-03T04:40:37.000Z | 2022-03-04T00:26:15.000Z | """Miscellaneous helpers."""
import warnings
def handle_unused_kwargs(unused_kwargs, msg=None):
if len(unused_kwargs) > 0:
if msg is not None:
warnings.warn(f"{msg}: Unexpected arguments {unused_kwargs}")
else:
warnings.warn(f"Unexpected arguments {unused_kwargs}")
| 28.363636 | 73 | 0.663462 | 38 | 312 | 5.289474 | 0.526316 | 0.298507 | 0.129353 | 0.308458 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004132 | 0.224359 | 312 | 10 | 74 | 31.2 | 0.826446 | 0.070513 | 0 | 0 | 0 | 0 | 0.278169 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
90f0ae5392d1472cf0101ef4c2178959aa451182 | 300 | py | Python | algoexpert.io/python/Find_Loop.py | XSoyOscar/Algorithms | 6e1626d4b0f7804494f0a651698966ad6fd0fe18 | [
"MIT"
] | 80 | 2020-07-02T20:47:21.000Z | 2022-03-22T06:52:59.000Z | algoexpert.io/python/Find_Loop.py | XSoyOscar/Algorithms | 6e1626d4b0f7804494f0a651698966ad6fd0fe18 | [
"MIT"
] | 1 | 2020-10-05T19:22:10.000Z | 2020-10-05T19:22:10.000Z | algoexpert.io/python/Find_Loop.py | XSoyOscar/Algorithms | 6e1626d4b0f7804494f0a651698966ad6fd0fe18 | [
"MIT"
] | 73 | 2020-04-09T22:28:01.000Z | 2022-02-26T19:22:25.000Z |
# O(n) time | O(1) space
def findLoop(head):
first = head.next
second = head.next.next
while first != second:
first = first.next
second = second.next.next
first = head
while first != second:
first = first.next
second = second.next
return first | 23.076923 | 33 | 0.583333 | 39 | 300 | 4.487179 | 0.333333 | 0.171429 | 0.182857 | 0.24 | 0.525714 | 0.525714 | 0.525714 | 0.525714 | 0.525714 | 0 | 0 | 0.004878 | 0.316667 | 300 | 13 | 34 | 23.076923 | 0.84878 | 0.073333 | 0 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
90fadbb54dedcf4a21fdefea57abf73240f02de7 | 499 | py | Python | src/pycropml/model.py | sielenk-yara/PyCrop2ML | dfcab79061fa71d4343120573b50b6232812999e | [
"MIT"
] | null | null | null | src/pycropml/model.py | sielenk-yara/PyCrop2ML | dfcab79061fa71d4343120573b50b6232812999e | [
"MIT"
] | null | null | null | src/pycropml/model.py | sielenk-yara/PyCrop2ML | dfcab79061fa71d4343120573b50b6232812999e | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Tue Jun 4 22:10:54 2019
@author: midingoy
"""
import os
path = os.path.dirname(os.path.realpath(__file__))
sbmlFilePath = os.path.join(path, 'MODEL1204190002.xml')
with open(sbmlFilePath,'r') as f:
sbmlString = f.read()
def module_exists(module_name):
try:
__import__(module_name)
except ImportError:
return False
else:
return True
if module_exists('pycrop2ml'):
import pycrop2ml
from pycropml import model
| 17.206897 | 56 | 0.671343 | 66 | 499 | 4.893939 | 0.712121 | 0.074303 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060759 | 0.208417 | 499 | 28 | 57 | 17.821429 | 0.756962 | 0.154309 | 0 | 0 | 0 | 0 | 0.070048 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.333333 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
290e609e00e6cfaee87e2b902f545bd06450fe8d | 399 | py | Python | machina/apps/forum_tracking/registry_config.py | OneRainbowDev/django-machina | 7354cc50f58dcbe49eecce7e1f019f6fff21d690 | [
"BSD-3-Clause"
] | 1 | 2021-10-08T03:31:24.000Z | 2021-10-08T03:31:24.000Z | machina/apps/forum_tracking/registry_config.py | OneRainbowDev/django-machina | 7354cc50f58dcbe49eecce7e1f019f6fff21d690 | [
"BSD-3-Clause"
] | 7 | 2020-02-12T01:11:13.000Z | 2022-03-11T23:26:32.000Z | machina/apps/forum_tracking/registry_config.py | OneRainbowDev/django-machina | 7354cc50f58dcbe49eecce7e1f019f6fff21d690 | [
"BSD-3-Clause"
] | 1 | 2019-04-20T05:26:27.000Z | 2019-04-20T05:26:27.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.apps import AppConfig
from django.utils.translation import ugettext_lazy as _
class TrackingRegistryConfig(AppConfig):
label = 'forum_tracking'
name = 'machina.apps.forum_tracking'
verbose_name = _('Machina: Forum tracking')
def ready(self): # pragma: no cover
from . import receivers # noqa
| 24.9375 | 55 | 0.721805 | 47 | 399 | 5.893617 | 0.680851 | 0.140794 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003086 | 0.18797 | 399 | 15 | 56 | 26.6 | 0.851852 | 0.107769 | 0 | 0 | 0 | 0 | 0.181818 | 0.076705 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.444444 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
29154b4e535f27d46e6504086dad1a8ac5b22afd | 7,811 | py | Python | ixnetwork_restpy/testplatform/sessions/ixnetwork/vport/protocols/triggeredtraceroutelearnedinfo_60610bdf03c18bf6050144c753fe5797.py | rfrye-github/ixnetwork_restpy | 23eeb24b21568a23d3f31bbd72814ff55eb1af44 | [
"MIT"
] | null | null | null | ixnetwork_restpy/testplatform/sessions/ixnetwork/vport/protocols/triggeredtraceroutelearnedinfo_60610bdf03c18bf6050144c753fe5797.py | rfrye-github/ixnetwork_restpy | 23eeb24b21568a23d3f31bbd72814ff55eb1af44 | [
"MIT"
] | null | null | null | ixnetwork_restpy/testplatform/sessions/ixnetwork/vport/protocols/triggeredtraceroutelearnedinfo_60610bdf03c18bf6050144c753fe5797.py | rfrye-github/ixnetwork_restpy | 23eeb24b21568a23d3f31bbd72814ff55eb1af44 | [
"MIT"
] | null | null | null | # MIT LICENSE
#
# Copyright 1997 - 2020 by IXIA Keysight
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
from ixnetwork_restpy.base import Base
from ixnetwork_restpy.files import Files
class TriggeredTracerouteLearnedInfo(Base):
"""This object holds the attributes for triggered trace route learned information.
The TriggeredTracerouteLearnedInfo class encapsulates a list of triggeredTracerouteLearnedInfo resources that are managed by the system.
A list of resources can be retrieved from the server using the TriggeredTracerouteLearnedInfo.find() method.
"""
__slots__ = ()
_SDM_NAME = 'triggeredTracerouteLearnedInfo'
_SDM_ATT_MAP = {
'Fec': 'fec',
'Hops': 'hops',
'IncomingLabelStack': 'incomingLabelStack',
'NumberOfReplyingHops': 'numberOfReplyingHops',
'OutgoingLabelStack': 'outgoingLabelStack',
'Reachability': 'reachability',
'SenderHandle': 'senderHandle',
}
def __init__(self, parent):
super(TriggeredTracerouteLearnedInfo, self).__init__(parent)
@property
def Hops(self):
"""
Returns
-------
- obj(ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.hops_afd65edf9d4aac0ccdf5a3a2bd672a47.Hops): An instance of the Hops class
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
from ixnetwork_restpy.testplatform.sessions.ixnetwork.vport.protocols.hops_afd65edf9d4aac0ccdf5a3a2bd672a47 import Hops
return Hops(self)
@property
def Fec(self):
"""
Returns
-------
- str: This signifies the FEC component.
"""
return self._get_attribute(self._SDM_ATT_MAP['Fec'])
@property
def Hops(self):
"""
Returns
-------
- str: This signifies the number of LSP hops.
"""
return self._get_attribute(self._SDM_ATT_MAP['Hops'])
@property
def IncomingLabelStack(self):
"""
Returns
-------
- str: This signifies the information is sent to the MPLS-OAM module which is used for validation of FEC stack received in an echo request. This is the assigned labels stack by the Ixia router and bfd/ping messages are expected to be received from DUT with this stack values. The outer value corresponds to the PSN Tunnel Label and the inner value corresponds to the PW label.
"""
return self._get_attribute(self._SDM_ATT_MAP['IncomingLabelStack'])
@property
def NumberOfReplyingHops(self):
"""
Returns
-------
- number: This signifies the total number of replying LSP hops.
"""
return self._get_attribute(self._SDM_ATT_MAP['NumberOfReplyingHops'])
@property
def OutgoingLabelStack(self):
"""
Returns
-------
- str: This signifies the information is sent to the MPLS-OAM module which is used for validation of FEC outgoing Label stack that is received in an echo request.
"""
return self._get_attribute(self._SDM_ATT_MAP['OutgoingLabelStack'])
@property
def Reachability(self):
"""
Returns
-------
- str: This signifies whether the LSP is reachable with a proper return code or not. If the return code is not set to 3, in the received reply message or if there is no reply message that is received, then the field will show unreachable.
"""
return self._get_attribute(self._SDM_ATT_MAP['Reachability'])
@property
def SenderHandle(self):
"""
Returns
-------
- number: This signifies the sender handle details.
"""
return self._get_attribute(self._SDM_ATT_MAP['SenderHandle'])
def find(self, Fec=None, Hops=None, IncomingLabelStack=None, NumberOfReplyingHops=None, OutgoingLabelStack=None, Reachability=None, SenderHandle=None):
"""Finds and retrieves triggeredTracerouteLearnedInfo resources from the server.
All named parameters are evaluated on the server using regex. The named parameters can be used to selectively retrieve triggeredTracerouteLearnedInfo resources from the server.
To retrieve an exact match ensure the parameter value starts with ^ and ends with $
By default the find method takes no parameters and will retrieve all triggeredTracerouteLearnedInfo resources from the server.
Args
----
- Fec (str): This signifies the FEC component.
- Hops (str): This signifies the number of LSP hops.
- IncomingLabelStack (str): This signifies the information is sent to the MPLS-OAM module which is used for validation of FEC stack received in an echo request. This is the assigned labels stack by the Ixia router and bfd/ping messages are expected to be received from DUT with this stack values. The outer value corresponds to the PSN Tunnel Label and the inner value corresponds to the PW label.
- NumberOfReplyingHops (number): This signifies the total number of replying LSP hops.
- OutgoingLabelStack (str): This signifies the information is sent to the MPLS-OAM module which is used for validation of FEC outgoing Label stack that is received in an echo request.
- Reachability (str): This signifies whether the LSP is reachable with a proper return code or not. If the return code is not set to 3, in the received reply message or if there is no reply message that is received, then the field will show unreachable.
- SenderHandle (number): This signifies the sender handle details.
Returns
-------
- self: This instance with matching triggeredTracerouteLearnedInfo resources retrieved from the server available through an iterator or index
Raises
------
- ServerError: The server has encountered an uncategorized error condition
"""
return self._select(self._map_locals(self._SDM_ATT_MAP, locals()))
def read(self, href):
"""Retrieves a single instance of triggeredTracerouteLearnedInfo data from the server.
Args
----
- href (str): An href to the instance to be retrieved
Returns
-------
- self: This instance with the triggeredTracerouteLearnedInfo resources from the server available through an iterator or index
Raises
------
- NotFoundError: The requested resource does not exist on the server
- ServerError: The server has encountered an uncategorized error condition
"""
return self._read(href)
| 46.494048 | 406 | 0.681603 | 941 | 7,811 | 5.588735 | 0.264612 | 0.034607 | 0.036509 | 0.028903 | 0.507891 | 0.450656 | 0.4305 | 0.414908 | 0.370032 | 0.334284 | 0 | 0.006155 | 0.251184 | 7,811 | 167 | 407 | 46.772455 | 0.892973 | 0.641787 | 0 | 0.222222 | 0 | 0 | 0.145282 | 0.014978 | 0 | 0 | 0 | 0 | 0 | 1 | 0.244444 | false | 0 | 0.066667 | 0 | 0.622222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
29185856763e1bfd5209b8a437a2138073fa9716 | 1,779 | py | Python | hrflow/hrflow/profile/__init__.py | Riminder/python-hrflow-api | 5457c138c12689a1af08b243c15f3cbe898edf65 | [
"MIT"
] | 4 | 2020-04-01T15:16:04.000Z | 2021-01-18T03:52:39.000Z | hrflow/hrflow/profile/__init__.py | Riminder/python-hrflow-api | 5457c138c12689a1af08b243c15f3cbe898edf65 | [
"MIT"
] | null | null | null | hrflow/hrflow/profile/__init__.py | Riminder/python-hrflow-api | 5457c138c12689a1af08b243c15f3cbe898edf65 | [
"MIT"
] | null | null | null | """Profile related calls."""
from .attachment import ProfileAttachments
from .parsing import ProfileParsing
from .indexing import ProfileIndexing
from .revealing import ProfileRevealing
from .embedding import ProfileEmbedding
from .searching import ProfileSearching
from .scoring import ProfileScoring
from .reasoning import ProfileReasoning
class Profile(object):
"""
Class that interacts with hrflow API profiles endpoint.
Usage example:
>>> from hrflow.hrflow import hrflow
>>> from hrflow import Profile
>>> client = client(api_key="YOUR_API_KEY")
>>> profile = Profile(self.client)
>>> result = profile.get_profiles(source_ids=["5823bc959983f7a5925a5356020e60d605e8c9b5"])
>>> print(result)
{
"code": 200,
"message": "OK",
"data": {
"page": 1,
"maxPage": 3,
"count_profiles": 85,
"profiles": [
{
"profile_id": "215de6cb5099f4895149ec0a6ac91be94ffdd246",
"profile_reference": "49583",
...
"""
def __init__(self, client):
"""
Initialize Profile object with hrflow client.
Args:
client: hrflow client instance <hrflow object>
Returns
Profile instance object.
"""
self.client = client
self.attachment = ProfileAttachments(self.client)
self.parsing = ProfileParsing(self.client)
self.indexing = ProfileIndexing(self.client)
self.embedding = ProfileEmbedding(self.client)
self.revealing = ProfileRevealing(self.client)
self.scoring = ProfileScoring(self.client)
self.searching = ProfileSearching(self.client)
self.reasoning = ProfileReasoning(self.client)
| 30.672414 | 94 | 0.641372 | 160 | 1,779 | 7.05625 | 0.39375 | 0.097431 | 0.086802 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.050305 | 0.262507 | 1,779 | 57 | 95 | 31.210526 | 0.810213 | 0.437324 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.421053 | 0 | 0.526316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
291b2893cd65284844b44b6f8aedf27f7e0f86c8 | 2,543 | py | Python | day5/day5.py | martinpeck/advent-of-code-2019 | 12e80432b17a6ff695a38316245268f6c70d07d5 | [
"MIT"
] | null | null | null | day5/day5.py | martinpeck/advent-of-code-2019 | 12e80432b17a6ff695a38316245268f6c70d07d5 | [
"MIT"
] | null | null | null | day5/day5.py | martinpeck/advent-of-code-2019 | 12e80432b17a6ff695a38316245268f6c70d07d5 | [
"MIT"
] | null | null | null | from itertools import product
def load_instructions(filename):
with open(filename, "r") as file:
line = file.readline()
return parse_instructions(line)
def parse_instructions(instructions_as_string):
return list(map(int, instructions_as_string.split(",")))
def get_next_instruction(instructions, instruction_pointer):
instruction = instructions[instruction_pointer]
return instruction
def run_program(instructions, input_value):
instruction_pointer = 0
while True:
instruction = get_next_instruction(instructions, instruction_pointer)
if instruction == 99:
break
elif instruction == 1:
instruction_pointer += 1
location = instructions[instruction_pointer]
first_number = instructions[location]
instruction_pointer += 1
location = instructions[instruction_pointer]
second_number = instructions[location]
instruction_pointer += 1
location = instructions[instruction_pointer]
instructions[location] = first_number + second_number
instruction_pointer += 1
elif instruction == 2:
instruction_pointer += 1
location = instructions[instruction_pointer]
first_number = instructions[location]
instruction_pointer += 1
location = instructions[instruction_pointer]
second_number = instructions[location]
instruction_pointer += 1
location = instructions[instruction_pointer]
instructions[location] = first_number * second_number
instruction_pointer += 1
elif instruction == 3:
instruction_pointer += 1
location = instructions[instruction_pointer]
instructions[location] = input_value
elif instruction == 4:
instruction_pointer += 1
location = instructions[instruction_pointer]
output_value = instructions[location]
else:
raise Exception(f"Unexpected OpCode [{instruction}]")
return output_value
def solve_part1():
input_value = 1
instructions = load_instructions(filename="input.txt")
result = run_program(instructions, input_value)
return result[0]
def solve_part2():
pass
def solve_all():
print(f"*** {__file__} ***")
print(f"Solution to part 1: {solve_part1()}")
print(f"Solution to part 2: {solve_part2()}")
if __name__ == "__main__":
solve_all()
| 26.489583 | 77 | 0.64884 | 242 | 2,543 | 6.53719 | 0.268595 | 0.250316 | 0.208597 | 0.136536 | 0.604298 | 0.538559 | 0.477876 | 0.441846 | 0.441846 | 0.393173 | 0 | 0.013514 | 0.272513 | 2,543 | 95 | 78 | 26.768421 | 0.841622 | 0 | 0 | 0.354839 | 0 | 0 | 0.055053 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.112903 | false | 0.016129 | 0.016129 | 0.016129 | 0.209677 | 0.048387 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
292a15ea81c8a34f758d90779be06afccbb5d66a | 615 | py | Python | smarthome/things.py | andvikt/mqtt_decorator | 36bf76d3d6fa548172db32cdf002d68c96c7a535 | [
"BSD-2-Clause"
] | null | null | null | smarthome/things.py | andvikt/mqtt_decorator | 36bf76d3d6fa548172db32cdf002d68c96c7a535 | [
"BSD-2-Clause"
] | null | null | null | smarthome/things.py | andvikt/mqtt_decorator | 36bf76d3d6fa548172db32cdf002d68c96c7a535 | [
"BSD-2-Clause"
] | null | null | null | from .core import state
from .utils.converters import str_to_bool
from .thing import Thing
from .state import State
import attr
class Switch(Thing):
root = 'switch'
is_on: State = state(False, converter=str_to_bool)
class Dimmer(Thing):
root = 'dimmer'
dim_level: State = state(0, int)
class Number(Thing):
root = 'number'
value: State = state(0, int)
class String(Thing):
root = 'string'
value: State = state(0, str)
class Button(Thing):
root = 'button'
value: State = state(0, int)
class Temperature(Thing):
root = 'temp'
value: State = state(0, float)
| 17.083333 | 54 | 0.658537 | 86 | 615 | 4.639535 | 0.348837 | 0.135338 | 0.137845 | 0.160401 | 0.16792 | 0.120301 | 0 | 0 | 0 | 0 | 0 | 0.010482 | 0.22439 | 615 | 35 | 55 | 17.571429 | 0.825996 | 0 | 0 | 0.086957 | 0 | 0 | 0.055285 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.217391 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
29307ed2646e5f315e990593182cc80a240a6151 | 254 | py | Python | configs/diseased/resnet50_diseased.py | jiangwenj02/mmclassification | 4c3657c16f370ace9013b160aa054c87fd27a055 | [
"Apache-2.0"
] | null | null | null | configs/diseased/resnet50_diseased.py | jiangwenj02/mmclassification | 4c3657c16f370ace9013b160aa054c87fd27a055 | [
"Apache-2.0"
] | null | null | null | configs/diseased/resnet50_diseased.py | jiangwenj02/mmclassification | 4c3657c16f370ace9013b160aa054c87fd27a055 | [
"Apache-2.0"
] | null | null | null | _base_ = [
'../_base_/models/resnest50.py', '../_base_/datasets/diseased_bs32_pil_resize.py',
'../_base_/schedules/imagenet_bs256_coslr.py', '../_base_/default_runtime.py'
]
model = dict(
head=dict(
num_classes=2,
topk=(1,))
) | 28.222222 | 86 | 0.641732 | 31 | 254 | 4.709677 | 0.741935 | 0.123288 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042453 | 0.165354 | 254 | 9 | 87 | 28.222222 | 0.646226 | 0 | 0 | 0 | 0 | 0 | 0.572549 | 0.572549 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
293ce9ba28add674ea78b24b687d099771268258 | 510 | py | Python | tests/web/login.py | happz/settlers | 961a6d2121ab6e89106f17017f026c60c77f16f9 | [
"MIT"
] | 1 | 2018-11-16T09:41:31.000Z | 2018-11-16T09:41:31.000Z | tests/web/login.py | happz/settlers | 961a6d2121ab6e89106f17017f026c60c77f16f9 | [
"MIT"
] | 15 | 2015-01-07T14:17:36.000Z | 2019-04-29T13:26:43.000Z | tests/web/login.py | happz/settlers | 961a6d2121ab6e89106f17017f026c60c77f16f9 | [
"MIT"
] | null | null | null | from tests.web import *
class TestCase(WebTestCase):
@classmethod
def setup_class(cls):
super(TestCase, cls).setup_class()
cls._appserver = tests.appserver.AppServer(*tests.appserver.AppServer.fetch_config('default_appserver'))
cls._appserver.start()
@classmethod
def teardown_class(cls):
if cls._appserver:
cls._appserver.stop()
super(TestCase, cls).teardown_class()
def test_login(self):
self.login()
@requires_login
def test_logout(self):
self.logout()
| 21.25 | 108 | 0.713725 | 62 | 510 | 5.66129 | 0.403226 | 0.136752 | 0.074074 | 0.182336 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164706 | 510 | 23 | 109 | 22.173913 | 0.823944 | 0 | 0 | 0.117647 | 0 | 0 | 0.033333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.058824 | 0 | 0.352941 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
294c524c48c9552c666fbc7ff01b04d6b3d2431c | 1,269 | py | Python | scripts/gen_atan_lut.py | appotry/openofdm | 0ab83ce22385c132f055ef85dcbcea8eeffd77f0 | [
"Apache-2.0"
] | 200 | 2017-05-03T19:17:11.000Z | 2022-03-22T00:19:50.000Z | scripts/gen_atan_lut.py | jools76/openofdm | 229da948ae4df55fb3b6d9a055ca3e75079e50b1 | [
"Apache-2.0"
] | 8 | 2018-05-08T12:01:56.000Z | 2021-12-09T13:54:44.000Z | scripts/gen_atan_lut.py | jools76/openofdm | 229da948ae4df55fb3b6d9a055ca3e75079e50b1 | [
"Apache-2.0"
] | 78 | 2017-05-12T09:36:17.000Z | 2022-03-28T14:38:07.000Z | #!/usr/bin/env python
"""
Generate atan Look Up Table (LUT)
Key = math.tan(phase)*SIZE -- phase \in [0, math.pi/4)
Value = int(math.atan(phase)*SIZE*2)
SIZE is LUT size. The value is scaled up by SIZE*2 so that adjacent LUT values
can be distinguished.
"""
SIZE = 2**8
SCALE = 512
import argparse
import math
import os
def main():
parser = argparse.ArgumentParser()
parser.add_argument('--out')
args = parser.parse_args()
if args.out is None:
args.out = os.path.join(os.getcwd(), 'atan_lut.mif')
coe_out = '%s.coe' % (os.path.splitext(args.out)[0])
data = []
with open(args.out, 'w') as f:
for i in range(SIZE):
key = float(i)/SIZE
val = int(round(math.atan(key)*SCALE))
data.append(val)
print '%f -> %d' % (key, val)
f.write('{0:09b}\n'.format(val))
print "LUT SIZE %d, SCALE %d" % (SIZE, SCALE)
print "MIL file saved as %s" % (args.out)
with open(coe_out, 'w') as f:
f.write('memory_initialization_radix=2;\n')
f.write('memory_initialization_vector=\n')
f.write(',\n'.join(['{0:09b}'.format(l) for l in data]))
f.write(';')
print "COE file saved as %s" % (coe_out)
if __name__ == '__main__':
main()
| 24.882353 | 78 | 0.583924 | 198 | 1,269 | 3.651515 | 0.429293 | 0.048409 | 0.016598 | 0.019364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017764 | 0.245863 | 1,269 | 50 | 79 | 25.38 | 0.737722 | 0.01576 | 0 | 0 | 1 | 0 | 0.182806 | 0.062253 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.1 | null | null | 0.133333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
2950f908a4bd8cde4273538f53eb7bb4f191bae3 | 704 | py | Python | tests/integration/modules/test_rabbitmq.py | byteskeptical/salt | 637fe0b04f38b2274191b005d73b3c6707d7f400 | [
"Apache-2.0"
] | 12 | 2015-01-21T00:18:25.000Z | 2021-07-11T07:35:26.000Z | tests/integration/modules/test_rabbitmq.py | byteskeptical/salt | 637fe0b04f38b2274191b005d73b3c6707d7f400 | [
"Apache-2.0"
] | 86 | 2017-01-27T11:54:46.000Z | 2020-05-20T06:25:26.000Z | tests/integration/modules/test_rabbitmq.py | byteskeptical/salt | 637fe0b04f38b2274191b005d73b3c6707d7f400 | [
"Apache-2.0"
] | 12 | 2015-01-05T09:50:42.000Z | 2019-08-19T01:43:40.000Z | # -*- coding: utf-8 -*-
# Import python libs
from __future__ import absolute_import, unicode_literals, print_function
# Import Salt Testing libs
from tests.support.case import ModuleCase
from tests.support.helpers import requires_salt_modules, skip_if_not_root
@skip_if_not_root
@requires_salt_modules('rabbitmq')
class RabbitModuleTest(ModuleCase):
'''
Validates the rabbitmqctl functions.
To run these tests, you will need to be able to access the rabbitmqctl
commands.
'''
def test_user_exists(self):
'''
Find out whether a user exists.
'''
ret = self.run_function('rabbitmq.user_exists', ['null_user'])
self.assertEqual(ret, False)
| 28.16 | 74 | 0.715909 | 91 | 704 | 5.296703 | 0.615385 | 0.062241 | 0.06639 | 0.053942 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001767 | 0.196023 | 704 | 24 | 75 | 29.333333 | 0.849823 | 0.306818 | 0 | 0 | 0 | 0 | 0.084282 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.555556 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
295e7e6b3307507d1c9d23f65eca1dede83057e9 | 242 | py | Python | blog-api/apps/blog/api_views.py | djangocali/blog-api | b2ccf1da0ed67261d69d58f0a64032a81f1e4c07 | [
"BSD-3-Clause"
] | 1 | 2018-09-26T20:34:15.000Z | 2018-09-26T20:34:15.000Z | blog-api/apps/blog/api_views.py | djangocali/blog-api | b2ccf1da0ed67261d69d58f0a64032a81f1e4c07 | [
"BSD-3-Clause"
] | 6 | 2020-06-05T18:43:57.000Z | 2022-01-13T00:48:59.000Z | blog-api/apps/blog/api_views.py | djangocali/blog-api | b2ccf1da0ed67261d69d58f0a64032a81f1e4c07 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from rest_framework import viewsets
from .models import Post
from .serializers import PostSerializer
class PostViewSet(viewsets.ModelViewSet):
queryset = Post.objects.all()
serializer_class = PostSerializer
| 22 | 41 | 0.764463 | 27 | 242 | 6.777778 | 0.703704 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004854 | 0.14876 | 242 | 10 | 42 | 24.2 | 0.883495 | 0.086777 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
296d26ee69cfdaceca54cb407ffd178302f78c7f | 451 | py | Python | server/api/elections.py | jeffreymanzione/website | 0dea8aeedfc5e1e569c7efd60d19d9861961b3af | [
"MIT"
] | null | null | null | server/api/elections.py | jeffreymanzione/website | 0dea8aeedfc5e1e569c7efd60d19d9861961b3af | [
"MIT"
] | null | null | null | server/api/elections.py | jeffreymanzione/website | 0dea8aeedfc5e1e569c7efd60d19d9861961b3af | [
"MIT"
] | null | null | null | import json
import logging
from flask import Response, request
from .api import Api
TABLE_NAME = 'ElectionPredictions'
LIST_QUERY = 'SELECT * FROM {table_name};'.format(table_name=TABLE_NAME)
class ElectionsApi(Api):
def __init__(self, app, database):
super().__init__(app, database)
self.register_method(self._list_election_predictions, 'list')
def _list_election_predictions(self):
return self.select(LIST_QUERY)
| 26.529412 | 72 | 0.745011 | 57 | 451 | 5.526316 | 0.491228 | 0.114286 | 0.146032 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.159645 | 451 | 16 | 73 | 28.1875 | 0.831135 | 0 | 0 | 0 | 0 | 0 | 0.110865 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0.083333 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
297403ffc6b74b69fc1063192e2700430b5d60d3 | 869 | py | Python | Python/randomPasswordGenerator.py | saanyalall/Hacktoberfest2021-2 | 3b9aea4ff320976e7c2bcb85f5265be87b0c33be | [
"MIT"
] | null | null | null | Python/randomPasswordGenerator.py | saanyalall/Hacktoberfest2021-2 | 3b9aea4ff320976e7c2bcb85f5265be87b0c33be | [
"MIT"
] | null | null | null | Python/randomPasswordGenerator.py | saanyalall/Hacktoberfest2021-2 | 3b9aea4ff320976e7c2bcb85f5265be87b0c33be | [
"MIT"
] | null | null | null | # import the necessary modules i-e random and string
import random
import string
print("----------- !!! RANDOM PASSWORD GENERATOR !!! -----------")
# input length of password to generate
inputLength = int(input("Enter the length of password to generate: "))
#define the password characters
lowercase = string.ascii_lowercase
uppercase = string.ascii_uppercase
numbers = string.digits
special_characters = string.punctuation
#combine all the data
allCharacters = lowercase + uppercase + numbers + special_characters
#use random to randomly pick characters from allCharacters of inputLength
temp = random.sample(allCharacters, inputLength)
#create the password using join on temp
password = ''.join(temp)
#print the required password
print(f"Here is a password of length {inputLength}: \n{password}\n----------------------------------------------------------")
| 31.035714 | 126 | 0.710012 | 103 | 869 | 5.951456 | 0.475728 | 0.026101 | 0.052202 | 0.058728 | 0.084829 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128884 | 869 | 27 | 127 | 32.185185 | 0.809775 | 0.316456 | 0 | 0 | 1 | 0 | 0.367521 | 0.123077 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.333333 | 0.166667 | 0 | 0.166667 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
29795adac0a47cb3cf12f1baddda67163604f8fd | 1,700 | py | Python | StockAnalysisSystem/service/interface/analysis_result_page_generator.py | lifg2000/StockAnalysisSystem | b0bef50f5c1a9565e1a1f953fedbe7821601147c | [
"Apache-2.0"
] | null | null | null | StockAnalysisSystem/service/interface/analysis_result_page_generator.py | lifg2000/StockAnalysisSystem | b0bef50f5c1a9565e1a1f953fedbe7821601147c | [
"Apache-2.0"
] | null | null | null | StockAnalysisSystem/service/interface/analysis_result_page_generator.py | lifg2000/StockAnalysisSystem | b0bef50f5c1a9565e1a1f953fedbe7821601147c | [
"Apache-2.0"
] | null | null | null | import json
import traceback
from StockAnalysisSystem.core.config import Config
import StockAnalysisSystem.core.Utiltity.AnalyzerUtility as analyzer_util
def generate_result_page(result_path: str, name_dict_path: str, generate_sample: bool = False):
try:
print('Loading analyzer name table...')
with open(name_dict_path, 'rt') as f:
analyzer_name_dict = json.load(f)
print('Load analyzer name table done.')
except Exception as e:
print('Load analyzer name table fail.')
print(e)
print(traceback.format_exc())
finally:
pass
try:
with open(result_path, 'rt') as f:
print('Loading analysis result...')
analysis_result = analyzer_util.analysis_results_from_json(f)
print('Convert analysis result...')
security_analyzer_table = analyzer_util.analysis_result_list_to_security_analyzer_table(analysis_result)
print('Parsing analysis result...')
for security, analyzer_result in security_analyzer_table:
pass
ANALYSIS_RESULT_DATAFRAME = {
k.replace('.SZSE', '').replace('.SSE', ''):
analyzer_util.analyzer_table_to_dataframe(v).rename(columns=ANALYZER_NAME_DICT)
for k, v in security_analyzer_table.items()}
print('Converting to html...')
ANALYSIS_RESULT_HTML = {k: v.to_html(index=True) for k, v in ANALYSIS_RESULT_DATAFRAME.items()}
print('Load analysis result done.')
except Exception as e:
print('Load analysis result fail.')
print(e)
print(traceback.format_exc())
finally:
pass | 38.636364 | 116 | 0.644706 | 200 | 1,700 | 5.245 | 0.315 | 0.146806 | 0.080076 | 0.017159 | 0.183985 | 0.142993 | 0.142993 | 0.083889 | 0.083889 | 0 | 0 | 0 | 0.261176 | 1,700 | 44 | 117 | 38.636364 | 0.835191 | 0 | 0 | 0.342105 | 1 | 0 | 0.149324 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026316 | false | 0.078947 | 0.105263 | 0 | 0.131579 | 0.342105 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
4622017f050dc0ac5c1f1199ff79c08fa922c226 | 337 | py | Python | tests/examples/test_HybridTut.py | adamjhn/netpyne | b9e104645f11fe6688496b22cd4183f463e11adc | [
"MIT"
] | 120 | 2015-12-29T08:30:08.000Z | 2021-11-16T11:49:58.000Z | tests/examples/test_HybridTut.py | ericaygriffith/netpyne | d5745015755855a1214e25d6033d3685cccace0d | [
"MIT"
] | 1,178 | 2020-06-21T16:52:57.000Z | 2021-03-11T15:47:54.000Z | tests/examples/test_HybridTut.py | ericaygriffith/netpyne | d5745015755855a1214e25d6033d3685cccace0d | [
"MIT"
] | 143 | 2016-01-09T17:51:43.000Z | 2022-01-02T06:37:12.000Z | import pytest
import os
import sys
if '-nogui' not in sys.argv:
sys.argv.append('-nogui')
from .utils import pkg_setup
@pytest.mark.package_data(['examples/HybridTut/', '.'])
class TestHybridTut:
def test_run(self, pkg_setup):
import HybridTut_run
def test_export(self, pkg_setup):
import HybridTut_export
| 21.0625 | 55 | 0.709199 | 47 | 337 | 4.914894 | 0.553191 | 0.103896 | 0.103896 | 0.155844 | 0.233766 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181009 | 337 | 15 | 56 | 22.466667 | 0.836957 | 0 | 0 | 0 | 0 | 0 | 0.094955 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.5 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
46309e26fe19ad43b78249e2f965eba106827b0c | 593 | py | Python | day06-2.py | RafaelOda/AdventOfCode2020 | ef6d85f7557667df627f10d1a35a0fc30cc83d2c | [
"MIT"
] | null | null | null | day06-2.py | RafaelOda/AdventOfCode2020 | ef6d85f7557667df627f10d1a35a0fc30cc83d2c | [
"MIT"
] | null | null | null | day06-2.py | RafaelOda/AdventOfCode2020 | ef6d85f7557667df627f10d1a35a0fc30cc83d2c | [
"MIT"
] | null | null | null | # Day 06 - Part 2
print "Day 06 - Part 2"
with open("./day06-input.txt") as f:
content = f.read().splitlines()
count_for_every_group = 0
group_answers = list()
def count_for_current_group():
if not group_answers:
return
number_of_common_answers = len(group_answers[0].intersection(*group_answers[1:]))
return number_of_common_answers
for entry in content:
if not entry:
count_for_every_group += count_for_current_group()
group_answers = list()
else:
group_answers.append(set(list(entry)))
count_for_every_group += count_for_current_group()
print count_for_every_group
| 19.129032 | 82 | 0.758853 | 93 | 593 | 4.483871 | 0.408602 | 0.134293 | 0.1247 | 0.172662 | 0.335731 | 0.206235 | 0.206235 | 0.206235 | 0.206235 | 0 | 0 | 0.021484 | 0.136594 | 593 | 30 | 83 | 19.766667 | 0.792969 | 0.025295 | 0 | 0.222222 | 0 | 0 | 0.055556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
463d8803328c272dc4e264adb7dc3c357dcfc4c7 | 585 | py | Python | openpifpaf_posetrack/transforms/image.py | vita-epfl/openpifpaf_posetrack | 282ba063450d523728637167420d9ade4d9c1e65 | [
"MIT"
] | 9 | 2021-03-04T04:47:27.000Z | 2021-03-30T01:48:55.000Z | openpifpaf_posetrack/transforms/image.py | vita-epfl/openpifpaf_posetrack | 282ba063450d523728637167420d9ade4d9c1e65 | [
"MIT"
] | 4 | 2021-03-16T03:24:27.000Z | 2021-04-01T19:22:02.000Z | openpifpaf_posetrack/transforms/image.py | vita-epfl/openpifpaf_posetrack | 282ba063450d523728637167420d9ade4d9c1e65 | [
"MIT"
] | null | null | null | import logging
import numpy as np
import PIL
import scipy
import torch
import openpifpaf
LOG = logging.getLogger(__name__)
class HorizontalBlur(openpifpaf.transforms.Preprocess):
def __init__(self, sigma=5.0):
self.sigma = sigma
def __call__(self, image, anns, meta):
im_np = np.asarray(image)
sigma = self.sigma * (0.8 + 0.4 * float(torch.rand(1).item()))
LOG.debug('horizontal blur with %f', sigma)
im_np = scipy.ndimage.filters.gaussian_filter1d(im_np, sigma=sigma, axis=1)
return PIL.Image.fromarray(im_np), anns, meta
| 25.434783 | 83 | 0.683761 | 83 | 585 | 4.614458 | 0.554217 | 0.041775 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019272 | 0.201709 | 585 | 22 | 84 | 26.590909 | 0.800857 | 0 | 0 | 0 | 0 | 0 | 0.039316 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
4656fde2df896890c28db41ef2d1691bae482346 | 966 | py | Python | rf_api_client/models/node_types_api_models.py | ashlanderr/rf_api_client | 10821813a6dd0f33758100daee8bc9516e3f2077 | [
"MIT"
] | null | null | null | rf_api_client/models/node_types_api_models.py | ashlanderr/rf_api_client | 10821813a6dd0f33758100daee8bc9516e3f2077 | [
"MIT"
] | null | null | null | rf_api_client/models/node_types_api_models.py | ashlanderr/rf_api_client | 10821813a6dd0f33758100daee8bc9516e3f2077 | [
"MIT"
] | null | null | null | from enum import Enum, unique
from typing import List, Optional
from pydantic import BaseModel
@unique
class NodePropertyType(int, Enum):
INTEGER = 1,
REAL = 2,
BOOLEAN = 3,
TEXT = 5,
HTML = 6,
DATE = 7,
TIME = 8,
DATETIME = 9,
FILE = 10,
USER = 11,
ENUM = 12,
class NodeTypePropertyOwner(str, Enum):
node_type = 'node_type'
# todo find out all values
class NodeTypePropertyIcon(BaseModel):
img: str
text: str
class NodeTypePropertyDto(BaseModel):
name: str
owner_id: str
owner_type: NodeTypePropertyOwner
position: int
type_id: NodePropertyType
default_value: str
icons: List[NodeTypePropertyIcon]
multivalued: bool
displayable: bool
as_icon: bool
class NodeTypeDto(BaseModel):
id: str
map_id: str
name: str
icon: Optional[str]
displayable: bool
default_child_node_type_id: Optional[str]
properties: List[NodeTypePropertyDto]
| 18.226415 | 45 | 0.669772 | 115 | 966 | 5.521739 | 0.530435 | 0.037795 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019444 | 0.254658 | 966 | 52 | 46 | 18.576923 | 0.8625 | 0.024845 | 0 | 0.1 | 0 | 0 | 0.009574 | 0 | 0 | 0 | 0 | 0.019231 | 0 | 1 | 0 | false | 0 | 0.075 | 0 | 0.975 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
4669108c811ab78987cc45d1ffeffe39fbd7bca4 | 446 | py | Python | server/src/weaverbird/pipeline/steps/utils/base.py | JeremyJacquemont/weaverbird | e04ab6f9c8381986ab71078e5199ece7a875e743 | [
"BSD-3-Clause"
] | 54 | 2019-11-20T15:07:39.000Z | 2022-03-24T22:13:51.000Z | server/src/weaverbird/pipeline/steps/utils/base.py | JeremyJacquemont/weaverbird | e04ab6f9c8381986ab71078e5199ece7a875e743 | [
"BSD-3-Clause"
] | 786 | 2019-10-20T11:48:37.000Z | 2022-03-23T08:58:18.000Z | server/src/weaverbird/pipeline/steps/utils/base.py | JeremyJacquemont/weaverbird | e04ab6f9c8381986ab71078e5199ece7a875e743 | [
"BSD-3-Clause"
] | 10 | 2019-11-21T10:16:16.000Z | 2022-03-21T10:34:06.000Z | from typing import Dict
from pydantic.main import BaseModel
from weaverbird.pipeline.types import PopulatedWithFieldnames
class BaseStep(BaseModel):
name: str
class Config(PopulatedWithFieldnames):
extra = 'forbid'
# None values are excluded, to avoid triggering validations error in front-ends
def dict(self, *, exclude_none: bool = True, **kwargs) -> Dict:
return super().dict(exclude_none=True, **kwargs)
| 26.235294 | 83 | 0.726457 | 53 | 446 | 6.075472 | 0.716981 | 0.068323 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188341 | 446 | 16 | 84 | 27.875 | 0.889503 | 0.172646 | 0 | 0 | 0 | 0 | 0.016349 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0.111111 | 0.888889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 2 |
466e503276d8ad074434dfa8d415e3214116b9af | 10,459 | py | Python | dimredu/lib/minNDRSD.py | Marissa4/RPyCA | e3c229361a4cd9ddd53accc5541b7c8b5f8939e0 | [
"MIT"
] | null | null | null | dimredu/lib/minNDRSD.py | Marissa4/RPyCA | e3c229361a4cd9ddd53accc5541b7c8b5f8939e0 | [
"MIT"
] | null | null | null | dimredu/lib/minNDRSD.py | Marissa4/RPyCA | e3c229361a4cd9ddd53accc5541b7c8b5f8939e0 | [
"MIT"
] | null | null | null | #! /usr/bin/env python
import numpy as np
from dimredu.lib.shrink import shrink
from numba import jit
def minNDRSD(A, Yt, Yb, E, mu, hasWeave=True, debug=False, guess=None):
"""Compute a fast minimization of shrinkage plus Frobenius norm.
The is computes the minium of the following objective.
.. math::
\arg \min_{S_D} \frac{\mu}{2} \Big \| \frac{1}{\mu} Y_t +
\mathcal{S}_{\boldsymbol \epsilon}( P_{\Omega} ( S_D ) ) \Big \|_F^2 + \\
\frac{\mu}{2} \Big \| \frac{1}{\mu} Y_t + P_{\Omega}(M_D) -
(P_{\Omega}(D(L_G)) + P_{\Omega}(S_D)) \Big \|_F^2
Args:
A: A numpy array.
Yt, Yb: Numpy arrays of Lagrange multipliers.
E: A numpy array of error bounds.
mu: The value of :math:`\mu`.
Returns:
The value of :math:`S` that achieves the minimum.
"""
assert len(A.shape) == 1, 'A can only be a vector'
assert A.shape == E.shape, 'A and E have to have the same size'
assert A.shape == Yt.shape, 'A and Yt have to have the same size'
assert A.shape == Yb.shape, 'A and Yb have to have the same size'
# Note, while the derivative is always zero when you use the
# formula below, it is only a minimum if the second derivative is
# positive. The second derivative happens positive if and only
# \mu is positive.
assert mu >= 0., 'mu must be >= 0'
mu = float(mu)
####################################
# DEBUG ############################
if debug and (guess is not None):
before = objective(guess, A, Yt, Yb, E, mu)
# DEBUG ############################
####################################
S = np.zeros(A.shape)
_worker(A, Yt, E, Yb, mu, S)
####################################
# DEBUG ############################
if debug and (guess is not None):
after = objective(S, A, Yt, Yb, E, mu)
assert before / after + 1e-7 >= 1., 'minNDRSD went up!'
# DEBUG ############################
####################################
return S
@jit(nopython=True, cache=True)
def _worker(A, Yt, E, Yb, mu, S):
for i in range(len(A)):
if (1. / (2. * mu)) * (Yt[i] - mu * E[i] + Yb[i] + mu * A[i]) < -E[i]:
S[i] = (1. / (2. * mu)) * \
(Yt[i] - mu * E[i] + Yb[i] + mu * A[i])
elif ((-E[i] < (1. / mu) * (Yb[i] + mu * A[i])) and
((1 / mu) * (Yb[i] + mu * A[i]) < E[i])):
S[i] = (1. / mu) * (Yb[i] + mu * A[i])
elif E[i] < (1. / (2. * mu)) * (-Yt[i] + mu * E[i] + Yb[i] + mu * A[i]):
S[i] = (1. / (2. * mu)) * (-Yt[i] +
mu * E[i] + Yb[i] + mu * A[i])
else:
term1 = (1. / mu) * Yt[i]
term2 = (1. / mu) * Yb[i] + A[i] - E[i]
term3 = (1. / mu) * Yb[i] + A[i] + E[i]
Sp = (mu / 2.) * (term1 * term1 + term2 * term2)
Sm = (mu / 2.) * (term1 * term1 + term3 * term3)
if Sp < Sm:
S[i] = E[i]
else:
S[i] = -E[i]
def objective(S, A, Yt, Yb, E, mu):
temp1 = (mu / 2.) * np.linalg.norm((1. / mu)
* Yt + np.abs(shrink(E, S)))**2
temp2 = (mu / 2.) * np.linalg.norm((1. / mu) * Yb + A - S)**2
return temp1 + temp2
def data(which):
# Test data sets including those that given problems before.
if which == 'problem1':
SOrig = np.array([-0.00000e+00, -6.44347e-02, 1.46198e-01, 4.89326e+00,
4.86620e+00, -6.44347e-02, -5.20417e-18, -1.38359e-02,
3.11687e-01, 5.01579e+00, 1.46198e-01, -1.38359e-02,
-0.00000e+00, -7.29722e-02, 4.96465e+00, 4.89326e+00,
3.11687e-01, -7.29722e-02, -5.20417e-18, 4.20675e-02,
4.86620e+00, 5.01579e+00, 4.96465e+00, 4.20675e-02,
-0.00000e+00])
A = np.array([1.38778e-17, -1.36229e-01, -1.51878e-01, 4.80873e+00,
4.81636e+00, -1.36229e-01, -2.77556e-17, -1.46025e-01,
1.02459e-01, 5.02334e+00, -1.51878e-01, -1.46025e-01,
-1.38778e-17, -2.02653e-01, 4.93558e+00, 4.80873e+00,
1.02459e-01, -2.02653e-01, -1.38778e-17, -6.61490e-02,
4.81636e+00, 5.02334e+00, 4.93558e+00, -6.61490e-02,
-6.93889e-18])
Yt = np.array([0.00000e+00, 3.96973e-02, 1.39967e+00, 0.00000e+00,
0.00000e+00, 3.96973e-02, 4.99255e-18, 3.85058e-01,
1.52151e+00, 6.40735e-02, 1.39967e+00, 3.85058e-01,
2.49628e-18, 3.01137e-01, 8.15302e-03, 0.00000e+00,
1.52151e+00, 3.01137e-01, -1.49777e-17, 7.58762e-01,
0.00000e+00, 6.40735e-02, 8.15302e-03, 7.58762e-01,
0.00000e+00])
Yb = np.array([0.00000e+00, 3.96973e-02, 1.39967e+00, 0.00000e+00,
0.00000e+00, 3.96973e-02, 4.99255e-18, 3.85058e-01,
1.20474e+00, 6.40735e-02, 1.39967e+00, 3.85058e-01,
2.49628e-18, 3.01137e-01, 0.00000e+00, 0.00000e+00,
1.20474e+00, 3.01137e-01, -1.49777e-17, 6.40717e-01,
0.00000e+00, 6.40735e-02, 0.00000e+00, 6.40717e-01,
0.00000e+00])
E = np.array([0.00000e+00, 4.32591e-03, 9.95856e-02, 5.00000e+00,
5.00000e+00, 4.32591e-03, 0.00000e+00, 7.57776e-03,
3.11687e-01, 5.00000e+00, 9.95856e-02, 7.57776e-03,
0.00000e+00, 1.73463e-02, 5.00000e+00, 5.00000e+00,
3.11687e-01, 1.73463e-02, 0.00000e+00, 4.20675e-02,
5.00000e+00, 5.00000e+00, 5.00000e+00, 4.20675e-02,
0.00000e+00])
mu = 2.8780102048
return A, Yt, Yb, E, mu
if which == 'problem2':
A = np.array([0., 0.15854, 0.71644, 5., 5., 0.15854,
0., 0.20875, 1.19954, 5., 0.71644, 0.20875,
0., 0.31242, 5., 5., 1.19954, 0.31242,
0., 0.47801, 5., 5., 5., 0.47801, 0.])
Yt = np.array([0., 0.00741, 0.02558, 0., 0., 0.00741,
0., 0.00957, 0.02849, 0., 0.02558, 0.00957,
0., 0.01373, 0., 0., 0.02849, 0.01373,
0., 0.01947, 0., 0., 0., 0.01947, 0.])
Yb = np.array([0., 0.00784, 0.03542, 0., 0., 0.00784,
0., 0.01032, 0.05931, 0., 0.03542, 0.01032,
0., 0.01545, 0., 0., 0.05931, 0.01545,
0., 0.02363, 0., 0., 0., 0.02363, 0.])
E = np.array([0.00000e+00, 4.32591e-03, 9.95856e-02, 5.00000e+00,
5.00000e+00, 4.32591e-03, 0.00000e+00, 7.57776e-03,
3.11687e-01, 5.00000e+00, 9.95856e-02, 7.57776e-03,
0.00000e+00, 1.73463e-02, 5.00000e+00, 5.00000e+00,
3.11687e-01, 1.73463e-02, 0.00000e+00, 4.20675e-02,
5.00000e+00, 5.00000e+00, 5.00000e+00, 4.20675e-02,
0.00000e+00])
mu = 0.304172303889
return A, Yt, Yb, E, mu
if which == 'randomNormal':
np.random.seed(1234)
A = np.random.normal(size=[5])
Yt = np.random.normal(size=A.shape)
Yb = np.random.normal(size=A.shape)
E = np.ones(A.shape) * 1e-1
mu = 0.1
return A, Yt, Yb, E, mu
if which == 'simple':
A = np.array([1])
Yt = np.array([2])
Yb = np.array([3])
E = np.array([4])
mu = np.array([5])
return A, Yt, Yb, E, mu
data.sets = ['problem1', 'problem2', 'randomNormal', 'simple']
def plot_objective():
A, Yt, Yb, E, mu = data('randomNormal')
print()
print('A, Yt, Yb, E, mu')
print(A, Yt, Yb, E, mu)
Smin = minNDRSD(A, Yt, Yb, E, mu, hasWeave=True)
print('Smin')
print(Smin)
SminObj = objective(Smin, A, Yt, Yb, E, mu)
print('Should be smallest', SminObj)
print('random')
for i in range(5):
perturb = np.random.normal(size=A.shape) * 1e-2
print(perturb)
pObj = objective(Smin + perturb, A, Yt, Yb, E, mu)
print(pObj)
assert SminObj <= pObj
X = []
Y = []
print('linear')
perturb = np.random.normal(A.shape)
for s in np.linspace(-1e+1, +1e+1, 100):
pObj = objective(Smin + perturb * s, A, Yt, Yb, E, mu)
X.append(s)
Y.append(pObj)
import matplotlib.pylab as py
py.figure(1)
py.plot(X, Y)
py.show()
def test_minNDRSD():
A, Yt, Yb, E, mu = data('randomNormal')
print()
print('A, Yt, Yb, E, mu')
print(A, Yt, Yb, E, mu)
Smin = minNDRSD(A, Yt, Yb, E, mu, hasWeave=True)
print('Smin')
print(Smin)
SminObj = objective(Smin, A, Yt, Yb, E, mu)
print('Should be smallest', SminObj)
for i in range(10):
# This should be smaller that E, otherwise the objective
# is flat.
perturb = np.random.normal(size=A.shape) * 1e-3
pObj = objective(Smin + perturb, A, Yt, Yb, E, mu)
print(pObj)
assert SminObj <= pObj
def test_minNDRSD2():
for hasWeave in [True, False]:
for which in data.sets:
A, Yt, Yb, E, mu = data(which)
print()
print('A, Yt, Yb, E, mu')
print(A, Yt, Yb, E, mu)
Smin = minNDRSD(A, Yt, Yb, E, mu, hasWeave=hasWeave,
debug=True, guess=np.random.normal(size=A.shape))
print('Smin')
print(Smin)
SminObj = objective(Smin, A, Yt, Yb, E, mu)
print('Should be smallest', SminObj)
for i in range(10):
# This should be smaller that E, otherwise the objective
# is flat.
perturb = np.random.normal(size=A.shape) * 1e-3
pObj = objective(Smin + perturb, A, Yt, Yb, E, mu)
print(pObj)
assert SminObj <= pObj
if __name__ == '__main__':
test_minNDRSD()
test_minNDRSD2()
# plot_objective()
| 39.768061 | 84 | 0.460082 | 1,524 | 10,459 | 3.137139 | 0.16273 | 0.071951 | 0.028237 | 0.033884 | 0.512654 | 0.505124 | 0.461619 | 0.413303 | 0.368751 | 0.343024 | 0 | 0.232286 | 0.355005 | 10,459 | 262 | 85 | 39.919847 | 0.47643 | 0.100488 | 0 | 0.342246 | 0 | 0 | 0.042682 | 0 | 0 | 0 | 0 | 0 | 0.048128 | 1 | 0.037433 | false | 0 | 0.02139 | 0 | 0.090909 | 0.128342 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4676f49d596b3b47d1b447041adcbf9b2c941976 | 1,079 | py | Python | tests/test_issues/output/issue_tccm/minimalmodel.py | hsolbrig/biolinkml | e93cfd547800beec897dd3d03c277b3b1eabe78a | [
"CC0-1.0"
] | null | null | null | tests/test_issues/output/issue_tccm/minimalmodel.py | hsolbrig/biolinkml | e93cfd547800beec897dd3d03c277b3b1eabe78a | [
"CC0-1.0"
] | null | null | null | tests/test_issues/output/issue_tccm/minimalmodel.py | hsolbrig/biolinkml | e93cfd547800beec897dd3d03c277b3b1eabe78a | [
"CC0-1.0"
] | null | null | null |
# id: https://hotecosystem.org/tccm/prefixes
# description:
# license:
import dataclasses
import sys
from typing import Optional, List, Union, Dict, ClassVar, Any
from dataclasses import dataclass
from biolinkml.utils.slot import Slot
from biolinkml.utils.metamodelcore import empty_list, empty_dict, bnode
from biolinkml.utils.yamlutils import YAMLRoot, extended_str, extended_float, extended_int
if sys.version_info < (3, 7, 6):
from biolinkml.utils.dataclass_extensions_375 import dataclasses_init_fn_with_kwargs
else:
from biolinkml.utils.dataclass_extensions_376 import dataclasses_init_fn_with_kwargs
from biolinkml.utils.formatutils import camelcase, underscore, sfx
from rdflib import Namespace, URIRef
from biolinkml.utils.curienamespace import CurieNamespace
metamodel_version = "1.5.3"
# Overwrite dataclasses _init_fn to add **kwargs in __init__
dataclasses._init_fn = dataclasses_init_fn_with_kwargs
# Namespaces
DEFAULT_ = CurieNamespace('', 'https://hotecosystem.org/tccm/prefixes/')
# Types
# Class references
# Slots
class slots:
pass
| 25.093023 | 90 | 0.809082 | 141 | 1,079 | 5.964539 | 0.48227 | 0.108205 | 0.149822 | 0.074911 | 0.274673 | 0.078478 | 0 | 0 | 0 | 0 | 0 | 0.012632 | 0.119555 | 1,079 | 42 | 91 | 25.690476 | 0.872632 | 0.151066 | 0 | 0 | 0 | 0 | 0.048619 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.052632 | 0.631579 | 0 | 0.684211 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
467d2164ba62d3a8b0633a2e68664b7026426de2 | 3,072 | py | Python | tests/phase0/block_processing/test_process_proposer_slashing.py | CarlBeek/eth2.0-specs | 86ea525dbd5c2785221d5176eeb0005962753068 | [
"CC0-1.0"
] | 2 | 2021-02-26T11:49:09.000Z | 2021-02-26T12:16:47.000Z | tests/phase0/block_processing/test_process_proposer_slashing.py | JSON/eth2.0-specs | 2baa242ac004b0475604c4c4ef4315e14f56c5c7 | [
"CC0-1.0"
] | null | null | null | tests/phase0/block_processing/test_process_proposer_slashing.py | JSON/eth2.0-specs | 2baa242ac004b0475604c4c4ef4315e14f56c5c7 | [
"CC0-1.0"
] | null | null | null | from copy import deepcopy
import pytest
import build.phase0.spec as spec
from build.phase0.spec import (
get_balance,
get_current_epoch,
process_proposer_slashing,
)
from tests.phase0.helpers import (
get_valid_proposer_slashing,
)
# mark entire file as 'header'
pytestmark = pytest.mark.proposer_slashings
def run_proposer_slashing_processing(state, proposer_slashing, valid=True):
"""
Run ``process_proposer_slashing`` returning the pre and post state.
If ``valid == False``, run expecting ``AssertionError``
"""
post_state = deepcopy(state)
if not valid:
with pytest.raises(AssertionError):
process_proposer_slashing(post_state, proposer_slashing)
return state, None
process_proposer_slashing(post_state, proposer_slashing)
slashed_validator = post_state.validator_registry[proposer_slashing.proposer_index]
assert not slashed_validator.initiated_exit
assert slashed_validator.slashed
assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH
assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH
# lost whistleblower reward
assert (
get_balance(post_state, proposer_slashing.proposer_index) <
get_balance(state, proposer_slashing.proposer_index)
)
return state, post_state
def test_success(state):
proposer_slashing = get_valid_proposer_slashing(state)
pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing)
return pre_state, proposer_slashing, post_state
def test_epochs_are_different(state):
proposer_slashing = get_valid_proposer_slashing(state)
# set slots to be in different epochs
proposer_slashing.header_2.slot += spec.SLOTS_PER_EPOCH
pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing, False)
return pre_state, proposer_slashing, post_state
def test_headers_are_same(state):
proposer_slashing = get_valid_proposer_slashing(state)
# set headers to be the same
proposer_slashing.header_2 = proposer_slashing.header_1
pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing, False)
return pre_state, proposer_slashing, post_state
def test_proposer_is_slashed(state):
proposer_slashing = get_valid_proposer_slashing(state)
# set proposer to slashed
state.validator_registry[proposer_slashing.proposer_index].slashed = True
pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing, False)
return pre_state, proposer_slashing, post_state
def test_proposer_is_withdrawn(state):
proposer_slashing = get_valid_proposer_slashing(state)
# set proposer withdrawable_epoch in past
current_epoch = get_current_epoch(state)
proposer_index = proposer_slashing.proposer_index
state.validator_registry[proposer_index].withdrawable_epoch = current_epoch - 1
pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing, False)
return pre_state, proposer_slashing, post_state
| 31.346939 | 93 | 0.78418 | 387 | 3,072 | 5.842377 | 0.196382 | 0.297214 | 0.185759 | 0.077399 | 0.54843 | 0.49801 | 0.49801 | 0.388324 | 0.36621 | 0.298983 | 0 | 0.002696 | 0.154948 | 3,072 | 97 | 94 | 31.670103 | 0.868259 | 0.099609 | 0 | 0.290909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109091 | 1 | 0.109091 | false | 0 | 0.090909 | 0 | 0.327273 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
468c835f1beedbb09d51b57efd4e0554f9265bf5 | 8,556 | py | Python | services/acservice/auth.py | Xavier-Cliquennois/ac-mediator | ce55b65ab7f4532fc11fbb2f994518c60240bfdc | [
"Apache-2.0"
] | 9 | 2016-11-17T08:04:01.000Z | 2020-09-10T05:58:36.000Z | services/acservice/auth.py | Xavier-Cliquennois/ac-mediator | ce55b65ab7f4532fc11fbb2f994518c60240bfdc | [
"Apache-2.0"
] | 23 | 2016-10-26T14:43:55.000Z | 2021-06-10T20:02:38.000Z | services/acservice/auth.py | Xavier-Cliquennois/ac-mediator | ce55b65ab7f4532fc11fbb2f994518c60240bfdc | [
"Apache-2.0"
] | 3 | 2018-03-03T12:07:28.000Z | 2020-08-02T12:54:31.000Z | from accounts.models import ServiceCredentials
from ac_mediator.exceptions import *
from django.core.urlresolvers import reverse
from django.conf import settings
import requests
from services.acservice.constants import *
class ACServiceAuthMixin(object):
"""
Mixin that stores service credentials and implements service linking steps.
This mixin implements standard linking strategy for services that
support Oauth2 authentication. Services with specific requirements should override
the methods from this mixin.
"""
SUPPORTED_AUTH_METHODS = [APIKEY_AUTH_METHOD, ENDUSER_AUTH_METHOD]
BASE_AUTHORIZE_URL = "http://example.com/api/authorize/?client_id={0}"
ACCESS_TOKEN_URL = "http://example.com/api/oauth2/access_token/"
REFRESH_TOKEN_URL = "http://example.com/api/oauth2/refresh_token/"
service_client_id = None
service_client_secret = None
def conf_auth(self, config):
if 'client_id' not in config:
raise ImproperlyConfiguredACService('Missing item \'client_id\'')
if 'client_secret' not in config:
raise ImproperlyConfiguredACService('Missing item \'client_secret\'')
self.set_credentials(config['client_id'], config['client_secret'])
def set_credentials(self, client_id, client_secret):
self.service_client_id = client_id
self.service_client_secret = client_secret
def get_authorize_url(self):
return self.BASE_AUTHORIZE_URL.format(self.service_client_id)
def get_redirect_uri(self):
return settings.BASE_URL + reverse('link_service_callback', args=[self.id])
def access_token_request_data(self, authorization_code=None, refresh_token=None):
data = {'client_id': self.service_client_id, 'client_secret': self.service_client_secret}
if refresh_token is not None:
# If refresh token is passed, renew using refresh token
data.update({'grant_type': 'refresh_token', 'refresh_token': refresh_token})
else:
# If no refresh token is passed, get access token using authorization code
if authorization_code is None:
raise ACException('Authorization code was not provided')
data.update({'grant_type': 'authorization_code', 'code': authorization_code})
return data
def request_access_token(self, authorization_code):
return requests.post(
self.ACCESS_TOKEN_URL,
data=self.access_token_request_data(authorization_code=authorization_code)
)
def renew_access_token(self, refresh_token):
return requests.post(
self.ACCESS_TOKEN_URL,
data=self.access_token_request_data(refresh_token=refresh_token)
)
def renew_credentials(self, credentials):
r = self.renew_access_token(self.get_refresh_token_from_credentials(credentials))
return r.status_code == 200, r.json()
def request_credentials(self, authorization_code):
r = self.request_access_token(authorization_code)
return r.status_code == 200, r.json()
@staticmethod
def get_authorize_popup_specs():
return 'height=400,width=500'
@staticmethod
def process_credentials(credentials_data):
return credentials_data
def supports_auth(self, auth_type):
return auth_type in self.SUPPORTED_AUTH_METHODS
def get_apikey(self):
"""
API key used for non-end user authenticated requests
TODO: this should include the way in which the api key is included (via header, request param, etc)
:return: string containing the api key
"""
if not self.supports_auth(APIKEY_AUTH_METHOD):
raise ACException('Auth method \'{0}\' not supported by service {0}'.format(APIKEY_AUTH_METHOD, self.name))
return self.service_client_secret
def get_enduser_token(self, account):
"""
Get token used to make requests to the service on behalf of 'account'
TODO: this should include the way in which the token is included (via header, request param, etc)
:param account: user account to act on behalf of
:return: string containing the token
"""
if not self.supports_auth(ENDUSER_AUTH_METHOD):
raise ACException('Auth method \'{0}\' not supported by service {1}'.format(ENDUSER_AUTH_METHOD, self.name))
try:
service_credentials = ServiceCredentials.objects.get(account=account, service_id=self.id)
if not self.check_credentials_are_valid(service_credentials):
# Try to renew the credentials
success, received_credentials = self.renew_credentials(service_credentials)
if success:
# Store credentials (replace existing ones if needed)
service_credentials, is_new = ServiceCredentials.objects.get_or_create(
account=account, service_id=self.id)
service_credentials.credentials = received_credentials
service_credentials.save()
else:
raise ACAPIInvalidCredentialsForService(
'Could not renew service credentials for {0}'.format(self.name))
return self.get_access_token_from_credentials(service_credentials)
except ServiceCredentials.DoesNotExist:
raise ACAPIInvalidCredentialsForService
def check_credentials_are_valid(self, credentials):
"""
Check if the provided credentials are valid for a given service.
This method should be overwritten by each individual service or it will always return True.
:param credentials: credentials object as stored in ServiceCredentials entry
"""
return True
def check_credentials_should_be_renewed_background(self, credentials):
"""
Check if the provided credentials for a given service should be renewed for new ones.
For OAuth2 based services, this should be True when refresh token is about to expire but can be
still set to False if only access token has expired (as the access token can be automatically
renewed at request time if the refresh token is still valid).
This method should be overwritten by each individual service or it will return the opposite
value of `check_credentials_are_valid`.
:param credentials: credentials object as stored in ServiceCredentials entry
"""
return not self.check_credentials_are_valid(credentials)
def get_access_token_from_credentials(self, credentials):
"""
Return the access token from service credentials stored in ServiceCredentials object.
This method should be overwritten by each individual service that uses access tokens.
:param credentials: credentials object as stored in ServiceCredentials entry
:return: access token extracted from the stored credentials
"""
return None
def get_refresh_token_from_credentials(self, credentials):
"""
Return the refresh token from service credentials stored in ServiceCredentials object.
This method should be overwritten by each individual service that uses access tokens.
:param credentials: credentials object as stored in ServiceCredentials entry
:return: refresh token extracted from the stored credentials
"""
return None
def get_auth_info_for_request(self, auth_method, account=None):
"""
Return dictionary with information about how to authenticate a request.
The dictionary can contain the following fields:
- header: dictionary with header name and header contents (key, value) to be
added to a request (including credentials).
- params: dictionary with request paramer name and contents (key, value) to be
added to a request (including credentials).
An example for 'header':
{'headers': {'Authorization': 'Token API_KEY'}'}
An example for 'param':
{'params': {'token': 'API_KEY'}}
If both fields are included, both will be added when sending the request.
:param auth_method: auth method for which information is wanted
:param account: user account (for enduser authentication only)
:return: dictionary with auth information
"""
raise NotImplementedError("Service must implement method ACServiceAuthMixin.get_auth_info_for_request")
| 48.067416 | 120 | 0.69612 | 1,029 | 8,556 | 5.609329 | 0.208941 | 0.039501 | 0.017672 | 0.016632 | 0.364865 | 0.326403 | 0.305613 | 0.231116 | 0.209633 | 0.196812 | 0 | 0.003381 | 0.239598 | 8,556 | 177 | 121 | 48.338983 | 0.8838 | 0.359631 | 0 | 0.131868 | 0 | 0 | 0.113894 | 0.01292 | 0 | 0 | 0 | 0.011299 | 0 | 1 | 0.208791 | false | 0 | 0.065934 | 0.076923 | 0.527473 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
469c05d38421b43d8badf2ddd5f68d63a8af2222 | 1,218 | py | Python | test/cc_test.py | shunsukeaihara/colorcorrect | ca609401cfa1ccd2d34c69faa1ee63973247618f | [
"MIT"
] | 36 | 2017-11-23T18:41:54.000Z | 2022-03-15T10:59:59.000Z | test/cc_test.py | shunsukeaihara/colorcorrect | ca609401cfa1ccd2d34c69faa1ee63973247618f | [
"MIT"
] | 6 | 2017-10-23T08:07:53.000Z | 2020-10-29T12:59:49.000Z | test/cc_test.py | shunsukeaihara/colorcorrect | ca609401cfa1ccd2d34c69faa1ee63973247618f | [
"MIT"
] | 16 | 2017-10-20T19:27:27.000Z | 2022-02-08T12:47:03.000Z | from PIL import Image
from colorcorrect.algorithm import stretch, grey_world, retinex, retinex_with_adjust, max_white
from colorcorrect.algorithm import standard_deviation_weighted_grey_world
from colorcorrect.algorithm import standard_deviation_and_luminance_weighted_gray_world
from colorcorrect.algorithm import automatic_color_equalization
from colorcorrect.algorithm import luminance_weighted_gray_world
from colorcorrect.util import from_pil, to_pil
from unittest import TestCase
class CCTestCase(TestCase):
def setUp(self):
self.img = Image.open("test/test_image.jpg")
def tearDown(self):
pass
def test_all(self):
to_pil(stretch(from_pil(self.img)))
to_pil(grey_world(from_pil(self.img)))
to_pil(retinex(from_pil(self.img)))
to_pil(max_white(from_pil(self.img)))
to_pil(retinex_with_adjust(retinex(from_pil(self.img))))
to_pil(standard_deviation_weighted_grey_world(from_pil(self.img), 20, 20))
to_pil(standard_deviation_and_luminance_weighted_gray_world(from_pil(self.img), 20, 20))
to_pil(luminance_weighted_gray_world(from_pil(self.img), 20, 20))
to_pil(automatic_color_equalization(from_pil(self.img)))
| 43.5 | 96 | 0.777504 | 175 | 1,218 | 5.068571 | 0.228571 | 0.086809 | 0.111612 | 0.142052 | 0.580609 | 0.563698 | 0.330327 | 0.22097 | 0.142052 | 0.110485 | 0 | 0.011429 | 0.137931 | 1,218 | 27 | 97 | 45.111111 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0.015599 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.130435 | false | 0.043478 | 0.347826 | 0 | 0.521739 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
469e1ffce0a5d1a2ea613851f39a75dc3b61a83f | 3,745 | py | Python | constants/countries.py | sutekina/osu-gulag | c5eed521dfae282cd0cf95d02017d0f9654ceb11 | [
"MIT"
] | 187 | 2020-07-27T18:59:35.000Z | 2022-02-02T16:15:13.000Z | app/constants/countries.py | cmyui/gulag | ff3b39fb6304354694379c3f8cc74dfb73e670ce | [
"MIT"
] | 119 | 2020-08-15T16:32:50.000Z | 2022-02-02T05:19:55.000Z | constants/countries.py | AipNooBest/gulag | 3af3c13da47e4ec9e97bf4302f8021e00d468903 | [
"MIT"
] | 123 | 2020-07-23T21:47:52.000Z | 2022-02-05T13:59:32.000Z | __all__ = ("country_codes",)
country_codes = { # talk about ugly lol
"oc": 1,
"eu": 2,
"ad": 3,
"ae": 4,
"af": 5,
"ag": 6,
"ai": 7,
"al": 8,
"am": 9,
"an": 10,
"ao": 11,
"aq": 12,
"ar": 13,
"as": 14,
"at": 15,
"au": 16,
"aw": 17,
"az": 18,
"ba": 19,
"bb": 20,
"bd": 21,
"be": 22,
"bf": 23,
"bg": 24,
"bh": 25,
"bi": 26,
"bj": 27,
"bm": 28,
"bn": 29,
"bo": 30,
"br": 31,
"bs": 32,
"bt": 33,
"bv": 34,
"bw": 35,
"by": 36,
"bz": 37,
"ca": 38,
"cc": 39,
"cd": 40,
"cf": 41,
"cg": 42,
"ch": 43,
"ci": 44,
"ck": 45,
"cl": 46,
"cm": 47,
"cn": 48,
"co": 49,
"cr": 50,
"cu": 51,
"cv": 52,
"cx": 53,
"cy": 54,
"cz": 55,
"de": 56,
"dj": 57,
"dk": 58,
"dm": 59,
"do": 60,
"dz": 61,
"ec": 62,
"ee": 63,
"eg": 64,
"eh": 65,
"er": 66,
"es": 67,
"et": 68,
"fi": 69,
"fj": 70,
"fk": 71,
"fm": 72,
"fo": 73,
"fr": 74,
"fx": 75,
"ga": 76,
"gb": 77,
"gd": 78,
"ge": 79,
"gf": 80,
"gh": 81,
"gi": 82,
"gl": 83,
"gm": 84,
"gn": 85,
"gp": 86,
"gq": 87,
"gr": 88,
"gs": 89,
"gt": 90,
"gu": 91,
"gw": 92,
"gy": 93,
"hk": 94,
"hm": 95,
"hn": 96,
"hr": 97,
"ht": 98,
"hu": 99,
"id": 100,
"ie": 101,
"il": 102,
"in": 103,
"io": 104,
"iq": 105,
"ir": 106,
"is": 107,
"it": 108,
"jm": 109,
"jo": 110,
"jp": 111,
"ke": 112,
"kg": 113,
"kh": 114,
"ki": 115,
"km": 116,
"kn": 117,
"kp": 118,
"kr": 119,
"kw": 120,
"ky": 121,
"kz": 122,
"la": 123,
"lb": 124,
"lc": 125,
"li": 126,
"lk": 127,
"lr": 128,
"ls": 129,
"lt": 130,
"lu": 131,
"lv": 132,
"ly": 133,
"ma": 134,
"mc": 135,
"md": 136,
"mg": 137,
"mh": 138,
"mk": 139,
"ml": 140,
"mm": 141,
"mn": 142,
"mo": 143,
"mp": 144,
"mq": 145,
"mr": 146,
"ms": 147,
"mt": 148,
"mu": 149,
"mv": 150,
"mw": 151,
"mx": 152,
"my": 153,
"mz": 154,
"na": 155,
"nc": 156,
"ne": 157,
"nf": 158,
"ng": 159,
"ni": 160,
"nl": 161,
"no": 162,
"np": 163,
"nr": 164,
"nu": 165,
"nz": 166,
"om": 167,
"pa": 168,
"pe": 169,
"pf": 170,
"pg": 171,
"ph": 172,
"pk": 173,
"pl": 174,
"pm": 175,
"pn": 176,
"pr": 177,
"ps": 178,
"pt": 179,
"pw": 180,
"py": 181,
"qa": 182,
"re": 183,
"ro": 184,
"ru": 185,
"rw": 186,
"sa": 187,
"sb": 188,
"sc": 189,
"sd": 190,
"se": 191,
"sg": 192,
"sh": 193,
"si": 194,
"sj": 195,
"sk": 196,
"sl": 197,
"sm": 198,
"sn": 199,
"so": 200,
"sr": 201,
"st": 202,
"sv": 203,
"sy": 204,
"sz": 205,
"tc": 206,
"td": 207,
"tf": 208,
"tg": 209,
"th": 210,
"tj": 211,
"tk": 212,
"tm": 213,
"tn": 214,
"to": 215,
"tl": 216,
"tr": 217,
"tt": 218,
"tv": 219,
"tw": 220,
"tz": 221,
"ua": 222,
"ug": 223,
"um": 224,
"us": 225,
"uy": 226,
"uz": 227,
"va": 228,
"vc": 229,
"ve": 230,
"vg": 231,
"vi": 232,
"vn": 233,
"vu": 234,
"wf": 235,
"ws": 236,
"ye": 237,
"yt": 238,
"rs": 239,
"za": 240,
"zm": 241,
"me": 242,
"zw": 243,
"xx": 244,
"a2": 245,
"o1": 246,
"ax": 247,
"gg": 248,
"im": 249,
"je": 250,
"bl": 251,
"mf": 252,
}
| 14.571984 | 40 | 0.320694 | 513 | 3,745 | 2.329435 | 0.996101 | 0.020084 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.292925 | 0.407477 | 3,745 | 256 | 41 | 14.628906 | 0.245606 | 0.005073 | 0 | 0 | 0 | 0 | 0.138829 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
46a717166ab3d6b06fc669ecfc6058dad3e771ac | 770 | py | Python | continual_rl/policies/discrete_random/discrete_random_policy_config.py | AGI-Labs/continual_rl | bcf17d879e8a983340be233ff8f740c424d0f303 | [
"MIT"
] | 19 | 2021-07-27T05:20:09.000Z | 2022-02-27T07:12:05.000Z | continual_rl/policies/discrete_random/discrete_random_policy_config.py | AGI-Labs/continual_rl | bcf17d879e8a983340be233ff8f740c424d0f303 | [
"MIT"
] | 2 | 2021-11-05T07:36:50.000Z | 2022-03-11T00:21:50.000Z | continual_rl/policies/discrete_random/discrete_random_policy_config.py | AGI-Labs/continual_rl | bcf17d879e8a983340be233ff8f740c424d0f303 | [
"MIT"
] | 3 | 2021-10-20T06:04:35.000Z | 2022-03-06T22:59:36.000Z | from continual_rl.policies.config_base import ConfigBase
class DiscreteRandomPolicyConfig(ConfigBase):
def __init__(self):
super().__init__()
self.timesteps_per_collection = 128 # Per process, for batch
self.num_parallel_envs = None # If None we operate synchronously, otherwise we batch
def _load_from_dict_internal(self, config_dict):
self.timesteps_per_collection = config_dict.pop("timesteps_per_collection", self.timesteps_per_collection)
# Only necessary because the default is "None"
self.num_parallel_envs = config_dict.pop("num_parallel_envs", self.num_parallel_envs)
self.num_parallel_envs = int(self.num_parallel_envs) if self.num_parallel_envs is not None else None
return self
| 40.526316 | 114 | 0.750649 | 102 | 770 | 5.284314 | 0.431373 | 0.142857 | 0.194805 | 0.211503 | 0.09833 | 0.09833 | 0.09833 | 0 | 0 | 0 | 0 | 0.004762 | 0.181818 | 770 | 18 | 115 | 42.777778 | 0.850794 | 0.155844 | 0 | 0 | 0 | 0 | 0.063467 | 0.037152 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.090909 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
46ab4dea61540b127f5c96c2afcb437569b128a5 | 924 | py | Python | mooringlicensing/components/payments_ml/api.py | xzzy/mooringlicensing | 9f34815c6305e8f6a741dbc68889218ae7bfe953 | [
"Apache-2.0"
] | null | null | null | mooringlicensing/components/payments_ml/api.py | xzzy/mooringlicensing | 9f34815c6305e8f6a741dbc68889218ae7bfe953 | [
"Apache-2.0"
] | null | null | null | mooringlicensing/components/payments_ml/api.py | xzzy/mooringlicensing | 9f34815c6305e8f6a741dbc68889218ae7bfe953 | [
"Apache-2.0"
] | null | null | null | import logging
from rest_framework import views
from rest_framework.renderers import JSONRenderer
from rest_framework.response import Response
from mooringlicensing import settings
from mooringlicensing.components.main.models import ApplicationType
from mooringlicensing.components.payments_ml.models import FeeConstructor
logger = logging.getLogger('log')
class GetSeasonsForDcvDict(views.APIView):
renderer_classes = [JSONRenderer, ]
def get(self, request, format=None):
# Return current and future seasons for the DCV permit
application_type = ApplicationType.objects.get(code=settings.APPLICATION_TYPE_DCV_PERMIT['code'])
fee_constructors = FeeConstructor.get_current_and_future_fee_constructors_by_application_type_and_date(application_type,)
data = [{'id': item.fee_season.id, 'name': item.fee_season.__str__()} for item in fee_constructors]
return Response(data)
| 40.173913 | 129 | 0.798701 | 111 | 924 | 6.396396 | 0.504505 | 0.084507 | 0.071831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130952 | 924 | 22 | 130 | 42 | 0.884184 | 0.056277 | 0 | 0 | 0 | 0 | 0.014943 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.466667 | 0 | 0.733333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
46ac689cd982c86a5928757d055e3440106225b6 | 1,335 | py | Python | desktop/core/ext-py/pyasn1-modules-0.2.6/tests/test_rfc8418.py | yetsun/hue | 2e48f0cc70e233ee0e1b40733d4b2a18d8836c66 | [
"Apache-2.0"
] | 5,079 | 2015-01-01T03:39:46.000Z | 2022-03-31T07:38:22.000Z | desktop/core/ext-py/pyasn1-modules-0.2.6/tests/test_rfc8418.py | yetsun/hue | 2e48f0cc70e233ee0e1b40733d4b2a18d8836c66 | [
"Apache-2.0"
] | 1,623 | 2015-01-01T08:06:24.000Z | 2022-03-30T19:48:52.000Z | desktop/core/ext-py/pyasn1-modules-0.2.6/tests/test_rfc8418.py | yetsun/hue | 2e48f0cc70e233ee0e1b40733d4b2a18d8836c66 | [
"Apache-2.0"
] | 2,033 | 2015-01-04T07:18:02.000Z | 2022-03-28T19:55:47.000Z | #
# This file is part of pyasn1-modules software.
#
# Created by Russ Housley
# Copyright (c) 2019, Vigil Security, LLC
# License: http://snmplabs.com/pyasn1/license.html
#
import sys
from pyasn1.codec.der import decoder as der_decoder
from pyasn1.codec.der import encoder as der_encoder
from pyasn1_modules import pem
from pyasn1_modules import rfc5280
from pyasn1_modules import rfc8418
try:
import unittest2 as unittest
except ImportError:
import unittest
class KeyAgreeAlgTestCase(unittest.TestCase):
key_agree_alg_id_pem_text = "MBoGCyqGSIb3DQEJEAMUMAsGCWCGSAFlAwQBLQ=="
def setUp(self):
self.asn1Spec = rfc5280.AlgorithmIdentifier()
def testDerCodec(self):
substrate = pem.readBase64fromText(self.key_agree_alg_id_pem_text)
asn1Object, rest = der_decoder.decode(substrate, asn1Spec=self.asn1Spec)
assert not rest
assert asn1Object.prettyPrint()
assert asn1Object['algorithm'] == rfc8418.dhSinglePass_stdDH_hkdf_sha384_scheme
assert asn1Object['parameters'].isValue
assert der_encoder.encode(asn1Object) == substrate
suite = unittest.TestLoader().loadTestsFromModule(sys.modules[__name__])
if __name__ == '__main__':
import sys
result = unittest.TextTestRunner(verbosity=2).run(suite)
sys.exit(not result.wasSuccessful())
| 28.404255 | 87 | 0.755056 | 158 | 1,335 | 6.170886 | 0.531646 | 0.051282 | 0.052308 | 0.070769 | 0.090256 | 0.041026 | 0 | 0 | 0 | 0 | 0 | 0.038496 | 0.163296 | 1,335 | 46 | 88 | 29.021739 | 0.834378 | 0.118352 | 0 | 0.074074 | 0 | 0 | 0.057314 | 0.034217 | 0 | 0 | 0 | 0 | 0.185185 | 1 | 0.074074 | false | 0.037037 | 0.37037 | 0 | 0.518519 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
46af7a553f485120a27537bec039f5248bf23a26 | 6,777 | py | Python | framework/modelhublib/processor.py | modelhub-ai/modelhub-engine | 81e893fb7669ee9912178346efbf828dd8c0410b | [
"MIT"
] | 6 | 2018-10-13T10:11:51.000Z | 2022-02-21T08:28:10.000Z | framework/modelhublib/processor.py | modelhub-ai/modelhub-docker | 81e893fb7669ee9912178346efbf828dd8c0410b | [
"MIT"
] | 34 | 2018-03-06T16:25:10.000Z | 2018-06-26T21:55:13.000Z | framework/modelhublib/processor.py | modelhub-ai/modelhub-engine | 81e893fb7669ee9912178346efbf828dd8c0410b | [
"MIT"
] | 3 | 2019-08-15T18:09:32.000Z | 2022-02-16T07:55:27.000Z | import numpy as np
from .imageloaders import PilImageLoader, SitkImageLoader, NumpyImageLoader
from .imageconverters import PilToNumpyConverter, SitkToNumpyConverter, NumpyToNumpyConverter
class ImageProcessorBase(object):
"""
Abstract base class for image pre- and postprocessing, thus handeling all data
processing before and after the inference.
Several methods of this class have to be implemented in a contributed model.
Follow the "Contribute Your Model to Modelhub" guide for detailed instructions.
An image processor handles:
1. Loading of the input image(s).
2. Converting the loaded images to a numpy array
3. Preprocessing the image data (either on the image object or on the numpy array)
After this step the data should be prepared to be directly feed to the inference step.
4. Processing the inference result and convert it to the expected output format.
This class already provides loading and conversion of images using PIL and SimpleITK.
If you need to support image formats which are not covered by those two, you should
implement an additional :class:`~modelhublib.imageloaders.imageLoader.ImageLoader` and
:class:`~modelhublib.imageconverters.imageConverter.ImageConverter`. If you do so,
you will also need to overwrite the constructor (__init__) to instantiate your
loader and converter and include them in the chain of responsibility. Best practice
would be to call the original constructor from your derived class and then change
what you need to change.
Args:
config (dict): Model configuration (loaded from model's config.json)
"""
def __init__(self, config):
self._config = config
self._imageLoader = PilImageLoader(self._config)
self._imageLoader.setSuccessor(SitkImageLoader(self._config))
self._imageLoader._successor.setSuccessor(NumpyImageLoader(self._config))
self._imageToNumpyConverter = PilToNumpyConverter()
self._imageToNumpyConverter.setSuccessor(SitkToNumpyConverter())
self._imageToNumpyConverter._successor.setSuccessor(NumpyToNumpyConverter())
def loadAndPreprocess(self, input, id=None):
"""
Loads input, preprocesses it and returns a numpy array appropriate to feed
into the inference model (4 dimensions: [batchsize, z/color, height, width]).
There should be no need to overwrite this method in a derived class!
Rather overwrite the individual preprocessing steps used by this method!
Args:
input (str): Name of the input file to be loaded
id (str or None): ID of the input when handling multiple inputs
Returns:
numpy array appropriate to feed into the inference model
(4 dimensions: [batchsize, z/color, height, width])
"""
image = self._load(input, id=id)
image = self._preprocessBeforeConversionToNumpy(image)
npArr = self._convertToNumpy(image)
npArr = self._preprocessAfterConversionToNumpy(npArr)
return npArr
def computeOutput(self, inferenceResults):
"""
Abstract method. Overwrite this method to define how to postprocess
the inference results computed by the model into a proper output as
defined in the model configuration file.
Args:
inferenceResults: Results of the inference as computed by the model.
Returns:
Converted inference results into format as defined in the model configuration.
"""
raise NotImplementedError("This is a method of an abstract class.")
def _load(self, input, id=None):
"""
Performs the actual loading of the image.
There should be no need to overwrite this method in a derived class!
Rather implement an additional
:class:`~modelhublib.imageloaders.imageLoader.ImageLoader` to support
further image formats. See also documentation of :class:`~ImageProcessorBase`
above.
Args:
input (str): Name of the input file to be loaded
id (str or None): ID of the input when handling multiple inputs
Returns:
Image object which type will be the native image object type of
the library/handler used for loading (default implementation uses PIL or SimpleITK).
Hence it might not always be the same.
"""
image = self._imageLoader.load(input, id=id)
return image
def _preprocessBeforeConversionToNumpy(self, image):
"""
Perform preprocessing on the loaded image object (see :func:`~modelhublib.processor.ImageProcessorBase._load`).
Overwrite this to implement image preprocessing using the loaded image object.
If not overwritten, just returns the image object unchanged.
When overwriting this, make sure to handle the possible types appropriately
and throw an IOException if you cannot preprocess a certain type.
Args:
image (type = return of :func:`~modelhublib.processor.ImageProcessorBase._load`): Loaded image object
Returns:
Image object which must be of the same type as input image object.
"""
return image
def _convertToNumpy(self, image):
"""
Converts the image object into a corresponding numpy array
with 4 dimensions: [batchsize, z/color, height, width].
There should be no need to overwrite this method in a derived class!
Rather implement an additional
:class:`~modelhublib.imageconverters.imageConverter.ImageConverter` to support
further image format conversions. See also documentation of
:class:`~ImageProcessorBase` above.
Args:
image: (type = return of :func:`~modelhublib.processor.ImageProcessorBase._preprocessBeforeConversionToNumpy`): Loaded and preproceesed image object.
Returns:
Representation of the input image as numpy array with 4 dimensions [batchsize, z/color, height, width].
"""
npArr = self._imageToNumpyConverter.convert(image)
return npArr
def _preprocessAfterConversionToNumpy(self, npArr):
"""
Perform preprocessing on the numpy array (the result of _convertToNumpy()).
Overwrite this to implement preprocessing on the converted numpy array.
If not overwritten, just returns the input array unchanged.
Args:
npArr (numpy array): input data after conversion by :func:`~modelhublib.processor.ImageProcessorBase._convertToNumpy`
Returns:
Preprocessed numpy array with 4 dimensions [batchsize, z/color, height, width].
"""
return npArr
| 42.35625 | 161 | 0.698539 | 797 | 6,777 | 5.895859 | 0.282309 | 0.02575 | 0.012769 | 0.022345 | 0.293467 | 0.261119 | 0.234731 | 0.234731 | 0.191104 | 0.16429 | 0 | 0.00176 | 0.245241 | 6,777 | 159 | 162 | 42.622642 | 0.916911 | 0.673749 | 0 | 0.166667 | 0 | 0 | 0.024158 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.233333 | false | 0 | 0.1 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
d3c3b41b3a1f9af12d5e5fd2bb468f6513b54550 | 792 | py | Python | LeetCode/Python/gas_station.py | wh-acmer/minixalpha-acm | cb684ad70eaa61d42a445364cb3ee195b9e9302e | [
"MIT"
] | null | null | null | LeetCode/Python/gas_station.py | wh-acmer/minixalpha-acm | cb684ad70eaa61d42a445364cb3ee195b9e9302e | [
"MIT"
] | null | null | null | LeetCode/Python/gas_station.py | wh-acmer/minixalpha-acm | cb684ad70eaa61d42a445364cb3ee195b9e9302e | [
"MIT"
] | null | null | null | #!/usr/bin/env python
#coding: utf-8
class Solution:
# @param gas, a list of integers
# @param cost, a list of integers
# @return an integer
def canCompleteCircuit(self, gas, cost):
total, tank, start = 0, 0, 0
lg = len(gas)
for i in range(lg):
tank = tank + gas[i] - cost[i]
if tank < 0:
start = i + 1
total += tank
tank = 0
return start if total + tank >= 0 else -1
if __name__ == '__main__':
s = Solution()
assert -1 == s.canCompleteCircuit([0], [1])
assert 0 == s.canCompleteCircuit([1], [0])
assert 0 == s.canCompleteCircuit([2, 1], [1, 2])
assert -1 == s.canCompleteCircuit([2, 1], [2, 2])
assert 1 == s.canCompleteCircuit([2, 2], [3, 1])
| 29.333333 | 53 | 0.525253 | 108 | 792 | 3.777778 | 0.37963 | 0.232843 | 0.058824 | 0.191176 | 0.137255 | 0.137255 | 0 | 0 | 0 | 0 | 0 | 0.056285 | 0.32702 | 792 | 26 | 54 | 30.461538 | 0.709193 | 0.145202 | 0 | 0 | 0 | 0 | 0.011905 | 0 | 0 | 0 | 0 | 0 | 0.277778 | 1 | 0.055556 | false | 0 | 0 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d3d52982b72209958bac441c4d9f2bd1e92d764c | 3,952 | py | Python | exercise_2.py | srinivas-narayan/SimpleITKProject | 210385f028a9f55242f5a7102fdbf9971d832177 | [
"Apache-2.0"
] | null | null | null | exercise_2.py | srinivas-narayan/SimpleITKProject | 210385f028a9f55242f5a7102fdbf9971d832177 | [
"Apache-2.0"
] | null | null | null | exercise_2.py | srinivas-narayan/SimpleITKProject | 210385f028a9f55242f5a7102fdbf9971d832177 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import SimpleITK as sitk
import argparse
def convert_to_binary_mask( image ):
#=================== YOUR CODE HERE ==============================
# Assign 1 to every voxel in the image that is different from 0.
# Use a simpleITK filter
#=================================================================
return new_img
def dilate_mask( image ):
filter = sitk.BinaryDilateImageFilter()
filter.SetKernelRadius ( 1 ).SetForegroundValue ( 1 )
dilated = filter.Execute ( image )
return dilated
def erode_mask( image ):
#=================== YOUR CODE HERE ==============================
# Based on the code for dilate_mask, implement this function so
# that the input image is eroded. Use the same kernel radius.
#=================================================================
return eroded
def obtain_border( image ):
#=================== YOUR CODE HERE ==============================
# Obtain a rough estimate of the border of the masked object
# by sustracting the eroded mask from the dilated mask
#=================================================================
return border_img
print 'SimpleITK: Basic filters'
# This lines of code allow to read arguments from the command line
parser = argparse.ArgumentParser()
parser.add_argument("-i", "--img", required=True, help="Input image")
parser.add_argument("-m", "--mask", required=True, help="Mask image")
parser.add_argument("-o", "--out", required=True, help="Output image")
parser.add_argument("-b", "--border", required=True, help="Border image")
args = parser.parse_args()
#1- Reading the original image and displaying it
image = sitk.ReadImage ( args.img )
sitk.Show ( image, "Original image" )
# Smooth filter: A Gaussian filter is applied to the input
# image and it is displayed. Sigma = 2.0
smooth = sitk.SmoothingRecursiveGaussian ( image, 2.0 )
sitk.Show ( smooth, "Gaussian smoothing, sigma 2.0" )
# An alternative way to use the SmoothingRecursiveGaussian
# The image is displayed. New sigma is 4.0
# Can you see any differences in the image?
Gaussian = sitk.SmoothingRecursiveGaussian
smooth = Gaussian ( image, 4. )
sitk.Show ( smooth, "Gaussian smoothing, sigma 4.0" )
#=================== YOUR CODE HERE ==============================
# We want to display the difference between the original
# image and the substracted one. The following piece of
# code tries to do so. However, as you will see when you
# run it, there is an error.
# Your task is towrite the necessary code to fix this
# problem.
# Hint: Run the program and check the displayed error.
# Hint 2: Invesigate the usage of sitk.Cast function
#=================================================================
sitk.Show ( sitk.Subtract ( image, smooth ), "DiffWithGaussian" )
#=================== YOUR CODE HERE ==============================
# In exercise_1, you coded the function convert_to_binary_mask
# through the use of numpy. SimpleITK has a BinaryThreshold
# filter that can do the same.
# Implement convert_to_binary_mask. Read the mask image passed as
# argument and apply convert_to_binary_mask function. Save the
# result to disk into args.out. The results should be the same
# you obtained in exercise_1.
#=================================================================
# =================== YOUR CODE HERE ==============================
# Implement obtain_border to get a rough estimation of the borders
# of the object contained in the previously computed mask.
# Rough borders of an object are usually obtained by sustracting the
# dilated and eroded versions of the original mask. See dilate_mask
# for an example on how to dilate/erode using simpleITK.
# Display the result border image. WARNING: You might need to
# adjust the window/level settings to see the image properly.
#=================================================================
| 40.326531 | 73 | 0.606022 | 490 | 3,952 | 4.832653 | 0.367347 | 0.02027 | 0.030405 | 0.032095 | 0.048142 | 0.030405 | 0 | 0 | 0 | 0 | 0 | 0.006001 | 0.15663 | 3,952 | 97 | 74 | 40.742268 | 0.70447 | 0.667763 | 0 | 0 | 0 | 0 | 0.149525 | 0 | 0 | 0 | 0 | 0.010309 | 0 | 0 | null | null | 0 | 0.071429 | null | null | 0.035714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d3d601268249220ffd7fb5c1fecf1001a41a2b5f | 2,680 | py | Python | tests/test_reset_password.py | zevaverbach/epcon | 8352c030ee0d4197f559cdb58a54ee45c7a4471a | [
"BSD-2-Clause"
] | null | null | null | tests/test_reset_password.py | zevaverbach/epcon | 8352c030ee0d4197f559cdb58a54ee45c7a4471a | [
"BSD-2-Clause"
] | null | null | null | tests/test_reset_password.py | zevaverbach/epcon | 8352c030ee0d4197f559cdb58a54ee45c7a4471a | [
"BSD-2-Clause"
] | null | null | null | from pytest import mark
from django.core import mail
from django.core.urlresolvers import reverse
from django_factory_boy import auth as auth_factories
from tests.common_tools import template_used
@mark.django_db
def test_reset_password(client):
"""
Testing full reset password flow, from getting to the reset password page,
through sending email with unique token, to using that url to change the
password.
"""
url = reverse("accounts:password_reset")
assert url == "/accounts/password-reset/"
response = client.get(url)
assert template_used(response, "ep19/bs/accounts/password_reset.html")
# make sure that we're not using default template from django admin
assert "Django Administration" not in response.content.decode("utf-8")
assert 'input type="email"' in response.content.decode("utf-8")
# --------
response = client.post(url, {"email": "joedoe@example.com"})
# successful redirect, but no email sent because user doesn't exist
assert response.status_code == 302
assert response.url.endswith("/accounts/password-reset/done/")
assert len(mail.outbox) == 0
# --------
auth_factories.UserFactory(email="joedoe@example.com")
response = client.post(url, {"email": "joedoe@example.com"})
# successful redirect, and one email sent because user exists
assert response.status_code == 302
assert response.url.endswith("/accounts/password-reset/done/")
assert len(mail.outbox) == 1
response = client.get(reverse("accounts:password_reset_done"))
assert template_used(response, "ep19/bs/accounts/password_reset_done.html")
# --------
email = mail.outbox[0]
assert email.to == ["joedoe@example.com"]
assert email.subject == "EuroPython2019: Reset password link"
# get a relative url from the middle of the email.
url_from_email = email.body.splitlines()[6].split("example.com")[1]
response = client.get(url_from_email)
# This should be a template with two password inputs
assert template_used(
response, "ep19/bs/accounts/password_reset_confirm.html"
)
assert "Django Administration" not in response.content.decode("utf-8")
assert 'name="new_password1"' in response.content.decode("utf-8")
assert 'name="new_password2"' in response.content.decode("utf-8")
print(email.body)
# --------
response = client.post(
url_from_email,
{"new_password1": "asdf", "new_password2": "asdf"},
follow=True,
)
assert template_used(
response, "ep19/bs/accounts/password_reset_complete.html"
)
assert "Django Administration" not in response.content.decode("utf-8")
| 36.712329 | 79 | 0.699627 | 351 | 2,680 | 5.242165 | 0.333333 | 0.078261 | 0.102717 | 0.075 | 0.455435 | 0.441848 | 0.4125 | 0.4125 | 0.4125 | 0.271739 | 0 | 0.014986 | 0.178358 | 2,680 | 72 | 80 | 37.222222 | 0.820618 | 0.181343 | 0 | 0.244444 | 0 | 0 | 0.284459 | 0.139685 | 0 | 0 | 0 | 0 | 0.422222 | 1 | 0.022222 | false | 0.311111 | 0.111111 | 0 | 0.133333 | 0.022222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
d3de2f9118892f832c71237446df62795d66bff5 | 780 | py | Python | Recursion/Generate Div Tags/GenerateDivTags.py | joydeepnandi/Algo | 1c5c49cb172c47cad7df81dc9b66fedbacbb5b55 | [
"MIT"
] | null | null | null | Recursion/Generate Div Tags/GenerateDivTags.py | joydeepnandi/Algo | 1c5c49cb172c47cad7df81dc9b66fedbacbb5b55 | [
"MIT"
] | null | null | null | Recursion/Generate Div Tags/GenerateDivTags.py | joydeepnandi/Algo | 1c5c49cb172c47cad7df81dc9b66fedbacbb5b55 | [
"MIT"
] | null | null | null | # O((2n)!/((n!((n + 1)!)))) time | O((2n)!/((n!((n + 1)!)))) space -
# where n is the input number
def generateDivTags(numberOfTags):
matchedDivTags = []
generateDivTagsFromPrefix(numberOfTags, numberOfTags, "", matchedDivTags)
return matchedDivTags
def generateDivTagsFromPrefix(openingTagsNeeded, closingTagsNeeded, prefix, result):
if openingTagsNeeded > 0:
newPrefix = prefix + "<div>"
generateDivTagsFromPrefix(openingTagsNeeded - 1, closingTagsNeeded, newPrefix, result)
if openingTagsNeeded < closingTagsNeeded:
newPrefix = prefix + "</div>"
generateDivTagsFromPrefix(openingTagsNeeded, closingTagsNeeded - 1, newPrefix, result)
if closingTagsNeeded == 0:
result.append(prefix) | 32.5 | 95 | 0.671795 | 64 | 780 | 8.1875 | 0.40625 | 0.240458 | 0.015267 | 0.019084 | 0.251908 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012987 | 0.210256 | 780 | 24 | 96 | 32.5 | 0.837662 | 0.120513 | 0 | 0 | 1 | 0 | 0.016641 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0 | 0 | 0.230769 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d3ecc4cb19db2baab0671d65b82a37d52a83e483 | 3,061 | py | Python | reposter.py | plus20charisma/reddit-user-reposter | c787d366d6b9c8dfec8d44f59a3e398dc0c5b57f | [
"MIT"
] | null | null | null | reposter.py | plus20charisma/reddit-user-reposter | c787d366d6b9c8dfec8d44f59a3e398dc0c5b57f | [
"MIT"
] | null | null | null | reposter.py | plus20charisma/reddit-user-reposter | c787d366d6b9c8dfec8d44f59a3e398dc0c5b57f | [
"MIT"
] | null | null | null | import praw
import sqlite3
import time
#config
USERAGENT = "/u/plus20charisma's post & comment reposter by user & karma threshold."
USERNAME = ""#bot's username
PASSWORD = ""#bot's password
USERSCRAPE = "" #what user to scrape & repost from
SUBREDDIT = "" #what subreddit to post in. don't put /r/ or r/
MAXPOSTS = 100 #how far back to look. max is 100
KARMATHRESHOLD = 1000 #anything with karma greater than this number will be reposted
WAIT = 30 #wait time in seconds between runs
#makes a database if one doesn't exist.
#opens db if it already exists
print ("Opening database...")
sql = sqlite3.connect("sql.db")
cur = sql.cursor()
cur.execute('CREATE TABLE IF NOT EXISTS oldsubmissions(ID TEXT)')
cur.execute('CREATE TABLE IF NOT EXISTS oldcomments(ID TEXT)')
sql.commit()
print("Logging in to reddit...")
r = praw.Reddit(USERAGENT)
r.login(USERNAME, PASSWORD)
#comment scanner & reposter
def rep_bot_coms():
user = r.get_redditor(USERSCRAPE)
comments = user.get_comments(limit=MAXPOSTS)
print ("Checking comments...")
for comment in comments:
#checks if comment already exists in db
cur.execute('SELECT * FROM oldcomments WHERE ID=?', [comment.id])
if not cur.fetchone():
try:
if comment.score > KARMATHRESHOLD:
print ("Reposting comment above Karma Threshold...")
repost_comment_body = comment.body
#since comments don't have titles, the first 10 words become post title
repost_title_list = repost_comment_body.split()
repost_title = ' '.join(repost_title_list[:10]) + "..."
r.submit(SUBREDDIT, repost_title, repost_comment_body + "\n Permalink: " + comment.permalink +
"""\n***\n> I'm a bot that is reposting from /u/%s. If there's
any trouble, please message /u/plus20charisma!""" % USERSCRAPE)
print ("Comment posted to /r/"+SUBREDDIT+"!")
except AttributeError:
pass
#writes comment to db by id
cur.execute('INSERT INTO oldcomments VALUES(?)', [comment.id])
sql.commit()
print ("No more comments to submit!")
#same as rep_bot_coms() but for submissions
def rep_bot_submitted():
user = r.get_redditor(USERSCRAPE)
posts = user.get_submitted(limit=MAXPOSTS)
print ("Checking submissions...")
for post in posts:
cur.execute('SELECT * FROM oldsubmissions WHERE ID=?', [post.id])
if not cur.fetchone():
try:
if post.score > KARMATHRESHOLD:
print ("Reposting submission above Karma Threshold...")
repost_title = post.title
repost_url = post.url
r.submit(SUBREDDIT, repost_title, "Title:" + repost_title + repost_url +
"""\nOriginal Submission: %s \n***\n> I'm a bot that is reposting from /u/%s.
If there's any trouble, please message /u/plus20charisma!""" % (post.short_link, USERSCRAPE))
print "Submission posted to /r/"+SUBREDDIT+"!"
except AttributeError:
pass
cur.execute('INSERT INTO oldsubmissions VALUES(?)', [post.id])
sql.commit()
print ("No more submissions to submit!")
while True:
rep_bot_submitted()
rep_bot_coms()
print "Waiting " + str(WAIT) + " seconds"
time.sleep(WAIT)
| 35.183908 | 99 | 0.701405 | 429 | 3,061 | 4.93007 | 0.358974 | 0.036407 | 0.019858 | 0.019858 | 0.23357 | 0.183452 | 0.162648 | 0.070922 | 0.070922 | 0.070922 | 0 | 0.009501 | 0.174779 | 3,061 | 86 | 100 | 35.593023 | 0.827791 | 0.166612 | 0 | 0.19697 | 0 | 0 | 0.278487 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.060606 | 0.045455 | null | null | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
d3ef5b922b914bf83febaccf0e084c2dbeb1972e | 276 | py | Python | leetcode/1523_count_odd_numbers_in_an_interval_range.py | jacquerie/leetcode | a05e6b832eb0e0740aaff7b2eb3109038ad404bf | [
"MIT"
] | 3 | 2018-05-10T09:56:49.000Z | 2020-11-07T18:09:42.000Z | leetcode/1523_count_odd_numbers_in_an_interval_range.py | jacquerie/leetcode | a05e6b832eb0e0740aaff7b2eb3109038ad404bf | [
"MIT"
] | null | null | null | leetcode/1523_count_odd_numbers_in_an_interval_range.py | jacquerie/leetcode | a05e6b832eb0e0740aaff7b2eb3109038ad404bf | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
class Solution:
def countOdds(self, low: int, high: int) -> int:
return (high + 1) // 2 - low // 2
if __name__ == '__main__':
solution = Solution()
assert 3 == solution.countOdds(3, 7)
assert 1 == solution.countOdds(8, 10)
| 19.714286 | 52 | 0.572464 | 36 | 276 | 4.166667 | 0.583333 | 0.226667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053659 | 0.257246 | 276 | 13 | 53 | 21.230769 | 0.678049 | 0.076087 | 0 | 0 | 0 | 0 | 0.031621 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 1 | 0.142857 | false | 0 | 0 | 0.142857 | 0.428571 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
d3f0e325a23686c69f7eec073c983de5f0695885 | 658 | py | Python | packtml/__init__.py | cicorias/supv-ml-py | f7e030206efe5bb2c49433ae18e115ca0fcfc5cb | [
"MIT"
] | 14 | 2018-08-22T22:12:40.000Z | 2021-10-04T16:28:14.000Z | packtml/__init__.py | cicorias/supv-ml-py | f7e030206efe5bb2c49433ae18e115ca0fcfc5cb | [
"MIT"
] | null | null | null | packtml/__init__.py | cicorias/supv-ml-py | f7e030206efe5bb2c49433ae18e115ca0fcfc5cb | [
"MIT"
] | 14 | 2018-05-31T20:42:12.000Z | 2021-09-15T08:00:14.000Z | # -*- coding: utf-8 -*-
import os
# global namespace:
from packtml import clustering
from packtml import decision_tree
from packtml import metrics
from packtml import neural_net
from packtml import recommendation
from packtml import regression
from packtml import utils
# set the version
packtml_location = os.path.abspath(os.path.dirname(__file__))
with open(os.path.join(packtml_location, "VERSION")) as vsn:
__version__ = vsn.read().strip()
# remove from global namespace
del os
del packtml_location
del vsn
__all__ = [
'clustering',
'decision_tree',
'metrics',
'neural_net',
'recommendation',
'regression',
'utils'
]
| 19.939394 | 61 | 0.735562 | 84 | 658 | 5.535714 | 0.428571 | 0.165591 | 0.255914 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001832 | 0.170213 | 658 | 32 | 62 | 20.5625 | 0.849817 | 0.12766 | 0 | 0 | 0 | 0 | 0.133568 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.347826 | 0 | 0.347826 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
d3fb859b5ebf0b73bfbc7676e20ebf0151a0c398 | 1,578 | py | Python | dyncache/functionhash.py | b10011/dyncache | b36db775a83cedd445a0a6873161f74027072cdf | [
"MIT"
] | 3 | 2020-04-14T23:50:51.000Z | 2020-04-21T08:29:25.000Z | dyncache/functionhash.py | b10011/dyncache | b36db775a83cedd445a0a6873161f74027072cdf | [
"MIT"
] | null | null | null | dyncache/functionhash.py | b10011/dyncache | b36db775a83cedd445a0a6873161f74027072cdf | [
"MIT"
] | 1 | 2020-04-15T07:06:22.000Z | 2020-04-15T07:06:22.000Z | from hashlib import sha256
import cloudpickle
from .version import PYTHON_VERSION
def get_function_hash_py35_to_py37(function):
code = function.__code__
return sha256(
cloudpickle.dumps(
(
code.co_argcount,
code.co_cellvars,
code.co_code,
code.co_consts,
code.co_flags,
code.co_freevars,
code.co_kwonlyargcount,
code.co_lnotab,
code.co_name,
code.co_names,
code.co_nlocals,
code.co_stacksize,
code.co_varnames,
)
)
).digest()
def get_function_hash_python38_to_latest(function):
code = function.__code__
return sha256(
cloudpickle.dumps(
(
code.co_argcount,
code.co_cellvars,
code.co_code,
code.co_consts,
code.co_flags,
code.co_freevars,
code.co_kwonlyargcount,
code.co_lnotab,
code.co_name,
code.co_names,
code.co_nlocals,
code.co_posonlyargcount,
code.co_stacksize,
code.co_varnames,
)
)
).digest()
if PYTHON_VERSION >= (3, 8):
get_function_hash = get_function_hash_python38_to_latest
elif PYTHON_VERSION >= (3, 5):
get_function_hash = get_function_hash_py35_to_py37
else:
raise Exception("Python version not supported")
| 26.3 | 60 | 0.529785 | 159 | 1,578 | 4.867925 | 0.27673 | 0.209302 | 0.116279 | 0.046512 | 0.780362 | 0.780362 | 0.596899 | 0.514212 | 0.514212 | 0.514212 | 0 | 0.026596 | 0.404309 | 1,578 | 59 | 61 | 26.745763 | 0.796809 | 0 | 0 | 0.653846 | 0 | 0 | 0.017744 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0 | 0.057692 | 0 | 0.134615 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
311425c98c9164169aaa4631abe50ba707c57c73 | 209 | py | Python | level2_1 - Bunny Prisoner Locating/solution.py | berman82312/google-foobar | e5185f8b5d59519aa2ee0c74d4157faa80054e8a | [
"MIT"
] | null | null | null | level2_1 - Bunny Prisoner Locating/solution.py | berman82312/google-foobar | e5185f8b5d59519aa2ee0c74d4157faa80054e8a | [
"MIT"
] | null | null | null | level2_1 - Bunny Prisoner Locating/solution.py | berman82312/google-foobar | e5185f8b5d59519aa2ee0c74d4157faa80054e8a | [
"MIT"
] | null | null | null | def solution(x, y):
result = 0
if y == 1:
result = (1 + x) * x / 2
elif y == 2:
result = (1 + x) * x / 2 + x
else:
end = x + y - 2
result = (1 + end) * end / 2 + x
return str(result)
| 19 | 36 | 0.435407 | 37 | 209 | 2.459459 | 0.378378 | 0.230769 | 0.175824 | 0.197802 | 0.21978 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078125 | 0.38756 | 209 | 10 | 37 | 20.9 | 0.632813 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
31165ff67a18c8a55507e9fafb3f8a20097a2c71 | 25,777 | py | Python | pysnmp/MAU-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/MAU-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/MAU-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module MAU-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/MAU-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 17:49:53 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
ObjectIdentifier, Integer, OctetString = mibBuilder.importSymbols("ASN1", "ObjectIdentifier", "Integer", "OctetString")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ConstraintsUnion, ValueSizeConstraint, ValueRangeConstraint, ConstraintsIntersection, SingleValueConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ConstraintsUnion", "ValueSizeConstraint", "ValueRangeConstraint", "ConstraintsIntersection", "SingleValueConstraint")
IANAifJackType, IANAifMauTypeListBits, IANAifMauMediaAvailable, IANAifMauAutoNegCapBits = mibBuilder.importSymbols("IANA-MAU-MIB", "IANAifJackType", "IANAifMauTypeListBits", "IANAifMauMediaAvailable", "IANAifMauAutoNegCapBits")
InterfaceIndex, = mibBuilder.importSymbols("IF-MIB", "InterfaceIndex")
ModuleCompliance, ObjectGroup, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "ObjectGroup", "NotificationGroup")
MibIdentifier, iso, mib_2, Gauge32, Counter64, ModuleIdentity, Unsigned32, NotificationType, Counter32, Bits, TimeTicks, IpAddress, ObjectIdentity, Integer32, MibScalar, MibTable, MibTableRow, MibTableColumn = mibBuilder.importSymbols("SNMPv2-SMI", "MibIdentifier", "iso", "mib-2", "Gauge32", "Counter64", "ModuleIdentity", "Unsigned32", "NotificationType", "Counter32", "Bits", "TimeTicks", "IpAddress", "ObjectIdentity", "Integer32", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn")
TextualConvention, DisplayString, AutonomousType, TruthValue = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString", "AutonomousType", "TruthValue")
mauMod = ModuleIdentity((1, 3, 6, 1, 2, 1, 26, 6))
mauMod.setRevisions(('2007-04-21 00:00', '2003-09-19 00:00', '1999-08-24 04:00', '1997-10-31 00:00', '1993-09-30 00:00',))
if mibBuilder.loadTexts: mauMod.setLastUpdated('200704210000Z')
if mibBuilder.loadTexts: mauMod.setOrganization('IETF Ethernet Interfaces and Hub MIB Working Group')
snmpDot3MauMgt = MibIdentifier((1, 3, 6, 1, 2, 1, 26))
class JackType(TextualConvention, Integer32):
status = 'deprecated'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14))
namedValues = NamedValues(("other", 1), ("rj45", 2), ("rj45S", 3), ("db9", 4), ("bnc", 5), ("fAUI", 6), ("mAUI", 7), ("fiberSC", 8), ("fiberMIC", 9), ("fiberST", 10), ("telco", 11), ("mtrj", 12), ("hssdc", 13), ("fiberLC", 14))
dot3RpMauBasicGroup = MibIdentifier((1, 3, 6, 1, 2, 1, 26, 1))
dot3IfMauBasicGroup = MibIdentifier((1, 3, 6, 1, 2, 1, 26, 2))
dot3BroadMauBasicGroup = MibIdentifier((1, 3, 6, 1, 2, 1, 26, 3))
dot3IfMauAutoNegGroup = MibIdentifier((1, 3, 6, 1, 2, 1, 26, 5))
rpMauTable = MibTable((1, 3, 6, 1, 2, 1, 26, 1, 1), )
if mibBuilder.loadTexts: rpMauTable.setStatus('current')
rpMauEntry = MibTableRow((1, 3, 6, 1, 2, 1, 26, 1, 1, 1), ).setIndexNames((0, "MAU-MIB", "rpMauGroupIndex"), (0, "MAU-MIB", "rpMauPortIndex"), (0, "MAU-MIB", "rpMauIndex"))
if mibBuilder.loadTexts: rpMauEntry.setStatus('current')
rpMauGroupIndex = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 1, 1, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rpMauGroupIndex.setStatus('current')
rpMauPortIndex = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 1, 1, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rpMauPortIndex.setStatus('current')
rpMauIndex = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 1, 1, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rpMauIndex.setStatus('current')
rpMauType = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 1, 1, 1, 4), AutonomousType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rpMauType.setStatus('current')
rpMauStatus = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 1, 1, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6))).clone(namedValues=NamedValues(("other", 1), ("unknown", 2), ("operational", 3), ("standby", 4), ("shutdown", 5), ("reset", 6)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rpMauStatus.setStatus('current')
rpMauMediaAvailable = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 1, 1, 1, 6), IANAifMauMediaAvailable()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rpMauMediaAvailable.setStatus('current')
rpMauMediaAvailableStateExits = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 1, 1, 1, 7), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rpMauMediaAvailableStateExits.setStatus('current')
rpMauJabberState = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 1, 1, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("other", 1), ("unknown", 2), ("noJabber", 3), ("jabbering", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rpMauJabberState.setStatus('current')
rpMauJabberingStateEnters = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 1, 1, 1, 9), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rpMauJabberingStateEnters.setStatus('current')
rpMauFalseCarriers = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 1, 1, 1, 10), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rpMauFalseCarriers.setStatus('current')
rpJackTable = MibTable((1, 3, 6, 1, 2, 1, 26, 1, 2), )
if mibBuilder.loadTexts: rpJackTable.setStatus('current')
rpJackEntry = MibTableRow((1, 3, 6, 1, 2, 1, 26, 1, 2, 1), ).setIndexNames((0, "MAU-MIB", "rpMauGroupIndex"), (0, "MAU-MIB", "rpMauPortIndex"), (0, "MAU-MIB", "rpMauIndex"), (0, "MAU-MIB", "rpJackIndex"))
if mibBuilder.loadTexts: rpJackEntry.setStatus('current')
rpJackIndex = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 1, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 2147483647)))
if mibBuilder.loadTexts: rpJackIndex.setStatus('current')
rpJackType = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 1, 2, 1, 2), IANAifJackType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rpJackType.setStatus('current')
ifMauTable = MibTable((1, 3, 6, 1, 2, 1, 26, 2, 1), )
if mibBuilder.loadTexts: ifMauTable.setStatus('current')
ifMauEntry = MibTableRow((1, 3, 6, 1, 2, 1, 26, 2, 1, 1), ).setIndexNames((0, "MAU-MIB", "ifMauIfIndex"), (0, "MAU-MIB", "ifMauIndex"))
if mibBuilder.loadTexts: ifMauEntry.setStatus('current')
ifMauIfIndex = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 1), InterfaceIndex()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauIfIndex.setStatus('current')
ifMauIndex = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauIndex.setStatus('current')
ifMauType = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 3), AutonomousType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauType.setStatus('current')
ifMauStatus = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6))).clone(namedValues=NamedValues(("other", 1), ("unknown", 2), ("operational", 3), ("standby", 4), ("shutdown", 5), ("reset", 6)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: ifMauStatus.setStatus('current')
ifMauMediaAvailable = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 5), IANAifMauMediaAvailable()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauMediaAvailable.setStatus('current')
ifMauMediaAvailableStateExits = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 6), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauMediaAvailableStateExits.setStatus('current')
ifMauJabberState = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("other", 1), ("unknown", 2), ("noJabber", 3), ("jabbering", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauJabberState.setStatus('current')
ifMauJabberingStateEnters = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 8), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauJabberingStateEnters.setStatus('current')
ifMauFalseCarriers = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 9), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauFalseCarriers.setStatus('current')
ifMauTypeList = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 10), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauTypeList.setStatus('deprecated')
ifMauDefaultType = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 11), AutonomousType()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: ifMauDefaultType.setStatus('current')
ifMauAutoNegSupported = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 12), TruthValue()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauAutoNegSupported.setStatus('current')
ifMauTypeListBits = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 13), IANAifMauTypeListBits()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauTypeListBits.setStatus('current')
ifMauHCFalseCarriers = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 1, 1, 14), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauHCFalseCarriers.setStatus('current')
ifJackTable = MibTable((1, 3, 6, 1, 2, 1, 26, 2, 2), )
if mibBuilder.loadTexts: ifJackTable.setStatus('current')
ifJackEntry = MibTableRow((1, 3, 6, 1, 2, 1, 26, 2, 2, 1), ).setIndexNames((0, "MAU-MIB", "ifMauIfIndex"), (0, "MAU-MIB", "ifMauIndex"), (0, "MAU-MIB", "ifJackIndex"))
if mibBuilder.loadTexts: ifJackEntry.setStatus('current')
ifJackIndex = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 2147483647)))
if mibBuilder.loadTexts: ifJackIndex.setStatus('current')
ifJackType = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 2, 2, 1, 2), IANAifJackType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifJackType.setStatus('current')
ifMauAutoNegTable = MibTable((1, 3, 6, 1, 2, 1, 26, 5, 1), )
if mibBuilder.loadTexts: ifMauAutoNegTable.setStatus('current')
ifMauAutoNegEntry = MibTableRow((1, 3, 6, 1, 2, 1, 26, 5, 1, 1), ).setIndexNames((0, "MAU-MIB", "ifMauIfIndex"), (0, "MAU-MIB", "ifMauIndex"))
if mibBuilder.loadTexts: ifMauAutoNegEntry.setStatus('current')
ifMauAutoNegAdminStatus = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 5, 1, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("enabled", 1), ("disabled", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: ifMauAutoNegAdminStatus.setStatus('current')
ifMauAutoNegRemoteSignaling = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 5, 1, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("detected", 1), ("notdetected", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauAutoNegRemoteSignaling.setStatus('current')
ifMauAutoNegConfig = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 5, 1, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5))).clone(namedValues=NamedValues(("other", 1), ("configuring", 2), ("complete", 3), ("disabled", 4), ("parallelDetectFail", 5)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauAutoNegConfig.setStatus('current')
ifMauAutoNegCapability = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 5, 1, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauAutoNegCapability.setStatus('deprecated')
ifMauAutoNegCapAdvertised = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 5, 1, 1, 6), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: ifMauAutoNegCapAdvertised.setStatus('deprecated')
ifMauAutoNegCapReceived = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 5, 1, 1, 7), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauAutoNegCapReceived.setStatus('deprecated')
ifMauAutoNegRestart = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 5, 1, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("restart", 1), ("norestart", 2)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: ifMauAutoNegRestart.setStatus('current')
ifMauAutoNegCapabilityBits = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 5, 1, 1, 9), IANAifMauAutoNegCapBits()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauAutoNegCapabilityBits.setStatus('current')
ifMauAutoNegCapAdvertisedBits = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 5, 1, 1, 10), IANAifMauAutoNegCapBits()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: ifMauAutoNegCapAdvertisedBits.setStatus('current')
ifMauAutoNegCapReceivedBits = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 5, 1, 1, 11), IANAifMauAutoNegCapBits()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauAutoNegCapReceivedBits.setStatus('current')
ifMauAutoNegRemoteFaultAdvertised = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 5, 1, 1, 12), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("noError", 1), ("offline", 2), ("linkFailure", 3), ("autoNegError", 4)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: ifMauAutoNegRemoteFaultAdvertised.setStatus('current')
ifMauAutoNegRemoteFaultReceived = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 5, 1, 1, 13), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("noError", 1), ("offline", 2), ("linkFailure", 3), ("autoNegError", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ifMauAutoNegRemoteFaultReceived.setStatus('current')
broadMauBasicTable = MibTable((1, 3, 6, 1, 2, 1, 26, 3, 1), )
if mibBuilder.loadTexts: broadMauBasicTable.setStatus('deprecated')
broadMauBasicEntry = MibTableRow((1, 3, 6, 1, 2, 1, 26, 3, 1, 1), ).setIndexNames((0, "MAU-MIB", "broadMauIfIndex"), (0, "MAU-MIB", "broadMauIndex"))
if mibBuilder.loadTexts: broadMauBasicEntry.setStatus('deprecated')
broadMauIfIndex = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 3, 1, 1, 1), InterfaceIndex()).setMaxAccess("readonly")
if mibBuilder.loadTexts: broadMauIfIndex.setStatus('deprecated')
broadMauIndex = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 3, 1, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: broadMauIndex.setStatus('deprecated')
broadMauXmtRcvSplitType = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 3, 1, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("other", 1), ("single", 2), ("dual", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: broadMauXmtRcvSplitType.setStatus('deprecated')
broadMauXmtCarrierFreq = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 3, 1, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: broadMauXmtCarrierFreq.setStatus('deprecated')
broadMauTranslationFreq = MibTableColumn((1, 3, 6, 1, 2, 1, 26, 3, 1, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: broadMauTranslationFreq.setStatus('deprecated')
snmpDot3MauTraps = MibIdentifier((1, 3, 6, 1, 2, 1, 26, 0))
rpMauJabberTrap = NotificationType((1, 3, 6, 1, 2, 1, 26, 0, 1)).setObjects(("MAU-MIB", "rpMauJabberState"))
if mibBuilder.loadTexts: rpMauJabberTrap.setStatus('current')
ifMauJabberTrap = NotificationType((1, 3, 6, 1, 2, 1, 26, 0, 2)).setObjects(("MAU-MIB", "ifMauJabberState"))
if mibBuilder.loadTexts: ifMauJabberTrap.setStatus('current')
mauModConf = MibIdentifier((1, 3, 6, 1, 2, 1, 26, 6, 1))
mauModCompls = MibIdentifier((1, 3, 6, 1, 2, 1, 26, 6, 1, 1))
mauModObjGrps = MibIdentifier((1, 3, 6, 1, 2, 1, 26, 6, 1, 2))
mauModNotGrps = MibIdentifier((1, 3, 6, 1, 2, 1, 26, 6, 1, 3))
mauRpGrpBasic = ObjectGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 2, 1)).setObjects(("MAU-MIB", "rpMauGroupIndex"), ("MAU-MIB", "rpMauPortIndex"), ("MAU-MIB", "rpMauIndex"), ("MAU-MIB", "rpMauType"), ("MAU-MIB", "rpMauStatus"), ("MAU-MIB", "rpMauMediaAvailable"), ("MAU-MIB", "rpMauMediaAvailableStateExits"), ("MAU-MIB", "rpMauJabberState"), ("MAU-MIB", "rpMauJabberingStateEnters"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauRpGrpBasic = mauRpGrpBasic.setStatus('current')
mauRpGrp100Mbs = ObjectGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 2, 2)).setObjects(("MAU-MIB", "rpMauFalseCarriers"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauRpGrp100Mbs = mauRpGrp100Mbs.setStatus('current')
mauRpGrpJack = ObjectGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 2, 3)).setObjects(("MAU-MIB", "rpJackType"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauRpGrpJack = mauRpGrpJack.setStatus('current')
mauIfGrpBasic = ObjectGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 2, 4)).setObjects(("MAU-MIB", "ifMauIfIndex"), ("MAU-MIB", "ifMauIndex"), ("MAU-MIB", "ifMauType"), ("MAU-MIB", "ifMauStatus"), ("MAU-MIB", "ifMauMediaAvailable"), ("MAU-MIB", "ifMauMediaAvailableStateExits"), ("MAU-MIB", "ifMauJabberState"), ("MAU-MIB", "ifMauJabberingStateEnters"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauIfGrpBasic = mauIfGrpBasic.setStatus('current')
mauIfGrp100Mbs = ObjectGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 2, 5)).setObjects(("MAU-MIB", "ifMauFalseCarriers"), ("MAU-MIB", "ifMauTypeList"), ("MAU-MIB", "ifMauDefaultType"), ("MAU-MIB", "ifMauAutoNegSupported"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauIfGrp100Mbs = mauIfGrp100Mbs.setStatus('deprecated')
mauIfGrpJack = ObjectGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 2, 6)).setObjects(("MAU-MIB", "ifJackType"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauIfGrpJack = mauIfGrpJack.setStatus('current')
mauIfGrpAutoNeg = ObjectGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 2, 7)).setObjects(("MAU-MIB", "ifMauAutoNegAdminStatus"), ("MAU-MIB", "ifMauAutoNegRemoteSignaling"), ("MAU-MIB", "ifMauAutoNegConfig"), ("MAU-MIB", "ifMauAutoNegCapability"), ("MAU-MIB", "ifMauAutoNegCapAdvertised"), ("MAU-MIB", "ifMauAutoNegCapReceived"), ("MAU-MIB", "ifMauAutoNegRestart"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauIfGrpAutoNeg = mauIfGrpAutoNeg.setStatus('deprecated')
mauBroadBasic = ObjectGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 2, 8)).setObjects(("MAU-MIB", "broadMauIfIndex"), ("MAU-MIB", "broadMauIndex"), ("MAU-MIB", "broadMauXmtRcvSplitType"), ("MAU-MIB", "broadMauXmtCarrierFreq"), ("MAU-MIB", "broadMauTranslationFreq"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauBroadBasic = mauBroadBasic.setStatus('deprecated')
mauIfGrpHighCapacity = ObjectGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 2, 9)).setObjects(("MAU-MIB", "ifMauFalseCarriers"), ("MAU-MIB", "ifMauTypeListBits"), ("MAU-MIB", "ifMauDefaultType"), ("MAU-MIB", "ifMauAutoNegSupported"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauIfGrpHighCapacity = mauIfGrpHighCapacity.setStatus('current')
mauIfGrpAutoNeg2 = ObjectGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 2, 10)).setObjects(("MAU-MIB", "ifMauAutoNegAdminStatus"), ("MAU-MIB", "ifMauAutoNegRemoteSignaling"), ("MAU-MIB", "ifMauAutoNegConfig"), ("MAU-MIB", "ifMauAutoNegCapabilityBits"), ("MAU-MIB", "ifMauAutoNegCapAdvertisedBits"), ("MAU-MIB", "ifMauAutoNegCapReceivedBits"), ("MAU-MIB", "ifMauAutoNegRestart"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauIfGrpAutoNeg2 = mauIfGrpAutoNeg2.setStatus('current')
mauIfGrpAutoNeg1000Mbps = ObjectGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 2, 11)).setObjects(("MAU-MIB", "ifMauAutoNegRemoteFaultAdvertised"), ("MAU-MIB", "ifMauAutoNegRemoteFaultReceived"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauIfGrpAutoNeg1000Mbps = mauIfGrpAutoNeg1000Mbps.setStatus('current')
mauIfGrpHCStats = ObjectGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 2, 12)).setObjects(("MAU-MIB", "ifMauHCFalseCarriers"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauIfGrpHCStats = mauIfGrpHCStats.setStatus('current')
rpMauNotifications = NotificationGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 3, 1)).setObjects(("MAU-MIB", "rpMauJabberTrap"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
rpMauNotifications = rpMauNotifications.setStatus('current')
ifMauNotifications = NotificationGroup((1, 3, 6, 1, 2, 1, 26, 6, 1, 3, 2)).setObjects(("MAU-MIB", "ifMauJabberTrap"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ifMauNotifications = ifMauNotifications.setStatus('current')
mauModRpCompl = ModuleCompliance((1, 3, 6, 1, 2, 1, 26, 6, 1, 1, 1)).setObjects(("MAU-MIB", "mauRpGrpBasic"), ("MAU-MIB", "mauRpGrp100Mbs"), ("MAU-MIB", "mauRpGrpJack"), ("MAU-MIB", "rpMauNotifications"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauModRpCompl = mauModRpCompl.setStatus('deprecated')
mauModIfCompl = ModuleCompliance((1, 3, 6, 1, 2, 1, 26, 6, 1, 1, 2)).setObjects(("MAU-MIB", "mauIfGrpBasic"), ("MAU-MIB", "mauIfGrp100Mbs"), ("MAU-MIB", "mauIfGrpJack"), ("MAU-MIB", "mauIfGrpAutoNeg"), ("MAU-MIB", "mauBroadBasic"), ("MAU-MIB", "ifMauNotifications"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauModIfCompl = mauModIfCompl.setStatus('deprecated')
mauModIfCompl2 = ModuleCompliance((1, 3, 6, 1, 2, 1, 26, 6, 1, 1, 3)).setObjects(("MAU-MIB", "mauIfGrpBasic"), ("MAU-MIB", "mauIfGrpHighCapacity"), ("MAU-MIB", "mauIfGrpJack"), ("MAU-MIB", "mauIfGrpAutoNeg2"), ("MAU-MIB", "mauIfGrpAutoNeg1000Mbps"), ("MAU-MIB", "ifMauNotifications"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauModIfCompl2 = mauModIfCompl2.setStatus('deprecated')
mauModRpCompl2 = ModuleCompliance((1, 3, 6, 1, 2, 1, 26, 6, 1, 1, 4)).setObjects(("MAU-MIB", "mauRpGrpBasic"), ("MAU-MIB", "mauRpGrp100Mbs"), ("MAU-MIB", "mauRpGrpJack"), ("MAU-MIB", "rpMauNotifications"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauModRpCompl2 = mauModRpCompl2.setStatus('current')
mauModIfCompl3 = ModuleCompliance((1, 3, 6, 1, 2, 1, 26, 6, 1, 1, 5)).setObjects(("MAU-MIB", "mauIfGrpBasic"), ("MAU-MIB", "mauIfGrpHighCapacity"), ("MAU-MIB", "mauIfGrpHCStats"), ("MAU-MIB", "mauIfGrpJack"), ("MAU-MIB", "mauIfGrpAutoNeg2"), ("MAU-MIB", "mauIfGrpAutoNeg1000Mbps"), ("MAU-MIB", "ifMauNotifications"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
mauModIfCompl3 = mauModIfCompl3.setStatus('current')
mibBuilder.exportSymbols("MAU-MIB", mauIfGrpJack=mauIfGrpJack, mauModNotGrps=mauModNotGrps, ifMauStatus=ifMauStatus, ifJackEntry=ifJackEntry, ifMauAutoNegCapReceived=ifMauAutoNegCapReceived, mauModConf=mauModConf, mauRpGrp100Mbs=mauRpGrp100Mbs, rpMauTable=rpMauTable, rpMauStatus=rpMauStatus, ifMauNotifications=ifMauNotifications, ifMauJabberingStateEnters=ifMauJabberingStateEnters, ifMauAutoNegEntry=ifMauAutoNegEntry, rpJackIndex=rpJackIndex, mauModIfCompl2=mauModIfCompl2, mauModIfCompl3=mauModIfCompl3, ifMauAutoNegSupported=ifMauAutoNegSupported, ifMauJabberState=ifMauJabberState, mauModRpCompl2=mauModRpCompl2, dot3BroadMauBasicGroup=dot3BroadMauBasicGroup, broadMauXmtCarrierFreq=broadMauXmtCarrierFreq, rpMauJabberState=rpMauJabberState, ifMauAutoNegRemoteSignaling=ifMauAutoNegRemoteSignaling, ifMauFalseCarriers=ifMauFalseCarriers, dot3RpMauBasicGroup=dot3RpMauBasicGroup, ifMauAutoNegCapability=ifMauAutoNegCapability, mauRpGrpBasic=mauRpGrpBasic, rpJackType=rpJackType, ifMauMediaAvailable=ifMauMediaAvailable, rpJackTable=rpJackTable, mauModObjGrps=mauModObjGrps, ifMauAutoNegCapabilityBits=ifMauAutoNegCapabilityBits, rpMauMediaAvailableStateExits=rpMauMediaAvailableStateExits, ifMauAutoNegConfig=ifMauAutoNegConfig, rpMauIndex=rpMauIndex, broadMauIfIndex=broadMauIfIndex, ifMauType=ifMauType, ifJackIndex=ifJackIndex, ifMauTable=ifMauTable, snmpDot3MauTraps=snmpDot3MauTraps, mauIfGrpBasic=mauIfGrpBasic, rpMauJabberingStateEnters=rpMauJabberingStateEnters, broadMauTranslationFreq=broadMauTranslationFreq, ifMauHCFalseCarriers=ifMauHCFalseCarriers, mauModRpCompl=mauModRpCompl, mauIfGrpHighCapacity=mauIfGrpHighCapacity, ifMauAutoNegCapReceivedBits=ifMauAutoNegCapReceivedBits, broadMauXmtRcvSplitType=broadMauXmtRcvSplitType, mauModCompls=mauModCompls, PYSNMP_MODULE_ID=mauMod, ifMauAutoNegRemoteFaultAdvertised=ifMauAutoNegRemoteFaultAdvertised, rpMauNotifications=rpMauNotifications, ifMauAutoNegAdminStatus=ifMauAutoNegAdminStatus, ifMauJabberTrap=ifMauJabberTrap, rpMauPortIndex=rpMauPortIndex, mauModIfCompl=mauModIfCompl, rpMauFalseCarriers=rpMauFalseCarriers, mauIfGrpAutoNeg1000Mbps=mauIfGrpAutoNeg1000Mbps, ifMauTypeListBits=ifMauTypeListBits, rpMauJabberTrap=rpMauJabberTrap, snmpDot3MauMgt=snmpDot3MauMgt, ifMauIndex=ifMauIndex, rpMauType=rpMauType, ifMauTypeList=ifMauTypeList, dot3IfMauBasicGroup=dot3IfMauBasicGroup, broadMauIndex=broadMauIndex, mauIfGrpAutoNeg=mauIfGrpAutoNeg, rpMauGroupIndex=rpMauGroupIndex, broadMauBasicEntry=broadMauBasicEntry, rpMauMediaAvailable=rpMauMediaAvailable, ifMauAutoNegCapAdvertisedBits=ifMauAutoNegCapAdvertisedBits, ifMauAutoNegRestart=ifMauAutoNegRestart, ifJackType=ifJackType, mauMod=mauMod, ifMauAutoNegTable=ifMauAutoNegTable, ifMauAutoNegCapAdvertised=ifMauAutoNegCapAdvertised, mauIfGrp100Mbs=mauIfGrp100Mbs, JackType=JackType, rpJackEntry=rpJackEntry, mauIfGrpAutoNeg2=mauIfGrpAutoNeg2, ifMauMediaAvailableStateExits=ifMauMediaAvailableStateExits, mauRpGrpJack=mauRpGrpJack, ifJackTable=ifJackTable, ifMauEntry=ifMauEntry, rpMauEntry=rpMauEntry, mauIfGrpHCStats=mauIfGrpHCStats, ifMauAutoNegRemoteFaultReceived=ifMauAutoNegRemoteFaultReceived, ifMauDefaultType=ifMauDefaultType, dot3IfMauAutoNegGroup=dot3IfMauAutoNegGroup, broadMauBasicTable=broadMauBasicTable, mauBroadBasic=mauBroadBasic, ifMauIfIndex=ifMauIfIndex)
| 119.337963 | 3,298 | 0.736858 | 2,895 | 25,777 | 6.559931 | 0.094991 | 0.013164 | 0.016113 | 0.018956 | 0.446633 | 0.4103 | 0.359223 | 0.359223 | 0.346269 | 0.304987 | 0 | 0.071596 | 0.088606 | 25,777 | 215 | 3,299 | 119.893023 | 0.736773 | 0.011949 | 0 | 0.094059 | 0 | 0 | 0.178908 | 0.027141 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.039604 | 0 | 0.059406 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
311a02d0e9290ac29bd0a8ece1c383263eea1e7e | 1,255 | py | Python | dyex/functions/function.py | jsistos/derivative-calculator | 611861caef87cace7ff39ebdd6e40eb8b4097d12 | [
"Apache-2.0"
] | null | null | null | dyex/functions/function.py | jsistos/derivative-calculator | 611861caef87cace7ff39ebdd6e40eb8b4097d12 | [
"Apache-2.0"
] | null | null | null | dyex/functions/function.py | jsistos/derivative-calculator | 611861caef87cace7ff39ebdd6e40eb8b4097d12 | [
"Apache-2.0"
] | null | null | null | class Function():
def __init__(self, name):
self.name = name
self._args = []
return None
def __add__(self, other):
import dyex.functions.operator as op
return op.Sum(self, other)
def __mul__(self, other):
import dyex.functions.operator as op
import dyex.functions.elementary as elem
if isinstance(other, (int, float)):
return op.Mul(self, elem.Const(other))
return op.Mul(self, other)
def __rmul__(self, other):
import dyex.functions.operator as op
import dyex.functions.elementary as elem
if isinstance(other, (int, float)):
return op.Mul(elem.Const(other), self)
return op.Mul(other, self)
def __truediv__(self, other):
import dyex.functions.operator as op
return op.Div(self, other)
def __sub__(self, other):
import dyex.functions.elementary as elem
return self + elem.Const(-1) * other
def __pow__(self, degree):
import dyex.functions.elementary as elem
if isinstance(degree, (int, float)):
return elem.Poly(self, degree)
return elem.Poly(self, degree.value)
#def __init__(self, func_string = ""):
| 29.880952 | 50 | 0.609562 | 156 | 1,255 | 4.685897 | 0.237179 | 0.098495 | 0.207934 | 0.129959 | 0.625171 | 0.547196 | 0.499316 | 0.499316 | 0.435021 | 0.435021 | 0 | 0.001121 | 0.289243 | 1,255 | 42 | 51 | 29.880952 | 0.818386 | 0.029482 | 0 | 0.322581 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.225806 | false | 0 | 0.258065 | 0 | 0.83871 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
312d737c3bde445a9c9489e8198e4a1d10e9df90 | 1,207 | py | Python | examples/python/block/block_consensus.py | rockacola/neo-smart-contract-examples | d358c31d99fd7c2eed29008f92c123968ef94b0a | [
"MIT"
] | 15 | 2018-03-04T10:39:41.000Z | 2021-12-02T11:07:10.000Z | examples/python/block/block_consensus.py | rockacola/neo-smart-contract-examples | d358c31d99fd7c2eed29008f92c123968ef94b0a | [
"MIT"
] | 1 | 2018-07-09T08:18:48.000Z | 2019-10-23T14:40:11.000Z | examples/python/block/block_consensus.py | rockacola/neo-smart-contract-examples | d358c31d99fd7c2eed29008f92c123968ef94b0a | [
"MIT"
] | 7 | 2018-02-27T23:26:46.000Z | 2021-08-18T05:45:37.000Z | """
Date Created: 2018-02-28
Date Modified: 2018-02-28
Version: 1
Contract Hash: 6f873e1c9b6f306fb41f9a9a81ce32df726a1fb4
Available on NEO TestNet: False
Available on CoZ TestNet: False
Available on MainNet: False
Example: (return as bytearray)
Test Invoke: build /path/to/block-consensus.py test 02 05 False False 1
Expected Result: b'd<\t6\xaf\x9c\xdb\xb4\x00'
Operation Count: 53
GAS Consumption: 0.144
Example: (return as integer)
Test Invoke: build /path/to/block-consensus.py test 02 02 False False 1
Expected Result: 13032182223066446948
Operation Count: 53
GAS Consumption: 0.144
"""
from boa.blockchain.vm.Neo.Blockchain import GetHeader
from boa.blockchain.vm.Neo.Header import GetHash, GetConsensusData, GetNextConsensus # All these references are needed
def Main(height):
"""
:param height: The input block height
:type height: int
:return: Block consensus data of the input block height
:rtype: bytearray
"""
header = GetHeader(height)
consensus = header.ConsensusData
return consensus
| 31.763158 | 119 | 0.657001 | 146 | 1,207 | 5.431507 | 0.527397 | 0.041614 | 0.020177 | 0.058008 | 0.312736 | 0.194199 | 0.194199 | 0.108449 | 0.108449 | 0.108449 | 0 | 0.099089 | 0.272577 | 1,207 | 37 | 120 | 32.621622 | 0.8041 | 0.748964 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
31383c8762f89e4b5a672a75159d6903b307cd10 | 2,835 | py | Python | Mask_RCNN/utils/mask_model.py | adesgautam/objdet | 7154bd5035dd51de8a49b7ae59b65277a1727263 | [
"MIT"
] | null | null | null | Mask_RCNN/utils/mask_model.py | adesgautam/objdet | 7154bd5035dd51de8a49b7ae59b65277a1727263 | [
"MIT"
] | 14 | 2020-09-25T22:42:09.000Z | 2022-03-12T00:39:57.000Z | Mask_RCNN/utils/mask_model.py | adesgautam/objdet | 7154bd5035dd51de8a49b7ae59b65277a1727263 | [
"MIT"
] | null | null | null |
import os
import cv2
import sys
import random
import math
import re
import numpy as np
import pandas as pd
import warnings
import tensorflow as tf
import matplotlib
matplotlib.use('Agg')
import matplotlib.pyplot as plt
import matplotlib.patches as patches
import skimage
import glob
import ntpath
import itertools
import urllib
import json
import base64
import string
import keras
from mrcnn import utils
from mrcnn import visualize
from mrcnn.visualize import display_images
import mrcnn.model as modellib
from mrcnn.model import log
from mrcnn.config import Config
from tensorflow.python.keras.backend import set_session
def fxn():
warnings.warn("deprecated", DeprecationWarning)
with warnings.catch_warnings():
warnings.simplefilter("ignore")
fxn()
class BalloonConfig(Config):
"""Configuration for training on the toy dataset.
Derives from the base Config class and overrides some values.
"""
NAME = "mydataset"
# We use a GPU with 12GB memory, which can fit two images.
# Adjust down if you use a smaller GPU.
IMAGES_PER_GPU = 1
NUM_CLASSES = 1 + 1 # Background + class1
# Number of training steps per epoch
STEPS_PER_EPOCH = 100
# Skip detections with < 90% confidence
DETECTION_MIN_CONFIDENCE = 0.9
def get_ax(rows=1, cols=1, size=16):
"""Return a Matplotlib Axes array to be used in
all visualizations in the notebook. Provide a
central point to control graph sizes.
Adjust the size attribute to control how big to render images
"""
_, ax = plt.subplots(rows, cols, figsize=(size*cols, size*rows))
return ax
# Directory to save logs and trained model
MODEL_DIR = 'logs'
custom_WEIGHTS_PATH = "models/mask_rcnn_trained_weights.h5" # TODO: update this path
global graph
graph = tf.get_default_graph()
config = BalloonConfig()
dataset = {}
class InferenceConfig(config.__class__):
# Run detection on one image at a time
GPU_COUNT = 1
IMAGES_PER_GPU = 1
config = InferenceConfig()
config.display()
DEVICE = "/cpu:0"
with tf.device(DEVICE):
model = modellib.MaskRCNN(mode="inference", model_dir=MODEL_DIR,
config=config)
session = tf.Session()
init = tf.global_variables_initializer()
set_session(session)
model.load_weights(custom_WEIGHTS_PATH, by_name=True)
def model_predict(image_path):
image = skimage.io.imread(image_path)
dataset['class_names'] = ['BG','class1']
results = ''
with graph.as_default():
results = model.detect([image], verbose=1)
ax = get_ax(1)
r = results[0]
masked_image = visualize.display_instances(image, r['rois'], r['masks'], r['class_ids'],
dataset['class_names'], r['scores'], ax=ax,
title="Predictions")
return masked_image
| 23.429752 | 93 | 0.705115 | 389 | 2,835 | 5.017995 | 0.470437 | 0.023053 | 0.015369 | 0.01332 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012511 | 0.210582 | 2,835 | 120 | 94 | 23.625 | 0.859696 | 0.208466 | 0 | 0.026667 | 0 | 0 | 0.067277 | 0.016018 | 0 | 0 | 0 | 0.008333 | 0 | 1 | 0.04 | false | 0 | 0.386667 | 0 | 0.573333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
3144caec6ad8e4e13fd6b2138e8c08ab5ee3a0df | 60 | py | Python | course_path/config.py | jeff4elee/course_path | a0f0c227a7caa2243cbb99f6cc534500cf37f58a | [
"MIT"
] | null | null | null | course_path/config.py | jeff4elee/course_path | a0f0c227a7caa2243cbb99f6cc534500cf37f58a | [
"MIT"
] | null | null | null | course_path/config.py | jeff4elee/course_path | a0f0c227a7caa2243cbb99f6cc534500cf37f58a | [
"MIT"
] | null | null | null | WTF_CSRF_ENABLED = True
SECRET_KEY = 'w0w?goodhaxorskillz!'
| 20 | 35 | 0.8 | 8 | 60 | 5.625 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018519 | 0.1 | 60 | 2 | 36 | 30 | 0.814815 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
314ec824e867c366cc82116a3197e8f2f1296571 | 154 | py | Python | day1/puzzle1/solve.py | andrewthetechie/adventofcode21 | 757ea2fe2701dd6ab1d3a512f92ace600a6457e6 | [
"MIT"
] | null | null | null | day1/puzzle1/solve.py | andrewthetechie/adventofcode21 | 757ea2fe2701dd6ab1d3a512f92ace600a6457e6 | [
"MIT"
] | null | null | null | day1/puzzle1/solve.py | andrewthetechie/adventofcode21 | 757ea2fe2701dd6ab1d3a512f92ace600a6457e6 | [
"MIT"
] | null | null | null | from itertools import islice
with open('input', 'r') as fh:
data = fh.readlines()
print(sum(n1 < n2 for n1, n2 in zip(data, islice(data, 1, None)))) | 25.666667 | 66 | 0.655844 | 27 | 154 | 3.740741 | 0.777778 | 0.079208 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039683 | 0.181818 | 154 | 6 | 66 | 25.666667 | 0.761905 | 0 | 0 | 0 | 0 | 0 | 0.03871 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3154926ffdcd85f061fe9e8438e36d4e20fc2163 | 119 | py | Python | python-para-zumbis/while.py | kidchenko/playground | 750f1d12a793f6851df68bbd1b9d3ec32b5f70a3 | [
"MIT"
] | 4 | 2016-11-10T02:29:32.000Z | 2017-08-24T15:19:12.000Z | python-para-zumbis/while.py | kidchenko/playground | 750f1d12a793f6851df68bbd1b9d3ec32b5f70a3 | [
"MIT"
] | 13 | 2019-09-16T20:01:18.000Z | 2022-02-13T11:00:49.000Z | python-para-zumbis/while.py | kidchenko/playground | 750f1d12a793f6851df68bbd1b9d3ec32b5f70a3 | [
"MIT"
] | 1 | 2022-02-24T06:35:25.000Z | 2022-02-24T06:35:25.000Z | numero = int(input('Digite um numero: '))
count = 1
while count <= numero:
print(count)
count = count + 1
| 19.833333 | 42 | 0.596639 | 16 | 119 | 4.4375 | 0.5625 | 0.169014 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022989 | 0.268908 | 119 | 5 | 43 | 23.8 | 0.793103 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3154df9c573e581b7325d09b6a15d6724fef2c00 | 1,368 | py | Python | examples/huawei.py | joysboy/ncclient | 0bae05d0e9a00ad4dfacf57f2a177b8bb44af4dd | [
"Apache-2.0"
] | null | null | null | examples/huawei.py | joysboy/ncclient | 0bae05d0e9a00ad4dfacf57f2a177b8bb44af4dd | [
"Apache-2.0"
] | null | null | null | examples/huawei.py | joysboy/ncclient | 0bae05d0e9a00ad4dfacf57f2a177b8bb44af4dd | [
"Apache-2.0"
] | null | null | null | #! /usr/bin/env python2.6
#
# Connect to the NETCONF server passed on the command line and
# display their capabilities. This script and the following scripts
# all assume that the user calling the script is known by the server
# and that suitable SSH keys are in place. For brevity and clarity
# of the examples, we omit proper exception handling.
#
# $ ./nc01.py broccoli
import sys
from ncclient import manager
filter_vlan_snippet = """
<vlan xmlns="http://www.huawei.com/netconf/vrp" content-version="1.0" format-version="1.0">
</vlan>"""
def create_vlan(mgr, vlanid, vlanname):
snippet = """
<vlan xmlns="http://www.huawei.com/netconf/vrp" content-version="1.0" format-version="1.0">
<vlans>
<vlan>
<vlanId>%s</vlanId>
<vlanName/>
<vlanDesc>%s</vlanDesc>
</vlan>
</vlans>
</vlan>"""
confstr = snippet % (vlanid, vlanname)
mgr.edit_config(target='running', config=confstr)
def test_huawei_api(host, user, password):
device = {"name": "huawei"}
with manager.connect(host, port=830, user=user, password=password, device_params=device) as m:
create_vlan(m, '20', 'customer')
result = m.get_config(source="running", filter=("subtree", filter_vlan_snippet))
print result
if __name__ == '__main__':
test_huawei_api(sys.argv[1], sys.argv[2], sys.argv[3])
| 31.090909 | 98 | 0.670322 | 191 | 1,368 | 4.691099 | 0.549738 | 0.035714 | 0.040179 | 0.044643 | 0.162946 | 0.162946 | 0.162946 | 0.162946 | 0.162946 | 0.162946 | 0 | 0.018002 | 0.187866 | 1,368 | 43 | 99 | 31.813953 | 0.788479 | 0.260234 | 0 | 0.153846 | 0 | 0.076923 | 0.395813 | 0.10668 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.076923 | 0.076923 | null | null | 0.038462 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
3166bee530e1c4f5f751bb83b551a0c09306e4d7 | 845 | py | Python | PropelRapp/migrations/0010_auto_20200507_0705.py | Adi1222/PropelR | 453196cb8ad7c251b650c6bd147a8be5ee8eed50 | [
"MIT"
] | 1 | 2021-12-03T11:24:57.000Z | 2021-12-03T11:24:57.000Z | PropelRapp/migrations/0010_auto_20200507_0705.py | Adi1222/PropelR | 453196cb8ad7c251b650c6bd147a8be5ee8eed50 | [
"MIT"
] | null | null | null | PropelRapp/migrations/0010_auto_20200507_0705.py | Adi1222/PropelR | 453196cb8ad7c251b650c6bd147a8be5ee8eed50 | [
"MIT"
] | null | null | null | # Generated by Django 2.2.1 on 2020-05-07 07:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('PropelRapp', '0009_auto_20200506_0627'),
]
operations = [
migrations.AddField(
model_name='menu',
name='is_deleted',
field=models.CharField(choices=[('N', 'NO'), ('Y', 'YES')], default='N', max_length=1),
),
migrations.AddField(
model_name='role',
name='is_deleted',
field=models.CharField(choices=[('N', 'NO'), ('Y', 'YES')], default='N', max_length=1),
),
migrations.AddField(
model_name='submenu',
name='is_deleted',
field=models.CharField(choices=[('N', 'NO'), ('Y', 'YES')], default='N', max_length=1),
),
]
| 29.137931 | 99 | 0.539645 | 92 | 845 | 4.826087 | 0.445652 | 0.121622 | 0.155405 | 0.182432 | 0.560811 | 0.560811 | 0.560811 | 0.560811 | 0.560811 | 0.560811 | 0 | 0.056198 | 0.284024 | 845 | 28 | 100 | 30.178571 | 0.677686 | 0.053254 | 0 | 0.545455 | 1 | 0 | 0.12782 | 0.028822 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.045455 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
316777370d8603e8a2e1b38ab9370254efa8d5b3 | 1,352 | py | Python | skunkbooth/utils/settings.py | codephile1221/SkunkBooth | dd174cac3af409fe72e5ca568bd4cb9048b0facb | [
"MIT"
] | 41 | 2021-08-01T06:20:34.000Z | 2022-03-20T07:15:23.000Z | skunkbooth/utils/settings.py | codephile1221/SkunkBooth | dd174cac3af409fe72e5ca568bd4cb9048b0facb | [
"MIT"
] | 37 | 2021-08-03T03:17:03.000Z | 2022-01-18T03:42:09.000Z | skunkbooth/utils/settings.py | codephile1221/SkunkBooth | dd174cac3af409fe72e5ca568bd4cb9048b0facb | [
"MIT"
] | 10 | 2021-08-03T16:56:37.000Z | 2022-02-08T00:11:01.000Z | from collections.abc import MutableMapping
from yaml import dump, safe_load
from skunkbooth.data.defaults import _settings
class _makesettings(MutableMapping):
"""Metaclass for settings"""
def __init__(self, *args):
try:
with open(_settings["SETTINGS_FILE"]) as f:
self.store = safe_load(f)
if self.store is None:
self.store = {}
except FileNotFoundError:
self.store = {}
# Load default values if unset
for i in _settings:
if i not in self.store:
self.store[i] = _settings[i]
with open(_settings["SETTINGS_FILE"], "w") as f:
f.write(dump(self.store))
def __getitem__(self, key: str):
return self.store[self._keytransform(key)]
def __setitem__(self, key: str, val: str):
self.store[self._keytransform(key)] = val
with open(self["SETTINGS_FILE"], "w") as f:
f.write(dump(self.store))
def __delitem__(self, key: str):
del self.store[self._keytransform(key)]
def __iter__(self):
return iter(self.store)
def __len__(self):
return len(self.store)
def _keytransform(self, key: str) -> str:
return key
class settings(metaclass=_makesettings):
"""Produced by _makesettings"""
pass
| 25.509434 | 56 | 0.599112 | 162 | 1,352 | 4.746914 | 0.339506 | 0.152146 | 0.06762 | 0.097529 | 0.273082 | 0.179454 | 0.09883 | 0.09883 | 0.09883 | 0.09883 | 0 | 0 | 0.29142 | 1,352 | 52 | 57 | 26 | 0.802714 | 0.057692 | 0 | 0.121212 | 0 | 0 | 0.032462 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.212121 | false | 0.030303 | 0.090909 | 0.121212 | 0.484848 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
316c7c586a2d2c1d1e73b722317f4c9548460590 | 512 | py | Python | sun_horizon.py | sraaphorst/gem_adapt_queue | acaca3583cd93a377e7b7e2d58e938057271c642 | [
"BSD-3-Clause"
] | null | null | null | sun_horizon.py | sraaphorst/gem_adapt_queue | acaca3583cd93a377e7b7e2d58e938057271c642 | [
"BSD-3-Clause"
] | null | null | null | sun_horizon.py | sraaphorst/gem_adapt_queue | acaca3583cd93a377e7b7e2d58e938057271c642 | [
"BSD-3-Clause"
] | null | null | null | # Bryan Miller 2004
# Converted to python - Matt Bonnyman 2018-07-12
import numpy as np
import astropy.units as u
def sun_horizon(site):
"""
Calculate angle between sun and horizon at sunset for observer's location.
Parameter
---------
site : '~astroplan.Observer'
Return
--------
'~astropy.units.Quantity'
"""
sun_horiz = -.83 * u.deg
equat_radius = 6378137. * u.m
return sun_horiz - np.sqrt(2. * site.location.height / equat_radius) * (180. / np.pi) * u.deg
| 23.272727 | 97 | 0.630859 | 70 | 512 | 4.542857 | 0.685714 | 0.075472 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063613 | 0.232422 | 512 | 21 | 98 | 24.380952 | 0.745547 | 0.455078 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
3171fa89da789bf7cf6300c8975e8ccbc54fb920 | 173 | py | Python | app/core/config.py | Luivatra/ergopad-api | e3bcf93bf61509b3aa96b62603268acd399bbc28 | [
"MIT"
] | null | null | null | app/core/config.py | Luivatra/ergopad-api | e3bcf93bf61509b3aa96b62603268acd399bbc28 | [
"MIT"
] | 23 | 2022-03-09T11:31:32.000Z | 2022-03-31T08:53:27.000Z | app/core/config.py | Luivatra/ergopad-api | e3bcf93bf61509b3aa96b62603268acd399bbc28 | [
"MIT"
] | 2 | 2022-02-16T03:40:05.000Z | 2022-02-16T22:40:15.000Z | from config import Config, Network # api specific config
CFG = Config[Network]
PROJECT_NAME = "ERGOPAD"
SQLALCHEMY_DATABASE_URI = CFG.connectionString
API_V1_STR = "/api"
| 24.714286 | 57 | 0.786127 | 23 | 173 | 5.695652 | 0.695652 | 0.198473 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006667 | 0.132948 | 173 | 6 | 58 | 28.833333 | 0.866667 | 0.109827 | 0 | 0 | 0 | 0 | 0.072368 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
31797f997703235c197bcffb46b8801810970703 | 377 | py | Python | holobot/discord/sdk/context_menus/models/server_message_interaction_context.py | rexor12/holobot | 89b7b416403d13ccfeee117ef942426b08d3651d | [
"MIT"
] | 1 | 2021-05-24T00:17:46.000Z | 2021-05-24T00:17:46.000Z | holobot/discord/sdk/context_menus/models/server_message_interaction_context.py | rexor12/holobot | 89b7b416403d13ccfeee117ef942426b08d3651d | [
"MIT"
] | 41 | 2021-03-24T22:50:09.000Z | 2021-12-17T12:15:13.000Z | holobot/discord/sdk/context_menus/models/server_message_interaction_context.py | rexor12/holobot | 89b7b416403d13ccfeee117ef942426b08d3651d | [
"MIT"
] | null | null | null | from dataclasses import dataclass
from holobot.discord.sdk.models import InteractionContext
@dataclass
class ServerMessageInteractionContext(InteractionContext):
"""A context for server-specific message context menu interactions."""
server_id: str = ""
server_name: str = ""
channel_id: str = ""
target_message_id: str = ""
target_author_id: str = ""
| 29 | 74 | 0.734748 | 41 | 377 | 6.585366 | 0.609756 | 0.074074 | 0.081481 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172414 | 377 | 12 | 75 | 31.416667 | 0.865385 | 0.169761 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.222222 | 0 | 0.888889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
3188cd98434d8325537fe660f15fff1b9617882d | 2,238 | py | Python | drivers/migrations/0001_initial.py | CaveBushman/carssoft | dee50e3dee8a22272a601e31f62afef95e951a8c | [
"MIT"
] | null | null | null | drivers/migrations/0001_initial.py | CaveBushman/carssoft | dee50e3dee8a22272a601e31f62afef95e951a8c | [
"MIT"
] | null | null | null | drivers/migrations/0001_initial.py | CaveBushman/carssoft | dee50e3dee8a22272a601e31f62afef95e951a8c | [
"MIT"
] | null | null | null | # Generated by Django 3.2.7 on 2021-09-28 14:27
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Address',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('street', models.CharField(max_length=100)),
('city', models.CharField(max_length=100)),
('zip', models.CharField(max_length=10)),
('state', models.CharField(max_length=50)),
],
),
migrations.CreateModel(
name='Driver',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('first_name', models.CharField(max_length=100)),
('last_name', models.CharField(max_length=100)),
('image', models.ImageField(null=True, upload_to='')),
('email', models.EmailField(max_length=100, null=True)),
('tel', models.CharField(max_length=10)),
('driving_licence_no', models.CharField(max_length=20)),
('driving_licence_issued', models.CharField(max_length=50)),
('driving_licence_classes_A', models.BooleanField(default=False)),
('driving_licence_classes_B', models.BooleanField(default=False)),
('driving_licence_classes_C', models.BooleanField(default=False)),
('driving_licence_classes_D', models.BooleanField(default=False)),
('driving_licence_classes_E', models.BooleanField(default=False)),
('driving_licence_classes_T', models.BooleanField(default=False)),
('driving_licence_valid_to', models.DateField(null=True)),
('created', models.DateField(auto_now_add=True)),
('updated', models.DateField(auto_now=True)),
('address', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, to='drivers.address')),
],
),
]
| 45.673469 | 126 | 0.597408 | 229 | 2,238 | 5.624454 | 0.353712 | 0.069876 | 0.125776 | 0.167702 | 0.53028 | 0.407609 | 0.325311 | 0.127329 | 0.127329 | 0.127329 | 0 | 0.024301 | 0.264522 | 2,238 | 48 | 127 | 46.625 | 0.758202 | 0.020107 | 0 | 0.243902 | 1 | 0 | 0.146508 | 0.089457 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.04878 | 0 | 0.146341 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3197dd5bcad09aa8476c9d8d828abba80c74ab61 | 1,711 | py | Python | tools/ner_tool.py | ownthink/qimen | 278d2e4d97948c40e53dab5f764e19bd4169ba91 | [
"MIT"
] | 26 | 2019-04-19T13:15:20.000Z | 2022-02-07T08:47:14.000Z | tools/ner_tool.py | ownthink/qimen | 278d2e4d97948c40e53dab5f764e19bd4169ba91 | [
"MIT"
] | null | null | null | tools/ner_tool.py | ownthink/qimen | 278d2e4d97948c40e53dab5f764e19bd4169ba91 | [
"MIT"
] | 17 | 2019-04-19T13:13:28.000Z | 2022-01-18T07:38:09.000Z | #!/usr/bin/env python
# coding=UTF-8
'''
@Description: NER工具父类
@Author: ansvver(ansvver.cn@gmail.com)
@Date: 2019-04-20 14:06:51
@LastEditTime: 2019-04-20 16:59:05
'''
import sys
import json
from typing import List
# 全局的实体标签字典,key -> 大写字母单词缩写, value -> (中文简短描述, 附加备注信息)
GLOBAL_TAG_DICT = {"SAMPLE": ("示例", "示例例子,后续正式发版可考虑删除")}
class Entity(object):
"""实体类"""
def __init__(self):
self._tag = '' # 大写字母单词缩写, 如"EMAIL"、 "PHONE"等,请在`GLOBAL_TAG_DICT`中注册
self._word = '' # 句子中识别出来的名词
self._start = -1 # Optional, 在句子中的起始下标,注意下标从0开始
self._end = -1 # Optional, 在句子中的结束下标,注意下标从0开始
@property
def tag(self) -> str:
return self._tag
@tag.setter
def tag(self, value: str):
assert value.upper(
) in GLOBAL_TAG_DICT, '请确认tag标签已在`GLOBAL_TAG_DICT`中注册!'
self._tag = value
@property
def word(self) -> str:
return self._word
@word.setter
def word(self, value: str):
self._word = value
@property
def start(self) -> int:
return self._start
@start.setter
def start(self, value: int):
assert value >= 0, '请确保下标>0'
self._start = value
@property
def end(self) -> int:
return self._end
@end.setter
def end(self, value: int):
assert value >= 0, '请确保下标>0'
self._end = value
def __str__(self):
return json.dumps(self.__dict__, ensure_ascii=False)
class NERTool(object):
"""实体识别工具父类"""
def __init__(self):
super(NERTool, self).__init__()
def ner(self, inputs: List[str]) -> List[List[Entity]]:
"""子类需重写此函数, 输入是字符列表,输出是实体列表组成的列表(每个句子对应一个实体列表)"""
raise NotImplementedError
| 22.513158 | 77 | 0.606663 | 218 | 1,711 | 4.582569 | 0.431193 | 0.036036 | 0.052052 | 0.032032 | 0.108108 | 0.068068 | 0.068068 | 0.068068 | 0.068068 | 0 | 0 | 0.028974 | 0.253653 | 1,711 | 75 | 78 | 22.813333 | 0.751762 | 0.227937 | 0 | 0.177778 | 0 | 0 | 0.053406 | 0.023994 | 0 | 0 | 0 | 0 | 0.066667 | 1 | 0.266667 | false | 0 | 0.066667 | 0.111111 | 0.488889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
319ca23d604244f271178225c6dba2ba8abe6a11 | 1,847 | py | Python | pysb/examples/paper_figures/fig2a.py | PhilippBoeing/synbioweaver | 23efdf79a325885a43e82ba13e6ccefb8eb3d733 | [
"MIT"
] | null | null | null | pysb/examples/paper_figures/fig2a.py | PhilippBoeing/synbioweaver | 23efdf79a325885a43e82ba13e6ccefb8eb3d733 | [
"MIT"
] | null | null | null | pysb/examples/paper_figures/fig2a.py | PhilippBoeing/synbioweaver | 23efdf79a325885a43e82ba13e6ccefb8eb3d733 | [
"MIT"
] | null | null | null | """Elements for Figure 2A from the PySB publication"""
from pysb import *
from pysb.bng import generate_network, generate_equations
import re
# This code (pygmentized) is shown in figure S1A as "Basic Implementation"
def catalyze(enz, e_site, sub, s_site, prod, klist):
kf, kr, kc = klist # Get the parameters from the list
# Create the rules
rb = Rule('bind_%s_%s' % (enz().monomer.name, sub().monomer.name),
enz({e_site:None}) + sub({s_site:None}) <>
enz({e_site:1}) % sub({s_site:1}),
kf, kr)
rc = Rule('catalyze_%s%s_to_%s' %
(enz().monomer.name, sub().monomer.name, prod().monomer.name),
enz({e_site:1}) % sub({s_site:1}) >>
enz({e_site:None}) + prod({s_site:None}),
kc)
return [rb, rc]
Model()
Monomer('C8', ['bf'])
Monomer('Bid', ['bf', 'state'], {'state': ['U', 'T']})
klist = [Parameter('kf', 1), Parameter('kr', 1), Parameter('kc', 1)]
Initial(C8(bf=None), Parameter('C8_0', 100))
Initial(Bid(bf=None, state='U'), Parameter('Bid_0', 100))
# This is the code shown for "Example Macro Call" (not printed here)
catalyze(C8, 'bf', Bid(state='U'), 'bf', Bid(state='T'), klist)
bng_code = generate_network(model)
# Merge continued lines
bng_code = bng_code.replace('\\\n', '')
generate_equations(model)
num_rules = len(model.rules)
num_odes = len(model.odes)
print "BNGL Rules"
print "=========="
for line in bng_code.split("\n"):
for rule in model.rules:
match = re.match(r'^\s*%s:\s*(.*)' % rule.name, line)
if match:
print match.group(1)
print
print "ODEs"
print "===="
for species, ode in zip(model.species, model.odes):
print "%s: %s" % (species, ode)
def test_fig2a():
assert num_rules == 2, "number of rules not as expected"
assert num_odes == 4, "number of odes not as expected"
| 30.278689 | 74 | 0.615051 | 282 | 1,847 | 3.921986 | 0.336879 | 0.018083 | 0.036166 | 0.027125 | 0.10217 | 0.084991 | 0.084991 | 0.03255 | 0 | 0 | 0 | 0.01688 | 0.198159 | 1,847 | 60 | 75 | 30.783333 | 0.729912 | 0.114239 | 0 | 0 | 1 | 0 | 0.118655 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0 | null | null | 0 | 0.071429 | null | null | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
31aa434bdef3fb92b35a07a7af0183757baa59f5 | 730 | py | Python | rolz_bot/embeders/tarot.py | Reriiru/rolz_org_to_discord | caf860585c551d85729b2d9661e8d801750a7aa5 | [
"MIT"
] | 1 | 2021-04-15T04:14:55.000Z | 2021-04-15T04:14:55.000Z | rolz_bot/embeders/tarot.py | Reriiru/rolz_org_to_discord | caf860585c551d85729b2d9661e8d801750a7aa5 | [
"MIT"
] | null | null | null | rolz_bot/embeders/tarot.py | Reriiru/rolz_org_to_discord | caf860585c551d85729b2d9661e8d801750a7aa5 | [
"MIT"
] | 2 | 2017-11-05T02:34:35.000Z | 2017-11-20T06:00:06.000Z | from discord import Embed, Colour
class TarotEmbeder(Embed):
def __init__(self, image, username):
super().__init__(
title=image['Name'],
type="rich"
)
self.set_image(url=image['Url'])
self.colour = Colour.teal()
self.description = image['Description']
self.add_field(name="Requester", value=username)
class TarotShortEmbeder(Embed):
def __init__(self, image, username):
super().__init__(
title=image['Name'],
type="rich"
)
self.set_image(url=image['Url'])
self.colour = Colour.teal()
self.add_field(name="Requester", value=username) | 28.076923 | 56 | 0.547945 | 73 | 730 | 5.205479 | 0.356164 | 0.084211 | 0.063158 | 0.084211 | 0.752632 | 0.752632 | 0.752632 | 0.563158 | 0.563158 | 0.563158 | 0 | 0 | 0.324658 | 730 | 26 | 57 | 28.076923 | 0.770791 | 0 | 0 | 0.7 | 0 | 0 | 0.069767 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.05 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
31b73596e5273b273b941d4db0879073451d025d | 324 | py | Python | data/config.py | gemhalo1/libfacedetection.train | fbbf5f3c9da95904ae3b434a875ce98c023962b4 | [
"MIT"
] | 2 | 2020-09-26T03:57:07.000Z | 2020-09-27T13:21:39.000Z | data/config.py | gemhalo1/libfacedetection.train | fbbf5f3c9da95904ae3b434a875ce98c023962b4 | [
"MIT"
] | null | null | null | data/config.py | gemhalo1/libfacedetection.train | fbbf5f3c9da95904ae3b434a875ce98c023962b4 | [
"MIT"
] | null | null | null | # config.py
cfg = {
'name': 'YuFaceDetectNet',
#'min_sizes': [[32, 64, 128], [256], [512]],
#'steps': [32, 64, 128],
'min_sizes': [[10, 16, 24], [32, 48], [64, 96], [128, 192, 256]],
'steps': [8, 16, 32, 64],
'variance': [0.1, 0.2],
'clip': False,
'loc_weight': 1.0,
'gpu_train': True
}
| 23.142857 | 69 | 0.475309 | 47 | 324 | 3.191489 | 0.659574 | 0.08 | 0.093333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.229508 | 0.246914 | 324 | 13 | 70 | 24.923077 | 0.385246 | 0.231481 | 0 | 0 | 0 | 0 | 0.261224 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
31cad670e4bde75720728036701100192bb49d73 | 322 | py | Python | bauh/gems/arch/config.py | leoneii/bauh | ceef6c30851552ec37e21ef6335a4cbdd126622f | [
"Zlib"
] | 1 | 2020-06-16T17:08:32.000Z | 2020-06-16T17:08:32.000Z | bauh/gems/arch/config.py | leoneii/bauh | ceef6c30851552ec37e21ef6335a4cbdd126622f | [
"Zlib"
] | null | null | null | bauh/gems/arch/config.py | leoneii/bauh | ceef6c30851552ec37e21ef6335a4cbdd126622f | [
"Zlib"
] | null | null | null | from bauh.commons.config import read_config as read
from bauh.gems.arch import CONFIG_FILE
def read_config(update_file: bool = False) -> dict:
template = {'optimize': True, 'transitive_checking': True, "sync_databases": True, "simple_checking": False}
return read(CONFIG_FILE, template, update_file=update_file)
| 40.25 | 112 | 0.767081 | 45 | 322 | 5.266667 | 0.533333 | 0.126582 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127329 | 322 | 7 | 113 | 46 | 0.843416 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
31e1fd2520aba927ec90034cb311be3f0c7290af | 4,122 | py | Python | PyTangoArchiving/scripts/db_repair.py | sergirubio/PyTangoArchiving | ce66ac0d790984675449687a2901a8ba87274a86 | [
"CC-BY-3.0"
] | 6 | 2017-03-15T15:38:44.000Z | 2021-06-04T13:46:17.000Z | PyTangoArchiving/scripts/db_repair.py | sergirubio/PyTangoArchiving | ce66ac0d790984675449687a2901a8ba87274a86 | [
"CC-BY-3.0"
] | 19 | 2017-04-07T10:20:18.000Z | 2022-01-13T14:52:27.000Z | PyTangoArchiving/scripts/db_repair.py | sergirubio/PyTangoArchiving | ce66ac0d790984675449687a2901a8ba87274a86 | [
"CC-BY-3.0"
] | 9 | 2017-01-20T11:59:31.000Z | 2020-12-08T15:22:36.000Z | #!/usr/bin/python
import MySQLdb,sys,time,os
try:
import fandango
except:
sys.path.append('/homelocal/sicilia/lib/python/site-packages/')
import fandango
# Author: osanchez, srubio
# -------------------------------
# mysqlcheck --repair --all-databses doesn't works so well otherwise the SQL command REPAIR TABLE... works
# properly.
# -------------------------------
__doc__="""db_repair.py """
#For a proper definition would be better to have long name before the short name
__options__ = [
('h','help','Show Usage'),
('h','host=','DB Host'),
('n','no-prompt','Dont ask for confirmation'),
('f','force','Check all tables'),
('d:','days=','Number of days to check; 0 for raw check'),
('','','user'),
('','','password')]
shorts,ops,longs,args,usage = [],[],[],[],''
for s,l,d in __options__:
if not s and not l:
args.append(d)
else:
usage+='\n\t%s%s%s : %s' %(s,l and '/' or '',l,d)
if s: (shorts if not s.endswith(':') else ops).append(s)
if l: longs.append(l)
__usage__ = 'Usage:\n'+'\tdb_repair.py '+'[%s]'%''.join(shorts)+' '+' '.join('[-%s X]'%o[:-1] for o in ops)+' ' +' '.join('[--%s%s]'%(l,'X' if l.endswith('=') else '') for l in longs)+' '+' '.join(args)+'\n'+usage+'\n'
def usage(): return __doc__+'\n\n'+__usage__
def do_repair(user,passwd,condition="engine is null",database="information_schema",force=False,days=0,db_host='localhost') :
sql = "select CONCAT(table_schema, '.', table_name) from tables where %s" % condition
db_con = MySQLdb.connect(db_host, port=3306, user=user,passwd=passwd,db=database)
cursor = db_con.cursor()
cursor.execute(sql)
rset = cursor.fetchall()
print '%d tables match condition'%len(rset)
now = time.time()
days = days or 60
tlimit = fandango.time2str(now-days*24*3600);
now = fandango.time2str(now);
for item in rset :
try:
if fandango.isSequence(item):
item = item[0]
if force:
raise Exception,'force=True, all tables to be checked'
elif 'att_' in item:
q = "select count(*) from %s where time between '%s' and '%s' order by time"%(item,tlimit,now)
cursor.execute(q)
count = cursor.fetchone()[0]
q = "select * from %s where time between '%s' and '%s' order by time"%(item,tlimit,now)
print q
cursor.execute(q) # desc limit 1'%item);
l = len(cursor.fetchall())
if abs(count-l)>5:
raise Exception('%d!=%d'%(count,l))
else:
raise Exception,'%s is a config table'%item
except Exception,e:
print e
print 'Repairing %s ...' % item
cursor.execute('repair table %s' % item)
print '[OK]\n'
time.sleep(.001)
cursor.close()
db_con.close()
def main():
import getopt
t0,t1,t2 = sys.argv[1:],''.join(s for s,l,d in __options__ if s),[l for s,l,d in __options__ if l]
print t0,t1,t2
opts,args = getopt.getopt(t0,t1,t2)
opts = dict((k.strip('-'),v) for k,v in opts+[('',args)])
t0 = time.time()
if any(o in opts for o in ('h','?','help')):
print usage()
sys.exit()
else:
if not opts['']:
user = raw_input('Enter user:')
passwd = raw_input('Enter password:')
else:
user,passwd = opts[''][:2]
db_host = opts.get('host','localhost')
os.system('mysqladmin -h %s -u %s -p%s flush-hosts'%(db_host,user,passwd))
condition = 'engine is null OR table_schema like "hdb%%" OR table_schema like "tdb%%"'
if 'no-prompt' not in opts:
new_condition = raw_input('Query condition (%s):'%condition)
condition = new_condition.strip() or condition
do_repair(user,passwd,condition,force='force' in opts or 'f' in opts,days=int(opts.get('days',0)),db_host=db_host)
print 'database repair finished in %d seconds'%(time.time()-t0)
pass
if (__name__ == '__main__') :
main()
| 40.019417 | 218 | 0.559195 | 575 | 4,122 | 3.897391 | 0.32 | 0.018742 | 0.006693 | 0.008032 | 0.110665 | 0.095047 | 0.067381 | 0.044623 | 0.044623 | 0.044623 | 0 | 0.012366 | 0.254488 | 4,122 | 102 | 219 | 40.411765 | 0.716889 | 0.077875 | 0 | 0.11236 | 0 | 0 | 0.233588 | 0.0116 | 0.011236 | 0 | 0 | 0 | 0 | 0 | null | null | 0.089888 | 0.044944 | null | null | 0.089888 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
31eae7ff18e644d5afcb0edcde68925c01752331 | 331 | py | Python | application/model/layout/serialize/SerializePopup.py | Kzulfazriawan/Stigma-game-demo | 971ee90a908784dfe1c9e87733b0394fa2212299 | [
"MIT"
] | 2 | 2016-08-09T05:33:21.000Z | 2016-10-05T06:34:04.000Z | application/model/layout/serialize/SerializePopup.py | Kzulfazriawan/stigma-game-demo | 971ee90a908784dfe1c9e87733b0394fa2212299 | [
"MIT"
] | null | null | null | application/model/layout/serialize/SerializePopup.py | Kzulfazriawan/stigma-game-demo | 971ee90a908784dfe1c9e87733b0394fa2212299 | [
"MIT"
] | null | null | null | from core import Files
from library.stigma.helper import kivyBuilder
from library.stigma.application import Popup
kivyBuilder(Files.apppath, 'model', 'builder', 'serialize', 'serializepopup.kv')
class SerializePopup(Popup):
def __init__(self):
super(SerializePopup, self).__init__()
self.title = "Save / Load" | 33.1 | 80 | 0.740181 | 38 | 331 | 6.236842 | 0.631579 | 0.092827 | 0.14346 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148036 | 331 | 10 | 81 | 33.1 | 0.840426 | 0 | 0 | 0 | 0 | 0 | 0.14759 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
9ec57cf99813f9b4ebd59977cceddc4ef9bbef97 | 48,874 | py | Python | abbotpythonpb/container_pb2.py | arhat-dev/abbot-proto | af3bc030f7090c89b3c54d02e1a92e0fadd990c2 | [
"Apache-2.0"
] | null | null | null | abbotpythonpb/container_pb2.py | arhat-dev/abbot-proto | af3bc030f7090c89b3c54d02e1a92e0fadd990c2 | [
"Apache-2.0"
] | 4 | 2020-10-11T02:19:40.000Z | 2020-12-18T21:02:45.000Z | abbotpythonpb/container_pb2.py | arhat-dev/abbot-proto | af3bc030f7090c89b3c54d02e1a92e0fadd990c2 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: container.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from github.com.gogo.protobuf.gogoproto import gogo_pb2 as github_dot_com_dot_gogo_dot_protobuf_dot_gogoproto_dot_gogo__pb2
import meta_pb2 as meta__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='container.proto',
package='abbot',
syntax='proto3',
serialized_options=b'Z\037arhat.dev/abbot-proto/abbotgopb',
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n\x0f\x63ontainer.proto\x12\x05\x61\x62\x62ot\x1a-github.com/gogo/protobuf/gogoproto/gogo.proto\x1a\nmeta.proto\"\xc9\n\n\nCNICapArgs\x12\x31\n\x0cport_map_arg\x18\x01 \x01(\x0b\x32\x19.abbot.CNICapArgs.PortMapH\x00\x12\x34\n\rbandwidth_arg\x18\x02 \x01(\x0b\x32\x1b.abbot.CNICapArgs.BandwidthH\x00\x12\x31\n\x0cip_range_arg\x18\x03 \x01(\x0b\x32\x19.abbot.CNICapArgs.IPRangeH\x00\x12\x35\n\x0e\x64ns_config_arg\x18\x04 \x01(\x0b\x32\x1b.abbot.CNICapArgs.DNSConfigH\x00\x12\x39\n\x10ip_addresses_arg\x18\x05 \x01(\x0b\x32\x1d.abbot.CNICapArgs.IPAddressesH\x00\x12\x37\n\x0fmac_address_arg\x18\x06 \x01(\x0b\x32\x1c.abbot.CNICapArgs.MacAddressH\x00\x12?\n\x13infiniband_guid_arg\x18\x07 \x01(\x0b\x32 .abbot.CNICapArgs.InfinibandGUIDH\x00\x12\x33\n\rdevice_id_arg\x18\x08 \x01(\x0b\x32\x1a.abbot.CNICapArgs.DeviceIDH\x00\x1a\x92\x01\n\x07PortMap\x12)\n\x0e\x63ontainer_port\x18\x01 \x01(\x05\x42\x11\xea\xde\x1f\rcontainerPort\x12\x1f\n\thost_port\x18\x02 \x01(\x05\x42\x0c\xea\xde\x1f\x08hostPort\x12\x1e\n\x08protocol\x18\x03 \x01(\tB\x0c\xea\xde\x1f\x08protocol\x12\x1b\n\x07host_ip\x18\x04 \x01(\tB\n\xea\xde\x1f\x06hostIP\x1a\xcf\x01\n\tBandwidth\x12/\n\x0cingress_rate\x18\x01 \x01(\x05\x42\x19\xea\xde\x1f\x15ingressRate,omitempty\x12\x31\n\ringress_burst\x18\x02 \x01(\x05\x42\x1a\xea\xde\x1f\x16ingressBurst,omitempty\x12-\n\x0b\x65gress_rate\x18\x03 \x01(\x05\x42\x18\xea\xde\x1f\x14\x65gressRate,omitempty\x12/\n\x0c\x65gress_burst\x18\x04 \x01(\x05\x42\x19\xea\xde\x1f\x15\x65gressBurst,omitempty\x1a\xa7\x01\n\x07IPRange\x12\x1a\n\x06subnet\x18\x01 \x01(\tB\n\xea\xde\x1f\x06subnet\x12-\n\x0brange_start\x18\x02 \x01(\tB\x18\xea\xde\x1f\x14rangeStart,omitempty\x12)\n\trange_end\x18\x03 \x01(\tB\x16\xea\xde\x1f\x12rangeEnd,omitempty\x12&\n\x07gateway\x18\x04 \x01(\tB\x15\xea\xde\x1f\x11gateway,omitempty\x1a\x85\x01\n\tDNSConfig\x12&\n\x07servers\x18\x01 \x03(\tB\x15\xea\xde\x1f\x11servers,omitempty\x12(\n\x08searches\x18\x02 \x03(\tB\x16\xea\xde\x1f\x12searches,omitempty\x12&\n\x07options\x18\x03 \x03(\tB\x15\xea\xde\x1f\x11options,omitempty\x1a-\n\x0bIPAddresses\x12\x1e\n\x03ips\x18\x01 \x03(\tB\x11\xea\xde\x1f\rips,omitempty\x1a,\n\nMacAddress\x12\x1e\n\x03mac\x18\x01 \x01(\tB\x11\xea\xde\x1f\rmac,omitempty\x1aG\n\x0eInfinibandGUID\x12\x35\n\x0finfiniband_guid\x18\x01 \x01(\tB\x1c\xea\xde\x1f\x18infinibandGUID,omitempty\x1a\x35\n\x08\x44\x65viceID\x12)\n\tdevice_id\x18\x01 \x01(\tB\x16\xea\xde\x1f\x12\x64\x65viceID,omitemptyB\x08\n\x06option\"\xdc\x01\n\x1d\x43ontainerNetworkEnsureRequest\x12\x14\n\x0c\x63ontainer_id\x18\x01 \x01(\t\x12\x0b\n\x03pid\x18\x02 \x01(\r\x12#\n\x08\x63\x61p_args\x18\x03 \x03(\x0b\x32\x11.abbot.CNICapArgs\x12\x43\n\x08\x63ni_args\x18\x04 \x03(\x0b\x32\x31.abbot.ContainerNetworkEnsureRequest.CniArgsEntry\x1a.\n\x0c\x43niArgsEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\t:\x02\x38\x01\"C\n\x1e\x43ontainerNetworkRestoreRequest\x12\x14\n\x0c\x63ontainer_id\x18\x01 \x01(\t\x12\x0b\n\x03pid\x18\x02 \x01(\r\"B\n\x1d\x43ontainerNetworkDeleteRequest\x12\x14\n\x0c\x63ontainer_id\x18\x01 \x01(\t\x12\x0b\n\x03pid\x18\x02 \x01(\r\"O\n#ContainerNetworkConfigEnsureRequest\x12\x13\n\x0bipv4_subnet\x18\x01 \x01(\t\x12\x13\n\x0bipv6_subnet\x18\x02 \x01(\t\"$\n\"ContainerNetworkConfigQueryRequest\"A\n\x1c\x43ontainerNetworkQueryRequest\x12\x14\n\x0c\x63ontainer_id\x18\x01 \x01(\t\x12\x0b\n\x03pid\x18\x02 \x01(\r\"J\n\x1e\x43ontainerNetworkConfigResponse\x12\x13\n\x0bipv4_subnet\x18\x01 \x01(\t\x12\x13\n\x0bipv6_subnet\x18\x02 \x01(\t\"Z\n\x1e\x43ontainerNetworkStatusResponse\x12\x0b\n\x03pid\x18\x01 \x01(\r\x12+\n\ninterfaces\x18\x02 \x03(\x0b\x32\x17.abbot.NetworkInterface\"\xe3\x01\n\"ContainerNetworkStatusListResponse\x12\\\n\x12\x63ontainer_networks\x18\x01 \x03(\x0b\x32@.abbot.ContainerNetworkStatusListResponse.ContainerNetworksEntry\x1a_\n\x16\x43ontainerNetworksEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.abbot.ContainerNetworkStatusResponse:\x02\x38\x01\x42!Z\x1f\x61rhat.dev/abbot-proto/abbotgopbb\x06proto3'
,
dependencies=[github_dot_com_dot_gogo_dot_protobuf_dot_gogoproto_dot_gogo__pb2.DESCRIPTOR,meta__pb2.DESCRIPTOR,])
_CNICAPARGS_PORTMAP = _descriptor.Descriptor(
name='PortMap',
full_name='abbot.CNICapArgs.PortMap',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='container_port', full_name='abbot.CNICapArgs.PortMap.container_port', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\rcontainerPort', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='host_port', full_name='abbot.CNICapArgs.PortMap.host_port', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\010hostPort', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='protocol', full_name='abbot.CNICapArgs.PortMap.protocol', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\010protocol', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='host_ip', full_name='abbot.CNICapArgs.PortMap.host_ip', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\006hostIP', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=546,
serialized_end=692,
)
_CNICAPARGS_BANDWIDTH = _descriptor.Descriptor(
name='Bandwidth',
full_name='abbot.CNICapArgs.Bandwidth',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='ingress_rate', full_name='abbot.CNICapArgs.Bandwidth.ingress_rate', index=0,
number=1, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\025ingressRate,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ingress_burst', full_name='abbot.CNICapArgs.Bandwidth.ingress_burst', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\026ingressBurst,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='egress_rate', full_name='abbot.CNICapArgs.Bandwidth.egress_rate', index=2,
number=3, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\024egressRate,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='egress_burst', full_name='abbot.CNICapArgs.Bandwidth.egress_burst', index=3,
number=4, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\025egressBurst,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=695,
serialized_end=902,
)
_CNICAPARGS_IPRANGE = _descriptor.Descriptor(
name='IPRange',
full_name='abbot.CNICapArgs.IPRange',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='subnet', full_name='abbot.CNICapArgs.IPRange.subnet', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\006subnet', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='range_start', full_name='abbot.CNICapArgs.IPRange.range_start', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\024rangeStart,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='range_end', full_name='abbot.CNICapArgs.IPRange.range_end', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\022rangeEnd,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='gateway', full_name='abbot.CNICapArgs.IPRange.gateway', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\021gateway,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=905,
serialized_end=1072,
)
_CNICAPARGS_DNSCONFIG = _descriptor.Descriptor(
name='DNSConfig',
full_name='abbot.CNICapArgs.DNSConfig',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='servers', full_name='abbot.CNICapArgs.DNSConfig.servers', index=0,
number=1, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\021servers,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='searches', full_name='abbot.CNICapArgs.DNSConfig.searches', index=1,
number=2, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\022searches,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='options', full_name='abbot.CNICapArgs.DNSConfig.options', index=2,
number=3, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\021options,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1075,
serialized_end=1208,
)
_CNICAPARGS_IPADDRESSES = _descriptor.Descriptor(
name='IPAddresses',
full_name='abbot.CNICapArgs.IPAddresses',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='ips', full_name='abbot.CNICapArgs.IPAddresses.ips', index=0,
number=1, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\rips,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1210,
serialized_end=1255,
)
_CNICAPARGS_MACADDRESS = _descriptor.Descriptor(
name='MacAddress',
full_name='abbot.CNICapArgs.MacAddress',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='mac', full_name='abbot.CNICapArgs.MacAddress.mac', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\rmac,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1257,
serialized_end=1301,
)
_CNICAPARGS_INFINIBANDGUID = _descriptor.Descriptor(
name='InfinibandGUID',
full_name='abbot.CNICapArgs.InfinibandGUID',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='infiniband_guid', full_name='abbot.CNICapArgs.InfinibandGUID.infiniband_guid', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\030infinibandGUID,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1303,
serialized_end=1374,
)
_CNICAPARGS_DEVICEID = _descriptor.Descriptor(
name='DeviceID',
full_name='abbot.CNICapArgs.DeviceID',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='device_id', full_name='abbot.CNICapArgs.DeviceID.device_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=b'\352\336\037\022deviceID,omitempty', file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1376,
serialized_end=1429,
)
_CNICAPARGS = _descriptor.Descriptor(
name='CNICapArgs',
full_name='abbot.CNICapArgs',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='port_map_arg', full_name='abbot.CNICapArgs.port_map_arg', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='bandwidth_arg', full_name='abbot.CNICapArgs.bandwidth_arg', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ip_range_arg', full_name='abbot.CNICapArgs.ip_range_arg', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='dns_config_arg', full_name='abbot.CNICapArgs.dns_config_arg', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ip_addresses_arg', full_name='abbot.CNICapArgs.ip_addresses_arg', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='mac_address_arg', full_name='abbot.CNICapArgs.mac_address_arg', index=5,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='infiniband_guid_arg', full_name='abbot.CNICapArgs.infiniband_guid_arg', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='device_id_arg', full_name='abbot.CNICapArgs.device_id_arg', index=7,
number=8, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_CNICAPARGS_PORTMAP, _CNICAPARGS_BANDWIDTH, _CNICAPARGS_IPRANGE, _CNICAPARGS_DNSCONFIG, _CNICAPARGS_IPADDRESSES, _CNICAPARGS_MACADDRESS, _CNICAPARGS_INFINIBANDGUID, _CNICAPARGS_DEVICEID, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='option', full_name='abbot.CNICapArgs.option',
index=0, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
],
serialized_start=86,
serialized_end=1439,
)
_CONTAINERNETWORKENSUREREQUEST_CNIARGSENTRY = _descriptor.Descriptor(
name='CniArgsEntry',
full_name='abbot.ContainerNetworkEnsureRequest.CniArgsEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='abbot.ContainerNetworkEnsureRequest.CniArgsEntry.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='value', full_name='abbot.ContainerNetworkEnsureRequest.CniArgsEntry.value', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=b'8\001',
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1616,
serialized_end=1662,
)
_CONTAINERNETWORKENSUREREQUEST = _descriptor.Descriptor(
name='ContainerNetworkEnsureRequest',
full_name='abbot.ContainerNetworkEnsureRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='container_id', full_name='abbot.ContainerNetworkEnsureRequest.container_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='pid', full_name='abbot.ContainerNetworkEnsureRequest.pid', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='cap_args', full_name='abbot.ContainerNetworkEnsureRequest.cap_args', index=2,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='cni_args', full_name='abbot.ContainerNetworkEnsureRequest.cni_args', index=3,
number=4, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_CONTAINERNETWORKENSUREREQUEST_CNIARGSENTRY, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1442,
serialized_end=1662,
)
_CONTAINERNETWORKRESTOREREQUEST = _descriptor.Descriptor(
name='ContainerNetworkRestoreRequest',
full_name='abbot.ContainerNetworkRestoreRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='container_id', full_name='abbot.ContainerNetworkRestoreRequest.container_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='pid', full_name='abbot.ContainerNetworkRestoreRequest.pid', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1664,
serialized_end=1731,
)
_CONTAINERNETWORKDELETEREQUEST = _descriptor.Descriptor(
name='ContainerNetworkDeleteRequest',
full_name='abbot.ContainerNetworkDeleteRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='container_id', full_name='abbot.ContainerNetworkDeleteRequest.container_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='pid', full_name='abbot.ContainerNetworkDeleteRequest.pid', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1733,
serialized_end=1799,
)
_CONTAINERNETWORKCONFIGENSUREREQUEST = _descriptor.Descriptor(
name='ContainerNetworkConfigEnsureRequest',
full_name='abbot.ContainerNetworkConfigEnsureRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='ipv4_subnet', full_name='abbot.ContainerNetworkConfigEnsureRequest.ipv4_subnet', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ipv6_subnet', full_name='abbot.ContainerNetworkConfigEnsureRequest.ipv6_subnet', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1801,
serialized_end=1880,
)
_CONTAINERNETWORKCONFIGQUERYREQUEST = _descriptor.Descriptor(
name='ContainerNetworkConfigQueryRequest',
full_name='abbot.ContainerNetworkConfigQueryRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1882,
serialized_end=1918,
)
_CONTAINERNETWORKQUERYREQUEST = _descriptor.Descriptor(
name='ContainerNetworkQueryRequest',
full_name='abbot.ContainerNetworkQueryRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='container_id', full_name='abbot.ContainerNetworkQueryRequest.container_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='pid', full_name='abbot.ContainerNetworkQueryRequest.pid', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1920,
serialized_end=1985,
)
_CONTAINERNETWORKCONFIGRESPONSE = _descriptor.Descriptor(
name='ContainerNetworkConfigResponse',
full_name='abbot.ContainerNetworkConfigResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='ipv4_subnet', full_name='abbot.ContainerNetworkConfigResponse.ipv4_subnet', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='ipv6_subnet', full_name='abbot.ContainerNetworkConfigResponse.ipv6_subnet', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1987,
serialized_end=2061,
)
_CONTAINERNETWORKSTATUSRESPONSE = _descriptor.Descriptor(
name='ContainerNetworkStatusResponse',
full_name='abbot.ContainerNetworkStatusResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='pid', full_name='abbot.ContainerNetworkStatusResponse.pid', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='interfaces', full_name='abbot.ContainerNetworkStatusResponse.interfaces', index=1,
number=2, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2063,
serialized_end=2153,
)
_CONTAINERNETWORKSTATUSLISTRESPONSE_CONTAINERNETWORKSENTRY = _descriptor.Descriptor(
name='ContainerNetworksEntry',
full_name='abbot.ContainerNetworkStatusListResponse.ContainerNetworksEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='abbot.ContainerNetworkStatusListResponse.ContainerNetworksEntry.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='value', full_name='abbot.ContainerNetworkStatusListResponse.ContainerNetworksEntry.value', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=b'8\001',
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2288,
serialized_end=2383,
)
_CONTAINERNETWORKSTATUSLISTRESPONSE = _descriptor.Descriptor(
name='ContainerNetworkStatusListResponse',
full_name='abbot.ContainerNetworkStatusListResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='container_networks', full_name='abbot.ContainerNetworkStatusListResponse.container_networks', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_CONTAINERNETWORKSTATUSLISTRESPONSE_CONTAINERNETWORKSENTRY, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2156,
serialized_end=2383,
)
_CNICAPARGS_PORTMAP.containing_type = _CNICAPARGS
_CNICAPARGS_BANDWIDTH.containing_type = _CNICAPARGS
_CNICAPARGS_IPRANGE.containing_type = _CNICAPARGS
_CNICAPARGS_DNSCONFIG.containing_type = _CNICAPARGS
_CNICAPARGS_IPADDRESSES.containing_type = _CNICAPARGS
_CNICAPARGS_MACADDRESS.containing_type = _CNICAPARGS
_CNICAPARGS_INFINIBANDGUID.containing_type = _CNICAPARGS
_CNICAPARGS_DEVICEID.containing_type = _CNICAPARGS
_CNICAPARGS.fields_by_name['port_map_arg'].message_type = _CNICAPARGS_PORTMAP
_CNICAPARGS.fields_by_name['bandwidth_arg'].message_type = _CNICAPARGS_BANDWIDTH
_CNICAPARGS.fields_by_name['ip_range_arg'].message_type = _CNICAPARGS_IPRANGE
_CNICAPARGS.fields_by_name['dns_config_arg'].message_type = _CNICAPARGS_DNSCONFIG
_CNICAPARGS.fields_by_name['ip_addresses_arg'].message_type = _CNICAPARGS_IPADDRESSES
_CNICAPARGS.fields_by_name['mac_address_arg'].message_type = _CNICAPARGS_MACADDRESS
_CNICAPARGS.fields_by_name['infiniband_guid_arg'].message_type = _CNICAPARGS_INFINIBANDGUID
_CNICAPARGS.fields_by_name['device_id_arg'].message_type = _CNICAPARGS_DEVICEID
_CNICAPARGS.oneofs_by_name['option'].fields.append(
_CNICAPARGS.fields_by_name['port_map_arg'])
_CNICAPARGS.fields_by_name['port_map_arg'].containing_oneof = _CNICAPARGS.oneofs_by_name['option']
_CNICAPARGS.oneofs_by_name['option'].fields.append(
_CNICAPARGS.fields_by_name['bandwidth_arg'])
_CNICAPARGS.fields_by_name['bandwidth_arg'].containing_oneof = _CNICAPARGS.oneofs_by_name['option']
_CNICAPARGS.oneofs_by_name['option'].fields.append(
_CNICAPARGS.fields_by_name['ip_range_arg'])
_CNICAPARGS.fields_by_name['ip_range_arg'].containing_oneof = _CNICAPARGS.oneofs_by_name['option']
_CNICAPARGS.oneofs_by_name['option'].fields.append(
_CNICAPARGS.fields_by_name['dns_config_arg'])
_CNICAPARGS.fields_by_name['dns_config_arg'].containing_oneof = _CNICAPARGS.oneofs_by_name['option']
_CNICAPARGS.oneofs_by_name['option'].fields.append(
_CNICAPARGS.fields_by_name['ip_addresses_arg'])
_CNICAPARGS.fields_by_name['ip_addresses_arg'].containing_oneof = _CNICAPARGS.oneofs_by_name['option']
_CNICAPARGS.oneofs_by_name['option'].fields.append(
_CNICAPARGS.fields_by_name['mac_address_arg'])
_CNICAPARGS.fields_by_name['mac_address_arg'].containing_oneof = _CNICAPARGS.oneofs_by_name['option']
_CNICAPARGS.oneofs_by_name['option'].fields.append(
_CNICAPARGS.fields_by_name['infiniband_guid_arg'])
_CNICAPARGS.fields_by_name['infiniband_guid_arg'].containing_oneof = _CNICAPARGS.oneofs_by_name['option']
_CNICAPARGS.oneofs_by_name['option'].fields.append(
_CNICAPARGS.fields_by_name['device_id_arg'])
_CNICAPARGS.fields_by_name['device_id_arg'].containing_oneof = _CNICAPARGS.oneofs_by_name['option']
_CONTAINERNETWORKENSUREREQUEST_CNIARGSENTRY.containing_type = _CONTAINERNETWORKENSUREREQUEST
_CONTAINERNETWORKENSUREREQUEST.fields_by_name['cap_args'].message_type = _CNICAPARGS
_CONTAINERNETWORKENSUREREQUEST.fields_by_name['cni_args'].message_type = _CONTAINERNETWORKENSUREREQUEST_CNIARGSENTRY
_CONTAINERNETWORKSTATUSRESPONSE.fields_by_name['interfaces'].message_type = meta__pb2._NETWORKINTERFACE
_CONTAINERNETWORKSTATUSLISTRESPONSE_CONTAINERNETWORKSENTRY.fields_by_name['value'].message_type = _CONTAINERNETWORKSTATUSRESPONSE
_CONTAINERNETWORKSTATUSLISTRESPONSE_CONTAINERNETWORKSENTRY.containing_type = _CONTAINERNETWORKSTATUSLISTRESPONSE
_CONTAINERNETWORKSTATUSLISTRESPONSE.fields_by_name['container_networks'].message_type = _CONTAINERNETWORKSTATUSLISTRESPONSE_CONTAINERNETWORKSENTRY
DESCRIPTOR.message_types_by_name['CNICapArgs'] = _CNICAPARGS
DESCRIPTOR.message_types_by_name['ContainerNetworkEnsureRequest'] = _CONTAINERNETWORKENSUREREQUEST
DESCRIPTOR.message_types_by_name['ContainerNetworkRestoreRequest'] = _CONTAINERNETWORKRESTOREREQUEST
DESCRIPTOR.message_types_by_name['ContainerNetworkDeleteRequest'] = _CONTAINERNETWORKDELETEREQUEST
DESCRIPTOR.message_types_by_name['ContainerNetworkConfigEnsureRequest'] = _CONTAINERNETWORKCONFIGENSUREREQUEST
DESCRIPTOR.message_types_by_name['ContainerNetworkConfigQueryRequest'] = _CONTAINERNETWORKCONFIGQUERYREQUEST
DESCRIPTOR.message_types_by_name['ContainerNetworkQueryRequest'] = _CONTAINERNETWORKQUERYREQUEST
DESCRIPTOR.message_types_by_name['ContainerNetworkConfigResponse'] = _CONTAINERNETWORKCONFIGRESPONSE
DESCRIPTOR.message_types_by_name['ContainerNetworkStatusResponse'] = _CONTAINERNETWORKSTATUSRESPONSE
DESCRIPTOR.message_types_by_name['ContainerNetworkStatusListResponse'] = _CONTAINERNETWORKSTATUSLISTRESPONSE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
CNICapArgs = _reflection.GeneratedProtocolMessageType('CNICapArgs', (_message.Message,), {
'PortMap' : _reflection.GeneratedProtocolMessageType('PortMap', (_message.Message,), {
'DESCRIPTOR' : _CNICAPARGS_PORTMAP,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.CNICapArgs.PortMap)
})
,
'Bandwidth' : _reflection.GeneratedProtocolMessageType('Bandwidth', (_message.Message,), {
'DESCRIPTOR' : _CNICAPARGS_BANDWIDTH,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.CNICapArgs.Bandwidth)
})
,
'IPRange' : _reflection.GeneratedProtocolMessageType('IPRange', (_message.Message,), {
'DESCRIPTOR' : _CNICAPARGS_IPRANGE,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.CNICapArgs.IPRange)
})
,
'DNSConfig' : _reflection.GeneratedProtocolMessageType('DNSConfig', (_message.Message,), {
'DESCRIPTOR' : _CNICAPARGS_DNSCONFIG,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.CNICapArgs.DNSConfig)
})
,
'IPAddresses' : _reflection.GeneratedProtocolMessageType('IPAddresses', (_message.Message,), {
'DESCRIPTOR' : _CNICAPARGS_IPADDRESSES,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.CNICapArgs.IPAddresses)
})
,
'MacAddress' : _reflection.GeneratedProtocolMessageType('MacAddress', (_message.Message,), {
'DESCRIPTOR' : _CNICAPARGS_MACADDRESS,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.CNICapArgs.MacAddress)
})
,
'InfinibandGUID' : _reflection.GeneratedProtocolMessageType('InfinibandGUID', (_message.Message,), {
'DESCRIPTOR' : _CNICAPARGS_INFINIBANDGUID,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.CNICapArgs.InfinibandGUID)
})
,
'DeviceID' : _reflection.GeneratedProtocolMessageType('DeviceID', (_message.Message,), {
'DESCRIPTOR' : _CNICAPARGS_DEVICEID,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.CNICapArgs.DeviceID)
})
,
'DESCRIPTOR' : _CNICAPARGS,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.CNICapArgs)
})
_sym_db.RegisterMessage(CNICapArgs)
_sym_db.RegisterMessage(CNICapArgs.PortMap)
_sym_db.RegisterMessage(CNICapArgs.Bandwidth)
_sym_db.RegisterMessage(CNICapArgs.IPRange)
_sym_db.RegisterMessage(CNICapArgs.DNSConfig)
_sym_db.RegisterMessage(CNICapArgs.IPAddresses)
_sym_db.RegisterMessage(CNICapArgs.MacAddress)
_sym_db.RegisterMessage(CNICapArgs.InfinibandGUID)
_sym_db.RegisterMessage(CNICapArgs.DeviceID)
ContainerNetworkEnsureRequest = _reflection.GeneratedProtocolMessageType('ContainerNetworkEnsureRequest', (_message.Message,), {
'CniArgsEntry' : _reflection.GeneratedProtocolMessageType('CniArgsEntry', (_message.Message,), {
'DESCRIPTOR' : _CONTAINERNETWORKENSUREREQUEST_CNIARGSENTRY,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.ContainerNetworkEnsureRequest.CniArgsEntry)
})
,
'DESCRIPTOR' : _CONTAINERNETWORKENSUREREQUEST,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.ContainerNetworkEnsureRequest)
})
_sym_db.RegisterMessage(ContainerNetworkEnsureRequest)
_sym_db.RegisterMessage(ContainerNetworkEnsureRequest.CniArgsEntry)
ContainerNetworkRestoreRequest = _reflection.GeneratedProtocolMessageType('ContainerNetworkRestoreRequest', (_message.Message,), {
'DESCRIPTOR' : _CONTAINERNETWORKRESTOREREQUEST,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.ContainerNetworkRestoreRequest)
})
_sym_db.RegisterMessage(ContainerNetworkRestoreRequest)
ContainerNetworkDeleteRequest = _reflection.GeneratedProtocolMessageType('ContainerNetworkDeleteRequest', (_message.Message,), {
'DESCRIPTOR' : _CONTAINERNETWORKDELETEREQUEST,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.ContainerNetworkDeleteRequest)
})
_sym_db.RegisterMessage(ContainerNetworkDeleteRequest)
ContainerNetworkConfigEnsureRequest = _reflection.GeneratedProtocolMessageType('ContainerNetworkConfigEnsureRequest', (_message.Message,), {
'DESCRIPTOR' : _CONTAINERNETWORKCONFIGENSUREREQUEST,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.ContainerNetworkConfigEnsureRequest)
})
_sym_db.RegisterMessage(ContainerNetworkConfigEnsureRequest)
ContainerNetworkConfigQueryRequest = _reflection.GeneratedProtocolMessageType('ContainerNetworkConfigQueryRequest', (_message.Message,), {
'DESCRIPTOR' : _CONTAINERNETWORKCONFIGQUERYREQUEST,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.ContainerNetworkConfigQueryRequest)
})
_sym_db.RegisterMessage(ContainerNetworkConfigQueryRequest)
ContainerNetworkQueryRequest = _reflection.GeneratedProtocolMessageType('ContainerNetworkQueryRequest', (_message.Message,), {
'DESCRIPTOR' : _CONTAINERNETWORKQUERYREQUEST,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.ContainerNetworkQueryRequest)
})
_sym_db.RegisterMessage(ContainerNetworkQueryRequest)
ContainerNetworkConfigResponse = _reflection.GeneratedProtocolMessageType('ContainerNetworkConfigResponse', (_message.Message,), {
'DESCRIPTOR' : _CONTAINERNETWORKCONFIGRESPONSE,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.ContainerNetworkConfigResponse)
})
_sym_db.RegisterMessage(ContainerNetworkConfigResponse)
ContainerNetworkStatusResponse = _reflection.GeneratedProtocolMessageType('ContainerNetworkStatusResponse', (_message.Message,), {
'DESCRIPTOR' : _CONTAINERNETWORKSTATUSRESPONSE,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.ContainerNetworkStatusResponse)
})
_sym_db.RegisterMessage(ContainerNetworkStatusResponse)
ContainerNetworkStatusListResponse = _reflection.GeneratedProtocolMessageType('ContainerNetworkStatusListResponse', (_message.Message,), {
'ContainerNetworksEntry' : _reflection.GeneratedProtocolMessageType('ContainerNetworksEntry', (_message.Message,), {
'DESCRIPTOR' : _CONTAINERNETWORKSTATUSLISTRESPONSE_CONTAINERNETWORKSENTRY,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.ContainerNetworkStatusListResponse.ContainerNetworksEntry)
})
,
'DESCRIPTOR' : _CONTAINERNETWORKSTATUSLISTRESPONSE,
'__module__' : 'container_pb2'
# @@protoc_insertion_point(class_scope:abbot.ContainerNetworkStatusListResponse)
})
_sym_db.RegisterMessage(ContainerNetworkStatusListResponse)
_sym_db.RegisterMessage(ContainerNetworkStatusListResponse.ContainerNetworksEntry)
DESCRIPTOR._options = None
_CNICAPARGS_PORTMAP.fields_by_name['container_port']._options = None
_CNICAPARGS_PORTMAP.fields_by_name['host_port']._options = None
_CNICAPARGS_PORTMAP.fields_by_name['protocol']._options = None
_CNICAPARGS_PORTMAP.fields_by_name['host_ip']._options = None
_CNICAPARGS_BANDWIDTH.fields_by_name['ingress_rate']._options = None
_CNICAPARGS_BANDWIDTH.fields_by_name['ingress_burst']._options = None
_CNICAPARGS_BANDWIDTH.fields_by_name['egress_rate']._options = None
_CNICAPARGS_BANDWIDTH.fields_by_name['egress_burst']._options = None
_CNICAPARGS_IPRANGE.fields_by_name['subnet']._options = None
_CNICAPARGS_IPRANGE.fields_by_name['range_start']._options = None
_CNICAPARGS_IPRANGE.fields_by_name['range_end']._options = None
_CNICAPARGS_IPRANGE.fields_by_name['gateway']._options = None
_CNICAPARGS_DNSCONFIG.fields_by_name['servers']._options = None
_CNICAPARGS_DNSCONFIG.fields_by_name['searches']._options = None
_CNICAPARGS_DNSCONFIG.fields_by_name['options']._options = None
_CNICAPARGS_IPADDRESSES.fields_by_name['ips']._options = None
_CNICAPARGS_MACADDRESS.fields_by_name['mac']._options = None
_CNICAPARGS_INFINIBANDGUID.fields_by_name['infiniband_guid']._options = None
_CNICAPARGS_DEVICEID.fields_by_name['device_id']._options = None
_CONTAINERNETWORKENSUREREQUEST_CNIARGSENTRY._options = None
_CONTAINERNETWORKSTATUSLISTRESPONSE_CONTAINERNETWORKSENTRY._options = None
# @@protoc_insertion_point(module_scope)
| 44.674589 | 4,061 | 0.775627 | 5,732 | 48,874 | 6.258549 | 0.063154 | 0.036795 | 0.052434 | 0.052684 | 0.655712 | 0.594665 | 0.574232 | 0.550594 | 0.537381 | 0.52559 | 0 | 0.035741 | 0.106948 | 48,874 | 1,093 | 4,062 | 44.715462 | 0.786171 | 0.033944 | 0 | 0.640516 | 1 | 0.000993 | 0.223524 | 0.17264 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.005958 | 0 | 0.005958 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9ec9c397ddd2c74228ffd0aa80e60f81bcb7efb3 | 17,686 | py | Python | dashboard_api/api/utils.py | NASA-IMPACT/eis-pilot-dashboard-api | 18f28d1fcc8bd982b4161754f5852fb247fc77d0 | [
"MIT"
] | 1 | 2021-05-19T17:25:25.000Z | 2021-05-19T17:25:25.000Z | dashboard_api/api/utils.py | NASA-IMPACT/eis-pilot-dashboard-api | 18f28d1fcc8bd982b4161754f5852fb247fc77d0 | [
"MIT"
] | 9 | 2021-05-03T18:22:22.000Z | 2021-09-30T19:43:52.000Z | dashboard_api/api/utils.py | NASA-IMPACT/eis-pilot-dashboard-api | 18f28d1fcc8bd982b4161754f5852fb247fc77d0 | [
"MIT"
] | 1 | 2021-06-22T17:08:46.000Z | 2021-06-22T17:08:46.000Z | """dashboard_api.api.utils."""
import hashlib
import json
import re
import time
from enum import Enum
from typing import Any, Dict, Optional, Tuple
import numpy as np
# Temporary
import rasterio
from rasterio import features
from rasterio.warp import transform_bounds
from rasterstats.io import bounds_window
from rio_color.operations import parse_operations
from rio_color.utils import scale_dtype, to_math_type
from rio_tiler import constants
from rio_tiler.mercator import get_zooms
from rio_tiler.utils import _chunks, has_alpha_band, has_mask_band, linear_rescale
from shapely.geometry import box, shape
from dashboard_api.db.memcache import CacheLayer
from dashboard_api.models.timelapse import Feature
from starlette.requests import Request
def get_cache(request: Request) -> CacheLayer:
"""Get Memcached Layer."""
return request.state.cache
def get_hash(**kwargs: Any) -> str:
"""Create hash from kwargs."""
return hashlib.sha224(json.dumps(kwargs, sort_keys=True).encode()).hexdigest()
def postprocess(
tile: np.ndarray,
mask: np.ndarray,
rescale: Optional[str] = None,
color_formula: Optional[str] = None,
) -> np.ndarray:
"""Post-process tile data."""
if rescale:
rescale_arr = list(map(float, rescale.split(",")))
rescale_arr = list(_chunks(rescale_arr, 2))
if len(rescale_arr) != tile.shape[0]:
rescale_arr = ((rescale_arr[0]),) * tile.shape[0]
for bdx in range(tile.shape[0]):
tile[bdx] = np.where(
mask,
linear_rescale(
tile[bdx], in_range=rescale_arr[bdx], out_range=[0, 255]
),
0,
)
tile = tile.astype(np.uint8)
if color_formula:
# make sure one last time we don't have
# negative value before applying color formula
tile[tile < 0] = 0
for ops in parse_operations(color_formula):
tile = scale_dtype(ops(to_math_type(tile)), np.uint8)
return tile
# from rio-tiler 2.0a5
def info(address: str) -> Dict:
"""
Return simple metadata about the file.
Attributes
----------
address : str or PathLike object
A dataset path or URL. Will be opened in "r" mode.
Returns
-------
out : dict.
"""
with rasterio.open(address) as src_dst:
minzoom, maxzoom = get_zooms(src_dst)
bounds = transform_bounds(
src_dst.crs, constants.WGS84_CRS, *src_dst.bounds, densify_pts=21
)
center = [(bounds[0] + bounds[2]) / 2, (bounds[1] + bounds[3]) / 2, minzoom]
def _get_descr(ix):
"""Return band description."""
name = src_dst.descriptions[ix - 1]
if not name:
name = "band{}".format(ix)
return name
band_descriptions = [(ix, _get_descr(ix)) for ix in src_dst.indexes]
tags = [(ix, src_dst.tags(ix)) for ix in src_dst.indexes]
other_meta = dict()
if src_dst.scales[0] and src_dst.offsets[0]:
other_meta.update(dict(scale=src_dst.scales[0]))
other_meta.update(dict(offset=src_dst.offsets[0]))
if has_alpha_band(src_dst):
nodata_type = "Alpha"
elif has_mask_band(src_dst):
nodata_type = "Mask"
elif src_dst.nodata is not None:
nodata_type = "Nodata"
else:
nodata_type = "None"
try:
cmap = src_dst.colormap(1)
other_meta.update(dict(colormap=cmap))
except ValueError:
pass
return dict(
address=address,
bounds=bounds,
center=center,
minzoom=minzoom,
maxzoom=maxzoom,
band_metadata=tags,
band_descriptions=band_descriptions,
dtype=src_dst.meta["dtype"],
colorinterp=[src_dst.colorinterp[ix - 1].name for ix in src_dst.indexes],
nodata_type=nodata_type,
**other_meta,
)
# This code is copied from marblecutter
# https://github.com/mojodna/marblecutter/blob/master/marblecutter/stats.py
# License:
# Original work Copyright 2016 Stamen Design
# Modified work Copyright 2016-2017 Seth Fitzsimmons
# Modified work Copyright 2016 American Red Cross
# Modified work Copyright 2016-2017 Humanitarian OpenStreetMap Team
# Modified work Copyright 2017 Mapzen
class Timer(object):
"""Time a code block."""
def __enter__(self):
"""Starts timer."""
self.start = time.time()
return self
def __exit__(self, ty, val, tb):
"""Stops timer."""
self.end = time.time()
self.elapsed = self.end - self.start
# from https://gist.github.com/perrygeo/721040f8545272832a42#file-pctcover-png
# author: @perrygeo
def _rasterize_geom(geom, shape, affinetrans, all_touched):
indata = [(geom, 1)]
rv_array = features.rasterize(
indata, out_shape=shape, transform=affinetrans, fill=0, all_touched=all_touched
)
return rv_array
def rasterize_pctcover(geom, atrans, shape):
"""Rasterize features."""
alltouched = _rasterize_geom(geom, shape, atrans, all_touched=True)
exterior = _rasterize_geom(geom.exterior, shape, atrans, all_touched=True)
# Create percent cover grid as the difference between them
# at this point all cells are known 100% coverage,
# we'll update this array for exterior points
pctcover = alltouched - exterior
# loop through indicies of all exterior cells
for r, c in zip(*np.where(exterior == 1)):
# Find cell bounds, from rasterio DatasetReader.window_bounds
window = ((r, r + 1), (c, c + 1))
((row_min, row_max), (col_min, col_max)) = window
x_min, y_min = (col_min, row_max) * atrans
x_max, y_max = (col_max, row_min) * atrans
bounds = (x_min, y_min, x_max, y_max)
# Construct shapely geometry of cell
cell = box(*bounds)
# Intersect with original shape
cell_overlap = cell.intersection(geom)
# update pctcover with percentage based on area proportion
pctcover[r, c] = cell_overlap.area / cell.area
return pctcover
def get_zonal_stat(geojson: Feature, raster: str) -> Tuple[float, float]:
"""Return zonal statistics."""
geom = shape(geojson.geometry.dict())
with rasterio.open(raster) as src:
# read the raster data matching the geometry bounds
window = bounds_window(geom.bounds, src.transform)
# store our window information & read
window_affine = src.window_transform(window)
data = src.read(window=window)
# calculate the coverage of pixels for weighting
pctcover = rasterize_pctcover(geom, atrans=window_affine, shape=data.shape[1:])
return (
np.average(data[0], weights=pctcover),
np.nanmedian(data),
)
# from https://gitlab.com/zfasnacht/global_mapping/-/blob/master/global_mapping.py#L231
no2_cmap = {
0: [153, 197, 227, 255],
1: [154, 197, 225, 255],
2: [155, 198, 225, 255],
3: [158, 198, 223, 255],
4: [159, 199, 223, 255],
5: [161, 200, 222, 255],
6: [162, 200, 221, 255],
7: [163, 201, 221, 255],
8: [165, 201, 219, 255],
9: [166, 202, 219, 255],
10: [168, 203, 217, 255],
11: [169, 203, 217, 255],
12: [171, 203, 216, 255],
13: [173, 204, 215, 255],
14: [174, 205, 214, 255],
15: [176, 206, 213, 255],
16: [177, 206, 212, 255],
17: [179, 207, 212, 255],
18: [181, 207, 210, 255],
19: [182, 208, 210, 255],
20: [184, 209, 208, 255],
21: [185, 209, 208, 255],
22: [186, 209, 207, 255],
23: [188, 210, 206, 255],
24: [190, 211, 206, 255],
25: [192, 211, 204, 255],
26: [193, 212, 204, 255],
27: [194, 212, 203, 255],
28: [196, 213, 202, 255],
29: [198, 214, 201, 255],
30: [200, 214, 200, 255],
31: [201, 215, 199, 255],
32: [202, 215, 199, 255],
33: [205, 216, 197, 255],
34: [206, 217, 197, 255],
35: [208, 217, 195, 255],
36: [209, 218, 195, 255],
37: [210, 218, 194, 255],
38: [212, 219, 193, 255],
39: [213, 219, 192, 255],
40: [215, 220, 191, 255],
41: [217, 221, 191, 255],
42: [218, 221, 190, 255],
43: [220, 222, 189, 255],
44: [221, 222, 188, 255],
45: [223, 223, 187, 255],
46: [225, 224, 186, 255],
47: [226, 224, 185, 255],
48: [228, 225, 184, 255],
49: [229, 225, 184, 255],
50: [232, 226, 183, 255],
51: [232, 226, 182, 255],
52: [234, 227, 181, 255],
53: [236, 228, 180, 255],
54: [237, 228, 179, 255],
55: [239, 229, 178, 255],
56: [240, 229, 177, 255],
57: [243, 230, 176, 255],
58: [244, 231, 176, 255],
59: [245, 231, 175, 255],
60: [247, 232, 174, 255],
61: [248, 232, 173, 255],
62: [251, 233, 172, 255],
63: [252, 234, 171, 255],
64: [253, 232, 170, 255],
65: [253, 230, 167, 255],
66: [252, 226, 164, 255],
67: [252, 222, 161, 255],
68: [252, 220, 160, 255],
69: [252, 216, 157, 255],
70: [252, 213, 154, 255],
71: [251, 209, 151, 255],
72: [251, 206, 148, 255],
73: [251, 203, 146, 255],
74: [251, 200, 143, 255],
75: [251, 197, 140, 255],
76: [250, 193, 137, 255],
77: [250, 189, 134, 255],
78: [250, 186, 131, 255],
79: [250, 183, 129, 255],
80: [250, 181, 127, 255],
81: [249, 177, 124, 255],
82: [249, 174, 121, 255],
83: [249, 170, 119, 255],
84: [249, 168, 117, 255],
85: [249, 165, 116, 255],
86: [249, 162, 114, 255],
87: [249, 158, 112, 255],
88: [249, 155, 111, 255],
89: [249, 153, 110, 255],
90: [249, 150, 108, 255],
91: [249, 147, 107, 255],
92: [248, 144, 105, 255],
93: [248, 141, 104, 255],
94: [248, 139, 104, 255],
95: [248, 136, 102, 255],
96: [248, 133, 101, 255],
97: [248, 130, 99, 255],
98: [248, 126, 98, 255],
99: [248, 124, 97, 255],
100: [248, 121, 94, 255],
101: [248, 118, 93, 255],
102: [247, 115, 92, 255],
103: [246, 112, 92, 255],
104: [244, 110, 93, 255],
105: [243, 108, 94, 255],
106: [242, 106, 95, 255],
107: [240, 103, 96, 255],
108: [239, 101, 96, 255],
109: [237, 99, 97, 255],
110: [236, 97, 97, 255],
111: [235, 95, 98, 255],
112: [233, 92, 99, 255],
113: [231, 90, 100, 255],
114: [229, 87, 101, 255],
115: [228, 85, 101, 255],
116: [227, 83, 102, 255],
117: [225, 80, 103, 255],
118: [224, 78, 103, 255],
119: [222, 76, 104, 255],
120: [221, 74, 105, 255],
121: [219, 71, 105, 255],
122: [217, 70, 107, 255],
123: [216, 69, 107, 255],
124: [213, 68, 108, 255],
125: [211, 67, 109, 255],
126: [209, 67, 109, 255],
127: [207, 65, 111, 255],
128: [204, 65, 111, 255],
129: [202, 63, 113, 255],
130: [200, 63, 113, 255],
131: [198, 62, 114, 255],
132: [196, 61, 115, 255],
133: [194, 60, 116, 255],
134: [191, 58, 117, 255],
135: [189, 58, 118, 255],
136: [187, 57, 119, 255],
137: [185, 56, 120, 255],
138: [183, 55, 120, 255],
139: [181, 54, 122, 255],
140: [179, 53, 122, 255],
141: [177, 53, 123, 255],
142: [174, 51, 123, 255],
143: [171, 51, 124, 255],
144: [168, 49, 124, 255],
145: [165, 49, 124, 255],
146: [164, 48, 125, 255],
147: [161, 47, 125, 255],
148: [158, 46, 125, 255],
149: [155, 45, 126, 255],
150: [152, 44, 126, 255],
151: [150, 44, 126, 255],
152: [148, 43, 126, 255],
153: [146, 42, 127, 255],
154: [143, 41, 127, 255],
155: [140, 40, 127, 255],
156: [137, 39, 128, 255],
157: [135, 38, 128, 255],
158: [132, 38, 128, 255],
159: [130, 36, 129, 255],
160: [127, 36, 129, 255],
161: [125, 34, 129, 255],
162: [123, 34, 128, 255],
163: [121, 33, 128, 255],
164: [119, 32, 128, 255],
165: [117, 32, 128, 255],
166: [114, 31, 127, 255],
167: [113, 30, 127, 255],
168: [112, 30, 127, 255],
169: [109, 28, 127, 255],
170: [107, 28, 126, 255],
171: [105, 27, 126, 255],
172: [103, 26, 126, 255],
173: [101, 25, 126, 255],
174: [99, 24, 125, 255],
175: [97, 24, 125, 255],
176: [94, 23, 125, 255],
177: [93, 22, 125, 255],
178: [91, 21, 124, 255],
179: [89, 20, 124, 255],
180: [87, 20, 124, 255],
181: [85, 19, 124, 255],
182: [83, 18, 123, 255],
183: [82, 18, 123, 255],
184: [79, 17, 122, 255],
185: [77, 17, 120, 255],
186: [75, 17, 118, 255],
187: [73, 17, 116, 255],
188: [71, 17, 114, 255],
189: [69, 17, 112, 255],
190: [67, 17, 110, 255],
191: [64, 17, 108, 255],
192: [62, 17, 106, 255],
193: [61, 17, 104, 255],
194: [58, 17, 102, 255],
195: [57, 17, 101, 255],
196: [55, 16, 98, 255],
197: [53, 16, 96, 255],
198: [51, 16, 95, 255],
199: [49, 16, 93, 255],
200: [47, 16, 91, 255],
201: [45, 16, 88, 255],
202: [43, 16, 86, 255],
203: [41, 16, 85, 255],
204: [39, 16, 82, 255],
205: [37, 16, 81, 255],
206: [34, 16, 78, 255],
207: [32, 16, 76, 255],
208: [31, 16, 74, 255],
209: [30, 16, 73, 255],
210: [30, 15, 72, 255],
211: [29, 15, 70, 255],
212: [29, 15, 69, 255],
213: [28, 14, 67, 255],
214: [27, 14, 66, 255],
215: [27, 14, 65, 255],
216: [26, 14, 63, 255],
217: [26, 14, 62, 255],
218: [25, 13, 60, 255],
219: [25, 13, 59, 255],
220: [24, 13, 58, 255],
221: [23, 12, 56, 255],
222: [23, 12, 54, 255],
223: [22, 12, 52, 255],
224: [22, 12, 52, 255],
225: [22, 11, 51, 255],
226: [21, 11, 49, 255],
227: [20, 11, 48, 255],
228: [19, 10, 46, 255],
229: [19, 10, 45, 255],
230: [19, 10, 43, 255],
231: [18, 10, 41, 255],
232: [17, 9, 40, 255],
233: [17, 9, 38, 255],
234: [16, 9, 37, 255],
235: [16, 9, 36, 255],
236: [15, 8, 34, 255],
237: [15, 8, 33, 255],
238: [14, 8, 31, 255],
239: [14, 8, 30, 255],
240: [13, 7, 29, 255],
241: [12, 7, 27, 255],
242: [12, 7, 26, 255],
243: [11, 6, 24, 255],
244: [11, 6, 23, 255],
245: [10, 6, 22, 255],
246: [9, 6, 20, 255],
247: [9, 5, 19, 255],
248: [8, 5, 17, 255],
249: [8, 5, 16, 255],
250: [7, 5, 15, 255],
251: [7, 4, 13, 255],
252: [6, 4, 12, 255],
253: [6, 4, 10, 255],
254: [5, 4, 9, 255],
255: [5, 3, 8, 255],
}
crop_monitor_cmap = {
1: [120, 120, 120, 255],
2: [130, 65, 0, 255],
3: [66, 207, 56, 255],
4: [245, 239, 0, 255],
5: [241, 89, 32, 255],
6: [168, 0, 0, 255],
7: [0, 143, 201, 255],
}
COLOR_MAPS = {
"no2": no2_cmap.copy(),
"cropmonitor": crop_monitor_cmap.copy(),
}
def get_custom_cmap(cname) -> Dict:
"""Return custom colormap."""
if not re.match(r"^custom_", cname):
raise Exception("Invalid colormap name")
_, name = cname.split("_")
return COLOR_MAPS[name]
COLOR_MAP_NAMES = [
"accent",
"accent_r",
"afmhot",
"afmhot_r",
"autumn",
"autumn_r",
"binary",
"binary_r",
"blues",
"blues_r",
"bone",
"bone_r",
"brbg",
"brbg_r",
"brg",
"brg_r",
"bugn",
"bugn_r",
"bupu",
"bupu_r",
"bwr",
"bwr_r",
"cfastie",
"cividis",
"cividis_r",
"cmrmap",
"cmrmap_r",
"cool",
"cool_r",
"coolwarm",
"coolwarm_r",
"copper",
"copper_r",
"cubehelix",
"cubehelix_r",
"dark2",
"dark2_r",
"flag",
"flag_r",
"gist_earth",
"gist_earth_r",
"gist_gray",
"gist_gray_r",
"gist_heat",
"gist_heat_r",
"gist_ncar",
"gist_ncar_r",
"gist_rainbow",
"gist_rainbow_r",
"gist_stern",
"gist_stern_r",
"gist_yarg",
"gist_yarg_r",
"gnbu",
"gnbu_r",
"gnuplot",
"gnuplot2",
"gnuplot2_r",
"gnuplot_r",
"gray",
"gray_r",
"greens",
"greens_r",
"greys",
"greys_r",
"hot",
"hot_r",
"hsv",
"hsv_r",
"inferno",
"inferno_r",
"jet",
"jet_r",
"magma",
"magma_r",
"nipy_spectral",
"nipy_spectral_r",
"ocean",
"ocean_r",
"oranges",
"oranges_r",
"orrd",
"orrd_r",
"paired",
"paired_r",
"pastel1",
"pastel1_r",
"pastel2",
"pastel2_r",
"pink",
"pink_r",
"piyg",
"piyg_r",
"plasma",
"plasma_r",
"prgn",
"prgn_r",
"prism",
"prism_r",
"pubu",
"pubu_r",
"pubugn",
"pubugn_r",
"puor",
"puor_r",
"purd",
"purd_r",
"purples",
"purples_r",
"rainbow",
"rainbow_r",
"rdbu",
"rdbu_r",
"rdgy",
"rdgy_r",
"rdpu",
"rdpu_r",
"rdylbu",
"rdylbu_r",
"rdylgn",
"rdylgn_r",
"reds",
"reds_r",
"rplumbo",
"schwarzwald",
"seismic",
"seismic_r",
"set1",
"set1_r",
"set2",
"set2_r",
"set3",
"set3_r",
"spectral",
"spectral_r",
"spring",
"spring_r",
"summer",
"summer_r",
"tab10",
"tab10_r",
"tab20",
"tab20_r",
"tab20b",
"tab20b_r",
"tab20c",
"tab20c_r",
"terrain",
"terrain_r",
"twilight",
"twilight_r",
"twilight_shifted",
"twilight_shifted_r",
"viridis",
"viridis_r",
"winter",
"winter_r",
"wistia",
"wistia_r",
"ylgn",
"ylgn_r",
"ylgnbu",
"ylgnbu_r",
"ylorbr",
"ylorbr_r",
"ylorrd",
"ylorrd_r",
] + [f"custom_{c}" for c in COLOR_MAPS.keys()]
ColorMapName = Enum("ColorMapNames", [(a, a) for a in COLOR_MAP_NAMES]) # type: ignore
| 26.008824 | 87 | 0.533925 | 2,551 | 17,686 | 3.600549 | 0.255586 | 0.012412 | 0.005226 | 0.003266 | 0.027436 | 0.006968 | 0.00479 | 0 | 0 | 0 | 0 | 0.285533 | 0.288307 | 17,686 | 679 | 88 | 26.047128 | 0.444188 | 0.090184 | 0 | 0 | 0 | 0 | 0.07912 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019064 | false | 0.001733 | 0.034662 | 0 | 0.07279 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9ed512d47b509cb3ea841b512f494c210a331bfb | 810 | py | Python | setup.py | mvx24/django-mvx-utils | 28bb9479b7c0a542f9fabf4d1952215bd77059b0 | [
"MIT"
] | null | null | null | setup.py | mvx24/django-mvx-utils | 28bb9479b7c0a542f9fabf4d1952215bd77059b0 | [
"MIT"
] | null | null | null | setup.py | mvx24/django-mvx-utils | 28bb9479b7c0a542f9fabf4d1952215bd77059b0 | [
"MIT"
] | null | null | null | from setuptools import setup, find_packages
setup(
name='django-mvx-utils',
version='0.0.1',
description='Random utils for Django.',
url='https://github.com/mvx24/django-mvx-utils',
author='mvx24',
author_email='cram2400@gmail.com',
license='MIT',
classifiers=[
'Development Status :: 3 - Alpha',
'Environment :: Console',
'Framework :: Django',
'License :: OSI Approved :: MIT License',
'Operating System :: MacOS :: MacOS X',
'Operating System :: POSIX',
'Operating System :: Unix',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 2 :: Only',
'Topic :: Database',
'Topic :: Software Development :: Libraries :: Python Modules'
],
keywords='django utils views decorators',
packages=find_packages(exclude=['tests']),
install_requires=['django']
)
| 28.928571 | 64 | 0.681481 | 93 | 810 | 5.892473 | 0.655914 | 0.082117 | 0.051095 | 0.094891 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021739 | 0.148148 | 810 | 27 | 65 | 30 | 0.772464 | 0 | 0 | 0 | 0 | 0 | 0.622222 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.038462 | 0 | 0.038462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9ed7b4c6fa47ebfa0702722f4fa4c90e066f5812 | 403 | py | Python | competitive/AtCoder/ABC198/E.py | pn11/benkyokai | 9ebdc46b529e76b7196add26dbc1e62ad48e72b0 | [
"MIT"
] | null | null | null | competitive/AtCoder/ABC198/E.py | pn11/benkyokai | 9ebdc46b529e76b7196add26dbc1e62ad48e72b0 | [
"MIT"
] | 22 | 2020-03-24T16:24:47.000Z | 2022-02-26T15:51:18.000Z | competitive/AtCoder/ABC198/E.py | pn11/benkyokai | 9ebdc46b529e76b7196add26dbc1e62ad48e72b0 | [
"MIT"
] | null | null | null | # 書きかけ
N = int(input())
C = [int(x) for x in input().split()]
A = []
B = []
for _ in range(N-1):
a, b = [int(x) for x in input().split()]
A.append(a)
B.append(b)
print(N, C, A, B)
# 1からi までの経路に登場した色
colors = [set()]*N
#colors[0] = C[0]
childs = [list()]*N
for a, b in
ans = []
def dfs(i, colors):
if C[i] in colors[i]:
pass
else:
ans.append(i)
dfs[]
| 13 | 44 | 0.491315 | 72 | 403 | 2.736111 | 0.402778 | 0.050761 | 0.071066 | 0.081218 | 0.213198 | 0.213198 | 0.213198 | 0.213198 | 0 | 0 | 0 | 0.014085 | 0.295285 | 403 | 30 | 45 | 13.433333 | 0.679577 | 0.091811 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.052632 | 0 | null | null | 0.052632 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
9ee53816f100085c662ffcaf1cd9853ae5ac8013 | 435 | py | Python | vendor/packages/translate-toolkit/translate/misc/wsgiserver/__init__.py | DESHRAJ/fjord | 8899b6286b23347c9b024334e61c33fe133e836d | [
"BSD-3-Clause"
] | null | null | null | vendor/packages/translate-toolkit/translate/misc/wsgiserver/__init__.py | DESHRAJ/fjord | 8899b6286b23347c9b024334e61c33fe133e836d | [
"BSD-3-Clause"
] | 1 | 2021-12-13T20:55:07.000Z | 2021-12-13T20:55:07.000Z | vendor/packages/translate-toolkit/translate/misc/wsgiserver/__init__.py | DESHRAJ/fjord | 8899b6286b23347c9b024334e61c33fe133e836d | [
"BSD-3-Clause"
] | null | null | null | __all__ = ['HTTPRequest', 'HTTPConnection', 'HTTPServer',
'SizeCheckWrapper', 'KnownLengthRFile', 'ChunkedRFile',
'MaxSizeExceeded', 'NoSSLError', 'FatalSSLAlert',
'WorkerThread', 'ThreadPool', 'SSLAdapter',
'CherryPyWSGIServer',
'Gateway', 'WSGIGateway', 'WSGIGateway_10', 'WSGIGateway_u0',
'WSGIPathInfoDispatcher', 'get_ssl_adapter_class']
from wsgiserver import *
| 43.5 | 72 | 0.650575 | 28 | 435 | 9.785714 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008746 | 0.211494 | 435 | 9 | 73 | 48.333333 | 0.790087 | 0 | 0 | 0 | 0 | 0 | 0.588506 | 0.098851 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9ee6fb39992d422378b31e8e414a792ec4645847 | 2,272 | py | Python | bnbapp/protocols/middleware.py | Bionetbook/bionetbook | c92d5bbdc5c121631c2230cf93aa63977d381f30 | [
"MIT"
] | null | null | null | bnbapp/protocols/middleware.py | Bionetbook/bionetbook | c92d5bbdc5c121631c2230cf93aa63977d381f30 | [
"MIT"
] | null | null | null | bnbapp/protocols/middleware.py | Bionetbook/bionetbook | c92d5bbdc5c121631c2230cf93aa63977d381f30 | [
"MIT"
] | null | null | null | from django.core.urlresolvers import reverse
from django.http import HttpResponse, HttpResponseRedirect
# from django.contrib import messages
from protocols.models import Protocol
from django.http import Http404
# class ConfirmProfile(object):
# def process_response(self, request, response):
# my_profile = getattr(request, "my_profile", None)
# if not hasattr(request, "user"):
# return response
# if request.user.is_anonymous():
# return response
# if my_profile == None:
# current_path = request.get_full_path()
# # TODO write this cleaner
# paths = ('profile_update', 'logout', )
# valid_path = False
# for path in paths:
# if current_path.startswith(reverse(path)):
# valid_path = True
# if not valid_path:
# messages.add_message(request, messages.ERROR, "Please fill out your profile.")
# return HttpResponseRedirect(reverse('profile_update'))
# return response
class ProtocolAccess(object):
def process_view(self, request, view_func, view_args, view_kwargs):
if "protocol_slug" in view_kwargs:
user = getattr(request, "user", None)
if user:
try:
protocol = Protocol.objects.get(slug=view_kwargs["protocol_slug"])
except Protocol.DoesNotExist:
print "%s failed to access non-existant protocol" % (user)
raise Http404
if not protocol.user_has_access(user):
print "%s failed to access %s" % (user, protocol)
raise Http404 # How about returning a 404 response
return
# def process_view(self, request, view_func, view_args, view_kwargs):
# __traceback_hide__ = True
# toolbar = self.__class__.debug_toolbars.get(threading.currentThread().ident)
# if not toolbar:
# return
# result = None
# for panel in toolbar.panels:
# response = panel.process_view(request, view_func, view_args, view_kwargs)
# if response:
# result = response
# return result
| 33.910448 | 96 | 0.59331 | 239 | 2,272 | 5.460251 | 0.389121 | 0.038314 | 0.034483 | 0.043678 | 0.137165 | 0.106513 | 0.106513 | 0.106513 | 0.078161 | 0.078161 | 0 | 0.007797 | 0.322623 | 2,272 | 66 | 97 | 34.424242 | 0.840156 | 0.577465 | 0 | 0.111111 | 0 | 0 | 0.100432 | 0 | 0 | 0 | 0 | 0.015152 | 0 | 0 | null | null | 0 | 0.222222 | null | null | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9eed40bafe8e6b5f2bb89c019adccc71431c31f3 | 5,465 | py | Python | sacla/scripts/backups/old_scripts/sacla2_Chip_Collect1_v2.py | beamline-i24/DiamondChips | 02fb58a95ad2c1712c41b641eb5f197d688c54c3 | [
"Apache-2.0"
] | null | null | null | sacla/scripts/backups/old_scripts/sacla2_Chip_Collect1_v2.py | beamline-i24/DiamondChips | 02fb58a95ad2c1712c41b641eb5f197d688c54c3 | [
"Apache-2.0"
] | null | null | null | sacla/scripts/backups/old_scripts/sacla2_Chip_Collect1_v2.py | beamline-i24/DiamondChips | 02fb58a95ad2c1712c41b641eb5f197d688c54c3 | [
"Apache-2.0"
] | null | null | null | import pv
import os, re, sys
import time, math, string
from time import sleep
from ca import caput, caget
from sacla2_Chip_StartUp1 import get_xy
from sacla2_Chip_StartUp1 import make_path_dict
from sacla2_Chip_StartUp1 import scrape_dcparameters
###############################################
# OLD Chip_Collect from SACLA1 experiment #
# This version last edited 21Oct2016 by DAS #
###############################################
def get_chip_prog_values(xstart=0, ystart=0, xblocks=9, yblocks=9, coltype=41, block_id=11, num_of_shots=1):
chip_dict = \
{'X_NUM_STEPS': [11, 12],
'Y_NUM_STEPS': [12, 12],
'X_STEP_SIZE': [13, 0.125],
'Y_STEP_SIZE': [14, 0.125],
#'X_STEP_SIZE': [13, 0.001],
#'Y_STEP_SIZE': [14, 0.001],
'DWELL_TIME': [15, 16], #SACLA 15ms + 1ms
#'DWELL_TIME': [15, 55], #10Hz test
#'DWELL_TIME': [15, 105], #5Hz test
'X_START': [16, xstart],
'Y_START': [17, ystart],
'Z_START': [18, 0],
'X_NUM_BLOCKS': [20, xblocks],
'Y_NUM_BLOCKS': [21, yblocks],
'X_BLOCK_SIZE': [24, 2.2],
'Y_BLOCK_SIZE': [25, 2.5],
'COLTYPE': [26, coltype],
'N_EXPOSURES': [30, num_of_shots],
'BLOCK_ID': [31, block_id]}
return chip_dict
def load_motion_program_data(motion_program_dict, map_type):
print 'Loading prog vars for chip'
if map_type == '0':
prefix = 11
elif map_type == '1':
prefix = 11
elif map_type == '2':
prefix = 12
elif map_type == '3':
prefix = 13
else:
print 'Unknown map_type'
#for k, v in motion_program_dict.items():
for key in sorted(motion_program_dict.keys()):
v = motion_program_dict[key]
pvar_base = prefix * 100
pvar = pvar_base + v[0]
value = str(v[1])
s = 'P' + str(pvar) + '=' + str(value)
print key, '\t', s
caput(pv.me14e_pmac_str, s)
sleep(0.02)
print 'done'
def main():
print 'Starting'
starttime = time.ctime()
caput(pv.me14e_pmac_str, '!x0y0z0')
chipname, visit_id, proteinname, num_of_shots, chip_type, map_type, path_key = scrape_dcparameters()
if map_type == '0' or map_type == '1':
path_dict = make_path_dict()
print path_dict.keys()
print path_key + '_classic'
print path_dict[path_key + '_classic']
xstart, ystart, xblocks, yblocks, coltype, path = path_dict[path_key + '_classic']
# What is block id? Is it used by the motion program?
block_id = str(string.uppercase.index(path[0][0])+1) + path[0][1]
print '\n\nChip name is', chipname
print 'visit_id', visit_id
print 'proteinname', proteinname
print 'path_key:', path_key
print 'path:', path
print 'xstart', xstart
print 'ystart', ystart
print 'xblocks', xblocks
print 'yblocks', yblocks
print 'Block ID', block_id
print 'coltype', coltype
print 'num_of_shots', num_of_shots
chip_prog_dict = get_chip_prog_values(xstart, ystart, xblocks, yblocks, coltype, block_id, num_of_shots)
print 'Moving to Start'
caput(pv.me14e_pmac_str, '!x%sy%sz0' %(xstart, ystart))
sleep(1.5)
load_motion_program_data(chip_prog_dict, map_type)
print 'Killing Camera'
#caput('ME14E-DI-CAM-01:CAM:Acquire', 'Done')
#caput('ME14E-DI-CAM-03:CAM:Acquire', 'Done')
#sleep(0.2)
caput(pv.me14e_pmac_str, '&2b11r')
endtime = time.ctime()
print 3*'\n'
print 'Summary'
print 'Chip name:', chipname
print 'Protein name:', proteinname
print 'Start time:', starttime
print 'End time:', endtime
#caput(pv.me14e_pmac_str, '!x0y0z0')
print 3*'\n'
elif map_type == '2':
path_dict = make_path_dict()
print path_dict.keys()
print path_key + '_classic'
print path_dict[path_key + '_classic']
xstart, ystart, xblocks, yblocks, coltype, path = path_dict[path_key + '_classic']
# What is block id? Is it used by the motion program?
block_id = str(string.uppercase.index(path[0][0])+1) + path[0][1]
print '\n\nChip name is', chipname
print 'visit_id', visit_id
print 'proteinname', proteinname
print 'path_key:', path_key
print 'path:', path
print 'xstart', xstart
print 'ystart', ystart
print 'Block ID', block_id
print 'coltype', coltype
print 'num_of_shots', num_of_shots
chip_prog_dict = get_chip_prog_values(xstart, ystart, xblocks, yblocks, coltype, block_id, num_of_shots)
print 'Moving to Start'
caput(pv.me14e_pmac_str, '!x%sy%sz0' %(xstart, ystart))
sleep(1.5)
load_motion_program_data(chip_prog_dict, map_type)
print 'Killing Camera'
#caput('ME14E-DI-CAM-01:CAM:Acquire', 'Done')
#caput('ME14E-DI-CAM-03:CAM:Acquire', 'Done')
#sleep(0.2)
caput(pv.me14e_pmac_str, '&2b12r')
endtime = time.ctime()
print 3*'\n'
print 'Summary'
print 'Chip name:', chipname
print 'Protein name:', proteinname
print 'Start time:', starttime
print 'End time:', endtime
#caput(pv.me14e_pmac_str, '!x0y0z0')
print 3*'\n'
elif map_type == '3':
caput(pv.me14e_pmac_str, '&2b13r')
else:
print 'Unknown map_type'
if __name__ == "__main__":
main()
| 35.718954 | 112 | 0.595425 | 749 | 5,465 | 4.105474 | 0.216288 | 0.03187 | 0.029268 | 0.046829 | 0.671545 | 0.581463 | 0.573333 | 0.573333 | 0.573333 | 0.573333 | 0 | 0.048913 | 0.259286 | 5,465 | 152 | 113 | 35.953947 | 0.710721 | 0.117292 | 0 | 0.56 | 0 | 0 | 0.14574 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.064 | null | null | 0.416 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
9ef0a1444b69d219ecc8a8b82f81b03a1e233e44 | 308 | py | Python | inventory/admin.py | mugagambi/retail-system | 82bd9f243836aadee001fa7f17d1d93441214aa8 | [
"MIT"
] | 1 | 2019-10-08T13:53:49.000Z | 2019-10-08T13:53:49.000Z | inventory/admin.py | mugagambi/retail-system | 82bd9f243836aadee001fa7f17d1d93441214aa8 | [
"MIT"
] | null | null | null | inventory/admin.py | mugagambi/retail-system | 82bd9f243836aadee001fa7f17d1d93441214aa8 | [
"MIT"
] | null | null | null | from django.contrib import admin
from inventory import models
class ItemAdmin(admin.ModelAdmin):
list_display = ('name', 'total_units', 'remaining_units', 'unit_price', 'description', 'account')
list_select_related = True
# Register your models here.,
admin.site.register(models.Item, ItemAdmin)
| 25.666667 | 101 | 0.756494 | 38 | 308 | 5.973684 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12987 | 308 | 11 | 102 | 28 | 0.847015 | 0.087662 | 0 | 0 | 0 | 0 | 0.207885 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
7301f0c2a525c80ce274bc4c8d17bd9d113eff79 | 755 | py | Python | startcbv/templates/startcbv/urls.py | audreyfeldroy/django-startcbv | de936e0b16df3f6bedc40f145bf339facc7f6c24 | [
"MIT"
] | 1 | 2019-10-31T00:16:45.000Z | 2019-10-31T00:16:45.000Z | startcbv/templates/startcbv/urls.py | audreyr/django-startcbv | de936e0b16df3f6bedc40f145bf339facc7f6c24 | [
"MIT"
] | 2 | 2020-02-11T21:50:44.000Z | 2020-06-05T16:46:34.000Z | startcbv/templates/startcbv/urls.py | audreyfeldroy/django-startcbv | de936e0b16df3f6bedc40f145bf339facc7f6c24 | [
"MIT"
] | null | null | null | from django.conf.urls.defaults import *
from django.views.generic import DetailView, ListView
from {{ app_name }}.models import {{ model_name }}
urlpatterns = patterns('{{ app_name }}.views',
url(regex=r'^$',
view=ListView.as_view(
queryset={{ model_name }}.objects.order_by('-pub_date'),
context_object_name='latest_{{ model_name.lower }}_list',
template_name='{{ app_name }}/{{ model_name.lower }}_list.html'),
name='{{ model_name.lower }}_list',
),
url(regex=r'^(?P<slug>[-\w]+)/$',
view=DetailView.as_view(
model={{ model_name }},
template_name='{{ app_name }}/{{ model_name.lower }}_detail.html'),
name='{{ model_name.lower }}_detail',
),
)
| 37.75 | 79 | 0.598675 | 89 | 755 | 4.786517 | 0.426966 | 0.169014 | 0.164319 | 0.169014 | 0.305164 | 0.15493 | 0.15493 | 0 | 0 | 0 | 0 | 0 | 0.215894 | 755 | 19 | 80 | 39.736842 | 0.719595 | 0 | 0 | 0.111111 | 0 | 0 | 0.312583 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.166667 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
731fa63ccfa67e69478fd1db5327150337fbbe9d | 976 | py | Python | core/db_connection.py | RelativeProgramming/postgres-retrofit | f6e67288d6a6f9e07f581c2cbc77dd69ab946817 | [
"MIT"
] | null | null | null | core/db_connection.py | RelativeProgramming/postgres-retrofit | f6e67288d6a6f9e07f581c2cbc77dd69ab946817 | [
"MIT"
] | null | null | null | core/db_connection.py | RelativeProgramming/postgres-retrofit | f6e67288d6a6f9e07f581c2cbc77dd69ab946817 | [
"MIT"
] | null | null | null | import json
import os
import psycopg2
DB_CONFIG_PATH = os.path.dirname(__file__) + '/../config/db_config.json'
def get_db_config(path=DB_CONFIG_PATH):
f_db_config = open(path, 'r')
db_config = json.loads(f_db_config.read())
f_db_config.close()
return db_config
def get_bytea(value):
return psycopg2.Binary(value)
def create_connection(db_config):
con = None
cur = None
# create db connection
try:
con = psycopg2.connect(
"dbname='" + db_config['db_name'] + "' user='" +
db_config['username'] + "' host='" + db_config['host'] +
"' password='" + db_config['password'] + "'")
except:
print('ERROR: Can not connect to database')
print("dbname='" + db_config['db_name'] + "' user='" +
db_config['username'] + "' host='" + db_config['host'] +
"' password='" + db_config['password'] + "'")
return
cur = con.cursor()
return con, cur
| 27.111111 | 72 | 0.586066 | 120 | 976 | 4.5 | 0.35 | 0.266667 | 0.066667 | 0.059259 | 0.296296 | 0.296296 | 0.296296 | 0.296296 | 0.296296 | 0.296296 | 0 | 0.004121 | 0.254098 | 976 | 35 | 73 | 27.885714 | 0.737637 | 0.020492 | 0 | 0.148148 | 0 | 0 | 0.197065 | 0.026205 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0.074074 | 0.111111 | 0.037037 | 0.37037 | 0.074074 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
732140286a41f14b791db10a3c6ebc3beceaf85d | 391 | py | Python | napari/_qt/menus/window_menu.py | sandutsar/napari | 37d476bc0b00252177f17f25e7d1fd52ddc4bb69 | [
"BSD-3-Clause"
] | 2 | 2020-06-18T20:15:41.000Z | 2021-08-11T02:10:58.000Z | napari/_qt/menus/window_menu.py | sandutsar/napari | 37d476bc0b00252177f17f25e7d1fd52ddc4bb69 | [
"BSD-3-Clause"
] | 7 | 2020-04-11T03:37:54.000Z | 2021-01-31T22:41:35.000Z | napari/_qt/menus/window_menu.py | DragaDoncila/napari | 044beba342ef392f4cbed2e8e3a27f27d4799ccb | [
"BSD-3-Clause"
] | 3 | 2020-08-29T21:07:38.000Z | 2022-01-10T15:36:16.000Z | from typing import TYPE_CHECKING
from qtpy.QtWidgets import QMenu
from ...utils.translations import trans
from ._util import populate_menu
if TYPE_CHECKING:
from ..qt_main_window import Window
class WindowMenu(QMenu):
def __init__(self, window: 'Window'):
super().__init__(trans._('&Window'), window._qt_window)
ACTIONS = []
populate_menu(self, ACTIONS)
| 23 | 63 | 0.71867 | 49 | 391 | 5.367347 | 0.510204 | 0.091255 | 0.121673 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184143 | 391 | 16 | 64 | 24.4375 | 0.824451 | 0 | 0 | 0 | 0 | 0 | 0.033248 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.454545 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
73225bc0a64c76b6032770b03dae6f3ea52cfb16 | 653 | py | Python | Taller_Estructuras_de_Control_Secuenciales/Ejercicio_07.py | LeonardoJimenezUbaque/Algoritmos_y_Programacion_C4_G2 | 7bb6fffa7d5d99ac2b5c0a3724f97a84e145bbb7 | [
"MIT"
] | null | null | null | Taller_Estructuras_de_Control_Secuenciales/Ejercicio_07.py | LeonardoJimenezUbaque/Algoritmos_y_Programacion_C4_G2 | 7bb6fffa7d5d99ac2b5c0a3724f97a84e145bbb7 | [
"MIT"
] | null | null | null | Taller_Estructuras_de_Control_Secuenciales/Ejercicio_07.py | LeonardoJimenezUbaque/Algoritmos_y_Programacion_C4_G2 | 7bb6fffa7d5d99ac2b5c0a3724f97a84e145bbb7 | [
"MIT"
] | null | null | null | """
Ejercicio 07
Dada una cantidad en metros, se requiere que la convierta a pies y pulgadas, considerando lo siguiente:
1 metro = 39.27 pulgadas; 1 pie = 12 pulgadas.
Entradas
Metros --> Float --> M
Salidas
Pies --> Float --> P_I
Pulgadas --> Float--> P_U
"""
# Instrucciones al usuario
print("Para conocer cual es la cantidad de pies y pulgadas a las que equivale sus metros, escriba lo siguiente: ")
# Entradas
M = float(input("Digite la cantidad de metros: "))
# Caja Negra
P_I = M*3.281
P_U = M*39.27
# Salidas
print(f"Su cantidad de metros en pies es igual a: {P_I} pies")
print(f"Su cantidad de metros en pulgadas es igual a: {P_U} pulgadas")
| 28.391304 | 114 | 0.713629 | 114 | 653 | 4.035088 | 0.473684 | 0.086957 | 0.104348 | 0.069565 | 0.113043 | 0.113043 | 0.113043 | 0 | 0 | 0 | 0 | 0.033582 | 0.179173 | 653 | 22 | 115 | 29.681818 | 0.824627 | 0.471669 | 0 | 0 | 0 | 0.166667 | 0.742515 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
7328af9439c5736f563cdd611bc972da86333670 | 312 | py | Python | Misc/helpers.py | ManuelFre/OEC-BB | 8796e8313af82497b6aa2d988d2dbf65649acfa2 | [
"MIT"
] | null | null | null | Misc/helpers.py | ManuelFre/OEC-BB | 8796e8313af82497b6aa2d988d2dbf65649acfa2 | [
"MIT"
] | null | null | null | Misc/helpers.py | ManuelFre/OEC-BB | 8796e8313af82497b6aa2d988d2dbf65649acfa2 | [
"MIT"
] | null | null | null | from Config.settings import DEBUG_PRINT
def debug_print(msg):
""" Prints messages only if DEBUG_PRINT is set to True in the settings.
This function is used instead of print throughout the entire app.
Args:
msg (str/int/obj): message to print
"""
if DEBUG_PRINT:
print(msg)
| 24 | 75 | 0.679487 | 47 | 312 | 4.425532 | 0.659574 | 0.192308 | 0.115385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.253205 | 312 | 12 | 76 | 26 | 0.892704 | 0.576923 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.5 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
7329f83ecd476b7be1c9c68dfae5ce905b2d3db9 | 950 | py | Python | honeybot/plugins/bitcoin.py | marceloyb/honeybot | b2b92af54d01228ec150185eaa08a4baf55f1c88 | [
"MIT"
] | null | null | null | honeybot/plugins/bitcoin.py | marceloyb/honeybot | b2b92af54d01228ec150185eaa08a4baf55f1c88 | [
"MIT"
] | null | null | null | honeybot/plugins/bitcoin.py | marceloyb/honeybot | b2b92af54d01228ec150185eaa08a4baf55f1c88 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
[bitcoin.py]
Bitcoin Price Checking Plugin
[Author]
Gabriele Ron
[Website]
https://Macr0Nerd.github.io
[About]
Checks the current price for Bitcoin through the Legacy Coin Market Cap API
TODO: Update API to new Coin Market Cap API
[Commands]
>>> .btc
returns current value of bitcoin
"""
import requests
class Plugin:
def __init__(self):
self.api_url = "https://api.coinmarketcap.com/v1/ticker/bitcoin/"
pass
def run(self, incoming, methods, info):
try:
#msgs = info['args'][1:][0].split()
if info['command'] == 'PRIVMSG' and info['args'][1] == '.btc':
response = requests.get("https://api.coinmarketcap.com/v1/ticker/bitcoin/")
response_json = response.json()
methods['send'](info['address'], "$" + response_json[0]['price_usd'])
except Exception as e:
print('woops plugin', __file__, e)
| 24.358974 | 91 | 0.613684 | 119 | 950 | 4.798319 | 0.630252 | 0.063047 | 0.045534 | 0.056042 | 0.136602 | 0.136602 | 0.136602 | 0 | 0 | 0 | 0 | 0.01105 | 0.237895 | 950 | 38 | 92 | 25 | 0.777624 | 0.362105 | 0 | 0 | 0 | 0 | 0.252931 | 0 | 0 | 0 | 0 | 0.026316 | 0 | 1 | 0.153846 | false | 0.076923 | 0.076923 | 0 | 0.307692 | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
732c55d8dd6a041ee9789daaf23e7e21398f768f | 125 | py | Python | 2/functions.py | Gregorita12/training2019 | 50e8637b43c5d33ed41ebe8f5650dd49ff8c4171 | [
"MIT"
] | 4 | 2019-11-05T18:12:48.000Z | 2019-12-17T14:14:24.000Z | 2/functions.py | Gregorita12/training2019 | 50e8637b43c5d33ed41ebe8f5650dd49ff8c4171 | [
"MIT"
] | 1 | 2019-12-15T13:15:17.000Z | 2019-12-15T13:15:17.000Z | 2/functions.py | Gregorita12/training2019 | 50e8637b43c5d33ed41ebe8f5650dd49ff8c4171 | [
"MIT"
] | 3 | 2019-12-15T12:10:17.000Z | 2019-12-17T14:24:22.000Z | def is_prime(a):
if a % 2 == 0:
print("Brawo!!")
return ":)"
return ":("
x = is_prime(4)
print(x)
| 11.363636 | 24 | 0.44 | 18 | 125 | 2.944444 | 0.666667 | 0.264151 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0.352 | 125 | 10 | 25 | 12.5 | 0.617284 | 0 | 0 | 0 | 0 | 0 | 0.08871 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.428571 | 0.285714 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
732d9507a98d31fa7042f28459ab1202cabd4be0 | 2,807 | py | Python | app/code/data_quality_check.py | tanviwagh/Movies_Analysis | 6ef9774988535031e8a63d890e47bf530a9bafec | [
"MIT"
] | null | null | null | app/code/data_quality_check.py | tanviwagh/Movies_Analysis | 6ef9774988535031e8a63d890e47bf530a9bafec | [
"MIT"
] | 34 | 2022-01-23T11:59:15.000Z | 2022-03-29T12:02:53.000Z | app/code/data_quality_check.py | tanviwagh/Movies_Analysis | 6ef9774988535031e8a63d890e47bf530a9bafec | [
"MIT"
] | 1 | 2022-03-18T15:52:23.000Z | 2022-03-18T15:52:23.000Z | from pyspark.sql.functions import col
def process(spark, config):
db_name = config['athena']['db_name']
movie_tbl_name = config['athena']['movie_tbl_name']
genre_tbl_name = config['athena']['genre_tbl_name']
artist_tbl_name = config['athena']['artist_tbl_name']
music_tbl_name = config['athena']['music_tbl_name']
director_tbl_name = config['athena']['director_tbl_name']
producer_tbl_name = config['athena']['producer_tbl_name']
writer_tbl_name = config['athena']['writer_tbl_name']
table_exists_check(spark, db_name, movie_tbl_name)
table_exists_check(spark, db_name, genre_tbl_name)
table_exists_check(spark, db_name, artist_tbl_name)
table_exists_check(spark, db_name, music_tbl_name)
table_exists_check(spark, db_name, director_tbl_name)
table_exists_check(spark, db_name, producer_tbl_name)
table_exists_check(spark, db_name, writer_tbl_name)
row_count_check(spark, db_name, movie_tbl_name)
row_count_check(spark, db_name, genre_tbl_name)
row_count_check(spark, db_name, artist_tbl_name)
row_count_check(spark, db_name, music_tbl_name)
row_count_check(spark, db_name, director_tbl_name)
row_count_check(spark, db_name, producer_tbl_name)
row_count_check(spark, db_name, writer_tbl_name)
non_null_check(spark, db_name, movie_tbl_name)
non_null_check(spark, db_name, genre_tbl_name)
non_null_check(spark, db_name, artist_tbl_name)
non_null_check(spark, db_name, music_tbl_name)
non_null_check(spark, db_name, director_tbl_name)
non_null_check(spark, db_name, producer_tbl_name)
non_null_check(spark, db_name, writer_tbl_name)
def table_exists_check(spark, db_name, tbl_name):
if spark._jsparkSession.catalog().tableExists(db_name, tbl_name):
print("Table exists")
else:
raise Exception("Table {db_name}.{tbl_name} does not exist".format(db_name=db_name, tbl_name=tbl_name))
def row_count_check(spark, db_name, tbl_name):
SQL = """ SELECT COUNT(*) FROM {db_name}.{tbl_name} """.format(db_name=db_name, tbl_name=tbl_name)
count_df = spark.sql(SQL)
if count_df.collect()[0][0] > 0:
print("Row count check successful")
else:
raise Exception("Row count check has failed for {db_name}.{tbl_name}".format(db_name=db_name, tbl_name=tbl_name))
def non_null_check(spark, db_name, tbl_name):
SQL = """ SELECT imdbID FROM {db_name}.{tbl_name} """.format(db_name=db_name, tbl_name=tbl_name)
data_df = spark.sql(SQL)
filter_cond = col('imdbID').isNull()
filtered_df = data_df.filter(filter_cond)
if filtered_df.count() == 0:
print("Non-NULL check successful")
else:
raise Exception("Non-NULL check has failed for {db_name}.{tbl_name}".format(db_name=db_name, tbl_name=tbl_name))
| 36.454545 | 121 | 0.735305 | 442 | 2,807 | 4.244344 | 0.122172 | 0.201493 | 0.153518 | 0.204691 | 0.670576 | 0.6258 | 0.59968 | 0.527186 | 0.138593 | 0.138593 | 0 | 0.001676 | 0.149982 | 2,807 | 76 | 122 | 36.934211 | 0.784577 | 0 | 0 | 0.057692 | 0 | 0 | 0.162567 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.019231 | 0 | 0.096154 | 0.057692 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
733258d8d95497e1dfed6c7c8959c5b04aca2b76 | 5,973 | py | Python | pylint_plugins/py2exe_checker.py | zkanda/scalyr-agent-2 | 81bde110337b5e7dc5045f26097b721e1ff4ebae | [
"Apache-2.0"
] | 67 | 2015-02-03T00:35:33.000Z | 2022-03-23T10:14:26.000Z | pylint_plugins/py2exe_checker.py | kdelph23/scalyr-agent-2 | 6b975db59367d271eeba6a614ac40c7cb4205c41 | [
"Apache-2.0"
] | 578 | 2015-04-09T08:58:56.000Z | 2022-03-30T12:13:21.000Z | pylint_plugins/py2exe_checker.py | kdelph23/scalyr-agent-2 | 6b975db59367d271eeba6a614ac40c7cb4205c41 | [
"Apache-2.0"
] | 58 | 2015-01-15T22:00:43.000Z | 2022-02-18T15:48:31.000Z | # Copyright 2020 Scalyr Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
pylint checker plugin used to enforce invariants necessary for `py2exe` to work
correctly (`py2exe` is the tool used to generate the Windows binaries). These
invariants are difficult to enforce in other, more traditional ways. See below for
more specifics about the checker and the invariants it enforces.
"""
from __future__ import absolute_import
if False: # NOSONAR
from typing import Any
from typing import Union
from astroid import node_classes, scoped_nodes
from pylint.checkers import BaseChecker
from pylint.interfaces import IAstroidChecker
from setup import WINDOWS_MONITOR_MODULES_TO_INCLUDE, WINDOWS_PY2_SIX_MOVES_IMPORTS
class Py2ExeChecker(BaseChecker):
"""Enforces two key invariants for py2exe to work:
1. Any module imported using `six.moves` is listed in the `WINDOWS_PY2_SIX_MOVES_IMPORTS` dict.
This is important because `py2exe` cannot correctly infer dependencies from modules imported in this
manner. This would result in the necessary modules not being included in the Windows binary and thus causing
it to fail when run. Instead, we use the `WINDOWS_PY2_SIX_MOVES_IMPORTS` dict to give `py2exe` a manual list
of modules to include.
2. Any monitor that should be included in the Windows binary is listed in the `WINDOWS_MONITOR_MODULES_TO_INCLUDE`
set. This is important because monitors are loaded dynamically at runtime and are, therefore, not included
by default by `py2exe` through its static dependency analysis. This would result in the monitors not being
available in the Windows binary. Instead, we use this list to give `py3exe` a manual list of modules to include.
"""
__implements__ = IAstroidChecker
name = "scalyr-py2exe-checker"
SIX_MOVES_IMPORT_NOT_INCLUDED_FOR_WIN32 = "six-moves-import-not-included-for-win32"
MONITOR_NOT_INCLUDED_FOR_WIN32 = "monitor-not-included-for-win32"
msgs = {
"E8101": (
'Found six.moves import "%s" on line %d, but module is not included for py2exe. Windows binary '
"will not properly include all dependencies for the module. Please edit WINDOWS_PY2_SIX_MOVES_IMPORTS to "
"add the module and dependencies.",
SIX_MOVES_IMPORT_NOT_INCLUDED_FOR_WIN32,
"",
),
"E8102": (
'Found definition for monitor "%s" in module "%s", but that module is not listed in '
"WINDOWS_MONITOR_MODULES_TO_INCLUDE. This monitor will not be included in the Windows binary. Add the "
"monitor's module to the list in platform_windows.py to fix.",
MONITOR_NOT_INCLUDED_FOR_WIN32,
"",
),
}
options = ()
priority = -1
def __init__(self, linter):
# type: (Any) -> None
super(Py2ExeChecker, self).__init__(linter)
# Keeps track of the current module being visited.
self.__current_module = None
def visit_module(self, node):
# type: (scoped_nodes.Module) -> None
self.__current_module = node.name
def leave_module(self, _node):
# type: (scoped_nodes.Module) -> None
self.__current_module = None
def visit_import(self, node):
# type: (node_classes.Import) -> None
# Invoked when visiting a line like `import x,y,z`. The modules are in a list in `node.names`.
if self.__is_scalyr_agent_module:
for import_fragment in node.names:
self.__verify_six_import_in_whitelist(import_fragment[0], node)
def visit_importfrom(self, node):
# type: (node_classes.ImportFrom) -> None
# Invoked when visiting a line like `from module import x,y,z`. The module name is in
# node.modname and the actual imports are in a list in node.names.
if self.__is_scalyr_agent_module:
module_name = node.modname
for import_fragment in node.names:
self.__verify_six_import_in_whitelist(
"%s.%s" % (module_name, import_fragment[0]), node
)
def visit_classdef(self, node):
# type: (node_classes.ClassDef) -> None
class_name = node.name
if (
"ScalyrMonitor" in node.basenames
and self.__current_module not in WINDOWS_MONITOR_MODULES_TO_INCLUDE
):
args = (class_name, self.__current_module)
self.add_message(self.MONITOR_NOT_INCLUDED_FOR_WIN32, node=node, args=args)
@property
def __is_scalyr_agent_module(self):
# type: () -> bool
return self.__current_module is not None and self.__current_module.startswith(
"scalyr_agent"
)
def __verify_six_import_in_whitelist(self, import_name, import_node):
# type: (str, Union[node_classes.Import,node_classes.ImportFrom]) -> None
"""
Verify any imports from `six.moves` are in the known dict list.
"""
if (
import_name.startswith("six.moves")
and import_name not in WINDOWS_PY2_SIX_MOVES_IMPORTS
):
args = (import_name, import_node.lineno)
self.add_message(
self.SIX_MOVES_IMPORT_NOT_INCLUDED_FOR_WIN32,
node=import_node,
args=args,
)
def register(linter):
# type: (Any) -> None
linter.register_checker(Py2ExeChecker(linter))
| 40.910959 | 119 | 0.681232 | 800 | 5,973 | 4.87125 | 0.2825 | 0.026687 | 0.032333 | 0.039004 | 0.335643 | 0.23223 | 0.177059 | 0.100334 | 0.100334 | 0.082114 | 0 | 0.012029 | 0.248451 | 5,973 | 145 | 120 | 41.193103 | 0.856093 | 0.415871 | 0 | 0.186667 | 0 | 0.013333 | 0.181926 | 0.045481 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12 | false | 0 | 0.32 | 0.013333 | 0.56 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
733a2c802b0b4e1091569c0a77e77cbb8f89e0ec | 1,259 | py | Python | PipeCatalogue.py | DrTol/DynamicPipeModel_Python | 83974ecfc0de17da73903ce2003789c6607665c2 | [
"CC0-1.0"
] | 1 | 2020-11-20T13:48:43.000Z | 2020-11-20T13:48:43.000Z | PipeCatalogue.py | DrTol/DynamicPipeModel_Python | 83974ecfc0de17da73903ce2003789c6607665c2 | [
"CC0-1.0"
] | null | null | null | PipeCatalogue.py | DrTol/DynamicPipeModel_Python | 83974ecfc0de17da73903ce2003789c6607665c2 | [
"CC0-1.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Pipe Catalogue Data - Single Steel Pipe by LOGSTOR
Created on Mon Nov 2 20:14:25 2020
@author: Hakan İbrahim Tol, PhD
References:
[1] LOGSTOR, Product Catalogue Version 2018.12.
https://www.logstor.com/media/6115/product-catalogue-uk-201812.pdf
"""
def LayerDiameters(DN,IS):
# DN: Nominal pipe diameter
# IS: Insulation series
DN_l=[20,25,32,40,50,65,80,100,125]
if DN not in DN_l:
raise TypeError("Nominal Pipe Diameter can be:", DN_l)
d1_l=[21.7,28.5,37.2,43.1,54.5,70.3,82.5,107.1,132.5]
d2_l=[26.9,33.7,42.4,48.3,60.3,76.1,88.9,114.3,139.7]
if IS==1:
d3_l=[84,84,104,104,119,134,154,193.6,218.2]
d4_l=[90,90,110,110,125,140,160,200,225]
elif IS==2:
d3_l=[104,104,119,119,134,154,174,218.2,242.8]
d4_l=[110,110,125,125,140,160,180,225,250]
elif IS==3:
d3_l=[119,119,134,134,154,174,193.6,242.8,272.2]
d4_l=[125,125,140,140,160,180,200,250,280]
else:
raise TypeError("Insulation Series (IS) can be one of (poor) 1, 2, or 3 (good)")
ind=DN_l.index(DN)
return d1_l[ind]*0.001,d2_l[ind]*0.001,d3_l[ind]*0.001,d4_l[ind]*0.001
| 29.27907 | 89 | 0.58618 | 247 | 1,259 | 2.927126 | 0.493927 | 0.016598 | 0.027663 | 0.04426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.318418 | 0.236696 | 1,259 | 42 | 90 | 29.97619 | 0.432882 | 0.254964 | 0 | 0 | 0 | 0.052632 | 0.101695 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0 | 0 | 0.105263 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.