hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0afe66b01b034a4b81b4c66d871904305aefc746 | 7,376 | py | Python | src/bindings/python/src/compatibility/ngraph/__init__.py | artkuli/openvino | eb2fb5bf7df36ae55e3251816999b801ce053335 | [
"Apache-2.0"
] | 1,127 | 2018-10-15T14:36:58.000Z | 2020-04-20T09:29:44.000Z | src/bindings/python/src/compatibility/ngraph/__init__.py | artkuli/openvino | eb2fb5bf7df36ae55e3251816999b801ce053335 | [
"Apache-2.0"
] | 439 | 2018-10-20T04:40:35.000Z | 2020-04-19T05:56:25.000Z | src/bindings/python/src/compatibility/ngraph/__init__.py | tuxedcat/openvino | 5939cb1b363ebb56b73c2ad95d8899961a084677 | [
"Apache-2.0"
] | 414 | 2018-10-17T05:53:46.000Z | 2020-04-16T17:29:53.000Z | # Copyright (C) 2018-2022 Intel Corporation
# SPDX-License-Identifier: Apache-2.0
"""ngraph module namespace, exposing factory functions for all ops and other classes."""
# noqa: F401
try:
from ngraph.impl import util
__version__ = util.get_ngraph_version_string()
except ImportError:
__version__ = "0.0.0.dev0"
from ngraph.impl import Dimension
from ngraph.impl import Function
from ngraph.impl import Node
from ngraph.impl import PartialShape
from ngraph.helpers import function_from_cnn
from ngraph.helpers import function_to_cnn
from ngraph.opset9 import absolute
from ngraph.opset9 import absolute as abs
from ngraph.opset9 import acos
from ngraph.opset9 import acosh
from ngraph.opset9 import adaptive_avg_pool
from ngraph.opset9 import adaptive_max_pool
from ngraph.opset9 import add
from ngraph.opset9 import asin
from ngraph.opset9 import asinh
from ngraph.opset9 import assign
from ngraph.opset9 import atan
from ngraph.opset9 import atanh
from ngraph.opset9 import avg_pool
from ngraph.opset9 import batch_norm_inference
from ngraph.opset9 import batch_to_space
from ngraph.opset9 import binary_convolution
from ngraph.opset9 import broadcast
from ngraph.opset9 import bucketize
from ngraph.opset9 import ceiling
from ngraph.opset9 import ceiling as ceil
from ngraph.opset9 import clamp
from ngraph.opset9 import concat
from ngraph.opset9 import constant
from ngraph.opset9 import convert
from ngraph.opset9 import convert_like
from ngraph.opset9 import convolution
from ngraph.opset9 import convolution_backprop_data
from ngraph.opset9 import cos
from ngraph.opset9 import cosh
from ngraph.opset9 import ctc_greedy_decoder
from ngraph.opset9 import ctc_greedy_decoder_seq_len
from ngraph.opset9 import ctc_loss
from ngraph.opset9 import cum_sum
from ngraph.opset9 import cum_sum as cumsum
from ngraph.opset9 import deformable_convolution
from ngraph.opset9 import deformable_psroi_pooling
from ngraph.opset9 import depth_to_space
from ngraph.opset9 import detection_output
from ngraph.opset9 import dft
from ngraph.opset9 import divide
from ngraph.opset9 import einsum
from ngraph.opset9 import elu
from ngraph.opset9 import embedding_bag_offsets_sum
from ngraph.opset9 import embedding_bag_packed_sum
from ngraph.opset9 import embedding_segments_sum
from ngraph.opset9 import extract_image_patches
from ngraph.opset9 import equal
from ngraph.opset9 import erf
from ngraph.opset9 import exp
from ngraph.opset9 import eye
from ngraph.opset9 import fake_quantize
from ngraph.opset9 import floor
from ngraph.opset9 import floor_mod
from ngraph.opset9 import gather
from ngraph.opset9 import gather_elements
from ngraph.opset9 import gather_nd
from ngraph.opset9 import gather_tree
from ngraph.opset9 import gelu
from ngraph.opset9 import generate_proposals
from ngraph.opset9 import greater
from ngraph.opset9 import greater_equal
from ngraph.opset9 import grid_sample
from ngraph.opset9 import grn
from ngraph.opset9 import group_convolution
from ngraph.opset9 import group_convolution_backprop_data
from ngraph.opset9 import gru_cell
from ngraph.opset9 import gru_sequence
from ngraph.opset9 import hard_sigmoid
from ngraph.opset9 import hsigmoid
from ngraph.opset9 import hswish
from ngraph.opset9 import idft
from ngraph.opset9 import if_op
from ngraph.opset9 import interpolate
from ngraph.opset9 import irdft
from ngraph.opset9 import i420_to_bgr
from ngraph.opset9 import i420_to_rgb
from ngraph.opset9 import less
from ngraph.opset9 import less_equal
from ngraph.opset9 import log
from ngraph.opset9 import logical_and
from ngraph.opset9 import logical_not
from ngraph.opset9 import logical_or
from ngraph.opset9 import logical_xor
from ngraph.opset9 import log_softmax
from ngraph.opset9 import loop
from ngraph.opset9 import lrn
from ngraph.opset9 import lstm_cell
from ngraph.opset9 import lstm_sequence
from ngraph.opset9 import matmul
from ngraph.opset9 import matrix_nms
from ngraph.opset9 import max_pool
from ngraph.opset9 import maximum
from ngraph.opset9 import minimum
from ngraph.opset9 import mish
from ngraph.opset9 import mod
from ngraph.opset9 import multiclass_nms
from ngraph.opset9 import multiply
from ngraph.opset9 import mvn
from ngraph.opset9 import negative
from ngraph.opset9 import non_max_suppression
from ngraph.opset9 import non_zero
from ngraph.opset9 import normalize_l2
from ngraph.opset9 import not_equal
from ngraph.opset9 import nv12_to_bgr
from ngraph.opset9 import nv12_to_rgb
from ngraph.opset9 import one_hot
from ngraph.opset9 import pad
from ngraph.opset9 import parameter
from ngraph.opset9 import power
from ngraph.opset9 import prelu
from ngraph.opset9 import prior_box
from ngraph.opset9 import prior_box_clustered
from ngraph.opset9 import psroi_pooling
from ngraph.opset9 import proposal
from ngraph.opset9 import random_uniform
from ngraph.opset9 import range
from ngraph.opset9 import rdft
from ngraph.opset9 import read_value
from ngraph.opset9 import reduce_l1
from ngraph.opset9 import reduce_l2
from ngraph.opset9 import reduce_logical_and
from ngraph.opset9 import reduce_logical_or
from ngraph.opset9 import reduce_max
from ngraph.opset9 import reduce_mean
from ngraph.opset9 import reduce_min
from ngraph.opset9 import reduce_prod
from ngraph.opset9 import reduce_sum
from ngraph.opset9 import region_yolo
from ngraph.opset9 import reorg_yolo
from ngraph.opset9 import relu
from ngraph.opset9 import reshape
from ngraph.opset9 import result
from ngraph.opset9 import reverse_sequence
from ngraph.opset9 import rnn_cell
from ngraph.opset9 import rnn_sequence
from ngraph.opset9 import roi_align
from ngraph.opset9 import roi_pooling
from ngraph.opset9 import roll
from ngraph.opset9 import round
from ngraph.opset9 import scatter_elements_update
from ngraph.opset9 import scatter_update
from ngraph.opset9 import select
from ngraph.opset9 import selu
from ngraph.opset9 import shape_of
from ngraph.opset9 import shuffle_channels
from ngraph.opset9 import sigmoid
from ngraph.opset9 import sign
from ngraph.opset9 import sin
from ngraph.opset9 import sinh
from ngraph.opset9 import slice
from ngraph.opset9 import softmax
from ngraph.opset9 import softplus
from ngraph.opset9 import softsign
from ngraph.opset9 import space_to_batch
from ngraph.opset9 import space_to_depth
from ngraph.opset9 import split
from ngraph.opset9 import sqrt
from ngraph.opset9 import squared_difference
from ngraph.opset9 import squeeze
from ngraph.opset9 import strided_slice
from ngraph.opset9 import subtract
from ngraph.opset9 import swish
from ngraph.opset9 import tan
from ngraph.opset9 import tanh
from ngraph.opset9 import tensor_iterator
from ngraph.opset9 import tile
from ngraph.opset9 import topk
from ngraph.opset9 import transpose
from ngraph.opset9 import unsqueeze
from ngraph.opset9 import variadic_split
# Extend Node class to support binary operators
Node.__add__ = add
Node.__sub__ = subtract
Node.__mul__ = multiply
Node.__div__ = divide
Node.__truediv__ = divide
Node.__radd__ = lambda left, right: add(right, left)
Node.__rsub__ = lambda left, right: subtract(right, left)
Node.__rmul__ = lambda left, right: multiply(right, left)
Node.__rdiv__ = lambda left, right: divide(right, left)
Node.__rtruediv__ = lambda left, right: divide(right, left)
Node.__eq__ = equal
Node.__ne__ = not_equal
Node.__lt__ = less
Node.__le__ = less_equal
Node.__gt__ = greater
Node.__ge__ = greater_equal
| 35.291866 | 88 | 0.849783 | 1,134 | 7,376 | 5.352734 | 0.223104 | 0.291598 | 0.448105 | 0.616145 | 0.503954 | 0.189127 | 0.03855 | 0 | 0 | 0 | 0 | 0.030604 | 0.114018 | 7,376 | 208 | 89 | 35.461538 | 0.89824 | 0.029555 | 0 | 0 | 0 | 0 | 0.001399 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.903553 | 0 | 0.903553 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
7c17b6b64cf011675a1b133308668c03a581a7f5 | 66 | py | Python | seq2seq/evaluator/__init__.py | mtran14/pytorch-seq2seq | 738059377eee9be07863e33f21c7d255139c44d6 | [
"Apache-2.0"
] | 1,491 | 2017-06-30T16:15:40.000Z | 2022-03-22T02:05:16.000Z | seq2seq/evaluator/__init__.py | mtran14/pytorch-seq2seq | 738059377eee9be07863e33f21c7d255139c44d6 | [
"Apache-2.0"
] | 128 | 2017-07-07T21:41:03.000Z | 2021-06-30T13:18:23.000Z | seq2seq/evaluator/__init__.py | mtran14/pytorch-seq2seq | 738059377eee9be07863e33f21c7d255139c44d6 | [
"Apache-2.0"
] | 434 | 2017-07-08T12:35:15.000Z | 2022-03-25T06:28:13.000Z | from .evaluator import Evaluator
from .predictor import Predictor
| 22 | 32 | 0.848485 | 8 | 66 | 7 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 66 | 2 | 33 | 33 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
7c447398a86170ba7a510ff5d2aed5025b43f691 | 163 | py | Python | python/ql/test/library-tests/exprs/strings/test.py | vadi2/codeql | a806a4f08696d241ab295a286999251b56a6860c | [
"MIT"
] | 4,036 | 2020-04-29T00:09:57.000Z | 2022-03-31T14:16:38.000Z | python/ql/test/library-tests/exprs/strings/test.py | vadi2/codeql | a806a4f08696d241ab295a286999251b56a6860c | [
"MIT"
] | 2,970 | 2020-04-28T17:24:18.000Z | 2022-03-31T22:40:46.000Z | python/ql/test/library-tests/exprs/strings/test.py | ScriptBox99/github-codeql | 2ecf0d3264db8fb4904b2056964da469372a235c | [
"MIT"
] | 794 | 2020-04-29T00:28:25.000Z | 2022-03-30T08:21:46.000Z |
#Half Surrogate pairs
u'[\uD800-\uDBFF][\uDC00-\uDFFF]'
#Outside BMP
u'[\U00010000-\U0010ffff]'
#Troublesome format strings
u'{}\r{}{:<{width}}'
u'{}\r{}{:<{}}'
| 16.3 | 33 | 0.619632 | 21 | 163 | 4.809524 | 0.809524 | 0.039604 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114094 | 0.08589 | 163 | 9 | 34 | 18.111111 | 0.563758 | 0.417178 | 0 | 0 | 0 | 0 | 0.803922 | 0.519608 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
7c735aabeeda74a86b13736b0280ca6df5652b2d | 6,008 | py | Python | test/dlc_tests/eks/mxnet/training/test_eks_mxnet_training.py | arjkesh/deep-learning-containers-1 | 9bf65197dee8a6f59b3d4ee240dcc11824240854 | [
"Apache-2.0"
] | 1 | 2021-12-17T15:50:48.000Z | 2021-12-17T15:50:48.000Z | test/dlc_tests/eks/mxnet/training/test_eks_mxnet_training.py | arjkesh/deep-learning-containers-1 | 9bf65197dee8a6f59b3d4ee240dcc11824240854 | [
"Apache-2.0"
] | null | null | null | test/dlc_tests/eks/mxnet/training/test_eks_mxnet_training.py | arjkesh/deep-learning-containers-1 | 9bf65197dee8a6f59b3d4ee240dcc11824240854 | [
"Apache-2.0"
] | null | null | null | import re
import os
import random
import test.test_utils.eks as eks_utils
from invoke import run
def test_eks_mxnet_single_node_training(mxnet_training):
"""
Function to create a pod using kubectl and given container image, and run MXNet training
Args:
:param mxnet_training: the ECR URI
"""
training_result = False
rand_int = random.randint(4001, 6000)
yaml_path = os.path.join(os.sep, "tmp", f"mxnet_single_node_training_{rand_int}.yaml")
pod_name = f"mxnet-single-node-training-{rand_int}"
args = (
"git clone https://github.com/apache/incubator-mxnet.git && python "
"/incubator-mxnet/example/image-classification/train_mnist.py"
)
processor_type = "gpu" if "gpu" in mxnet_training else "cpu"
args = args + " --gpus 0" if processor_type == "gpu" else args
# TODO: Change hardcoded value to read a mapping from the EKS cluster instance.
cpu_limit = 72
cpu_limit = str(int(cpu_limit) / 2)
search_replace_dict = {
"<POD_NAME>": pod_name,
"<CONTAINER_NAME>": mxnet_training,
"<ARGS>": args,
"<CPU_LIMIT>": cpu_limit,
}
eks_utils.write_eks_yaml_file_from_template(
eks_utils.SINGLE_NODE_TRAINING_TEMPLATE_PATH, yaml_path, search_replace_dict
)
try:
run("kubectl create -f {}".format(yaml_path))
if eks_utils.is_eks_training_complete(pod_name):
mxnet_out = run("kubectl logs {}".format(pod_name)).stdout
if "Epoch[19] Validation-accuracy" in mxnet_out:
training_result = True
else:
eks_utils.LOGGER.info("**** training output ****")
eks_utils.LOGGER.debug(mxnet_out)
assert training_result, f"Training failed"
finally:
run("kubectl delete pods {}".format(pod_name))
def test_eks_mxnet_dgl_single_node_training(mxnet_training, py3_only):
"""
Function to create a pod using kubectl and given container image, and run
DGL training with MXNet backend
Args:
:param mxnet_training: the ECR URI
"""
training_result = False
rand_int = random.randint(4001, 6000)
yaml_path = os.path.join(os.sep, "tmp", f"mxnet_single_node_training_dgl_{rand_int}.yaml")
pod_name = f"mxnet-single-node-training-dgl-{rand_int}"
args = (
"git clone -b 0.4.x https://github.com/dmlc/dgl.git && "
"cd /dgl/examples/mxnet/gcn/ && DGLBACKEND=mxnet python train.py --dataset cora"
)
# TODO: Change hardcoded value to read a mapping from the EKS cluster instance.
cpu_limit = 72
cpu_limit = str(int(cpu_limit) / 2)
if "gpu" in mxnet_training:
args = args + " --gpu 0"
else:
args = args + " --gpu -1"
search_replace_dict = {
"<POD_NAME>": pod_name,
"<CONTAINER_NAME>": mxnet_training,
"<ARGS>": args,
"<CPU_LIMIT>": cpu_limit,
}
eks_utils.write_eks_yaml_file_from_template(
eks_utils.SINGLE_NODE_TRAINING_TEMPLATE_PATH, yaml_path, search_replace_dict
)
try:
run("kubectl create -f {}".format(yaml_path))
if eks_utils.is_eks_training_complete(pod_name):
dgl_out = run("kubectl logs {}".format(pod_name)).stdout
if "Test accuracy" in dgl_out:
training_result = True
else:
eks_utils.LOGGER.info("**** training output ****")
eks_utils.LOGGER.debug(dgl_out)
assert training_result, f"Training failed"
finally:
run("kubectl delete pods {}".format(pod_name))
def test_eks_mxnet_gluonnlp_single_node_training(mxnet_training, py3_only):
"""
Function to create a pod using kubectl and given container image, and run
DGL training with MXNet backend
Args:
:param mxnet_training: the ECR URI
"""
training_result = False
rand_int = random.randint(4001, 6000)
yaml_path = os.path.join(os.sep, "tmp", f"mxnet_single_node_training_gluonnlp_{rand_int}.yaml")
pod_name = f"mxnet-single-node-training-gluonnlp-{rand_int}"
args = (
"git clone -b master https://github.com/dmlc/gluon-nlp.git && "
"cd gluon-nlp && git checkout v0.9.0 &&"
"cd ./scripts/sentiment_analysis/ &&"
"python sentiment_analysis_cnn.py --batch_size 50 --epochs 20 --dropout 0.5 "
"--model_mode multichannel --data_name TREC"
)
# TODO: Change hardcoded value to read a mapping from the EKS cluster instance.
cpu_limit = 72
cpu_limit = str(int(cpu_limit) / 2)
if "gpu" in mxnet_training:
args = args + " --gpu 0"
search_replace_dict = {
"<POD_NAME>": pod_name,
"<CONTAINER_NAME>": mxnet_training,
"<ARGS>": args,
"<CPU_LIMIT>": cpu_limit,
}
eks_utils.write_eks_yaml_file_from_template(
eks_utils.SINGLE_NODE_TRAINING_TEMPLATE_PATH, yaml_path, search_replace_dict
)
try:
run("kubectl create -f {}".format(yaml_path))
if eks_utils.is_eks_training_complete(pod_name):
gluonnlp_out = run("kubectl logs {}".format(pod_name)).stdout
results = re.search(r"test acc ((?:\d*\.\d+)|\d+)", gluonnlp_out)
if results is not None:
accuracy = float(results.groups()[0])
if accuracy >= 0.75:
eks_utils.LOGGER.info(
"GluonNLP EKS test succeeded with accuracy {} >= 0.75".format(
accuracy
)
)
training_result = True
else:
eks_utils.LOGGER.info(
"GluonNLP EKS test FAILED with accuracy {} < 0.75".format(
accuracy
)
)
eks_utils.LOGGER.debug(gluonnlp_out)
assert training_result, f"Training failed"
finally:
run("kubectl delete pods {}".format(pod_name))
| 31.78836 | 99 | 0.614015 | 768 | 6,008 | 4.553385 | 0.203125 | 0.036031 | 0.061767 | 0.046039 | 0.781813 | 0.763798 | 0.739777 | 0.724049 | 0.692022 | 0.670289 | 0 | 0.014309 | 0.278795 | 6,008 | 188 | 100 | 31.957447 | 0.792753 | 0.111518 | 0 | 0.572581 | 0 | 0.016129 | 0.26569 | 0.076265 | 0 | 0 | 0 | 0.015957 | 0.024194 | 1 | 0.024194 | false | 0 | 0.040323 | 0 | 0.064516 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
7c8b84c092aa94842e685a57986f17b8d0e9054a | 89 | py | Python | django/goal/admin.py | reallinfo/lifelog | 0862555fc863003d6e347e38ffd44f48de73d31c | [
"Apache-2.0"
] | null | null | null | django/goal/admin.py | reallinfo/lifelog | 0862555fc863003d6e347e38ffd44f48de73d31c | [
"Apache-2.0"
] | 1 | 2018-10-11T17:06:01.000Z | 2018-10-14T00:39:07.000Z | django/goal/admin.py | reallinfo/lifelog | 0862555fc863003d6e347e38ffd44f48de73d31c | [
"Apache-2.0"
] | 1 | 2018-10-13T21:43:35.000Z | 2018-10-13T21:43:35.000Z | from django.contrib import admin
from goal.models import Goal
admin.site.register(Goal)
| 17.8 | 32 | 0.820225 | 14 | 89 | 5.214286 | 0.642857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11236 | 89 | 4 | 33 | 22.25 | 0.924051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
7cc2e121a2736951fc5039c3fb44133fd83c2763 | 246 | py | Python | src/oscar/apps/catalogue/signals.py | mcwladkoe/django-oscar | 13020bffa603a5b66198017ae2b76817f303e246 | [
"BSD-3-Clause"
] | null | null | null | src/oscar/apps/catalogue/signals.py | mcwladkoe/django-oscar | 13020bffa603a5b66198017ae2b76817f303e246 | [
"BSD-3-Clause"
] | null | null | null | src/oscar/apps/catalogue/signals.py | mcwladkoe/django-oscar | 13020bffa603a5b66198017ae2b76817f303e246 | [
"BSD-3-Clause"
] | null | null | null | import django.dispatch
product_viewed = django.dispatch.Signal(
providing_args=["product", "user", "request", "response"])
category_viewed = django.dispatch.Signal(
providing_args=["category", "context", "user", "request", "response"])
| 30.75 | 74 | 0.719512 | 26 | 246 | 6.653846 | 0.5 | 0.242775 | 0.231214 | 0.300578 | 0.450867 | 0.450867 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109756 | 246 | 7 | 75 | 35.142857 | 0.789954 | 0 | 0 | 0 | 0 | 0 | 0.243902 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
7cd7b99dc8fc09c4a791c48f97b71d2e6dc37b1a | 127 | py | Python | slowapi/__init__.py | WardPearce/slowapi | 8f14398f229884b282d6cc4f6296cd75df0a82be | [
"MIT"
] | null | null | null | slowapi/__init__.py | WardPearce/slowapi | 8f14398f229884b282d6cc4f6296cd75df0a82be | [
"MIT"
] | null | null | null | slowapi/__init__.py | WardPearce/slowapi | 8f14398f229884b282d6cc4f6296cd75df0a82be | [
"MIT"
] | null | null | null | from .extension import Limiter, _rate_limit_exceeded_handler
__all__ = [
"Limiter",
"_rate_limit_exceeded_handler"
]
| 15.875 | 60 | 0.755906 | 14 | 127 | 6 | 0.642857 | 0.261905 | 0.380952 | 0.571429 | 0.738095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.165354 | 127 | 7 | 61 | 18.142857 | 0.792453 | 0 | 0 | 0 | 0 | 0 | 0.275591 | 0.220472 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
6b053fb164d7dbe633ca6130b4ec3492506dadb1 | 9,308 | py | Python | alf/networks/containers_test.py | www2171668/alf | 6e3731fc559d3b4e6b5b9ed6251fff728a560d64 | [
"Apache-2.0"
] | null | null | null | alf/networks/containers_test.py | www2171668/alf | 6e3731fc559d3b4e6b5b9ed6251fff728a560d64 | [
"Apache-2.0"
] | null | null | null | alf/networks/containers_test.py | www2171668/alf | 6e3731fc559d3b4e6b5b9ed6251fff728a560d64 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2021 Horizon Robotics and ALF Contributors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import torch
import alf
from alf.utils.spec_utils import is_same_spec
def _randn_from_spec(specs, batch_size):
return alf.nest.map_structure(
lambda spec: torch.randn(batch_size, *spec.shape), specs)
class ContainersTest(alf.test.TestCase):
def _verify_parameter_copy(self, src, copy):
"""net.copy() only copy the structure, not the values of parameters."""
for s, c in zip(src.parameters(), copy.parameters()):
if (s == 0).all():
self.assertTrue((c == 0).all())
else:
self.assertFalse((c == 0).all())
self.assertFalse((c == s).all())
def test_sequential1(self):
net = alf.nn.Sequential(
alf.layers.FC(4, 6), alf.nn.GRUCell(6, 8), alf.nn.GRUCell(8, 12))
self.assertEqual(net.input_tensor_spec, alf.TensorSpec((4, )))
self.assertTrue(
alf.utils.spec_utils.is_same_spec(
net.state_spec,
[(), alf.TensorSpec(
(8, )), alf.TensorSpec((12, ))]))
batch_size = 24
x = _randn_from_spec(net.input_tensor_spec, batch_size)
state = _randn_from_spec(net.state_spec, batch_size)
y, new_state = net(x, state)
x1 = net[0](x)
x2, s1 = net[1](x1, state[1])
x3, s2 = net[2](x2, state[2])
self.assertEqual(x3, y)
self.assertEqual((), new_state[0])
self.assertEqual(s1, new_state[1])
self.assertEqual(s2, new_state[2])
net_copy = net.copy()
self._verify_parameter_copy(net, net_copy)
def test_sequential_complex1(self):
net = alf.nn.Sequential(
alf.layers.FC(4, 6),
a=alf.nn.GRUCell(6, 8),
b=alf.nn.GRUCell(8, 12),
c=('a', alf.nn.GRUCell(8, 16)),
output=('b', 'c'))
self.assertEqual(net.input_tensor_spec, alf.TensorSpec((4, )))
self.assertTrue(
alf.utils.spec_utils.is_same_spec(net.state_spec,
[(),
alf.TensorSpec((8, )),
alf.TensorSpec((12, )),
alf.TensorSpec((16, ))]))
batch_size = 24
x = _randn_from_spec(net.input_tensor_spec, batch_size)
state = _randn_from_spec(net.state_spec, batch_size)
y, new_state = net(x, state)
x1 = net[0](x)
a, s1 = net[1](x1, state[1])
b, s2 = net[2](a, state[2])
c, s3 = net[3](a, state[3])
self.assertEqual(len(y), 2)
self.assertEqual(type(y), tuple)
self.assertEqual(b, y[0])
self.assertEqual(c, y[1])
self.assertEqual((), new_state[0])
self.assertEqual(s1, new_state[1])
self.assertEqual(s2, new_state[2])
self.assertEqual(s3, new_state[3])
def test_sequential2(self):
net = alf.nn.Sequential(
alf.layers.FC(4, 6), alf.layers.FC(6, 8), alf.layers.FC(8, 12))
self.assertEqual(net.input_tensor_spec, alf.TensorSpec((4, )))
self.assertEqual(net.state_spec, ())
batch_size = 24
x = _randn_from_spec(net.input_tensor_spec, batch_size)
y, new_state = net(x)
self.assertEqual(new_state, ())
x1 = net[0](x)
x2 = net[1](x1)
x3 = net[2](x2)
self.assertEqual(x3, y)
net_copy = net.copy()
self._verify_parameter_copy(net, net_copy)
def test_sequential_complex2(self):
net = alf.nn.Sequential(
alf.layers.FC(4, 6),
a=alf.layers.FC(6, 8),
b=alf.layers.FC(8, 12),
c=(('a', 'b'), alf.layers.NestConcat()))
self.assertEqual(net.input_tensor_spec, alf.TensorSpec((4, )))
self.assertEqual(net.state_spec, ())
batch_size = 24
x = _randn_from_spec(net.input_tensor_spec, batch_size)
y, new_state = net(x)
self.assertEqual(new_state, ())
x1 = net[0](x)
x2 = net[1](x1)
x3 = net[2](x2)
x4 = net[3]((x2, x3))
self.assertEqual(x4, y)
net_copy = net.copy()
self._verify_parameter_copy(net, net_copy)
def test_sequential_complex3(self):
net = alf.nn.Sequential(
alf.layers.FC(4, 6),
a=alf.layers.FC(6, 8),
b=alf.layers.FC(8, 8),
c=(('a', 'b'), lambda x: x[0] + x[1]))
self.assertEqual(net.input_tensor_spec, alf.TensorSpec((4, )))
self.assertEqual(net.state_spec, ())
batch_size = 24
x = _randn_from_spec(net.input_tensor_spec, batch_size)
y, new_state = net(x)
self.assertEqual(new_state, ())
x1 = net[0](x)
x2 = net[1](x1)
x3 = net[2](x2)
x4 = x2 + x3
self.assertEqual(x4, y)
net_copy = net.copy()
self._verify_parameter_copy(net, net_copy)
def test_parallel1(self):
net = alf.nn.Parallel((alf.layers.FC(4, 6), alf.nn.GRUCell(6, 8),
alf.nn.GRUCell(8, 12)))
self.assertTrue(
is_same_spec(net.input_tensor_spec, (alf.TensorSpec(
(4, )), alf.TensorSpec((6, )), alf.TensorSpec((8, )))))
self.assertTrue(
is_same_spec(net.state_spec, ((), alf.TensorSpec(
(8, )), alf.TensorSpec((12, )))))
batch_size = 24
x = _randn_from_spec(net.input_tensor_spec, batch_size)
state = _randn_from_spec(net.state_spec, batch_size)
y, new_state = net(x, state)
y0, state0 = net.networks[0](x[0])
y1, state1 = net.networks[1](x[1], state[1])
y2, state2 = net.networks[2](x[2], state[2])
self.assertEqual(y[0], y0)
self.assertEqual(y[1], y1)
self.assertEqual(y[2], y2)
self.assertEqual(new_state[0], state0)
self.assertEqual(new_state[1], state1)
self.assertEqual(new_state[2], state2)
net_copy = net.copy()
self._verify_parameter_copy(net, net_copy)
def test_parallel2(self):
net = alf.nn.Parallel((alf.layers.FC(4, 6), alf.layers.FC(6, 8),
alf.layers.FC(8, 12)))
self.assertTrue(
is_same_spec(net.input_tensor_spec, (alf.TensorSpec(
(4, )), alf.TensorSpec((6, )), alf.TensorSpec((8, )))))
self.assertEqual(net.state_spec, ())
batch_size = 24
x = _randn_from_spec(net.input_tensor_spec, batch_size)
y, new_state = net(x)
self.assertEqual(new_state, ())
y0 = net.networks[0](x[0])[0]
y1 = net.networks[1](x[1])[0]
y2 = net.networks[2](x[2])[0]
self.assertEqual(y[0], y0)
self.assertEqual(y[1], y1)
self.assertEqual(y[2], y2)
net_copy = net.copy()
self._verify_parameter_copy(net, net_copy)
def test_branch1(self):
net = alf.nn.Branch((alf.layers.FC(4, 6), alf.nn.GRUCell(4, 8),
alf.nn.GRUCell(4, 12)))
self.assertEqual(net.input_tensor_spec, alf.TensorSpec((4, )))
self.assertTrue(
is_same_spec(net.state_spec, ((), alf.TensorSpec(
(8, )), alf.TensorSpec((12, )))))
batch_size = 24
x = _randn_from_spec(net.input_tensor_spec, batch_size)
state = _randn_from_spec(net.state_spec, batch_size)
y, new_state = net(x, state)
y0, state0 = net.networks[0](x)
y1, state1 = net.networks[1](x, state[1])
y2, state2 = net.networks[2](x, state[2])
self.assertEqual(y[0], y0)
self.assertEqual(y[1], y1)
self.assertEqual(y[2], y2)
self.assertEqual(new_state[0], state0)
self.assertEqual(new_state[1], state1)
self.assertEqual(new_state[2], state2)
net_copy = net.copy()
self._verify_parameter_copy(net, net_copy)
def test_branch2(self):
net = alf.nn.Branch((alf.layers.FC(4, 6), alf.layers.FC(4, 8),
alf.layers.FC(4, 12)))
self.assertEqual(net.input_tensor_spec, alf.TensorSpec((4, )))
self.assertEqual(net.state_spec, ())
batch_size = 24
x = _randn_from_spec(net.input_tensor_spec, batch_size)
y, new_state = net(x)
self.assertEqual(new_state, ())
y0 = net.networks[0](x)[0]
y1 = net.networks[1](x)[0]
y2 = net.networks[2](x)[0]
self.assertEqual(y[0], y0)
self.assertEqual(y[1], y1)
self.assertEqual(y[2], y2)
net_copy = net.copy()
self._verify_parameter_copy(net, net_copy)
if __name__ == '__main__':
alf.test.main()
| 34.60223 | 80 | 0.564353 | 1,276 | 9,308 | 3.942006 | 0.123824 | 0.149105 | 0.041551 | 0.064414 | 0.747714 | 0.741352 | 0.714314 | 0.714314 | 0.700199 | 0.700199 | 0 | 0.043669 | 0.29147 | 9,308 | 268 | 81 | 34.731343 | 0.71903 | 0.071766 | 0 | 0.625616 | 0 | 0 | 0.00174 | 0 | 0 | 0 | 0 | 0 | 0.29064 | 1 | 0.054187 | false | 0 | 0.014778 | 0.004926 | 0.078818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
6b16cd0f109a12c7f686ea924440d95489ff0d48 | 409 | py | Python | ch16_mongodb/application/tests/_all_tests.py | vromanuk/data-driven-web-app | ff9aac603477c3646f3c09f7d949dd1e7cf85f44 | [
"MIT"
] | null | null | null | ch16_mongodb/application/tests/_all_tests.py | vromanuk/data-driven-web-app | ff9aac603477c3646f3c09f7d949dd1e7cf85f44 | [
"MIT"
] | null | null | null | ch16_mongodb/application/tests/_all_tests.py | vromanuk/data-driven-web-app | ff9aac603477c3646f3c09f7d949dd1e7cf85f44 | [
"MIT"
] | null | null | null | import sys
import os
container_folder = os.path.abspath(os.path.join(
os.path.dirname(__file__), '..'
))
sys.path.insert(0, container_folder)
# noinspection PyUnresolvedReferences
from account_tests import *
# noinspection PyUnresolvedReferences
from package_tests import *
# noinspection PyUnresolvedReferences
from sitemap_tests import *
# noinspection PyUnresolvedReferences
from home_tests import * | 24.058824 | 48 | 0.811736 | 46 | 409 | 7 | 0.434783 | 0.42236 | 0.47205 | 0.419255 | 0.456522 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002755 | 0.112469 | 409 | 17 | 49 | 24.058824 | 0.884298 | 0.349633 | 0 | 0 | 0 | 0 | 0.007634 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
6b190921c2e6d59eae8ce655e97d4b4f37dcd635 | 174 | py | Python | four_color/src/variable.py | abkraynak/ai-algorithms | f152831d3225ea5b750aace3e9573aa935378a98 | [
"MIT"
] | null | null | null | four_color/src/variable.py | abkraynak/ai-algorithms | f152831d3225ea5b750aace3e9573aa935378a98 | [
"MIT"
] | null | null | null | four_color/src/variable.py | abkraynak/ai-algorithms | f152831d3225ea5b750aace3e9573aa935378a98 | [
"MIT"
] | null | null | null | class Variable(object):
def __init__(self, name: str):
self.name = name
def __repr__(self):
return self.name
Unassigned = Variable('Unassigned') | 21.75 | 35 | 0.632184 | 20 | 174 | 5.1 | 0.55 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.258621 | 174 | 8 | 35 | 21.75 | 0.790698 | 0 | 0 | 0 | 0 | 0 | 0.057143 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.166667 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 5 |
6b1fcfd2d9585d19ae3fd9705e128b19b1ec40e7 | 2,387 | py | Python | tensorflow/python/keras/initializers/__init__.py | M155K4R4/Tensorflow | e5e03ef3148303b3dfed89a1492dedf92b45be25 | [
"Apache-2.0"
] | 522 | 2016-06-08T02:15:50.000Z | 2022-03-02T05:30:36.000Z | tensorflow/python/keras/initializers/__init__.py | M155K4R4/Tensorflow | e5e03ef3148303b3dfed89a1492dedf92b45be25 | [
"Apache-2.0"
] | 48 | 2016-07-26T00:11:55.000Z | 2022-02-23T13:36:33.000Z | tensorflow/python/keras/initializers/__init__.py | M155K4R4/Tensorflow | e5e03ef3148303b3dfed89a1492dedf92b45be25 | [
"Apache-2.0"
] | 108 | 2016-06-16T15:34:05.000Z | 2022-03-12T13:23:11.000Z | # Copyright 2016 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Keras built-in initializers."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
# Initializer functions / callable classes.
from tensorflow.python.keras._impl.keras.initializers import Constant
from tensorflow.python.keras._impl.keras.initializers import Identity
from tensorflow.python.keras._impl.keras.initializers import Initializer
from tensorflow.python.keras._impl.keras.initializers import Ones
from tensorflow.python.keras._impl.keras.initializers import Orthogonal
from tensorflow.python.keras._impl.keras.initializers import RandomNormal
from tensorflow.python.keras._impl.keras.initializers import RandomUniform
from tensorflow.python.keras._impl.keras.initializers import TruncatedNormal
from tensorflow.python.keras._impl.keras.initializers import VarianceScaling
from tensorflow.python.keras._impl.keras.initializers import Zeros
# Functional interface.
# pylint: disable=g-bad-import-order
from tensorflow.python.keras._impl.keras.initializers import glorot_normal
from tensorflow.python.keras._impl.keras.initializers import glorot_uniform
from tensorflow.python.keras._impl.keras.initializers import he_normal
from tensorflow.python.keras._impl.keras.initializers import he_uniform
from tensorflow.python.keras._impl.keras.initializers import lecun_normal
from tensorflow.python.keras._impl.keras.initializers import lecun_uniform
# Auxiliary utils.
from tensorflow.python.keras._impl.keras.initializers import deserialize
from tensorflow.python.keras._impl.keras.initializers import serialize
from tensorflow.python.keras._impl.keras.initializers import get
del absolute_import
del division
del print_function
| 47.74 | 80 | 0.808965 | 311 | 2,387 | 6.07717 | 0.350482 | 0.140741 | 0.201058 | 0.251323 | 0.553439 | 0.553439 | 0.553439 | 0.553439 | 0.195767 | 0 | 0 | 0.003695 | 0.093004 | 2,387 | 49 | 81 | 48.714286 | 0.869284 | 0.337662 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.92 | 0 | 0.92 | 0.08 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
6b2a425d2cf927c4ff5fcf44e2bb14f939b84e1a | 92 | py | Python | archetypes/__init__.py | aleixalcacer/archetypes | dd8abd3028ba5a4b34306208eef5dcb8ade1066f | [
"BSD-3-Clause"
] | 6 | 2021-06-01T05:43:14.000Z | 2022-03-03T08:21:29.000Z | archetypes/__init__.py | aleixalcacer/archetypes | dd8abd3028ba5a4b34306208eef5dcb8ade1066f | [
"BSD-3-Clause"
] | 2 | 2021-03-10T18:16:42.000Z | 2022-03-03T12:56:04.000Z | archetypes/__init__.py | aleixalcacer/archetypes | dd8abd3028ba5a4b34306208eef5dcb8ade1066f | [
"BSD-3-Clause"
] | null | null | null | from .archetypes import AA
from .biarchetypes import BiAA
from .version import __version__
| 18.4 | 32 | 0.826087 | 12 | 92 | 6 | 0.583333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141304 | 92 | 4 | 33 | 23 | 0.911392 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
6b30bc2d9a3bbba5d6e5ea0fa9fd1aae9af39c20 | 67 | py | Python | notifications/tests/sample_notifications/templatetags/notifications_tags.py | cron-ooo/django-notifications | dcf84acb4ec6d451a60c0cd6773773238b23601b | [
"BSD-3-Clause"
] | 1,354 | 2015-01-03T17:22:58.000Z | 2022-03-29T11:49:12.000Z | notifications/tests/sample_notifications/templatetags/notifications_tags.py | cron-ooo/django-notifications | dcf84acb4ec6d451a60c0cd6773773238b23601b | [
"BSD-3-Clause"
] | 275 | 2015-01-19T21:32:51.000Z | 2022-03-30T10:07:14.000Z | notifications/tests/sample_notifications/templatetags/notifications_tags.py | cron-ooo/django-notifications | dcf84acb4ec6d451a60c0cd6773773238b23601b | [
"BSD-3-Clause"
] | 385 | 2015-01-08T19:51:12.000Z | 2022-03-29T10:19:16.000Z | from notifications.templatetags.notifications_tags import register
| 33.5 | 66 | 0.910448 | 7 | 67 | 8.571429 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.059701 | 67 | 1 | 67 | 67 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
6b30d2d4e3ea3bf953a2c460019ebd23e67f9f5c | 316 | py | Python | python/anyascii/_data/_2b2.py | casept/anyascii | d4f426b91751254b68eaa84c6cd23099edd668e6 | [
"ISC"
] | null | null | null | python/anyascii/_data/_2b2.py | casept/anyascii | d4f426b91751254b68eaa84c6cd23099edd668e6 | [
"ISC"
] | null | null | null | python/anyascii/_data/_2b2.py | casept/anyascii | d4f426b91751254b68eaa84c6cd23099edd668e6 | [
"ISC"
] | null | null | null | b=' Jian Qing Chen Qia Jue Ai Shu Chang Lu Jian Jiao Xun Xiao Te Gong Ou Ze Rao Gui' | 316 | 316 | 0.196203 | 20 | 316 | 3.1 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.794304 | 316 | 1 | 316 | 316 | 0.953846 | 0 | 0 | 0 | 0 | 0 | 0.984227 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
86498f96383dedaabca885b3b27a719ff440eeb2 | 208 | py | Python | app/sdk.py | beda-software/adibox-sdc | 0c2da41ac5ddbe0e33657fe51c6ee7d5aeb382ee | [
"MIT"
] | 7 | 2020-05-14T21:00:42.000Z | 2021-10-17T09:10:18.000Z | app/sdk.py | beda-software/adibox-sdc | 0c2da41ac5ddbe0e33657fe51c6ee7d5aeb382ee | [
"MIT"
] | 27 | 2020-07-16T06:59:04.000Z | 2021-11-18T13:22:35.000Z | app/sdk.py | beda-software/adibox-sdc | 0c2da41ac5ddbe0e33657fe51c6ee7d5aeb382ee | [
"MIT"
] | 2 | 2020-11-06T09:30:09.000Z | 2021-03-03T09:48:27.000Z | from aidbox_python_sdk.sdk import SDK
from aidbox_python_sdk.settings import Settings
from app.manifest import meta_resources
sdk_settings = Settings(**{})
sdk = SDK(sdk_settings, resources=meta_resources)
| 26 | 49 | 0.826923 | 30 | 208 | 5.466667 | 0.333333 | 0.109756 | 0.195122 | 0.231707 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.100962 | 208 | 7 | 50 | 29.714286 | 0.877005 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
8697af500926386488514a94cd80f0458b38f4ba | 76 | py | Python | scrapereads/__init__.py | arthurdjn/scrape-goodreads | be0d81ef0be09955599a3fae43648e9945923ff1 | [
"MIT"
] | 3 | 2021-02-10T10:32:48.000Z | 2021-08-06T23:53:22.000Z | scrapereads/__init__.py | arthurdjn/scrape-goodreads | be0d81ef0be09955599a3fae43648e9945923ff1 | [
"MIT"
] | 1 | 2022-03-14T09:33:41.000Z | 2022-03-14T09:33:41.000Z | scrapereads/__init__.py | arthurdjn/scrape-goodreads | be0d81ef0be09955599a3fae43648e9945923ff1 | [
"MIT"
] | null | null | null | from scrapereads.reads import Author, Book, Quote
from .api import GoodReads | 38 | 49 | 0.828947 | 11 | 76 | 5.727273 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118421 | 76 | 2 | 50 | 38 | 0.940299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
86a731e4e985f9f45dc69eb5cf4ab2c1b2482e61 | 112,328 | py | Python | nova/objects/instance.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/objects/instance.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/objects/instance.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# Copyright 2013 IBM Corp.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
name|'import'
name|'contextlib'
newline|'\n'
nl|'\n'
name|'from'
name|'oslo_config'
name|'import'
name|'cfg'
newline|'\n'
name|'from'
name|'oslo_db'
name|'import'
name|'exception'
name|'as'
name|'db_exc'
newline|'\n'
name|'from'
name|'oslo_log'
name|'import'
name|'log'
name|'as'
name|'logging'
newline|'\n'
name|'from'
name|'oslo_serialization'
name|'import'
name|'jsonutils'
newline|'\n'
name|'from'
name|'oslo_utils'
name|'import'
name|'timeutils'
newline|'\n'
name|'from'
name|'oslo_utils'
name|'import'
name|'versionutils'
newline|'\n'
name|'from'
name|'sqlalchemy'
op|'.'
name|'orm'
name|'import'
name|'joinedload'
newline|'\n'
nl|'\n'
name|'from'
name|'nova'
op|'.'
name|'cells'
name|'import'
name|'opts'
name|'as'
name|'cells_opts'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'cells'
name|'import'
name|'rpcapi'
name|'as'
name|'cells_rpcapi'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'cells'
name|'import'
name|'utils'
name|'as'
name|'cells_utils'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'db'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'db'
op|'.'
name|'sqlalchemy'
name|'import'
name|'api'
name|'as'
name|'db_api'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'db'
op|'.'
name|'sqlalchemy'
name|'import'
name|'models'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'exception'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'i18n'
name|'import'
name|'_LE'
op|','
name|'_LW'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'notifications'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'objects'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'objects'
name|'import'
name|'base'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'objects'
name|'import'
name|'fields'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'utils'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|variable|CONF
name|'CONF'
op|'='
name|'cfg'
op|'.'
name|'CONF'
newline|'\n'
DECL|variable|LOG
name|'LOG'
op|'='
name|'logging'
op|'.'
name|'getLogger'
op|'('
name|'__name__'
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'# List of fields that can be joined in DB layer.'
nl|'\n'
DECL|variable|_INSTANCE_OPTIONAL_JOINED_FIELDS
name|'_INSTANCE_OPTIONAL_JOINED_FIELDS'
op|'='
op|'['
string|"'metadata'"
op|','
string|"'system_metadata'"
op|','
nl|'\n'
string|"'info_cache'"
op|','
string|"'security_groups'"
op|','
nl|'\n'
string|"'pci_devices'"
op|','
string|"'tags'"
op|','
string|"'services'"
op|']'
newline|'\n'
comment|"# These are fields that are optional but don't translate to db columns"
nl|'\n'
DECL|variable|_INSTANCE_OPTIONAL_NON_COLUMN_FIELDS
name|'_INSTANCE_OPTIONAL_NON_COLUMN_FIELDS'
op|'='
op|'['
string|"'fault'"
op|','
string|"'flavor'"
op|','
string|"'old_flavor'"
op|','
nl|'\n'
string|"'new_flavor'"
op|','
string|"'ec2_ids'"
op|']'
newline|'\n'
comment|'# These are fields that are optional and in instance_extra'
nl|'\n'
DECL|variable|_INSTANCE_EXTRA_FIELDS
name|'_INSTANCE_EXTRA_FIELDS'
op|'='
op|'['
string|"'numa_topology'"
op|','
string|"'pci_requests'"
op|','
nl|'\n'
string|"'flavor'"
op|','
string|"'vcpu_model'"
op|','
string|"'migration_context'"
op|','
nl|'\n'
string|"'keypairs'"
op|']'
newline|'\n'
nl|'\n'
comment|'# These are fields that can be specified as expected_attrs'
nl|'\n'
DECL|variable|INSTANCE_OPTIONAL_ATTRS
name|'INSTANCE_OPTIONAL_ATTRS'
op|'='
op|'('
name|'_INSTANCE_OPTIONAL_JOINED_FIELDS'
op|'+'
nl|'\n'
name|'_INSTANCE_OPTIONAL_NON_COLUMN_FIELDS'
op|'+'
nl|'\n'
name|'_INSTANCE_EXTRA_FIELDS'
op|')'
newline|'\n'
comment|'# These are fields that most query calls load by default'
nl|'\n'
DECL|variable|INSTANCE_DEFAULT_FIELDS
name|'INSTANCE_DEFAULT_FIELDS'
op|'='
op|'['
string|"'metadata'"
op|','
string|"'system_metadata'"
op|','
nl|'\n'
string|"'info_cache'"
op|','
string|"'security_groups'"
op|']'
newline|'\n'
nl|'\n'
comment|'# Maximum count of tags to one instance'
nl|'\n'
DECL|variable|MAX_TAG_COUNT
name|'MAX_TAG_COUNT'
op|'='
number|'50'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_expected_cols
name|'def'
name|'_expected_cols'
op|'('
name|'expected_attrs'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return expected_attrs that are columns needing joining.\n\n NB: This function may modify expected_attrs if one\n requested attribute requires another.\n """'
newline|'\n'
name|'if'
name|'not'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'expected_attrs'
newline|'\n'
nl|'\n'
dedent|''
name|'simple_cols'
op|'='
op|'['
name|'attr'
name|'for'
name|'attr'
name|'in'
name|'expected_attrs'
nl|'\n'
name|'if'
name|'attr'
name|'in'
name|'_INSTANCE_OPTIONAL_JOINED_FIELDS'
op|']'
newline|'\n'
nl|'\n'
name|'complex_cols'
op|'='
op|'['
string|"'extra.%s'"
op|'%'
name|'field'
nl|'\n'
name|'for'
name|'field'
name|'in'
name|'_INSTANCE_EXTRA_FIELDS'
nl|'\n'
name|'if'
name|'field'
name|'in'
name|'expected_attrs'
op|']'
newline|'\n'
name|'if'
name|'complex_cols'
op|':'
newline|'\n'
indent|' '
name|'simple_cols'
op|'.'
name|'append'
op|'('
string|"'extra'"
op|')'
newline|'\n'
dedent|''
name|'simple_cols'
op|'='
op|'['
name|'x'
name|'for'
name|'x'
name|'in'
name|'simple_cols'
name|'if'
name|'x'
name|'not'
name|'in'
name|'_INSTANCE_EXTRA_FIELDS'
op|']'
newline|'\n'
name|'return'
name|'simple_cols'
op|'+'
name|'complex_cols'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|variable|_NO_DATA_SENTINEL
dedent|''
name|'_NO_DATA_SENTINEL'
op|'='
name|'object'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
comment|'# TODO(berrange): Remove NovaObjectDictCompat'
nl|'\n'
op|'@'
name|'base'
op|'.'
name|'NovaObjectRegistry'
op|'.'
name|'register'
newline|'\n'
name|'class'
name|'Instance'
op|'('
name|'base'
op|'.'
name|'NovaPersistentObject'
op|','
name|'base'
op|'.'
name|'NovaObject'
op|','
nl|'\n'
DECL|class|Instance
name|'base'
op|'.'
name|'NovaObjectDictCompat'
op|')'
op|':'
newline|'\n'
comment|'# Version 2.0: Initial version'
nl|'\n'
comment|'# Version 2.1: Added services'
nl|'\n'
comment|'# Version 2.2: Added keypairs'
nl|'\n'
DECL|variable|VERSION
indent|' '
name|'VERSION'
op|'='
string|"'2.2'"
newline|'\n'
nl|'\n'
DECL|variable|fields
name|'fields'
op|'='
op|'{'
nl|'\n'
string|"'id'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'user_id'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'project_id'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'image_ref'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'kernel_id'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'ramdisk_id'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'hostname'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'launch_index'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'key_name'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'key_data'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'power_state'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'vm_state'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'task_state'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'services'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'ServiceList'"
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'memory_mb'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'vcpus'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'root_gb'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'ephemeral_gb'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'ephemeral_key_uuid'"
op|':'
name|'fields'
op|'.'
name|'UUIDField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'host'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'node'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'instance_type_id'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'user_data'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'reservation_id'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'launched_at'"
op|':'
name|'fields'
op|'.'
name|'DateTimeField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'terminated_at'"
op|':'
name|'fields'
op|'.'
name|'DateTimeField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'availability_zone'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'display_name'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'display_description'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'launched_on'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# NOTE(jdillaman): locked deprecated in favor of locked_by,'
nl|'\n'
comment|'# to be removed in Icehouse'
nl|'\n'
string|"'locked'"
op|':'
name|'fields'
op|'.'
name|'BooleanField'
op|'('
name|'default'
op|'='
name|'False'
op|')'
op|','
nl|'\n'
string|"'locked_by'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'os_type'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'architecture'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'vm_mode'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'uuid'"
op|':'
name|'fields'
op|'.'
name|'UUIDField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'root_device_name'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'default_ephemeral_device'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'default_swap_device'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'config_drive'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'access_ip_v4'"
op|':'
name|'fields'
op|'.'
name|'IPV4AddressField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'access_ip_v6'"
op|':'
name|'fields'
op|'.'
name|'IPV6AddressField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'auto_disk_config'"
op|':'
name|'fields'
op|'.'
name|'BooleanField'
op|'('
name|'default'
op|'='
name|'False'
op|')'
op|','
nl|'\n'
string|"'progress'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'shutdown_terminate'"
op|':'
name|'fields'
op|'.'
name|'BooleanField'
op|'('
name|'default'
op|'='
name|'False'
op|')'
op|','
nl|'\n'
string|"'disable_terminate'"
op|':'
name|'fields'
op|'.'
name|'BooleanField'
op|'('
name|'default'
op|'='
name|'False'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'cell_name'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'metadata'"
op|':'
name|'fields'
op|'.'
name|'DictOfStringsField'
op|'('
op|')'
op|','
nl|'\n'
string|"'system_metadata'"
op|':'
name|'fields'
op|'.'
name|'DictOfNullableStringsField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'info_cache'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'InstanceInfoCache'"
op|','
nl|'\n'
DECL|variable|nullable
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'security_groups'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'SecurityGroupList'"
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'fault'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'InstanceFault'"
op|','
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'cleaned'"
op|':'
name|'fields'
op|'.'
name|'BooleanField'
op|'('
name|'default'
op|'='
name|'False'
op|')'
op|','
nl|'\n'
nl|'\n'
string|"'pci_devices'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'PciDeviceList'"
op|','
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'numa_topology'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'InstanceNUMATopology'"
op|','
nl|'\n'
DECL|variable|nullable
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'pci_requests'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'InstancePCIRequests'"
op|','
nl|'\n'
DECL|variable|nullable
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'tags'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'TagList'"
op|')'
op|','
nl|'\n'
string|"'flavor'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'Flavor'"
op|')'
op|','
nl|'\n'
string|"'old_flavor'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'Flavor'"
op|','
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'new_flavor'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'Flavor'"
op|','
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'vcpu_model'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'VirtCPUModel'"
op|','
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'ec2_ids'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'EC2Ids'"
op|')'
op|','
nl|'\n'
string|"'migration_context'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'MigrationContext'"
op|','
nl|'\n'
DECL|variable|nullable
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'keypairs'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'KeyPairList'"
op|')'
op|','
nl|'\n'
op|'}'
newline|'\n'
nl|'\n'
DECL|variable|obj_extra_fields
name|'obj_extra_fields'
op|'='
op|'['
string|"'name'"
op|']'
newline|'\n'
nl|'\n'
DECL|member|obj_make_compatible
name|'def'
name|'obj_make_compatible'
op|'('
name|'self'
op|','
name|'primitive'
op|','
name|'target_version'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'Instance'
op|','
name|'self'
op|')'
op|'.'
name|'obj_make_compatible'
op|'('
name|'primitive'
op|','
name|'target_version'
op|')'
newline|'\n'
name|'target_version'
op|'='
name|'versionutils'
op|'.'
name|'convert_version_to_tuple'
op|'('
name|'target_version'
op|')'
newline|'\n'
name|'if'
name|'target_version'
op|'<'
op|'('
number|'2'
op|','
number|'2'
op|')'
name|'and'
string|"'keypairs'"
name|'in'
name|'primitive'
op|':'
newline|'\n'
indent|' '
name|'del'
name|'primitive'
op|'['
string|"'keypairs'"
op|']'
newline|'\n'
dedent|''
name|'if'
name|'target_version'
op|'<'
op|'('
number|'2'
op|','
number|'1'
op|')'
name|'and'
string|"'services'"
name|'in'
name|'primitive'
op|':'
newline|'\n'
indent|' '
name|'del'
name|'primitive'
op|'['
string|"'services'"
op|']'
newline|'\n'
nl|'\n'
DECL|member|__init__
dedent|''
dedent|''
name|'def'
name|'__init__'
op|'('
name|'self'
op|','
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'Instance'
op|','
name|'self'
op|')'
op|'.'
name|'__init__'
op|'('
op|'*'
name|'args'
op|','
op|'**'
name|'kwargs'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_reset_metadata_tracking'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'property'
newline|'\n'
DECL|member|image_meta
name|'def'
name|'image_meta'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'objects'
op|'.'
name|'ImageMeta'
op|'.'
name|'from_instance'
op|'('
name|'self'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_reset_metadata_tracking
dedent|''
name|'def'
name|'_reset_metadata_tracking'
op|'('
name|'self'
op|','
name|'fields'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'fields'
name|'is'
name|'None'
name|'or'
string|"'system_metadata'"
name|'in'
name|'fields'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_orig_system_metadata'
op|'='
op|'('
name|'dict'
op|'('
name|'self'
op|'.'
name|'system_metadata'
op|')'
name|'if'
nl|'\n'
string|"'system_metadata'"
name|'in'
name|'self'
name|'else'
op|'{'
op|'}'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'fields'
name|'is'
name|'None'
name|'or'
string|"'metadata'"
name|'in'
name|'fields'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_orig_metadata'
op|'='
op|'('
name|'dict'
op|'('
name|'self'
op|'.'
name|'metadata'
op|')'
name|'if'
nl|'\n'
string|"'metadata'"
name|'in'
name|'self'
name|'else'
op|'{'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|member|obj_reset_changes
dedent|''
dedent|''
name|'def'
name|'obj_reset_changes'
op|'('
name|'self'
op|','
name|'fields'
op|'='
name|'None'
op|','
name|'recursive'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'Instance'
op|','
name|'self'
op|')'
op|'.'
name|'obj_reset_changes'
op|'('
name|'fields'
op|','
nl|'\n'
name|'recursive'
op|'='
name|'recursive'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_reset_metadata_tracking'
op|'('
name|'fields'
op|'='
name|'fields'
op|')'
newline|'\n'
nl|'\n'
DECL|member|obj_what_changed
dedent|''
name|'def'
name|'obj_what_changed'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'changes'
op|'='
name|'super'
op|'('
name|'Instance'
op|','
name|'self'
op|')'
op|'.'
name|'obj_what_changed'
op|'('
op|')'
newline|'\n'
name|'if'
string|"'metadata'"
name|'in'
name|'self'
name|'and'
name|'self'
op|'.'
name|'metadata'
op|'!='
name|'self'
op|'.'
name|'_orig_metadata'
op|':'
newline|'\n'
indent|' '
name|'changes'
op|'.'
name|'add'
op|'('
string|"'metadata'"
op|')'
newline|'\n'
dedent|''
name|'if'
string|"'system_metadata'"
name|'in'
name|'self'
name|'and'
op|'('
name|'self'
op|'.'
name|'system_metadata'
op|'!='
nl|'\n'
name|'self'
op|'.'
name|'_orig_system_metadata'
op|')'
op|':'
newline|'\n'
indent|' '
name|'changes'
op|'.'
name|'add'
op|'('
string|"'system_metadata'"
op|')'
newline|'\n'
dedent|''
name|'return'
name|'changes'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'classmethod'
newline|'\n'
DECL|member|_obj_from_primitive
name|'def'
name|'_obj_from_primitive'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'objver'
op|','
name|'primitive'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'='
name|'super'
op|'('
name|'Instance'
op|','
name|'cls'
op|')'
op|'.'
name|'_obj_from_primitive'
op|'('
name|'context'
op|','
name|'objver'
op|','
nl|'\n'
name|'primitive'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_reset_metadata_tracking'
op|'('
op|')'
newline|'\n'
name|'return'
name|'self'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'property'
newline|'\n'
DECL|member|name
name|'def'
name|'name'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'base_name'
op|'='
name|'CONF'
op|'.'
name|'instance_name_template'
op|'%'
name|'self'
op|'.'
name|'id'
newline|'\n'
dedent|''
name|'except'
name|'TypeError'
op|':'
newline|'\n'
comment|'# Support templates like "uuid-%(uuid)s", etc.'
nl|'\n'
indent|' '
name|'info'
op|'='
op|'{'
op|'}'
newline|'\n'
comment|"# NOTE(russellb): Don't use self.iteritems() here, as it will"
nl|'\n'
comment|'# result in infinite recursion on the name property.'
nl|'\n'
name|'for'
name|'key'
name|'in'
name|'self'
op|'.'
name|'fields'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'key'
op|'=='
string|"'name'"
op|':'
newline|'\n'
comment|'# NOTE(danms): prevent recursion'
nl|'\n'
indent|' '
name|'continue'
newline|'\n'
dedent|''
name|'elif'
name|'not'
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
name|'key'
op|')'
op|':'
newline|'\n'
comment|"# NOTE(danms): Don't trigger lazy-loads"
nl|'\n'
indent|' '
name|'continue'
newline|'\n'
dedent|''
name|'info'
op|'['
name|'key'
op|']'
op|'='
name|'self'
op|'['
name|'key'
op|']'
newline|'\n'
dedent|''
name|'try'
op|':'
newline|'\n'
indent|' '
name|'base_name'
op|'='
name|'CONF'
op|'.'
name|'instance_name_template'
op|'%'
name|'info'
newline|'\n'
dedent|''
name|'except'
name|'KeyError'
op|':'
newline|'\n'
indent|' '
name|'base_name'
op|'='
name|'self'
op|'.'
name|'uuid'
newline|'\n'
dedent|''
dedent|''
name|'return'
name|'base_name'
newline|'\n'
nl|'\n'
DECL|member|_flavor_from_db
dedent|''
name|'def'
name|'_flavor_from_db'
op|'('
name|'self'
op|','
name|'db_flavor'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Load instance flavor information from instance_extra."""'
newline|'\n'
nl|'\n'
name|'flavor_info'
op|'='
name|'jsonutils'
op|'.'
name|'loads'
op|'('
name|'db_flavor'
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'flavor'
op|'='
name|'objects'
op|'.'
name|'Flavor'
op|'.'
name|'obj_from_primitive'
op|'('
name|'flavor_info'
op|'['
string|"'cur'"
op|']'
op|')'
newline|'\n'
name|'if'
name|'flavor_info'
op|'['
string|"'old'"
op|']'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'old_flavor'
op|'='
name|'objects'
op|'.'
name|'Flavor'
op|'.'
name|'obj_from_primitive'
op|'('
nl|'\n'
name|'flavor_info'
op|'['
string|"'old'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'old_flavor'
op|'='
name|'None'
newline|'\n'
dedent|''
name|'if'
name|'flavor_info'
op|'['
string|"'new'"
op|']'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'new_flavor'
op|'='
name|'objects'
op|'.'
name|'Flavor'
op|'.'
name|'obj_from_primitive'
op|'('
nl|'\n'
name|'flavor_info'
op|'['
string|"'new'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'new_flavor'
op|'='
name|'None'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'obj_reset_changes'
op|'('
op|'['
string|"'flavor'"
op|','
string|"'old_flavor'"
op|','
string|"'new_flavor'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'staticmethod'
newline|'\n'
DECL|member|_from_db_object
name|'def'
name|'_from_db_object'
op|'('
name|'context'
op|','
name|'instance'
op|','
name|'db_inst'
op|','
name|'expected_attrs'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Method to help with migration to objects.\n\n Converts a database entity to a formal object.\n """'
newline|'\n'
name|'instance'
op|'.'
name|'_context'
op|'='
name|'context'
newline|'\n'
name|'if'
name|'expected_attrs'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'expected_attrs'
op|'='
op|'['
op|']'
newline|'\n'
comment|'# Most of the field names match right now, so be quick'
nl|'\n'
dedent|''
name|'for'
name|'field'
name|'in'
name|'instance'
op|'.'
name|'fields'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'field'
name|'in'
name|'INSTANCE_OPTIONAL_ATTRS'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
dedent|''
name|'elif'
name|'field'
op|'=='
string|"'deleted'"
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'deleted'
op|'='
name|'db_inst'
op|'['
string|"'deleted'"
op|']'
op|'=='
name|'db_inst'
op|'['
string|"'id'"
op|']'
newline|'\n'
dedent|''
name|'elif'
name|'field'
op|'=='
string|"'cleaned'"
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'cleaned'
op|'='
name|'db_inst'
op|'['
string|"'cleaned'"
op|']'
op|'=='
number|'1'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'['
name|'field'
op|']'
op|'='
name|'db_inst'
op|'['
name|'field'
op|']'
newline|'\n'
nl|'\n'
comment|'# NOTE(danms): We can be called with a dict instead of a'
nl|'\n'
comment|'# SQLAlchemy object, so we have to be careful here'
nl|'\n'
dedent|''
dedent|''
name|'if'
name|'hasattr'
op|'('
name|'db_inst'
op|','
string|"'__dict__'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'have_extra'
op|'='
string|"'extra'"
name|'in'
name|'db_inst'
op|'.'
name|'__dict__'
name|'and'
name|'db_inst'
op|'['
string|"'extra'"
op|']'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'have_extra'
op|'='
string|"'extra'"
name|'in'
name|'db_inst'
name|'and'
name|'db_inst'
op|'['
string|"'extra'"
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
string|"'metadata'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'['
string|"'metadata'"
op|']'
op|'='
name|'utils'
op|'.'
name|'instance_meta'
op|'('
name|'db_inst'
op|')'
newline|'\n'
dedent|''
name|'if'
string|"'system_metadata'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'['
string|"'system_metadata'"
op|']'
op|'='
name|'utils'
op|'.'
name|'instance_sys_meta'
op|'('
name|'db_inst'
op|')'
newline|'\n'
dedent|''
name|'if'
string|"'fault'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'['
string|"'fault'"
op|']'
op|'='
op|'('
nl|'\n'
name|'objects'
op|'.'
name|'InstanceFault'
op|'.'
name|'get_latest_for_instance'
op|'('
nl|'\n'
name|'context'
op|','
name|'instance'
op|'.'
name|'uuid'
op|')'
op|')'
newline|'\n'
dedent|''
name|'if'
string|"'numa_topology'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'have_extra'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'_load_numa_topology'
op|'('
nl|'\n'
name|'db_inst'
op|'['
string|"'extra'"
op|']'
op|'.'
name|'get'
op|'('
string|"'numa_topology'"
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'numa_topology'
op|'='
name|'None'
newline|'\n'
dedent|''
dedent|''
name|'if'
string|"'pci_requests'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'have_extra'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'_load_pci_requests'
op|'('
nl|'\n'
name|'db_inst'
op|'['
string|"'extra'"
op|']'
op|'.'
name|'get'
op|'('
string|"'pci_requests'"
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'pci_requests'
op|'='
name|'None'
newline|'\n'
dedent|''
dedent|''
name|'if'
string|"'vcpu_model'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'have_extra'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'_load_vcpu_model'
op|'('
nl|'\n'
name|'db_inst'
op|'['
string|"'extra'"
op|']'
op|'.'
name|'get'
op|'('
string|"'vcpu_model'"
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'vcpu_model'
op|'='
name|'None'
newline|'\n'
dedent|''
dedent|''
name|'if'
string|"'ec2_ids'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'_load_ec2_ids'
op|'('
op|')'
newline|'\n'
dedent|''
name|'if'
string|"'migration_context'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'have_extra'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'_load_migration_context'
op|'('
nl|'\n'
name|'db_inst'
op|'['
string|"'extra'"
op|']'
op|'.'
name|'get'
op|'('
string|"'migration_context'"
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'migration_context'
op|'='
name|'None'
newline|'\n'
dedent|''
dedent|''
name|'if'
string|"'keypairs'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'have_extra'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'_load_keypairs'
op|'('
name|'db_inst'
op|'['
string|"'extra'"
op|']'
op|'.'
name|'get'
op|'('
string|"'keypairs'"
op|')'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'if'
string|"'info_cache'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'db_inst'
op|'.'
name|'get'
op|'('
string|"'info_cache'"
op|')'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'info_cache'
op|'='
name|'None'
newline|'\n'
dedent|''
name|'elif'
name|'not'
name|'instance'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'info_cache'"
op|')'
op|':'
newline|'\n'
comment|'# TODO(danms): If this ever happens on a backlevel instance'
nl|'\n'
comment|'# passed to us by a backlevel service, things will break'
nl|'\n'
indent|' '
name|'instance'
op|'.'
name|'info_cache'
op|'='
name|'objects'
op|'.'
name|'InstanceInfoCache'
op|'('
name|'context'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'instance'
op|'.'
name|'info_cache'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'info_cache'
op|'.'
name|'_from_db_object'
op|'('
name|'context'
op|','
nl|'\n'
name|'instance'
op|'.'
name|'info_cache'
op|','
nl|'\n'
name|'db_inst'
op|'['
string|"'info_cache'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'if'
name|'any'
op|'('
op|'['
name|'x'
name|'in'
name|'expected_attrs'
name|'for'
name|'x'
name|'in'
op|'('
string|"'flavor'"
op|','
nl|'\n'
string|"'old_flavor'"
op|','
nl|'\n'
string|"'new_flavor'"
op|')'
op|']'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'have_extra'
name|'and'
name|'db_inst'
op|'['
string|"'extra'"
op|']'
op|'.'
name|'get'
op|'('
string|"'flavor'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'_flavor_from_db'
op|'('
name|'db_inst'
op|'['
string|"'extra'"
op|']'
op|'['
string|"'flavor'"
op|']'
op|')'
newline|'\n'
nl|'\n'
comment|'# TODO(danms): If we are updating these on a backlevel instance,'
nl|'\n'
comment|"# we'll end up sending back new versions of these objects (see"
nl|'\n'
comment|'# above note for new info_caches'
nl|'\n'
dedent|''
dedent|''
name|'if'
string|"'pci_devices'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'pci_devices'
op|'='
name|'base'
op|'.'
name|'obj_make_list'
op|'('
nl|'\n'
name|'context'
op|','
name|'objects'
op|'.'
name|'PciDeviceList'
op|'('
name|'context'
op|')'
op|','
nl|'\n'
name|'objects'
op|'.'
name|'PciDevice'
op|','
name|'db_inst'
op|'['
string|"'pci_devices'"
op|']'
op|')'
newline|'\n'
name|'instance'
op|'['
string|"'pci_devices'"
op|']'
op|'='
name|'pci_devices'
newline|'\n'
dedent|''
name|'if'
string|"'security_groups'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'sec_groups'
op|'='
name|'base'
op|'.'
name|'obj_make_list'
op|'('
nl|'\n'
name|'context'
op|','
name|'objects'
op|'.'
name|'SecurityGroupList'
op|'('
name|'context'
op|')'
op|','
nl|'\n'
name|'objects'
op|'.'
name|'SecurityGroup'
op|','
name|'db_inst'
op|'.'
name|'get'
op|'('
string|"'security_groups'"
op|','
op|'['
op|']'
op|')'
op|')'
newline|'\n'
name|'instance'
op|'['
string|"'security_groups'"
op|']'
op|'='
name|'sec_groups'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
string|"'tags'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'tags'
op|'='
name|'base'
op|'.'
name|'obj_make_list'
op|'('
nl|'\n'
name|'context'
op|','
name|'objects'
op|'.'
name|'TagList'
op|'('
name|'context'
op|')'
op|','
nl|'\n'
name|'objects'
op|'.'
name|'Tag'
op|','
name|'db_inst'
op|'['
string|"'tags'"
op|']'
op|')'
newline|'\n'
name|'instance'
op|'['
string|"'tags'"
op|']'
op|'='
name|'tags'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
string|"'services'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'services'
op|'='
name|'base'
op|'.'
name|'obj_make_list'
op|'('
nl|'\n'
name|'context'
op|','
name|'objects'
op|'.'
name|'ServiceList'
op|'('
name|'context'
op|')'
op|','
nl|'\n'
name|'objects'
op|'.'
name|'Service'
op|','
name|'db_inst'
op|'['
string|"'services'"
op|']'
op|')'
newline|'\n'
name|'instance'
op|'['
string|"'services'"
op|']'
op|'='
name|'services'
newline|'\n'
nl|'\n'
dedent|''
name|'instance'
op|'.'
name|'obj_reset_changes'
op|'('
op|')'
newline|'\n'
name|'return'
name|'instance'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'staticmethod'
newline|'\n'
op|'@'
name|'db'
op|'.'
name|'select_db_reader_mode'
newline|'\n'
DECL|member|_db_instance_get_by_uuid
name|'def'
name|'_db_instance_get_by_uuid'
op|'('
name|'context'
op|','
name|'uuid'
op|','
name|'columns_to_join'
op|','
nl|'\n'
name|'use_slave'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'db'
op|'.'
name|'instance_get_by_uuid'
op|'('
name|'context'
op|','
name|'uuid'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'columns_to_join'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable_classmethod'
newline|'\n'
DECL|member|get_by_uuid
name|'def'
name|'get_by_uuid'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'uuid'
op|','
name|'expected_attrs'
op|'='
name|'None'
op|','
name|'use_slave'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'expected_attrs'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'expected_attrs'
op|'='
op|'['
string|"'info_cache'"
op|','
string|"'security_groups'"
op|']'
newline|'\n'
dedent|''
name|'columns_to_join'
op|'='
name|'_expected_cols'
op|'('
name|'expected_attrs'
op|')'
newline|'\n'
name|'db_inst'
op|'='
name|'cls'
op|'.'
name|'_db_instance_get_by_uuid'
op|'('
name|'context'
op|','
name|'uuid'
op|','
name|'columns_to_join'
op|','
nl|'\n'
name|'use_slave'
op|'='
name|'use_slave'
op|')'
newline|'\n'
name|'return'
name|'cls'
op|'.'
name|'_from_db_object'
op|'('
name|'context'
op|','
name|'cls'
op|'('
op|')'
op|','
name|'db_inst'
op|','
nl|'\n'
name|'expected_attrs'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable_classmethod'
newline|'\n'
DECL|member|get_by_id
name|'def'
name|'get_by_id'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'inst_id'
op|','
name|'expected_attrs'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'expected_attrs'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'expected_attrs'
op|'='
op|'['
string|"'info_cache'"
op|','
string|"'security_groups'"
op|']'
newline|'\n'
dedent|''
name|'columns_to_join'
op|'='
name|'_expected_cols'
op|'('
name|'expected_attrs'
op|')'
newline|'\n'
name|'db_inst'
op|'='
name|'db'
op|'.'
name|'instance_get'
op|'('
name|'context'
op|','
name|'inst_id'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'columns_to_join'
op|')'
newline|'\n'
name|'return'
name|'cls'
op|'.'
name|'_from_db_object'
op|'('
name|'context'
op|','
name|'cls'
op|'('
op|')'
op|','
name|'db_inst'
op|','
nl|'\n'
name|'expected_attrs'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable'
newline|'\n'
DECL|member|create
name|'def'
name|'create'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'id'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ObjectActionError'
op|'('
name|'action'
op|'='
string|"'create'"
op|','
nl|'\n'
name|'reason'
op|'='
string|"'already created'"
op|')'
newline|'\n'
dedent|''
name|'updates'
op|'='
name|'self'
op|'.'
name|'obj_get_changes'
op|'('
op|')'
newline|'\n'
name|'expected_attrs'
op|'='
op|'['
name|'attr'
name|'for'
name|'attr'
name|'in'
name|'INSTANCE_DEFAULT_FIELDS'
nl|'\n'
name|'if'
name|'attr'
name|'in'
name|'updates'
op|']'
newline|'\n'
name|'if'
string|"'security_groups'"
name|'in'
name|'updates'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
string|"'security_groups'"
op|']'
op|'='
op|'['
name|'x'
op|'.'
name|'name'
name|'for'
name|'x'
name|'in'
nl|'\n'
name|'updates'
op|'['
string|"'security_groups'"
op|']'
op|']'
newline|'\n'
dedent|''
name|'if'
string|"'info_cache'"
name|'in'
name|'updates'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
string|"'info_cache'"
op|']'
op|'='
op|'{'
nl|'\n'
string|"'network_info'"
op|':'
name|'updates'
op|'['
string|"'info_cache'"
op|']'
op|'.'
name|'network_info'
op|'.'
name|'json'
op|'('
op|')'
nl|'\n'
op|'}'
newline|'\n'
dedent|''
name|'updates'
op|'['
string|"'extra'"
op|']'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'numa_topology'
op|'='
name|'updates'
op|'.'
name|'pop'
op|'('
string|"'numa_topology'"
op|','
name|'None'
op|')'
newline|'\n'
name|'expected_attrs'
op|'.'
name|'append'
op|'('
string|"'numa_topology'"
op|')'
newline|'\n'
name|'if'
name|'numa_topology'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
string|"'extra'"
op|']'
op|'['
string|"'numa_topology'"
op|']'
op|'='
name|'numa_topology'
op|'.'
name|'_to_json'
op|'('
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
string|"'extra'"
op|']'
op|'['
string|"'numa_topology'"
op|']'
op|'='
name|'None'
newline|'\n'
dedent|''
name|'pci_requests'
op|'='
name|'updates'
op|'.'
name|'pop'
op|'('
string|"'pci_requests'"
op|','
name|'None'
op|')'
newline|'\n'
name|'expected_attrs'
op|'.'
name|'append'
op|'('
string|"'pci_requests'"
op|')'
newline|'\n'
name|'if'
name|'pci_requests'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
string|"'extra'"
op|']'
op|'['
string|"'pci_requests'"
op|']'
op|'='
op|'('
nl|'\n'
name|'pci_requests'
op|'.'
name|'to_json'
op|'('
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
string|"'extra'"
op|']'
op|'['
string|"'pci_requests'"
op|']'
op|'='
name|'None'
newline|'\n'
dedent|''
name|'flavor'
op|'='
name|'updates'
op|'.'
name|'pop'
op|'('
string|"'flavor'"
op|','
name|'None'
op|')'
newline|'\n'
name|'if'
name|'flavor'
op|':'
newline|'\n'
indent|' '
name|'expected_attrs'
op|'.'
name|'append'
op|'('
string|"'flavor'"
op|')'
newline|'\n'
name|'old'
op|'='
op|'('
op|'('
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'old_flavor'"
op|')'
name|'and'
nl|'\n'
name|'self'
op|'.'
name|'old_flavor'
op|')'
name|'and'
nl|'\n'
name|'self'
op|'.'
name|'old_flavor'
op|'.'
name|'obj_to_primitive'
op|'('
op|')'
name|'or'
name|'None'
op|')'
newline|'\n'
name|'new'
op|'='
op|'('
op|'('
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'new_flavor'"
op|')'
name|'and'
nl|'\n'
name|'self'
op|'.'
name|'new_flavor'
op|')'
name|'and'
nl|'\n'
name|'self'
op|'.'
name|'new_flavor'
op|'.'
name|'obj_to_primitive'
op|'('
op|')'
name|'or'
name|'None'
op|')'
newline|'\n'
name|'flavor_info'
op|'='
op|'{'
nl|'\n'
string|"'cur'"
op|':'
name|'self'
op|'.'
name|'flavor'
op|'.'
name|'obj_to_primitive'
op|'('
op|')'
op|','
nl|'\n'
string|"'old'"
op|':'
name|'old'
op|','
nl|'\n'
string|"'new'"
op|':'
name|'new'
op|','
nl|'\n'
op|'}'
newline|'\n'
name|'updates'
op|'['
string|"'extra'"
op|']'
op|'['
string|"'flavor'"
op|']'
op|'='
name|'jsonutils'
op|'.'
name|'dumps'
op|'('
name|'flavor_info'
op|')'
newline|'\n'
dedent|''
name|'keypairs'
op|'='
name|'updates'
op|'.'
name|'pop'
op|'('
string|"'keypairs'"
op|','
name|'None'
op|')'
newline|'\n'
name|'if'
name|'keypairs'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'expected_attrs'
op|'.'
name|'append'
op|'('
string|"'keypairs'"
op|')'
newline|'\n'
name|'updates'
op|'['
string|"'extra'"
op|']'
op|'['
string|"'keypairs'"
op|']'
op|'='
name|'jsonutils'
op|'.'
name|'dumps'
op|'('
nl|'\n'
name|'keypairs'
op|'.'
name|'obj_to_primitive'
op|'('
op|')'
op|')'
newline|'\n'
dedent|''
name|'vcpu_model'
op|'='
name|'updates'
op|'.'
name|'pop'
op|'('
string|"'vcpu_model'"
op|','
name|'None'
op|')'
newline|'\n'
name|'expected_attrs'
op|'.'
name|'append'
op|'('
string|"'vcpu_model'"
op|')'
newline|'\n'
name|'if'
name|'vcpu_model'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
string|"'extra'"
op|']'
op|'['
string|"'vcpu_model'"
op|']'
op|'='
op|'('
nl|'\n'
name|'jsonutils'
op|'.'
name|'dumps'
op|'('
name|'vcpu_model'
op|'.'
name|'obj_to_primitive'
op|'('
op|')'
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
string|"'extra'"
op|']'
op|'['
string|"'vcpu_model'"
op|']'
op|'='
name|'None'
newline|'\n'
dedent|''
name|'db_inst'
op|'='
name|'db'
op|'.'
name|'instance_create'
op|'('
name|'self'
op|'.'
name|'_context'
op|','
name|'updates'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_from_db_object'
op|'('
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|','
name|'db_inst'
op|','
name|'expected_attrs'
op|')'
newline|'\n'
nl|'\n'
comment|'# NOTE(danms): The EC2 ids are created on their first load. In order'
nl|'\n'
comment|'# to avoid them being missing and having to be loaded later, we'
nl|'\n'
comment|'# load them once here on create now that the instance record is'
nl|'\n'
comment|'# created.'
nl|'\n'
name|'self'
op|'.'
name|'_load_ec2_ids'
op|'('
op|')'
newline|'\n'
name|'self'
op|'.'
name|'obj_reset_changes'
op|'('
op|'['
string|"'ec2_ids'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable'
newline|'\n'
DECL|member|destroy
name|'def'
name|'destroy'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'id'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ObjectActionError'
op|'('
name|'action'
op|'='
string|"'destroy'"
op|','
nl|'\n'
name|'reason'
op|'='
string|"'already destroyed'"
op|')'
newline|'\n'
dedent|''
name|'if'
name|'not'
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'uuid'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ObjectActionError'
op|'('
name|'action'
op|'='
string|"'destroy'"
op|','
nl|'\n'
name|'reason'
op|'='
string|"'no uuid'"
op|')'
newline|'\n'
dedent|''
name|'if'
name|'not'
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'host'"
op|')'
name|'or'
name|'not'
name|'self'
op|'.'
name|'host'
op|':'
newline|'\n'
comment|'# NOTE(danms): If our host is not set, avoid a race'
nl|'\n'
indent|' '
name|'constraint'
op|'='
name|'db'
op|'.'
name|'constraint'
op|'('
name|'host'
op|'='
name|'db'
op|'.'
name|'equal_any'
op|'('
name|'None'
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'constraint'
op|'='
name|'None'
newline|'\n'
nl|'\n'
dedent|''
name|'cell_type'
op|'='
name|'cells_opts'
op|'.'
name|'get_cell_type'
op|'('
op|')'
newline|'\n'
name|'if'
name|'cell_type'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'stale_instance'
op|'='
name|'self'
op|'.'
name|'obj_clone'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'try'
op|':'
newline|'\n'
indent|' '
name|'db_inst'
op|'='
name|'db'
op|'.'
name|'instance_destroy'
op|'('
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|'.'
name|'uuid'
op|','
nl|'\n'
name|'constraint'
op|'='
name|'constraint'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_from_db_object'
op|'('
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|','
name|'db_inst'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'exception'
op|'.'
name|'ConstraintNotMet'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ObjectActionError'
op|'('
name|'action'
op|'='
string|"'destroy'"
op|','
nl|'\n'
name|'reason'
op|'='
string|"'host changed'"
op|')'
newline|'\n'
dedent|''
name|'if'
name|'cell_type'
op|'=='
string|"'compute'"
op|':'
newline|'\n'
indent|' '
name|'cells_api'
op|'='
name|'cells_rpcapi'
op|'.'
name|'CellsAPI'
op|'('
op|')'
newline|'\n'
name|'cells_api'
op|'.'
name|'instance_destroy_at_top'
op|'('
name|'self'
op|'.'
name|'_context'
op|','
name|'stale_instance'
op|')'
newline|'\n'
dedent|''
name|'delattr'
op|'('
name|'self'
op|','
name|'base'
op|'.'
name|'get_attrname'
op|'('
string|"'id'"
op|')'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_save_info_cache
dedent|''
name|'def'
name|'_save_info_cache'
op|'('
name|'self'
op|','
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'self'
op|'.'
name|'info_cache'
op|':'
newline|'\n'
indent|' '
name|'with'
name|'self'
op|'.'
name|'info_cache'
op|'.'
name|'obj_alternate_context'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'info_cache'
op|'.'
name|'save'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|_save_security_groups
dedent|''
dedent|''
dedent|''
name|'def'
name|'_save_security_groups'
op|'('
name|'self'
op|','
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'security_groups'
op|'='
name|'self'
op|'.'
name|'security_groups'
name|'or'
op|'['
op|']'
newline|'\n'
name|'for'
name|'secgroup'
name|'in'
name|'security_groups'
op|':'
newline|'\n'
indent|' '
name|'with'
name|'secgroup'
op|'.'
name|'obj_alternate_context'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'secgroup'
op|'.'
name|'save'
op|'('
op|')'
newline|'\n'
dedent|''
dedent|''
name|'self'
op|'.'
name|'security_groups'
op|'.'
name|'obj_reset_changes'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|_save_fault
dedent|''
name|'def'
name|'_save_fault'
op|'('
name|'self'
op|','
name|'context'
op|')'
op|':'
newline|'\n'
comment|"# NOTE(danms): I don't think we need to worry about this, do we?"
nl|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
DECL|member|_save_numa_topology
dedent|''
name|'def'
name|'_save_numa_topology'
op|'('
name|'self'
op|','
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'self'
op|'.'
name|'numa_topology'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'numa_topology'
op|'.'
name|'instance_uuid'
op|'='
name|'self'
op|'.'
name|'uuid'
newline|'\n'
name|'with'
name|'self'
op|'.'
name|'numa_topology'
op|'.'
name|'obj_alternate_context'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'numa_topology'
op|'.'
name|'_save'
op|'('
op|')'
newline|'\n'
dedent|''
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'objects'
op|'.'
name|'InstanceNUMATopology'
op|'.'
name|'delete_by_instance_uuid'
op|'('
nl|'\n'
name|'context'
op|','
name|'self'
op|'.'
name|'uuid'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_save_pci_requests
dedent|''
dedent|''
name|'def'
name|'_save_pci_requests'
op|'('
name|'self'
op|','
name|'context'
op|')'
op|':'
newline|'\n'
comment|'# NOTE(danms): No need for this yet.'
nl|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
DECL|member|_save_pci_devices
dedent|''
name|'def'
name|'_save_pci_devices'
op|'('
name|'self'
op|','
name|'context'
op|')'
op|':'
newline|'\n'
comment|'# NOTE(yjiang5): All devices held by PCI tracker, only PCI tracker'
nl|'\n'
comment|'# permitted to update the DB. all change to devices from here will'
nl|'\n'
comment|'# be dropped.'
nl|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
DECL|member|_save_flavor
dedent|''
name|'def'
name|'_save_flavor'
op|'('
name|'self'
op|','
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'not'
name|'any'
op|'('
op|'['
name|'x'
name|'in'
name|'self'
op|'.'
name|'obj_what_changed'
op|'('
op|')'
name|'for'
name|'x'
name|'in'
nl|'\n'
op|'('
string|"'flavor'"
op|','
string|"'old_flavor'"
op|','
string|"'new_flavor'"
op|')'
op|']'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
newline|'\n'
comment|'# FIXME(danms): We can do this smarterly by updating this'
nl|'\n'
comment|'# with all the other extra things at the same time'
nl|'\n'
dedent|''
name|'flavor_info'
op|'='
op|'{'
nl|'\n'
string|"'cur'"
op|':'
name|'self'
op|'.'
name|'flavor'
op|'.'
name|'obj_to_primitive'
op|'('
op|')'
op|','
nl|'\n'
string|"'old'"
op|':'
op|'('
name|'self'
op|'.'
name|'old_flavor'
name|'and'
nl|'\n'
name|'self'
op|'.'
name|'old_flavor'
op|'.'
name|'obj_to_primitive'
op|'('
op|')'
name|'or'
name|'None'
op|')'
op|','
nl|'\n'
string|"'new'"
op|':'
op|'('
name|'self'
op|'.'
name|'new_flavor'
name|'and'
nl|'\n'
name|'self'
op|'.'
name|'new_flavor'
op|'.'
name|'obj_to_primitive'
op|'('
op|')'
name|'or'
name|'None'
op|')'
op|','
nl|'\n'
op|'}'
newline|'\n'
name|'db'
op|'.'
name|'instance_extra_update_by_uuid'
op|'('
nl|'\n'
name|'context'
op|','
name|'self'
op|'.'
name|'uuid'
op|','
nl|'\n'
op|'{'
string|"'flavor'"
op|':'
name|'jsonutils'
op|'.'
name|'dumps'
op|'('
name|'flavor_info'
op|')'
op|'}'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'obj_reset_changes'
op|'('
op|'['
string|"'flavor'"
op|','
string|"'old_flavor'"
op|','
string|"'new_flavor'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_save_old_flavor
dedent|''
name|'def'
name|'_save_old_flavor'
op|'('
name|'self'
op|','
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
string|"'old_flavor'"
name|'in'
name|'self'
op|'.'
name|'obj_what_changed'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_save_flavor'
op|'('
name|'context'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_save_new_flavor
dedent|''
dedent|''
name|'def'
name|'_save_new_flavor'
op|'('
name|'self'
op|','
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
string|"'new_flavor'"
name|'in'
name|'self'
op|'.'
name|'obj_what_changed'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_save_flavor'
op|'('
name|'context'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_save_vcpu_model
dedent|''
dedent|''
name|'def'
name|'_save_vcpu_model'
op|'('
name|'self'
op|','
name|'context'
op|')'
op|':'
newline|'\n'
comment|'# TODO(yjiang5): should merge the db accesses for all the extra'
nl|'\n'
comment|'# fields'
nl|'\n'
indent|' '
name|'if'
string|"'vcpu_model'"
name|'in'
name|'self'
op|'.'
name|'obj_what_changed'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'self'
op|'.'
name|'vcpu_model'
op|':'
newline|'\n'
indent|' '
name|'update'
op|'='
name|'jsonutils'
op|'.'
name|'dumps'
op|'('
name|'self'
op|'.'
name|'vcpu_model'
op|'.'
name|'obj_to_primitive'
op|'('
op|')'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'update'
op|'='
name|'None'
newline|'\n'
dedent|''
name|'db'
op|'.'
name|'instance_extra_update_by_uuid'
op|'('
nl|'\n'
name|'context'
op|','
name|'self'
op|'.'
name|'uuid'
op|','
nl|'\n'
op|'{'
string|"'vcpu_model'"
op|':'
name|'update'
op|'}'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_save_ec2_ids
dedent|''
dedent|''
name|'def'
name|'_save_ec2_ids'
op|'('
name|'self'
op|','
name|'context'
op|')'
op|':'
newline|'\n'
comment|'# NOTE(hanlind): Read-only so no need to save this.'
nl|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
DECL|member|_save_migration_context
dedent|''
name|'def'
name|'_save_migration_context'
op|'('
name|'self'
op|','
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'self'
op|'.'
name|'migration_context'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'migration_context'
op|'.'
name|'instance_uuid'
op|'='
name|'self'
op|'.'
name|'uuid'
newline|'\n'
name|'with'
name|'self'
op|'.'
name|'migration_context'
op|'.'
name|'obj_alternate_context'
op|'('
name|'context'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'migration_context'
op|'.'
name|'_save'
op|'('
op|')'
newline|'\n'
dedent|''
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'objects'
op|'.'
name|'MigrationContext'
op|'.'
name|'_destroy'
op|'('
name|'context'
op|','
name|'self'
op|'.'
name|'uuid'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_save_keypairs
dedent|''
dedent|''
name|'def'
name|'_save_keypairs'
op|'('
name|'self'
op|','
name|'context'
op|')'
op|':'
newline|'\n'
comment|'# NOTE(danms): Read-only so no need to save this.'
nl|'\n'
indent|' '
name|'pass'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable'
newline|'\n'
DECL|member|save
name|'def'
name|'save'
op|'('
name|'self'
op|','
name|'expected_vm_state'
op|'='
name|'None'
op|','
nl|'\n'
name|'expected_task_state'
op|'='
name|'None'
op|','
name|'admin_state_reset'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Save updates to this instance\n\n Column-wise updates will be made based on the result of\n self.what_changed(). If expected_task_state is provided,\n it will be checked against the in-database copy of the\n instance before updates are made.\n\n :param:context: Security context\n :param:expected_task_state: Optional tuple of valid task states\n for the instance to be in\n :param:expected_vm_state: Optional tuple of valid vm states\n for the instance to be in\n :param admin_state_reset: True if admin API is forcing setting\n of task_state/vm_state\n\n """'
newline|'\n'
comment|'# Store this on the class because _cell_name_blocks_sync is useless'
nl|'\n'
comment|'# after the db update call below.'
nl|'\n'
name|'self'
op|'.'
name|'_sync_cells'
op|'='
name|'not'
name|'self'
op|'.'
name|'_cell_name_blocks_sync'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'context'
op|'='
name|'self'
op|'.'
name|'_context'
newline|'\n'
name|'cell_type'
op|'='
name|'cells_opts'
op|'.'
name|'get_cell_type'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'cell_type'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
comment|'# NOTE(comstud): We need to stash a copy of ourselves'
nl|'\n'
comment|'# before any updates are applied. When we call the save'
nl|'\n'
comment|'# methods on nested objects, we will lose any changes to'
nl|'\n'
comment|'# them. But we need to make sure child cells can tell'
nl|'\n'
comment|'# what is changed.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# We also need to nuke any updates to vm_state and task_state'
nl|'\n'
comment|'# unless admin_state_reset is True. compute cells are'
nl|'\n'
comment|'# authoritative for their view of vm_state and task_state.'
nl|'\n'
indent|' '
name|'stale_instance'
op|'='
name|'self'
op|'.'
name|'obj_clone'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'cells_update_from_api'
op|'='
op|'('
name|'cell_type'
op|'=='
string|"'api'"
name|'and'
name|'self'
op|'.'
name|'cell_name'
name|'and'
nl|'\n'
name|'self'
op|'.'
name|'_sync_cells'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'cells_update_from_api'
op|':'
newline|'\n'
DECL|function|_handle_cell_update_from_api
indent|' '
name|'def'
name|'_handle_cell_update_from_api'
op|'('
op|')'
op|':'
newline|'\n'
indent|' '
name|'cells_api'
op|'='
name|'cells_rpcapi'
op|'.'
name|'CellsAPI'
op|'('
op|')'
newline|'\n'
name|'cells_api'
op|'.'
name|'instance_update_from_api'
op|'('
name|'context'
op|','
name|'stale_instance'
op|','
nl|'\n'
name|'expected_vm_state'
op|','
nl|'\n'
name|'expected_task_state'
op|','
nl|'\n'
name|'admin_state_reset'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'updates'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'changes'
op|'='
name|'self'
op|'.'
name|'obj_what_changed'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'for'
name|'field'
name|'in'
name|'self'
op|'.'
name|'fields'
op|':'
newline|'\n'
comment|'# NOTE(danms): For object fields, we construct and call a'
nl|'\n'
comment|'# helper method like self._save_$attrname()'
nl|'\n'
indent|' '
name|'if'
op|'('
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
name|'field'
op|')'
name|'and'
nl|'\n'
name|'isinstance'
op|'('
name|'self'
op|'.'
name|'fields'
op|'['
name|'field'
op|']'
op|','
name|'fields'
op|'.'
name|'ObjectField'
op|')'
op|')'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'getattr'
op|'('
name|'self'
op|','
string|"'_save_%s'"
op|'%'
name|'field'
op|')'
op|'('
name|'context'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'AttributeError'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'exception'
op|'('
name|'_LE'
op|'('
string|"'No save handler for %s'"
op|')'
op|','
name|'field'
op|','
nl|'\n'
name|'instance'
op|'='
name|'self'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'db_exc'
op|'.'
name|'DBReferenceError'
name|'as'
name|'exp'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'exp'
op|'.'
name|'key'
op|'!='
string|"'instance_uuid'"
op|':'
newline|'\n'
indent|' '
name|'raise'
newline|'\n'
comment|'# NOTE(melwitt): This will happen if we instance.save()'
nl|'\n'
comment|'# before an instance.create() and FK constraint fails.'
nl|'\n'
comment|'# In practice, this occurs in cells during a delete of'
nl|'\n'
comment|'# an unscheduled instance. Otherwise, it could happen'
nl|'\n'
comment|'# as a result of bug.'
nl|'\n'
dedent|''
name|'raise'
name|'exception'
op|'.'
name|'InstanceNotFound'
op|'('
name|'instance_id'
op|'='
name|'self'
op|'.'
name|'uuid'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'elif'
name|'field'
name|'in'
name|'changes'
op|':'
newline|'\n'
indent|' '
name|'if'
op|'('
name|'field'
op|'=='
string|"'cell_name'"
name|'and'
name|'self'
op|'['
name|'field'
op|']'
name|'is'
name|'not'
name|'None'
name|'and'
nl|'\n'
name|'self'
op|'['
name|'field'
op|']'
op|'.'
name|'startswith'
op|'('
name|'cells_utils'
op|'.'
name|'BLOCK_SYNC_FLAG'
op|')'
op|')'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
name|'field'
op|']'
op|'='
name|'self'
op|'['
name|'field'
op|']'
op|'.'
name|'replace'
op|'('
nl|'\n'
name|'cells_utils'
op|'.'
name|'BLOCK_SYNC_FLAG'
op|','
string|"''"
op|','
number|'1'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
name|'field'
op|']'
op|'='
name|'self'
op|'['
name|'field'
op|']'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
dedent|''
name|'if'
name|'not'
name|'updates'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'cells_update_from_api'
op|':'
newline|'\n'
indent|' '
name|'_handle_cell_update_from_api'
op|'('
op|')'
newline|'\n'
dedent|''
name|'return'
newline|'\n'
nl|'\n'
comment|'# Cleaned needs to be turned back into an int here'
nl|'\n'
dedent|''
name|'if'
string|"'cleaned'"
name|'in'
name|'updates'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'updates'
op|'['
string|"'cleaned'"
op|']'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
string|"'cleaned'"
op|']'
op|'='
number|'1'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
string|"'cleaned'"
op|']'
op|'='
number|'0'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'if'
name|'expected_task_state'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
string|"'expected_task_state'"
op|']'
op|'='
name|'expected_task_state'
newline|'\n'
dedent|''
name|'if'
name|'expected_vm_state'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'updates'
op|'['
string|"'expected_vm_state'"
op|']'
op|'='
name|'expected_vm_state'
newline|'\n'
nl|'\n'
dedent|''
name|'expected_attrs'
op|'='
op|'['
name|'attr'
name|'for'
name|'attr'
name|'in'
name|'_INSTANCE_OPTIONAL_JOINED_FIELDS'
nl|'\n'
name|'if'
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
name|'attr'
op|')'
op|']'
newline|'\n'
name|'if'
string|"'pci_devices'"
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
comment|"# NOTE(danms): We don't refresh pci_devices on save right now"
nl|'\n'
indent|' '
name|'expected_attrs'
op|'.'
name|'remove'
op|'('
string|"'pci_devices'"
op|')'
newline|'\n'
nl|'\n'
comment|'# NOTE(alaski): We need to pull system_metadata for the'
nl|'\n'
comment|"# notification.send_update() below. If we don't there's a KeyError"
nl|'\n'
comment|'# when it tries to extract the flavor.'
nl|'\n'
comment|'# NOTE(danms): If we have sysmeta, we need flavor since the caller'
nl|'\n'
comment|'# might be expecting flavor information as a result'
nl|'\n'
dedent|''
name|'if'
string|"'system_metadata'"
name|'not'
name|'in'
name|'expected_attrs'
op|':'
newline|'\n'
indent|' '
name|'expected_attrs'
op|'.'
name|'append'
op|'('
string|"'system_metadata'"
op|')'
newline|'\n'
name|'expected_attrs'
op|'.'
name|'append'
op|'('
string|"'flavor'"
op|')'
newline|'\n'
dedent|''
name|'old_ref'
op|','
name|'inst_ref'
op|'='
name|'db'
op|'.'
name|'instance_update_and_get_original'
op|'('
nl|'\n'
name|'context'
op|','
name|'self'
op|'.'
name|'uuid'
op|','
name|'updates'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'_expected_cols'
op|'('
name|'expected_attrs'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'_from_db_object'
op|'('
name|'context'
op|','
name|'self'
op|','
name|'inst_ref'
op|','
nl|'\n'
name|'expected_attrs'
op|'='
name|'expected_attrs'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'cells_update_from_api'
op|':'
newline|'\n'
indent|' '
name|'_handle_cell_update_from_api'
op|'('
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'cell_type'
op|'=='
string|"'compute'"
op|':'
newline|'\n'
indent|' '
name|'if'
name|'self'
op|'.'
name|'_sync_cells'
op|':'
newline|'\n'
indent|' '
name|'cells_api'
op|'='
name|'cells_rpcapi'
op|'.'
name|'CellsAPI'
op|'('
op|')'
newline|'\n'
name|'cells_api'
op|'.'
name|'instance_update_at_top'
op|'('
name|'context'
op|','
name|'stale_instance'
op|')'
newline|'\n'
nl|'\n'
DECL|function|_notify
dedent|''
dedent|''
name|'def'
name|'_notify'
op|'('
op|')'
op|':'
newline|'\n'
comment|'# NOTE(danms): We have to be super careful here not to trigger'
nl|'\n'
comment|'# any lazy-loads that will unmigrate or unbackport something. So,'
nl|'\n'
comment|'# make a copy of the instance for notifications first.'
nl|'\n'
indent|' '
name|'new_ref'
op|'='
name|'self'
op|'.'
name|'obj_clone'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'notifications'
op|'.'
name|'send_update'
op|'('
name|'context'
op|','
name|'old_ref'
op|','
name|'new_ref'
op|')'
newline|'\n'
nl|'\n'
comment|'# NOTE(alaski): If cell synchronization is blocked it means we have'
nl|'\n'
comment|'# already run this block of code in either the parent or child of this'
nl|'\n'
comment|'# cell. Therefore this notification has already been sent.'
nl|'\n'
dedent|''
name|'if'
name|'not'
name|'self'
op|'.'
name|'_sync_cells'
op|':'
newline|'\n'
indent|' '
name|'_notify'
op|'='
name|'lambda'
op|':'
name|'None'
comment|'# noqa: F811'
newline|'\n'
nl|'\n'
dedent|''
name|'_notify'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'self'
op|'.'
name|'obj_reset_changes'
op|'('
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable'
newline|'\n'
DECL|member|refresh
name|'def'
name|'refresh'
op|'('
name|'self'
op|','
name|'use_slave'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'extra'
op|'='
op|'['
name|'field'
name|'for'
name|'field'
name|'in'
name|'INSTANCE_OPTIONAL_ATTRS'
nl|'\n'
name|'if'
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
name|'field'
op|')'
op|']'
newline|'\n'
name|'current'
op|'='
name|'self'
op|'.'
name|'__class__'
op|'.'
name|'get_by_uuid'
op|'('
name|'self'
op|'.'
name|'_context'
op|','
name|'uuid'
op|'='
name|'self'
op|'.'
name|'uuid'
op|','
nl|'\n'
name|'expected_attrs'
op|'='
name|'extra'
op|','
nl|'\n'
name|'use_slave'
op|'='
name|'use_slave'
op|')'
newline|'\n'
comment|'# NOTE(danms): We orphan the instance copy so we do not unexpectedly'
nl|'\n'
comment|'# trigger a lazy-load (which would mean we failed to calculate the'
nl|'\n'
comment|'# expected_attrs properly)'
nl|'\n'
name|'current'
op|'.'
name|'_context'
op|'='
name|'None'
newline|'\n'
nl|'\n'
name|'for'
name|'field'
name|'in'
name|'self'
op|'.'
name|'fields'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
name|'field'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'field'
op|'=='
string|"'info_cache'"
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'info_cache'
op|'.'
name|'refresh'
op|'('
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'self'
op|'['
name|'field'
op|']'
op|'!='
name|'current'
op|'['
name|'field'
op|']'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'['
name|'field'
op|']'
op|'='
name|'current'
op|'['
name|'field'
op|']'
newline|'\n'
dedent|''
dedent|''
dedent|''
name|'self'
op|'.'
name|'obj_reset_changes'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|_load_generic
dedent|''
name|'def'
name|'_load_generic'
op|'('
name|'self'
op|','
name|'attrname'
op|')'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'='
name|'self'
op|'.'
name|'__class__'
op|'.'
name|'get_by_uuid'
op|'('
name|'self'
op|'.'
name|'_context'
op|','
nl|'\n'
name|'uuid'
op|'='
name|'self'
op|'.'
name|'uuid'
op|','
nl|'\n'
name|'expected_attrs'
op|'='
op|'['
name|'attrname'
op|']'
op|')'
newline|'\n'
nl|'\n'
comment|'# NOTE(danms): Never allow us to recursively-load'
nl|'\n'
name|'if'
name|'instance'
op|'.'
name|'obj_attr_is_set'
op|'('
name|'attrname'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'['
name|'attrname'
op|']'
op|'='
name|'instance'
op|'['
name|'attrname'
op|']'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ObjectActionError'
op|'('
nl|'\n'
name|'action'
op|'='
string|"'obj_load_attr'"
op|','
nl|'\n'
name|'reason'
op|'='
string|"'loading %s requires recursion'"
op|'%'
name|'attrname'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_load_fault
dedent|''
dedent|''
name|'def'
name|'_load_fault'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'fault'
op|'='
name|'objects'
op|'.'
name|'InstanceFault'
op|'.'
name|'get_latest_for_instance'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|'.'
name|'uuid'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_load_numa_topology
dedent|''
name|'def'
name|'_load_numa_topology'
op|'('
name|'self'
op|','
name|'db_topology'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'db_topology'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'numa_topology'
op|'='
name|'objects'
op|'.'
name|'InstanceNUMATopology'
op|'.'
name|'obj_from_db_obj'
op|'('
name|'self'
op|'.'
name|'uuid'
op|','
nl|'\n'
name|'db_topology'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'numa_topology'
op|'='
name|'objects'
op|'.'
name|'InstanceNUMATopology'
op|'.'
name|'get_by_instance_uuid'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|'.'
name|'uuid'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'exception'
op|'.'
name|'NumaTopologyNotFound'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'numa_topology'
op|'='
name|'None'
newline|'\n'
nl|'\n'
DECL|member|_load_pci_requests
dedent|''
dedent|''
dedent|''
name|'def'
name|'_load_pci_requests'
op|'('
name|'self'
op|','
name|'db_requests'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
comment|'# FIXME: also do this if none!'
nl|'\n'
indent|' '
name|'if'
name|'db_requests'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'pci_requests'
op|'='
name|'objects'
op|'.'
name|'InstancePCIRequests'
op|'.'
name|'obj_from_db'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|'.'
name|'uuid'
op|','
name|'db_requests'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'pci_requests'
op|'='
name|'objects'
op|'.'
name|'InstancePCIRequests'
op|'.'
name|'get_by_instance_uuid'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|'.'
name|'uuid'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_load_flavor
dedent|''
dedent|''
name|'def'
name|'_load_flavor'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'='
name|'self'
op|'.'
name|'__class__'
op|'.'
name|'get_by_uuid'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'_context'
op|','
name|'uuid'
op|'='
name|'self'
op|'.'
name|'uuid'
op|','
nl|'\n'
name|'expected_attrs'
op|'='
op|'['
string|"'flavor'"
op|','
string|"'system_metadata'"
op|']'
op|')'
newline|'\n'
nl|'\n'
comment|"# NOTE(danms): Orphan the instance to make sure we don't lazy-load"
nl|'\n'
comment|'# anything below'
nl|'\n'
name|'instance'
op|'.'
name|'_context'
op|'='
name|'None'
newline|'\n'
name|'self'
op|'.'
name|'flavor'
op|'='
name|'instance'
op|'.'
name|'flavor'
newline|'\n'
name|'self'
op|'.'
name|'old_flavor'
op|'='
name|'instance'
op|'.'
name|'old_flavor'
newline|'\n'
name|'self'
op|'.'
name|'new_flavor'
op|'='
name|'instance'
op|'.'
name|'new_flavor'
newline|'\n'
nl|'\n'
comment|'# NOTE(danms): The query above may have migrated the flavor from'
nl|'\n'
comment|'# system_metadata. Since we have it anyway, go ahead and refresh'
nl|'\n'
comment|'# our system_metadata from it so that a save will be accurate.'
nl|'\n'
name|'instance'
op|'.'
name|'system_metadata'
op|'.'
name|'update'
op|'('
name|'self'
op|'.'
name|'get'
op|'('
string|"'system_metadata'"
op|','
op|'{'
op|'}'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'system_metadata'
op|'='
name|'instance'
op|'.'
name|'system_metadata'
newline|'\n'
nl|'\n'
DECL|member|_load_vcpu_model
dedent|''
name|'def'
name|'_load_vcpu_model'
op|'('
name|'self'
op|','
name|'db_vcpu_model'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'db_vcpu_model'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'vcpu_model'
op|'='
name|'objects'
op|'.'
name|'VirtCPUModel'
op|'.'
name|'get_by_instance_uuid'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|'.'
name|'uuid'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'db_vcpu_model'
op|'='
name|'jsonutils'
op|'.'
name|'loads'
op|'('
name|'db_vcpu_model'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'vcpu_model'
op|'='
name|'objects'
op|'.'
name|'VirtCPUModel'
op|'.'
name|'obj_from_primitive'
op|'('
nl|'\n'
name|'db_vcpu_model'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_load_ec2_ids
dedent|''
dedent|''
name|'def'
name|'_load_ec2_ids'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'ec2_ids'
op|'='
name|'objects'
op|'.'
name|'EC2Ids'
op|'.'
name|'get_by_instance'
op|'('
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_load_security_groups
dedent|''
name|'def'
name|'_load_security_groups'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'security_groups'
op|'='
name|'objects'
op|'.'
name|'SecurityGroupList'
op|'.'
name|'get_by_instance'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_load_pci_devices
dedent|''
name|'def'
name|'_load_pci_devices'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'pci_devices'
op|'='
name|'objects'
op|'.'
name|'PciDeviceList'
op|'.'
name|'get_by_instance_uuid'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|'.'
name|'uuid'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_load_migration_context
dedent|''
name|'def'
name|'_load_migration_context'
op|'('
name|'self'
op|','
name|'db_context'
op|'='
name|'_NO_DATA_SENTINEL'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'db_context'
name|'is'
name|'_NO_DATA_SENTINEL'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'migration_context'
op|'='
op|'('
nl|'\n'
name|'objects'
op|'.'
name|'MigrationContext'
op|'.'
name|'get_by_instance_uuid'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|'.'
name|'uuid'
op|')'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'exception'
op|'.'
name|'MigrationContextNotFound'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'migration_context'
op|'='
name|'None'
newline|'\n'
dedent|''
dedent|''
name|'elif'
name|'db_context'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'migration_context'
op|'='
name|'None'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'migration_context'
op|'='
name|'objects'
op|'.'
name|'MigrationContext'
op|'.'
name|'obj_from_db_obj'
op|'('
nl|'\n'
name|'db_context'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_load_keypairs
dedent|''
dedent|''
name|'def'
name|'_load_keypairs'
op|'('
name|'self'
op|','
name|'db_keypairs'
op|'='
name|'_NO_DATA_SENTINEL'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'db_keypairs'
name|'is'
name|'_NO_DATA_SENTINEL'
op|':'
newline|'\n'
indent|' '
name|'inst'
op|'='
name|'objects'
op|'.'
name|'Instance'
op|'.'
name|'get_by_uuid'
op|'('
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|'.'
name|'uuid'
op|','
nl|'\n'
name|'expected_attrs'
op|'='
op|'['
string|"'keypairs'"
op|']'
op|')'
newline|'\n'
name|'if'
string|"'keypairs'"
name|'in'
name|'inst'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'keypairs'
op|'='
name|'inst'
op|'.'
name|'keypairs'
newline|'\n'
name|'self'
op|'.'
name|'keypairs'
op|'.'
name|'obj_reset_changes'
op|'('
name|'recursive'
op|'='
name|'True'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'obj_reset_changes'
op|'('
op|'['
string|"'keypairs'"
op|']'
op|')'
newline|'\n'
name|'return'
newline|'\n'
nl|'\n'
comment|'# NOTE(danms): We need to load from the old location by name'
nl|'\n'
comment|"# if we don't have them in extra. Only do this from the main"
nl|'\n'
comment|'# database as instances were created with keypairs in extra'
nl|'\n'
comment|'# before keypairs were moved to the api database.'
nl|'\n'
dedent|''
name|'self'
op|'.'
name|'keypairs'
op|'='
name|'objects'
op|'.'
name|'KeyPairList'
op|'('
name|'objects'
op|'='
op|'['
op|']'
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'key'
op|'='
name|'objects'
op|'.'
name|'KeyPair'
op|'.'
name|'get_by_name'
op|'('
name|'self'
op|'.'
name|'_context'
op|','
nl|'\n'
name|'self'
op|'.'
name|'user_id'
op|','
nl|'\n'
name|'self'
op|'.'
name|'key_name'
op|','
nl|'\n'
name|'localonly'
op|'='
name|'True'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'keypairs'
op|'.'
name|'objects'
op|'.'
name|'append'
op|'('
name|'key'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'exception'
op|'.'
name|'KeypairNotFound'
op|':'
newline|'\n'
indent|' '
name|'pass'
newline|'\n'
comment|'# NOTE(danms): If we loaded from legacy, we leave the keypairs'
nl|'\n'
comment|'# attribute dirty in hopes someone else will save it for us'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
name|'elif'
name|'db_keypairs'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'keypairs'
op|'='
name|'objects'
op|'.'
name|'KeyPairList'
op|'.'
name|'obj_from_primitive'
op|'('
nl|'\n'
name|'jsonutils'
op|'.'
name|'loads'
op|'('
name|'db_keypairs'
op|')'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'obj_reset_changes'
op|'('
op|'['
string|"'keypairs'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|apply_migration_context
dedent|''
dedent|''
name|'def'
name|'apply_migration_context'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'self'
op|'.'
name|'migration_context'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'numa_topology'
op|'='
name|'self'
op|'.'
name|'migration_context'
op|'.'
name|'new_numa_topology'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|'"Trying to apply a migration context that does not "'
nl|'\n'
string|'"seem to be set for this instance"'
op|','
name|'instance'
op|'='
name|'self'
op|')'
newline|'\n'
nl|'\n'
DECL|member|revert_migration_context
dedent|''
dedent|''
name|'def'
name|'revert_migration_context'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'self'
op|'.'
name|'migration_context'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'numa_topology'
op|'='
name|'self'
op|'.'
name|'migration_context'
op|'.'
name|'old_numa_topology'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'debug'
op|'('
string|'"Trying to revert a migration context that does not "'
nl|'\n'
string|'"seem to be set for this instance"'
op|','
name|'instance'
op|'='
name|'self'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'contextlib'
op|'.'
name|'contextmanager'
newline|'\n'
DECL|member|mutated_migration_context
name|'def'
name|'mutated_migration_context'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Context manager to temporarily apply the migration context.\n\n Calling .save() from within the context manager means that the mutated\n context will be saved which can cause incorrect resource tracking, and\n should be avoided.\n """'
newline|'\n'
name|'current_numa_topo'
op|'='
name|'self'
op|'.'
name|'numa_topology'
newline|'\n'
name|'self'
op|'.'
name|'apply_migration_context'
op|'('
op|')'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'yield'
newline|'\n'
dedent|''
name|'finally'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'numa_topology'
op|'='
name|'current_numa_topo'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable'
newline|'\n'
DECL|member|drop_migration_context
name|'def'
name|'drop_migration_context'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'self'
op|'.'
name|'migration_context'
op|':'
newline|'\n'
indent|' '
name|'objects'
op|'.'
name|'MigrationContext'
op|'.'
name|'_destroy'
op|'('
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|'.'
name|'uuid'
op|')'
newline|'\n'
name|'self'
op|'.'
name|'migration_context'
op|'='
name|'None'
newline|'\n'
nl|'\n'
DECL|member|clear_numa_topology
dedent|''
dedent|''
name|'def'
name|'clear_numa_topology'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'numa_topology'
op|'='
name|'self'
op|'.'
name|'numa_topology'
newline|'\n'
name|'if'
name|'numa_topology'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'numa_topology'
op|'='
name|'numa_topology'
op|'.'
name|'clear_host_pinning'
op|'('
op|')'
newline|'\n'
nl|'\n'
DECL|member|obj_load_attr
dedent|''
dedent|''
name|'def'
name|'obj_load_attr'
op|'('
name|'self'
op|','
name|'attrname'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'attrname'
name|'not'
name|'in'
name|'INSTANCE_OPTIONAL_ATTRS'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ObjectActionError'
op|'('
nl|'\n'
name|'action'
op|'='
string|"'obj_load_attr'"
op|','
nl|'\n'
name|'reason'
op|'='
string|"'attribute %s not lazy-loadable'"
op|'%'
name|'attrname'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'not'
name|'self'
op|'.'
name|'_context'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'OrphanedObjectError'
op|'('
name|'method'
op|'='
string|"'obj_load_attr'"
op|','
nl|'\n'
name|'objtype'
op|'='
name|'self'
op|'.'
name|'obj_name'
op|'('
op|')'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'LOG'
op|'.'
name|'debug'
op|'('
string|'"Lazy-loading \'%(attr)s\' on %(name)s uuid %(uuid)s"'
op|','
nl|'\n'
op|'{'
string|"'attr'"
op|':'
name|'attrname'
op|','
nl|'\n'
string|"'name'"
op|':'
name|'self'
op|'.'
name|'obj_name'
op|'('
op|')'
op|','
nl|'\n'
string|"'uuid'"
op|':'
name|'self'
op|'.'
name|'uuid'
op|','
nl|'\n'
op|'}'
op|')'
newline|'\n'
nl|'\n'
comment|'# NOTE(danms): We handle some fields differently here so that we'
nl|'\n'
comment|'# can be more efficient'
nl|'\n'
name|'if'
name|'attrname'
op|'=='
string|"'fault'"
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_load_fault'
op|'('
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'attrname'
op|'=='
string|"'numa_topology'"
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_load_numa_topology'
op|'('
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'attrname'
op|'=='
string|"'pci_requests'"
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_load_pci_requests'
op|'('
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'attrname'
op|'=='
string|"'vcpu_model'"
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_load_vcpu_model'
op|'('
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'attrname'
op|'=='
string|"'ec2_ids'"
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_load_ec2_ids'
op|'('
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'attrname'
op|'=='
string|"'migration_context'"
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_load_migration_context'
op|'('
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'attrname'
op|'=='
string|"'keypairs'"
op|':'
newline|'\n'
comment|'# NOTE(danms): Let keypairs control its own destiny for'
nl|'\n'
comment|'# resetting changes.'
nl|'\n'
indent|' '
name|'return'
name|'self'
op|'.'
name|'_load_keypairs'
op|'('
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'attrname'
op|'=='
string|"'security_groups'"
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_load_security_groups'
op|'('
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'attrname'
op|'=='
string|"'pci_devices'"
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_load_pci_devices'
op|'('
op|')'
newline|'\n'
dedent|''
name|'elif'
string|"'flavor'"
name|'in'
name|'attrname'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_load_flavor'
op|'('
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'attrname'
op|'=='
string|"'services'"
name|'and'
name|'self'
op|'.'
name|'deleted'
op|':'
newline|'\n'
comment|'# NOTE(mriedem): The join in the data model for instances.services'
nl|'\n'
comment|'# filters on instances.deleted == 0, so if the instance is deleted'
nl|'\n'
comment|"# don't attempt to even load services since we'll fail."
nl|'\n'
indent|' '
name|'self'
op|'.'
name|'services'
op|'='
name|'objects'
op|'.'
name|'ServiceList'
op|'('
name|'self'
op|'.'
name|'_context'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
comment|'# FIXME(comstud): This should be optimized to only load the attr.'
nl|'\n'
indent|' '
name|'self'
op|'.'
name|'_load_generic'
op|'('
name|'attrname'
op|')'
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'obj_reset_changes'
op|'('
op|'['
name|'attrname'
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|get_flavor
dedent|''
name|'def'
name|'get_flavor'
op|'('
name|'self'
op|','
name|'namespace'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'prefix'
op|'='
op|'('
string|"'%s_'"
op|'%'
name|'namespace'
op|')'
name|'if'
name|'namespace'
name|'is'
name|'not'
name|'None'
name|'else'
string|"''"
newline|'\n'
name|'attr'
op|'='
string|"'%sflavor'"
op|'%'
name|'prefix'
newline|'\n'
name|'try'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'getattr'
op|'('
name|'self'
op|','
name|'attr'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'exception'
op|'.'
name|'FlavorNotFound'
op|':'
newline|'\n'
comment|"# NOTE(danms): This only happens in the case where we don't"
nl|'\n'
comment|'# have flavor information in sysmeta or extra, and doing'
nl|'\n'
comment|'# this triggers a lookup based on our instance_type_id for'
nl|'\n'
comment|'# (very) legacy instances. That legacy code expects a None here,'
nl|'\n'
comment|'# so emulate it for this helper, even though the actual attribute'
nl|'\n'
comment|'# is not nullable.'
nl|'\n'
indent|' '
name|'return'
name|'None'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable'
newline|'\n'
DECL|member|delete_metadata_key
name|'def'
name|'delete_metadata_key'
op|'('
name|'self'
op|','
name|'key'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Optimized metadata delete method.\n\n This provides a more efficient way to delete a single metadata\n key, instead of just calling instance.save(). This should be called\n with the key still present in self.metadata, which it will update\n after completion.\n """'
newline|'\n'
name|'db'
op|'.'
name|'instance_metadata_delete'
op|'('
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|'.'
name|'uuid'
op|','
name|'key'
op|')'
newline|'\n'
name|'md_was_changed'
op|'='
string|"'metadata'"
name|'in'
name|'self'
op|'.'
name|'obj_what_changed'
op|'('
op|')'
newline|'\n'
name|'del'
name|'self'
op|'.'
name|'metadata'
op|'['
name|'key'
op|']'
newline|'\n'
name|'self'
op|'.'
name|'_orig_metadata'
op|'.'
name|'pop'
op|'('
name|'key'
op|','
name|'None'
op|')'
newline|'\n'
name|'notifications'
op|'.'
name|'send_update'
op|'('
name|'self'
op|'.'
name|'_context'
op|','
name|'self'
op|','
name|'self'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'md_was_changed'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'obj_reset_changes'
op|'('
op|'['
string|"'metadata'"
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_cell_name_blocks_sync
dedent|''
dedent|''
name|'def'
name|'_cell_name_blocks_sync'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
op|'('
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'cell_name'"
op|')'
name|'and'
nl|'\n'
name|'self'
op|'.'
name|'cell_name'
name|'is'
name|'not'
name|'None'
name|'and'
nl|'\n'
name|'self'
op|'.'
name|'cell_name'
op|'.'
name|'startswith'
op|'('
name|'cells_utils'
op|'.'
name|'BLOCK_SYNC_FLAG'
op|')'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'True'
newline|'\n'
dedent|''
name|'return'
name|'False'
newline|'\n'
nl|'\n'
DECL|member|_normalize_cell_name
dedent|''
name|'def'
name|'_normalize_cell_name'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Undo skip_cell_sync()\'s cell_name modification if applied"""'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'cell_name'"
op|')'
name|'or'
name|'self'
op|'.'
name|'cell_name'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'return'
newline|'\n'
dedent|''
name|'cn_changed'
op|'='
string|"'cell_name'"
name|'in'
name|'self'
op|'.'
name|'obj_what_changed'
op|'('
op|')'
newline|'\n'
name|'if'
name|'self'
op|'.'
name|'cell_name'
op|'.'
name|'startswith'
op|'('
name|'cells_utils'
op|'.'
name|'BLOCK_SYNC_FLAG'
op|')'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'cell_name'
op|'='
name|'self'
op|'.'
name|'cell_name'
op|'.'
name|'replace'
op|'('
nl|'\n'
name|'cells_utils'
op|'.'
name|'BLOCK_SYNC_FLAG'
op|','
string|"''"
op|','
number|'1'
op|')'
newline|'\n'
comment|'# cell_name is not normally an empty string, this means it was None'
nl|'\n'
comment|'# or unset before cells_utils.BLOCK_SYNC_FLAG was applied.'
nl|'\n'
name|'if'
name|'len'
op|'('
name|'self'
op|'.'
name|'cell_name'
op|')'
op|'=='
number|'0'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'cell_name'
op|'='
name|'None'
newline|'\n'
dedent|''
dedent|''
name|'if'
name|'not'
name|'cn_changed'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'obj_reset_changes'
op|'('
op|'['
string|"'cell_name'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'contextlib'
op|'.'
name|'contextmanager'
newline|'\n'
DECL|member|skip_cells_sync
name|'def'
name|'skip_cells_sync'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Context manager to save an instance without syncing cells.\n\n Temporarily disables the cells syncing logic, if enabled. This should\n only be used when saving an instance that has been passed down/up from\n another cell in order to avoid passing it back to the originator to be\n re-saved.\n """'
newline|'\n'
name|'cn_changed'
op|'='
string|"'cell_name'"
name|'in'
name|'self'
op|'.'
name|'obj_what_changed'
op|'('
op|')'
newline|'\n'
name|'if'
name|'not'
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
string|"'cell_name'"
op|')'
name|'or'
name|'self'
op|'.'
name|'cell_name'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'cell_name'
op|'='
string|"''"
newline|'\n'
dedent|''
name|'self'
op|'.'
name|'cell_name'
op|'='
string|"'%s%s'"
op|'%'
op|'('
name|'cells_utils'
op|'.'
name|'BLOCK_SYNC_FLAG'
op|','
name|'self'
op|'.'
name|'cell_name'
op|')'
newline|'\n'
name|'if'
name|'not'
name|'cn_changed'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'obj_reset_changes'
op|'('
op|'['
string|"'cell_name'"
op|']'
op|')'
newline|'\n'
dedent|''
name|'try'
op|':'
newline|'\n'
indent|' '
name|'yield'
newline|'\n'
dedent|''
name|'finally'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_normalize_cell_name'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|_make_instance_list
dedent|''
dedent|''
dedent|''
name|'def'
name|'_make_instance_list'
op|'('
name|'context'
op|','
name|'inst_list'
op|','
name|'db_inst_list'
op|','
name|'expected_attrs'
op|')'
op|':'
newline|'\n'
indent|' '
name|'get_fault'
op|'='
name|'expected_attrs'
name|'and'
string|"'fault'"
name|'in'
name|'expected_attrs'
newline|'\n'
name|'inst_faults'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'if'
name|'get_fault'
op|':'
newline|'\n'
comment|'# Build an instance_uuid:latest-fault mapping'
nl|'\n'
indent|' '
name|'expected_attrs'
op|'.'
name|'remove'
op|'('
string|"'fault'"
op|')'
newline|'\n'
name|'instance_uuids'
op|'='
op|'['
name|'inst'
op|'['
string|"'uuid'"
op|']'
name|'for'
name|'inst'
name|'in'
name|'db_inst_list'
op|']'
newline|'\n'
name|'faults'
op|'='
name|'objects'
op|'.'
name|'InstanceFaultList'
op|'.'
name|'get_by_instance_uuids'
op|'('
nl|'\n'
name|'context'
op|','
name|'instance_uuids'
op|')'
newline|'\n'
name|'for'
name|'fault'
name|'in'
name|'faults'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'fault'
op|'.'
name|'instance_uuid'
name|'not'
name|'in'
name|'inst_faults'
op|':'
newline|'\n'
indent|' '
name|'inst_faults'
op|'['
name|'fault'
op|'.'
name|'instance_uuid'
op|']'
op|'='
name|'fault'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
dedent|''
name|'inst_cls'
op|'='
name|'objects'
op|'.'
name|'Instance'
newline|'\n'
nl|'\n'
name|'inst_list'
op|'.'
name|'objects'
op|'='
op|'['
op|']'
newline|'\n'
name|'for'
name|'db_inst'
name|'in'
name|'db_inst_list'
op|':'
newline|'\n'
indent|' '
name|'inst_obj'
op|'='
name|'inst_cls'
op|'.'
name|'_from_db_object'
op|'('
nl|'\n'
name|'context'
op|','
name|'inst_cls'
op|'('
name|'context'
op|')'
op|','
name|'db_inst'
op|','
nl|'\n'
name|'expected_attrs'
op|'='
name|'expected_attrs'
op|')'
newline|'\n'
name|'if'
name|'get_fault'
op|':'
newline|'\n'
indent|' '
name|'inst_obj'
op|'.'
name|'fault'
op|'='
name|'inst_faults'
op|'.'
name|'get'
op|'('
name|'inst_obj'
op|'.'
name|'uuid'
op|','
name|'None'
op|')'
newline|'\n'
dedent|''
name|'inst_list'
op|'.'
name|'objects'
op|'.'
name|'append'
op|'('
name|'inst_obj'
op|')'
newline|'\n'
dedent|''
name|'inst_list'
op|'.'
name|'obj_reset_changes'
op|'('
op|')'
newline|'\n'
name|'return'
name|'inst_list'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'NovaObjectRegistry'
op|'.'
name|'register'
newline|'\n'
DECL|class|InstanceList
name|'class'
name|'InstanceList'
op|'('
name|'base'
op|'.'
name|'ObjectListBase'
op|','
name|'base'
op|'.'
name|'NovaObject'
op|')'
op|':'
newline|'\n'
comment|'# Version 2.0: Initial Version'
nl|'\n'
DECL|variable|VERSION
indent|' '
name|'VERSION'
op|'='
string|"'2.0'"
newline|'\n'
nl|'\n'
DECL|variable|fields
name|'fields'
op|'='
op|'{'
nl|'\n'
string|"'objects'"
op|':'
name|'fields'
op|'.'
name|'ListOfObjectsField'
op|'('
string|"'Instance'"
op|')'
op|','
nl|'\n'
op|'}'
newline|'\n'
nl|'\n'
op|'@'
name|'classmethod'
newline|'\n'
op|'@'
name|'db'
op|'.'
name|'select_db_reader_mode'
newline|'\n'
DECL|member|_get_by_filters_impl
name|'def'
name|'_get_by_filters_impl'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'filters'
op|','
nl|'\n'
name|'sort_key'
op|'='
string|"'created_at'"
op|','
name|'sort_dir'
op|'='
string|"'desc'"
op|','
name|'limit'
op|'='
name|'None'
op|','
nl|'\n'
name|'marker'
op|'='
name|'None'
op|','
name|'expected_attrs'
op|'='
name|'None'
op|','
name|'use_slave'
op|'='
name|'False'
op|','
nl|'\n'
name|'sort_keys'
op|'='
name|'None'
op|','
name|'sort_dirs'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'sort_keys'
name|'or'
name|'sort_dirs'
op|':'
newline|'\n'
indent|' '
name|'db_inst_list'
op|'='
name|'db'
op|'.'
name|'instance_get_all_by_filters_sort'
op|'('
nl|'\n'
name|'context'
op|','
name|'filters'
op|','
name|'limit'
op|'='
name|'limit'
op|','
name|'marker'
op|'='
name|'marker'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'_expected_cols'
op|'('
name|'expected_attrs'
op|')'
op|','
nl|'\n'
name|'sort_keys'
op|'='
name|'sort_keys'
op|','
name|'sort_dirs'
op|'='
name|'sort_dirs'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'db_inst_list'
op|'='
name|'db'
op|'.'
name|'instance_get_all_by_filters'
op|'('
nl|'\n'
name|'context'
op|','
name|'filters'
op|','
name|'sort_key'
op|','
name|'sort_dir'
op|','
name|'limit'
op|'='
name|'limit'
op|','
nl|'\n'
name|'marker'
op|'='
name|'marker'
op|','
name|'columns_to_join'
op|'='
name|'_expected_cols'
op|'('
name|'expected_attrs'
op|')'
op|')'
newline|'\n'
dedent|''
name|'return'
name|'_make_instance_list'
op|'('
name|'context'
op|','
name|'cls'
op|'('
op|')'
op|','
name|'db_inst_list'
op|','
nl|'\n'
name|'expected_attrs'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable_classmethod'
newline|'\n'
DECL|member|get_by_filters
name|'def'
name|'get_by_filters'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'filters'
op|','
nl|'\n'
name|'sort_key'
op|'='
string|"'created_at'"
op|','
name|'sort_dir'
op|'='
string|"'desc'"
op|','
name|'limit'
op|'='
name|'None'
op|','
nl|'\n'
name|'marker'
op|'='
name|'None'
op|','
name|'expected_attrs'
op|'='
name|'None'
op|','
name|'use_slave'
op|'='
name|'False'
op|','
nl|'\n'
name|'sort_keys'
op|'='
name|'None'
op|','
name|'sort_dirs'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'cls'
op|'.'
name|'_get_by_filters_impl'
op|'('
nl|'\n'
name|'context'
op|','
name|'filters'
op|','
name|'sort_key'
op|'='
name|'sort_key'
op|','
name|'sort_dir'
op|'='
name|'sort_dir'
op|','
nl|'\n'
name|'limit'
op|'='
name|'limit'
op|','
name|'marker'
op|'='
name|'marker'
op|','
name|'expected_attrs'
op|'='
name|'expected_attrs'
op|','
nl|'\n'
name|'use_slave'
op|'='
name|'use_slave'
op|','
name|'sort_keys'
op|'='
name|'sort_keys'
op|','
name|'sort_dirs'
op|'='
name|'sort_dirs'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'staticmethod'
newline|'\n'
op|'@'
name|'db'
op|'.'
name|'select_db_reader_mode'
newline|'\n'
DECL|member|_db_instance_get_all_by_host
name|'def'
name|'_db_instance_get_all_by_host'
op|'('
name|'context'
op|','
name|'host'
op|','
name|'columns_to_join'
op|','
nl|'\n'
name|'use_slave'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'db'
op|'.'
name|'instance_get_all_by_host'
op|'('
name|'context'
op|','
name|'host'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'columns_to_join'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable_classmethod'
newline|'\n'
DECL|member|get_by_host
name|'def'
name|'get_by_host'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'host'
op|','
name|'expected_attrs'
op|'='
name|'None'
op|','
name|'use_slave'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'db_inst_list'
op|'='
name|'cls'
op|'.'
name|'_db_instance_get_all_by_host'
op|'('
nl|'\n'
name|'context'
op|','
name|'host'
op|','
name|'columns_to_join'
op|'='
name|'_expected_cols'
op|'('
name|'expected_attrs'
op|')'
op|','
nl|'\n'
name|'use_slave'
op|'='
name|'use_slave'
op|')'
newline|'\n'
name|'return'
name|'_make_instance_list'
op|'('
name|'context'
op|','
name|'cls'
op|'('
op|')'
op|','
name|'db_inst_list'
op|','
nl|'\n'
name|'expected_attrs'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable_classmethod'
newline|'\n'
DECL|member|get_by_host_and_node
name|'def'
name|'get_by_host_and_node'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'host'
op|','
name|'node'
op|','
name|'expected_attrs'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'db_inst_list'
op|'='
name|'db'
op|'.'
name|'instance_get_all_by_host_and_node'
op|'('
nl|'\n'
name|'context'
op|','
name|'host'
op|','
name|'node'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'_expected_cols'
op|'('
name|'expected_attrs'
op|')'
op|')'
newline|'\n'
name|'return'
name|'_make_instance_list'
op|'('
name|'context'
op|','
name|'cls'
op|'('
op|')'
op|','
name|'db_inst_list'
op|','
nl|'\n'
name|'expected_attrs'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable_classmethod'
newline|'\n'
DECL|member|get_by_host_and_not_type
name|'def'
name|'get_by_host_and_not_type'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'host'
op|','
name|'type_id'
op|'='
name|'None'
op|','
nl|'\n'
name|'expected_attrs'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'db_inst_list'
op|'='
name|'db'
op|'.'
name|'instance_get_all_by_host_and_not_type'
op|'('
nl|'\n'
name|'context'
op|','
name|'host'
op|','
name|'type_id'
op|'='
name|'type_id'
op|')'
newline|'\n'
name|'return'
name|'_make_instance_list'
op|'('
name|'context'
op|','
name|'cls'
op|'('
op|')'
op|','
name|'db_inst_list'
op|','
nl|'\n'
name|'expected_attrs'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable_classmethod'
newline|'\n'
DECL|member|get_all
name|'def'
name|'get_all'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'expected_attrs'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Returns all instances on all nodes."""'
newline|'\n'
name|'db_instances'
op|'='
name|'db'
op|'.'
name|'instance_get_all'
op|'('
nl|'\n'
name|'context'
op|','
name|'columns_to_join'
op|'='
name|'_expected_cols'
op|'('
name|'expected_attrs'
op|')'
op|')'
newline|'\n'
name|'return'
name|'_make_instance_list'
op|'('
name|'context'
op|','
name|'cls'
op|'('
op|')'
op|','
name|'db_instances'
op|','
nl|'\n'
name|'expected_attrs'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable_classmethod'
newline|'\n'
DECL|member|get_hung_in_rebooting
name|'def'
name|'get_hung_in_rebooting'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'reboot_window'
op|','
nl|'\n'
name|'expected_attrs'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
name|'db_inst_list'
op|'='
name|'db'
op|'.'
name|'instance_get_all_hung_in_rebooting'
op|'('
name|'context'
op|','
nl|'\n'
name|'reboot_window'
op|')'
newline|'\n'
name|'return'
name|'_make_instance_list'
op|'('
name|'context'
op|','
name|'cls'
op|'('
op|')'
op|','
name|'db_inst_list'
op|','
nl|'\n'
name|'expected_attrs'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'staticmethod'
newline|'\n'
op|'@'
name|'db'
op|'.'
name|'select_db_reader_mode'
newline|'\n'
DECL|member|_db_instance_get_active_by_window_joined
name|'def'
name|'_db_instance_get_active_by_window_joined'
op|'('
nl|'\n'
name|'context'
op|','
name|'begin'
op|','
name|'end'
op|','
name|'project_id'
op|','
name|'host'
op|','
name|'columns_to_join'
op|','
nl|'\n'
name|'use_slave'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'db'
op|'.'
name|'instance_get_active_by_window_joined'
op|'('
nl|'\n'
name|'context'
op|','
name|'begin'
op|','
name|'end'
op|','
name|'project_id'
op|','
name|'host'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'columns_to_join'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable_classmethod'
newline|'\n'
DECL|member|_get_active_by_window_joined
name|'def'
name|'_get_active_by_window_joined'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'begin'
op|','
name|'end'
op|'='
name|'None'
op|','
nl|'\n'
name|'project_id'
op|'='
name|'None'
op|','
name|'host'
op|'='
name|'None'
op|','
nl|'\n'
name|'expected_attrs'
op|'='
name|'None'
op|','
nl|'\n'
name|'use_slave'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
comment|'# NOTE(mriedem): We need to convert the begin/end timestamp strings'
nl|'\n'
comment|'# to timezone-aware datetime objects for the DB API call.'
nl|'\n'
indent|' '
name|'begin'
op|'='
name|'timeutils'
op|'.'
name|'parse_isotime'
op|'('
name|'begin'
op|')'
newline|'\n'
name|'end'
op|'='
name|'timeutils'
op|'.'
name|'parse_isotime'
op|'('
name|'end'
op|')'
name|'if'
name|'end'
name|'else'
name|'None'
newline|'\n'
name|'db_inst_list'
op|'='
name|'cls'
op|'.'
name|'_db_instance_get_active_by_window_joined'
op|'('
nl|'\n'
name|'context'
op|','
name|'begin'
op|','
name|'end'
op|','
name|'project_id'
op|','
name|'host'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
name|'_expected_cols'
op|'('
name|'expected_attrs'
op|')'
op|','
nl|'\n'
name|'use_slave'
op|'='
name|'use_slave'
op|')'
newline|'\n'
name|'return'
name|'_make_instance_list'
op|'('
name|'context'
op|','
name|'cls'
op|'('
op|')'
op|','
name|'db_inst_list'
op|','
nl|'\n'
name|'expected_attrs'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'classmethod'
newline|'\n'
DECL|member|get_active_by_window_joined
name|'def'
name|'get_active_by_window_joined'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'begin'
op|','
name|'end'
op|'='
name|'None'
op|','
nl|'\n'
name|'project_id'
op|'='
name|'None'
op|','
name|'host'
op|'='
name|'None'
op|','
nl|'\n'
name|'expected_attrs'
op|'='
name|'None'
op|','
nl|'\n'
name|'use_slave'
op|'='
name|'False'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Get instances and joins active during a certain time window.\n\n :param:context: nova request context\n :param:begin: datetime for the start of the time window\n :param:end: datetime for the end of the time window\n :param:project_id: used to filter instances by project\n :param:host: used to filter instances on a given compute host\n :param:expected_attrs: list of related fields that can be joined\n in the database layer when querying for instances\n :param use_slave if True, ship this query off to a DB slave\n :returns: InstanceList\n\n """'
newline|'\n'
comment|'# NOTE(mriedem): We have to convert the datetime objects to string'
nl|'\n'
comment|'# primitives for the remote call.'
nl|'\n'
name|'begin'
op|'='
name|'utils'
op|'.'
name|'isotime'
op|'('
name|'begin'
op|')'
newline|'\n'
name|'end'
op|'='
name|'utils'
op|'.'
name|'isotime'
op|'('
name|'end'
op|')'
name|'if'
name|'end'
name|'else'
name|'None'
newline|'\n'
name|'return'
name|'cls'
op|'.'
name|'_get_active_by_window_joined'
op|'('
name|'context'
op|','
name|'begin'
op|','
name|'end'
op|','
nl|'\n'
name|'project_id'
op|','
name|'host'
op|','
nl|'\n'
name|'expected_attrs'
op|','
nl|'\n'
name|'use_slave'
op|'='
name|'use_slave'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable_classmethod'
newline|'\n'
DECL|member|get_by_security_group_id
name|'def'
name|'get_by_security_group_id'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'security_group_id'
op|')'
op|':'
newline|'\n'
indent|' '
name|'db_secgroup'
op|'='
name|'db'
op|'.'
name|'security_group_get'
op|'('
nl|'\n'
name|'context'
op|','
name|'security_group_id'
op|','
nl|'\n'
name|'columns_to_join'
op|'='
op|'['
string|"'instances.info_cache'"
op|','
nl|'\n'
string|"'instances.system_metadata'"
op|']'
op|')'
newline|'\n'
name|'return'
name|'_make_instance_list'
op|'('
name|'context'
op|','
name|'cls'
op|'('
op|')'
op|','
name|'db_secgroup'
op|'['
string|"'instances'"
op|']'
op|','
nl|'\n'
op|'['
string|"'info_cache'"
op|','
string|"'system_metadata'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'classmethod'
newline|'\n'
DECL|member|get_by_security_group
name|'def'
name|'get_by_security_group'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'security_group'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'cls'
op|'.'
name|'get_by_security_group_id'
op|'('
name|'context'
op|','
name|'security_group'
op|'.'
name|'id'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'base'
op|'.'
name|'remotable_classmethod'
newline|'\n'
DECL|member|get_by_grantee_security_group_ids
name|'def'
name|'get_by_grantee_security_group_ids'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'security_group_ids'
op|')'
op|':'
newline|'\n'
indent|' '
name|'db_instances'
op|'='
name|'db'
op|'.'
name|'instance_get_all_by_grantee_security_groups'
op|'('
nl|'\n'
name|'context'
op|','
name|'security_group_ids'
op|')'
newline|'\n'
name|'return'
name|'_make_instance_list'
op|'('
name|'context'
op|','
name|'cls'
op|'('
op|')'
op|','
name|'db_instances'
op|','
op|'['
op|']'
op|')'
newline|'\n'
nl|'\n'
DECL|member|fill_faults
dedent|''
name|'def'
name|'fill_faults'
op|'('
name|'self'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Batch query the database for our instances\' faults.\n\n :returns: A list of instance uuids for which faults were found.\n """'
newline|'\n'
name|'uuids'
op|'='
op|'['
name|'inst'
op|'.'
name|'uuid'
name|'for'
name|'inst'
name|'in'
name|'self'
op|']'
newline|'\n'
name|'faults'
op|'='
name|'objects'
op|'.'
name|'InstanceFaultList'
op|'.'
name|'get_by_instance_uuids'
op|'('
nl|'\n'
name|'self'
op|'.'
name|'_context'
op|','
name|'uuids'
op|')'
newline|'\n'
name|'faults_by_uuid'
op|'='
op|'{'
op|'}'
newline|'\n'
name|'for'
name|'fault'
name|'in'
name|'faults'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'fault'
op|'.'
name|'instance_uuid'
name|'not'
name|'in'
name|'faults_by_uuid'
op|':'
newline|'\n'
indent|' '
name|'faults_by_uuid'
op|'['
name|'fault'
op|'.'
name|'instance_uuid'
op|']'
op|'='
name|'fault'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'for'
name|'instance'
name|'in'
name|'self'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'instance'
op|'.'
name|'uuid'
name|'in'
name|'faults_by_uuid'
op|':'
newline|'\n'
indent|' '
name|'instance'
op|'.'
name|'fault'
op|'='
name|'faults_by_uuid'
op|'['
name|'instance'
op|'.'
name|'uuid'
op|']'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
comment|'# NOTE(danms): Otherwise the caller will cause a lazy-load'
nl|'\n'
comment|'# when checking it, and we know there are none'
nl|'\n'
indent|' '
name|'instance'
op|'.'
name|'fault'
op|'='
name|'None'
newline|'\n'
dedent|''
name|'instance'
op|'.'
name|'obj_reset_changes'
op|'('
op|'['
string|"'fault'"
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'faults_by_uuid'
op|'.'
name|'keys'
op|'('
op|')'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'db_api'
op|'.'
name|'main_context_manager'
op|'.'
name|'writer'
newline|'\n'
DECL|function|_migrate_instance_keypairs
name|'def'
name|'_migrate_instance_keypairs'
op|'('
name|'ctxt'
op|','
name|'count'
op|')'
op|':'
newline|'\n'
indent|' '
name|'db_extras'
op|'='
name|'ctxt'
op|'.'
name|'session'
op|'.'
name|'query'
op|'('
name|'models'
op|'.'
name|'InstanceExtra'
op|')'
op|'.'
name|'options'
op|'('
name|'joinedload'
op|'('
string|"'instance'"
op|')'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'keypairs'
op|'='
name|'None'
op|')'
op|'.'
name|'filter_by'
op|'('
name|'deleted'
op|'='
number|'0'
op|')'
op|'.'
name|'limit'
op|'('
name|'count'
op|')'
op|'.'
name|'all'
op|'('
op|')'
newline|'\n'
nl|'\n'
name|'count_all'
op|'='
name|'len'
op|'('
name|'db_extras'
op|')'
newline|'\n'
name|'count_hit'
op|'='
number|'0'
newline|'\n'
name|'for'
name|'db_extra'
name|'in'
name|'db_extras'
op|':'
newline|'\n'
indent|' '
name|'key_name'
op|'='
name|'db_extra'
op|'.'
name|'instance'
op|'.'
name|'key_name'
newline|'\n'
name|'keypairs'
op|'='
name|'objects'
op|'.'
name|'KeyPairList'
op|'('
name|'objects'
op|'='
op|'['
op|']'
op|')'
newline|'\n'
name|'if'
name|'key_name'
op|':'
newline|'\n'
indent|' '
name|'try'
op|':'
newline|'\n'
indent|' '
name|'key'
op|'='
name|'objects'
op|'.'
name|'KeyPair'
op|'.'
name|'get_by_name'
op|'('
name|'ctxt'
op|','
nl|'\n'
name|'db_extra'
op|'.'
name|'instance'
op|'.'
name|'user_id'
op|','
nl|'\n'
name|'key_name'
op|')'
newline|'\n'
name|'keypairs'
op|'.'
name|'objects'
op|'.'
name|'append'
op|'('
name|'key'
op|')'
newline|'\n'
dedent|''
name|'except'
name|'exception'
op|'.'
name|'KeypairNotFound'
op|':'
newline|'\n'
indent|' '
name|'LOG'
op|'.'
name|'warning'
op|'('
nl|'\n'
name|'_LW'
op|'('
string|"'Instance %(uuid)s keypair %(keyname)s not found'"
op|')'
op|','
nl|'\n'
op|'{'
string|"'uuid'"
op|':'
name|'db_extra'
op|'.'
name|'instance_uuid'
op|','
string|"'keyname'"
op|':'
name|'key_name'
op|'}'
op|')'
newline|'\n'
dedent|''
dedent|''
name|'db_extra'
op|'.'
name|'keypairs'
op|'='
name|'jsonutils'
op|'.'
name|'dumps'
op|'('
name|'keypairs'
op|'.'
name|'obj_to_primitive'
op|'('
op|')'
op|')'
newline|'\n'
name|'db_extra'
op|'.'
name|'save'
op|'('
name|'ctxt'
op|'.'
name|'session'
op|')'
newline|'\n'
name|'count_hit'
op|'+='
number|'1'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'count_all'
op|','
name|'count_hit'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|function|migrate_instance_keypairs
dedent|''
name|'def'
name|'migrate_instance_keypairs'
op|'('
name|'ctxt'
op|','
name|'count'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'_migrate_instance_keypairs'
op|'('
name|'ctxt'
op|','
name|'count'
op|')'
newline|'\n'
dedent|''
endmarker|''
end_unit
| 13.187133 | 656 | 0.603723 | 16,559 | 112,328 | 3.979286 | 0.044387 | 0.148696 | 0.080433 | 0.056091 | 0.830923 | 0.790403 | 0.723688 | 0.671452 | 0.595374 | 0.545673 | 0 | 0.000637 | 0.118902 | 112,328 | 8,517 | 657 | 13.188681 | 0.665138 | 0 | 0 | 0.948221 | 0 | 0.000704 | 0.494605 | 0.053451 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.000939 | 0.002466 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
86a8d530e26513211b8b5eef9b5986d6876896c1 | 49 | py | Python | testsuite/python-tests/hello.py | pyronia-sys/libpyronia | 02ccf1d832754f771d1f9a60e5e91819ec878757 | [
"Apache-2.0"
] | null | null | null | testsuite/python-tests/hello.py | pyronia-sys/libpyronia | 02ccf1d832754f771d1f9a60e5e91819ec878757 | [
"Apache-2.0"
] | 2 | 2020-02-05T01:26:55.000Z | 2020-02-05T01:31:32.000Z | testsuite/python-tests/hello.py | pyronia-sys/libpyronia | 02ccf1d832754f771d1f9a60e5e91819ec878757 | [
"Apache-2.0"
] | 1 | 2021-12-05T19:53:32.000Z | 2021-12-05T19:53:32.000Z | def hello():
print("Hello, World!")
hello()
| 9.8 | 26 | 0.571429 | 6 | 49 | 4.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.204082 | 49 | 4 | 27 | 12.25 | 0.717949 | 0 | 0 | 0 | 0 | 0 | 0.265306 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0 | 0 | 0.333333 | 0.333333 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
86e47c6a3c76f77fd39fced3d4e33ddd0f0a2359 | 213 | py | Python | visualization/urls.py | FSavoy/visuo-server | d9c93ec7ae9dd033f3f0290381ddbac413bb6f9a | [
"BSD-3-Clause"
] | 2 | 2017-11-16T08:32:46.000Z | 2018-04-02T13:36:42.000Z | visualization/urls.py | FSavoy/visuo-server | d9c93ec7ae9dd033f3f0290381ddbac413bb6f9a | [
"BSD-3-Clause"
] | null | null | null | visualization/urls.py | FSavoy/visuo-server | d9c93ec7ae9dd033f3f0290381ddbac413bb6f9a | [
"BSD-3-Clause"
] | 2 | 2017-11-16T08:33:52.000Z | 2021-05-12T06:31:54.000Z | from django.conf.urls import url
from visualization import views
urlpatterns = [
url(r'^(?P<shift>-?\d+)$', views.show_image, name='show_image_shift'),
url(r'^$', views.show_image, name='show_image'),
]
| 23.666667 | 74 | 0.680751 | 31 | 213 | 4.516129 | 0.516129 | 0.257143 | 0.2 | 0.257143 | 0.385714 | 0.385714 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131455 | 213 | 8 | 75 | 26.625 | 0.756757 | 0 | 0 | 0 | 0 | 0 | 0.215962 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
810b132064acedcaf00f6846002f747ce64f8250 | 20 | py | Python | QCompute/Define/TrimFlag.py | baidu/QCompute | 04e9fd62e10b1e8bb2be4ec6cff4ea4f3a7d4025 | [
"Apache-2.0"
] | 52 | 2020-09-14T02:25:25.000Z | 2022-02-17T07:43:01.000Z | QCompute/Define/TrimFlag.py | baidu/QCompute | 04e9fd62e10b1e8bb2be4ec6cff4ea4f3a7d4025 | [
"Apache-2.0"
] | null | null | null | QCompute/Define/TrimFlag.py | baidu/QCompute | 04e9fd62e10b1e8bb2be4ec6cff4ea4f3a7d4025 | [
"Apache-2.0"
] | 10 | 2020-09-15T06:10:24.000Z | 2021-09-15T07:26:27.000Z | """
Trim flag
"""
| 3.333333 | 9 | 0.4 | 2 | 20 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.3 | 20 | 5 | 10 | 4 | 0.571429 | 0.45 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
d48d7871dedf85c02f328408fe93aa8c11534c7a | 1,559 | py | Python | src/coreclr/gc/vxsort/smallsort/codegen/bitonic_isa.py | pyracanda/runtime | 72bee25ab532a4d0636118ec2ed3eabf3fd55245 | [
"MIT"
] | 9,402 | 2019-11-25T23:26:24.000Z | 2022-03-31T23:19:41.000Z | src/coreclr/gc/vxsort/smallsort/codegen/bitonic_isa.py | pyracanda/runtime | 72bee25ab532a4d0636118ec2ed3eabf3fd55245 | [
"MIT"
] | 37,522 | 2019-11-25T23:30:32.000Z | 2022-03-31T23:58:30.000Z | src/coreclr/gc/vxsort/smallsort/codegen/bitonic_isa.py | pyracanda/runtime | 72bee25ab532a4d0636118ec2ed3eabf3fd55245 | [
"MIT"
] | 3,629 | 2019-11-25T23:29:16.000Z | 2022-03-31T21:52:28.000Z | ##
## Licensed to the .NET Foundation under one or more agreements.
## The .NET Foundation licenses this file to you under the MIT license.
##
from abc import ABC, ABCMeta, abstractmethod
from utils import next_power_of_2
class BitonicISA(ABC, metaclass=ABCMeta):
@abstractmethod
def vector_size(self):
pass
@abstractmethod
def max_bitonic_sort_vectors(self):
pass
def largest_merge_variant_needed(self):
return next_power_of_2(self.max_bitonic_sort_vectors()) / 2;
@abstractmethod
def vector_size(self):
pass
@abstractmethod
def vector_type(self):
pass
@classmethod
@abstractmethod
def supported_types(cls):
pass
@abstractmethod
def generate_prologue(self, f):
pass
@abstractmethod
def generate_epilogue(self, f):
pass
@abstractmethod
def generate_1v_basic_sorters(self, f, ascending):
pass
@abstractmethod
def generate_1v_merge_sorters(self, f, ascending):
pass
def generate_1v_sorters(self, f, ascending):
self.generate_1v_basic_sorters(f, ascending)
self.generate_1v_merge_sorters(f, ascending)
@abstractmethod
def generate_compounded_sorter(self, f, width, ascending, inline):
pass
@abstractmethod
def generate_compounded_merger(self, f, width, ascending, inline):
pass
@abstractmethod
def generate_entry_points(self, f):
pass
@abstractmethod
def generate_master_entry_point(self, f):
pass
| 21.356164 | 71 | 0.681206 | 184 | 1,559 | 5.532609 | 0.353261 | 0.217092 | 0.185658 | 0.199411 | 0.433202 | 0.308448 | 0.208251 | 0.208251 | 0.10609 | 0 | 0 | 0.00682 | 0.247595 | 1,559 | 72 | 72 | 21.652778 | 0.86104 | 0.083387 | 0 | 0.583333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3125 | false | 0.270833 | 0.041667 | 0.020833 | 0.395833 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
d4a19eec8ad89c0ef055dfe56f5af37c9b6c3708 | 101 | wsgi | Python | akira.wsgi | djricafort/face_animation | 792da036d9785ef0d3cce8b055c9a7038c180c03 | [
"MIT"
] | null | null | null | akira.wsgi | djricafort/face_animation | 792da036d9785ef0d3cce8b055c9a7038c180c03 | [
"MIT"
] | null | null | null | akira.wsgi | djricafort/face_animation | 792da036d9785ef0d3cce8b055c9a7038c180c03 | [
"MIT"
] | 1 | 2021-04-11T06:19:17.000Z | 2021-04-11T06:19:17.000Z | #akira.wsgi
import sys
sys.path.insert(0, '/var/www/html/akira')
from app import app as application
| 16.833333 | 41 | 0.752475 | 18 | 101 | 4.222222 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011236 | 0.118812 | 101 | 5 | 42 | 20.2 | 0.842697 | 0.09901 | 0 | 0 | 0 | 0 | 0.211111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
d4a65354032aa9e49a7280e9f9f177d5d8448250 | 2,058 | py | Python | odin-libraries/bash/test_bash.py | elijah/odin | d181cd86b9909904ba97d8090098a4913c93a894 | [
"MIT"
] | 447 | 2020-05-21T11:22:16.000Z | 2022-03-13T01:28:25.000Z | odin-libraries/bash/test_bash.py | elijah/odin | d181cd86b9909904ba97d8090098a4913c93a894 | [
"MIT"
] | 40 | 2020-05-21T13:17:57.000Z | 2022-03-02T08:44:45.000Z | odin-libraries/bash/test_bash.py | elijah/odin | d181cd86b9909904ba97d8090098a4913c93a894 | [
"MIT"
] | 25 | 2020-05-28T21:23:13.000Z | 2022-03-18T19:31:31.000Z | import unittest
from pymongo import MongoClient
import json
from os import system, environ
import random
class odinSdkTest(unittest.TestCase):
def setUp(self):
client = MongoClient(environ.get('ODIN_MONGODB'))
mongodb = client['odin']
self.collection = mongodb['observability']
def testConditionNotOdinEnv(self):
r = random.randint(100000, 999999)
test_desc = 'test_desc' + str(r)
environ["ODIN_EXEC_ENV"] = "False"
system("/bin/odinbash -p relative -i job.yml")
system("/bin/odinbash -p relative -d " + test_desc + " -c True")
result = self.collection.find_one({"description" : test_desc})
self.assertEqual(None, result)
def testWatchNotOdinEnv(self):
r = random.randint(100000, 999999)
test_desc = 'test_desc' + str(r)
environ["ODIN_EXEC_ENV"] = "False"
system("/bin/odinbash -p relative -i job.yml")
system("/bin/odinbash -p relative -d " + test_desc + " -w True")
result = self.collection.find_one({"description" : test_desc})
self.assertEqual(None, result)
def testCondition(self):
r = random.randint(100000, 999999)
test_desc = 'test_desc' + str(r)
environ["ODIN_EXEC_ENV"] = "True"
system("/bin/odinbash -p relative -i job.yml -t")
system("/bin/odinbash -p relative -d " + test_desc + " -c True -t")
result = self.collection.find_one({"description" : test_desc})
self.assertEqual(test_desc, result['description'])
def testWatch(self):
r = random.randint(100000, 999999)
test_desc = 'test_desc' + str(r)
environ["ODIN_EXEC_ENV"] = "True"
system("/bin/odinbash -p relative -i job.yml -t")
system("/bin/odinbash -p relative -d " + test_desc + " -c True -t")
result = self.collection.find_one({"description" : test_desc})
self.assertEqual(test_desc, result['description'])
if __name__ == "__main__":
unittest.main() # run all tests
| 32.15625 | 75 | 0.617104 | 247 | 2,058 | 4.983806 | 0.242915 | 0.116978 | 0.110479 | 0.116978 | 0.735175 | 0.735175 | 0.735175 | 0.735175 | 0.735175 | 0.735175 | 0 | 0.031048 | 0.248785 | 2,058 | 63 | 76 | 32.666667 | 0.765201 | 0.006317 | 0 | 0.590909 | 0 | 0 | 0.253059 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 1 | 0.113636 | false | 0 | 0.113636 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
d4a9b4ef76ecd14edb4b73cfd7c6e13ebffb2d9c | 39 | py | Python | src/__init__.py | cscashby/pi-showcontrol | 2cc9b2b34ec8eeebede7609535b3c1e937b700cb | [
"MIT"
] | 3 | 2017-05-07T18:13:09.000Z | 2017-08-25T09:35:26.000Z | src/__init__.py | cscashby/pi-showcontrol | 2cc9b2b34ec8eeebede7609535b3c1e937b700cb | [
"MIT"
] | 6 | 2017-05-07T11:36:45.000Z | 2017-07-31T15:30:20.000Z | src/__init__.py | cscashby/pi-showcontrol | 2cc9b2b34ec8eeebede7609535b3c1e937b700cb | [
"MIT"
] | null | null | null | #!/usr/bin/python3
# Primary namespace
| 13 | 19 | 0.74359 | 5 | 39 | 5.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028571 | 0.102564 | 39 | 2 | 20 | 19.5 | 0.8 | 0.897436 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
d4b52455a05212b0ff41032e25a596d0f0421686 | 289 | py | Python | src/domain/validators/post_notification_validator.py | sergicollado/json_redo | 89ed514e441f7038f18fe1833f2c49a4bf08ef19 | [
"CC0-1.0"
] | null | null | null | src/domain/validators/post_notification_validator.py | sergicollado/json_redo | 89ed514e441f7038f18fe1833f2c49a4bf08ef19 | [
"CC0-1.0"
] | null | null | null | src/domain/validators/post_notification_validator.py | sergicollado/json_redo | 89ed514e441f7038f18fe1833f2c49a4bf08ef19 | [
"CC0-1.0"
] | null | null | null | from domain.validators.exceptions import BadParametersError
from domain.validators.abstract_classes import NotificationValidator
class PostNotificationValidator(NotificationValidator):
@property
def required_params(self):
return ["url", "name"] | 36.125 | 69 | 0.730104 | 24 | 289 | 8.708333 | 0.791667 | 0.095694 | 0.191388 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.207612 | 289 | 8 | 70 | 36.125 | 0.912664 | 0 | 0 | 0 | 0 | 0 | 0.024138 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0.166667 | 0.833333 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 5 |
d4b9a3b8d7c59bd59f2c14d6fa92f878f219aa31 | 168 | py | Python | pava/implementation/natives/sun/awt/windows/WDropTargetContextPeer.py | laffra/pava | 54d10cf7f8def2f96e254c0356623d08f221536f | [
"MIT"
] | 4 | 2017-03-30T16:51:16.000Z | 2020-10-05T12:25:47.000Z | pava/implementation/natives/sun/awt/windows/WDropTargetContextPeer.py | laffra/pava | 54d10cf7f8def2f96e254c0356623d08f221536f | [
"MIT"
] | null | null | null | pava/implementation/natives/sun/awt/windows/WDropTargetContextPeer.py | laffra/pava | 54d10cf7f8def2f96e254c0356623d08f221536f | [
"MIT"
] | null | null | null | def add_native_methods(clazz):
def getData__long__long__(a0, a1, a2):
raise NotImplementedError()
clazz.getData__long__long__ = getData__long__long__
| 24 | 55 | 0.755952 | 21 | 168 | 5.095238 | 0.571429 | 0.308411 | 0.420561 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021583 | 0.172619 | 168 | 6 | 56 | 28 | 0.748201 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
d4c089aa52e358912330bf515946655300659486 | 1,066 | py | Python | lightly/models/modules/__init__.py | CodeGuy-007/lightly | 64143fe8a477c04288009c65fa1265cef8aa48f8 | [
"MIT"
] | null | null | null | lightly/models/modules/__init__.py | CodeGuy-007/lightly | 64143fe8a477c04288009c65fa1265cef8aa48f8 | [
"MIT"
] | null | null | null | lightly/models/modules/__init__.py | CodeGuy-007/lightly | 64143fe8a477c04288009c65fa1265cef8aa48f8 | [
"MIT"
] | null | null | null | """The lightly.models.modules package provides reusable modules.
This package contains reusable modules such as the NNmemoryBankModule which
can be combined with any lightly model.
"""
# Copyright (c) 2021. Lightly AG and its affiliates.
# All Rights Reserved
from lightly.models.modules.heads import BarlowTwinsProjectionHead
from lightly.models.modules.heads import BYOLProjectionHead
from lightly.models.modules.heads import BYOLPredictionHead
from lightly.models.modules.heads import DINOProjectionHead
from lightly.models.modules.heads import MoCoProjectionHead
from lightly.models.modules.heads import NNCLRProjectionHead
from lightly.models.modules.heads import NNCLRPredictionHead
from lightly.models.modules.heads import SimCLRProjectionHead
from lightly.models.modules.heads import SimSiamProjectionHead
from lightly.models.modules.heads import SimSiamPredictionHead
from lightly.models.modules.heads import SwaVProjectionHead
from lightly.models.modules.heads import SwaVPrototypes
from lightly.models.modules.nn_memory_bank import NNMemoryBankModule
| 44.416667 | 75 | 0.861163 | 129 | 1,066 | 7.100775 | 0.364341 | 0.19869 | 0.305677 | 0.340611 | 0.458515 | 0.458515 | 0 | 0 | 0 | 0 | 0 | 0.004111 | 0.087242 | 1,066 | 23 | 76 | 46.347826 | 0.937307 | 0.234522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
d4c4ad1b4d2fd21cce592c44ff2ca0f536ddf541 | 1,029 | py | Python | test/test_v1alpha1_host_info.py | RyanSiu1995/argocd-python-client | 2e8f097fe09f247a46ac70692241a93d1acd076a | [
"MIT"
] | 1 | 2021-11-20T13:37:43.000Z | 2021-11-20T13:37:43.000Z | test/test_v1alpha1_host_info.py | RyanSiu1995/argocd-python-client | 2e8f097fe09f247a46ac70692241a93d1acd076a | [
"MIT"
] | null | null | null | test/test_v1alpha1_host_info.py | RyanSiu1995/argocd-python-client | 2e8f097fe09f247a46ac70692241a93d1acd076a | [
"MIT"
] | null | null | null | """
Consolidate Services
Description of all APIs # noqa: E501
The version of the OpenAPI document: version not set
Generated by: https://openapi-generator.tech
"""
import sys
import unittest
import argocd_python_client
from argocd_python_client.model.v1_node_system_info import V1NodeSystemInfo
from argocd_python_client.model.v1alpha1_host_resource_info import V1alpha1HostResourceInfo
globals()['V1NodeSystemInfo'] = V1NodeSystemInfo
globals()['V1alpha1HostResourceInfo'] = V1alpha1HostResourceInfo
from argocd_python_client.model.v1alpha1_host_info import V1alpha1HostInfo
class TestV1alpha1HostInfo(unittest.TestCase):
"""V1alpha1HostInfo unit test stubs"""
def setUp(self):
pass
def tearDown(self):
pass
def testV1alpha1HostInfo(self):
"""Test V1alpha1HostInfo"""
# FIXME: construct object with mandatory attributes with example values
# model = V1alpha1HostInfo() # noqa: E501
pass
if __name__ == '__main__':
unittest.main()
| 25.725 | 91 | 0.747328 | 109 | 1,029 | 6.834862 | 0.550459 | 0.06443 | 0.096644 | 0.088591 | 0.14094 | 0.104698 | 0.104698 | 0 | 0 | 0 | 0 | 0.03787 | 0.178814 | 1,029 | 39 | 92 | 26.384615 | 0.843787 | 0.314869 | 0 | 0.176471 | 1 | 0 | 0.071749 | 0.035874 | 0 | 0 | 0 | 0.025641 | 0 | 1 | 0.176471 | false | 0.176471 | 0.352941 | 0 | 0.588235 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 5 |
d4d24f4aa32fd191be49f315df064562bd538839 | 202 | py | Python | openprocurement/tender/openuadefense/__init__.py | Leits/openprocurement.tender.openuadefense | e7c512ed21166ae1928950bce80a11106fa2e545 | [
"Apache-2.0"
] | null | null | null | openprocurement/tender/openuadefense/__init__.py | Leits/openprocurement.tender.openuadefense | e7c512ed21166ae1928950bce80a11106fa2e545 | [
"Apache-2.0"
] | 2 | 2021-03-26T00:34:56.000Z | 2022-03-21T22:20:41.000Z | openprocurement/tender/openuadefense/__init__.py | leits/openprocurement.tender.openuadefense | e7c512ed21166ae1928950bce80a11106fa2e545 | [
"Apache-2.0"
] | null | null | null | from openprocurement.tender.openuadefense.models import Tender
def includeme(config):
config.add_tender_procurementMethodType(Tender)
config.scan("openprocurement.tender.openuadefense.views")
| 28.857143 | 62 | 0.826733 | 21 | 202 | 7.857143 | 0.619048 | 0.254545 | 0.412121 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089109 | 202 | 6 | 63 | 33.666667 | 0.896739 | 0 | 0 | 0 | 0 | 0 | 0.207921 | 0.207921 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
07b7ed78cb900fb128d439a18bcc36c6b15d8978 | 107 | py | Python | sigstickers/__main__.py | ironpinguin/SigStickers | 1fad21e779e32d6809b75833df56afb6da3bdb10 | [
"MIT"
] | null | null | null | sigstickers/__main__.py | ironpinguin/SigStickers | 1fad21e779e32d6809b75833df56afb6da3bdb10 | [
"MIT"
] | null | null | null | sigstickers/__main__.py | ironpinguin/SigStickers | 1fad21e779e32d6809b75833df56afb6da3bdb10 | [
"MIT"
] | null | null | null | """ entry point for python -m sigstickers """
from __future__ import annotations
from . import cli
cli()
| 15.285714 | 45 | 0.728972 | 14 | 107 | 5.285714 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.17757 | 107 | 6 | 46 | 17.833333 | 0.840909 | 0.345794 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
07e2d3b401d45b0e2bd9b36e50c6421c7a4b9d89 | 4,919 | py | Python | civictechprojects/migrations/0030_auto_20200501_1755.py | bhavanapamulaparthi/CivicTechExchange | c01b8ffccea19beda6e59290139d09f477ddad95 | [
"MIT"
] | null | null | null | civictechprojects/migrations/0030_auto_20200501_1755.py | bhavanapamulaparthi/CivicTechExchange | c01b8ffccea19beda6e59290139d09f477ddad95 | [
"MIT"
] | null | null | null | civictechprojects/migrations/0030_auto_20200501_1755.py | bhavanapamulaparthi/CivicTechExchange | c01b8ffccea19beda6e59290139d09f477ddad95 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11.28 on 2020-05-01 17:55
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('democracylab', '0006_auto_20200427_2336'),
('civictechprojects', '0029_projectcommit'),
]
operations = [
migrations.CreateModel(
name='Event',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('deleted', models.BooleanField(default=False)),
('event_agenda', models.CharField(blank=True, max_length=4000)),
('event_date_created', models.DateTimeField(null=True)),
('event_date_end', models.DateTimeField()),
('event_date_modified', models.DateTimeField(auto_now_add=True, null=True)),
('event_date_start', models.DateTimeField()),
('event_description', models.CharField(blank=True, max_length=4000)),
('event_location', models.CharField(blank=True, max_length=200)),
('event_name', models.CharField(max_length=200)),
('event_rsvp_url', models.CharField(blank=True, max_length=2083)),
('event_short_description', models.CharField(blank=True, max_length=140)),
('is_searchable', models.BooleanField(default=False)),
('is_created', models.BooleanField(default=True)),
('event_creator', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='event_creator', to='democracylab.Contributor')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Group',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('deleted', models.BooleanField(default=False)),
('group_date_created', models.DateTimeField(null=True)),
('group_date_modified', models.DateTimeField(auto_now_add=True, null=True)),
('group_description', models.CharField(blank=True, max_length=4000)),
('group_location', models.CharField(blank=True, max_length=200)),
('group_name', models.CharField(max_length=200)),
('group_short_description', models.CharField(blank=True, max_length=140)),
('is_searchable', models.BooleanField(default=False)),
('is_created', models.BooleanField(default=True)),
('group_creator', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='group_creator', to='democracylab.Contributor')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='ProjectRelationship',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('relationship_event', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='relationships', to='civictechprojects.Event')),
('relationship_group', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='relationships', to='civictechprojects.Group')),
('relationship_project', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='relationships', to='civictechprojects.Project')),
],
),
migrations.AddField(
model_name='projectfile',
name='file_event',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='files', to='civictechprojects.Event'),
),
migrations.AddField(
model_name='projectfile',
name='file_group',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='files', to='civictechprojects.Group'),
),
migrations.AddField(
model_name='projectlink',
name='link_event',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='links', to='civictechprojects.Event'),
),
migrations.AddField(
model_name='projectlink',
name='link_group',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='links', to='civictechprojects.Group'),
),
]
| 55.269663 | 191 | 0.614149 | 488 | 4,919 | 5.997951 | 0.202869 | 0.046122 | 0.047831 | 0.075162 | 0.810386 | 0.810386 | 0.751964 | 0.685685 | 0.606765 | 0.606765 | 0 | 0.019576 | 0.252287 | 4,919 | 88 | 192 | 55.897727 | 0.776237 | 0.014027 | 0 | 0.45679 | 1 | 0 | 0.190586 | 0.058836 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.037037 | 0 | 0.074074 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
5807437a3ec82c72a2016d8b0da6571fb6f2f02e | 34 | py | Python | Logger/__init__.py | ArthMx/Logger | 0fab4009abec00f33d9b93d0c6093e544e5168b8 | [
"MIT"
] | null | null | null | Logger/__init__.py | ArthMx/Logger | 0fab4009abec00f33d9b93d0c6093e544e5168b8 | [
"MIT"
] | null | null | null | Logger/__init__.py | ArthMx/Logger | 0fab4009abec00f33d9b93d0c6093e544e5168b8 | [
"MIT"
] | null | null | null | from .Logger import MetricsLogger
| 17 | 33 | 0.852941 | 4 | 34 | 7.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 34 | 1 | 34 | 34 | 0.966667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
ed0da160d8845781b2dc255e5fe7d18798934959 | 71 | py | Python | server/lib/api/__init__.py | ryanshi42/ARG-Event | e7446a659483b539198dd923bf5a57c0f83d85e9 | [
"MIT"
] | 4 | 2020-04-28T02:42:35.000Z | 2020-05-20T05:42:41.000Z | server/lib/api/__init__.py | ryanshi42/ARG-Event | e7446a659483b539198dd923bf5a57c0f83d85e9 | [
"MIT"
] | 2 | 2019-11-05T08:39:34.000Z | 2020-08-18T07:37:19.000Z | server/lib/api/__init__.py | ryanshi42/ARG-Event | e7446a659483b539198dd923bf5a57c0f83d85e9 | [
"MIT"
] | 6 | 2019-06-10T05:22:09.000Z | 2020-07-24T14:54:35.000Z | from .APIHandler import APIHandler, JSON, routing
from . import routes
| 23.666667 | 49 | 0.802817 | 9 | 71 | 6.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140845 | 71 | 2 | 50 | 35.5 | 0.934426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
ed28da52a45a8a6d796710e7652954fdb083d914 | 125 | py | Python | accelerator/examples/a_example_equivalenthashes.py | eBay/accelerator | 218d9a5e4451ac72b9e65df6c5b32e37d25136c8 | [
"Apache-2.0"
] | 143 | 2018-04-20T18:50:41.000Z | 2022-02-06T07:07:35.000Z | accelerator/examples/a_example_equivalenthashes.py | exaxorg/accelerator | d6132f215585b98d2ad14c5d74d2c937fbd940e2 | [
"Apache-2.0"
] | null | null | null | accelerator/examples/a_example_equivalenthashes.py | exaxorg/accelerator | d6132f215585b98d2ad14c5d74d2c937fbd940e2 | [
"Apache-2.0"
] | 29 | 2018-04-20T18:50:43.000Z | 2021-04-27T18:42:23.000Z | description = 'equivalent_hashes example method'
# equivalent_hashes = {'<verifier>': ('<hash>',)}
def synthesis():
pass
| 15.625 | 49 | 0.688 | 12 | 125 | 7 | 0.833333 | 0.380952 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136 | 125 | 7 | 50 | 17.857143 | 0.777778 | 0.376 | 0 | 0 | 0 | 0 | 0.421053 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
ed41d90504689743343707ccb21aece9d1b942f6 | 38 | py | Python | eqdes/extensions/exceptions.py | eng-tools/eqdes | f77809d3e79815b261a3385e33b81b596d514101 | [
"MIT"
] | null | null | null | eqdes/extensions/exceptions.py | eng-tools/eqdes | f77809d3e79815b261a3385e33b81b596d514101 | [
"MIT"
] | null | null | null | eqdes/extensions/exceptions.py | eng-tools/eqdes | f77809d3e79815b261a3385e33b81b596d514101 | [
"MIT"
] | 1 | 2020-11-07T04:46:23.000Z | 2020-11-07T04:46:23.000Z | class DesignError(Exception):
pass | 19 | 29 | 0.763158 | 4 | 38 | 7.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 38 | 2 | 30 | 19 | 0.90625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
ed560a945fb9cc5d0cd4d7fcd9098a7527b4ad4a | 51 | py | Python | enthought/pyface/tasks/i_editor.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | 3 | 2016-12-09T06:05:18.000Z | 2018-03-01T13:00:29.000Z | enthought/pyface/tasks/i_editor.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | 1 | 2020-12-02T00:51:32.000Z | 2020-12-02T08:48:55.000Z | enthought/pyface/tasks/i_editor.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | null | null | null | # proxy module
from pyface.tasks.i_editor import *
| 17 | 35 | 0.784314 | 8 | 51 | 4.875 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137255 | 51 | 2 | 36 | 25.5 | 0.886364 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
ed5d32ff01fb9883b171f6dd9b03cd92d3851744 | 319 | py | Python | backend/flask-api/src/servicos/processo_servico.py | lucasbibianot/inova-cnj-time16 | e621d7027bd462d348e233ffd6ed88648c53704b | [
"Apache-2.0"
] | null | null | null | backend/flask-api/src/servicos/processo_servico.py | lucasbibianot/inova-cnj-time16 | e621d7027bd462d348e233ffd6ed88648c53704b | [
"Apache-2.0"
] | null | null | null | backend/flask-api/src/servicos/processo_servico.py | lucasbibianot/inova-cnj-time16 | e621d7027bd462d348e233ffd6ed88648c53704b | [
"Apache-2.0"
] | 2 | 2020-10-19T22:03:31.000Z | 2020-11-29T21:22:33.000Z | import re
from ..entidades.modelo_fluxo import Processo
from ..persistencia.database import db
def inserir_processo(processo):
db.session.add(processo)
db.session.commit()
def retornar_processo(cd_processo):
return db.session.query(Processo).filter(
Processo.cd_processo == cd_processo).first()
| 22.785714 | 52 | 0.755486 | 41 | 319 | 5.731707 | 0.512195 | 0.114894 | 0.229787 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141066 | 319 | 13 | 53 | 24.538462 | 0.857664 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.333333 | 0.111111 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 5 |
71f9a5fef03ed79d23fe992fc9ce68ce1b5b0e76 | 17,092 | py | Python | flexget/tests/test_subtitle_list.py | OmgOhnoes/Flexget | d7d5095dba91b95390c76f57502410e23175c535 | [
"MIT"
] | 1 | 2021-03-24T11:54:01.000Z | 2021-03-24T11:54:01.000Z | flexget/tests/test_subtitle_list.py | OmgOhnoes/Flexget | d7d5095dba91b95390c76f57502410e23175c535 | [
"MIT"
] | null | null | null | flexget/tests/test_subtitle_list.py | OmgOhnoes/Flexget | d7d5095dba91b95390c76f57502410e23175c535 | [
"MIT"
] | null | null | null | from __future__ import unicode_literals, division, absolute_import
from builtins import * # noqa pylint: disable=unused-import, redefined-builtin
import datetime
import os
import sys
import pytest
from flexget.manager import Session
from flexget.plugins.list.subtitle_list import SubtitleListFile, SubtitleListLanguage, normalize_path
try:
import subliminal
except ImportError:
subliminal = babelfish = None
class TestSubtitleList(object):
config = """
templates:
global:
mock:
- {title: 'Movie', location: 'movie.mkv'}
- {title: 'Series', location: 'series.mkv'}
accept_all: yes
seen: local
tasks:
subtitle_add:
list_add:
- subtitle_list:
list: test
subtitle_emit:
disable: builtins
template: no_global
subtitle_list:
list: test
languages: [en]
subtitle_remove:
#subtitle_list:
# list: test
#accept_all: yes
list_remove:
- subtitle_list:
list: test
subtitle_fail:
template: no_global
subtitle_list:
list: test
subliminal:
languages: [en, afr]
exact_match: no
providers:
- opensubtitles
list_match:
from:
- subtitle_list:
list: test
single_match: yes
rerun: 0
subtitle_simulate_success:
template: no_global
subtitle_list:
list: test
subliminal:
languages: [en, ja]
exact_match: no
providers:
- opensubtitles
list_match:
from:
- subtitle_list:
list: test
single_match: yes
rerun: 0
subtitle_add_with_languages:
list_add:
- subtitle_list:
list: test
languages:
- en
- eng
subtitle_add_local_file:
disable: seen
template: no_global
mock:
- {title: 'The Walking Dead S06E08', location:
'subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.mp4'}
accept_all: yes
list_add:
- subtitle_list:
list: test
languages: [en, ja]
subtitle_add_another_local_file:
disable: seen
template: no_global
mock:
- {title: "Marvel's Jessica Jones S01E02",
location: "subtitle_list_test_dir/Marvels.Jessica.Jones.S01E02-FlexGet.mkv"}
accept_all: yes
list_add:
- subtitle_list:
list: test
languages: [en, ja]
subtitle_add_a_third_local_file:
disable: seen
template: no_global
mock:
- {title: "The.Big.Bang.Theory.S09E09",
location: "subtitle_list_test_dir/The.Big.Bang.Theory.S09E09-FlexGet.mkv"}
accept_all: yes
list_add:
- subtitle_list:
list: test
subtitle_test_expiration_add:
disable: builtins
template: no_global
mock:
- {title: "The.Big.Bang.Theory.S09E09",
location: "subtitle_list_test_dir/The.Big.Bang.Theory.S09E09-FlexGet.mkv"}
accept_all: yes
list_add:
- subtitle_list:
list: test
remove_after: 7 days
subtitle_add_local_dir:
disable: builtins
template: no_global
mock:
- {title: "My Videos", location: "subtitle_list_test_dir"}
list_add:
- subtitle_list:
list: test
allow_dir: yes
languages: ['ja']
accept_all: yes
subtitle_emit_dir:
disable: builtins
template: no_global
subtitle_list:
list: test
languages: [en]
subtitle_simulate_success_no_check:
template: no_global
subtitle_list:
list: test
check_subtitles: no
subliminal:
languages: [ja]
exact_match: no
providers:
- opensubtitles
list_match:
from:
- subtitle_list:
list: test
single_match: yes
subtitle_add_force_file:
disable: builtins
template: no_global
mock:
- {title: "The.Walking.Dead.S06E09-FlexGet",
location: "subtitle_list_test_dir/The.Walking.Dead.S06E09-FlexGet.mp4"}
list_add:
- subtitle_list:
list: test
allow_dir: yes
languages: ['ja']
accept_all: yes
subtitle_add_force_file_no:
disable: builtins
template: no_global
mock:
- {title: "The.Walking.Dead.S06E09-FlexGet",
location: "subtitle_list_test_dir/The.Walking.Dead.S06E09-FlexGet.mp4"}
list_add:
- subtitle_list:
list: test
languages: ['ja']
force_file_existence: no
accept_all: yes
subtitle_emit_force_no:
disable: builtins
template: no_global
subtitle_list:
list: test
force_file_existence: no
subtitle_path:
disable: builtins
template: no_global
mock:
- {title: "The.Walking.Dead.S06E08-FlexGet",
output: 'subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.mp4'}
list_add:
- subtitle_list:
list: test
path: '{{ output }}'
accept_all: yes
subtitle_path_relative:
disable: builtins
template: no_global
mock:
- {title: "The.Walking.Dead.S06E08-FlexGet"}
list_add:
- subtitle_list:
list: test
path: 'subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.mp4'
accept_all: yes
"""
def test_subtitle_list_del(self, execute_task):
task = execute_task('subtitle_add')
task = execute_task('subtitle_emit')
assert len(task.entries) == 2
task = execute_task('subtitle_remove')
task = execute_task('subtitle_emit')
assert len(task.entries) == 0
def test_subtitle_list_unique_lang(self, execute_task):
execute_task('subtitle_add_with_languages')
with Session() as session:
s = session.query(SubtitleListLanguage).all()
assert s[0].file.title != s[1].file.title, 'There should only be one row per entry as "en" and "eng" are eq'
assert len(s) == 2, 'Language "en" and "eng" are equivalent and only one should exist per entry'
def test_subtitle_list_old(self, execute_task):
task = execute_task('subtitle_test_expiration_add')
with Session() as session:
s = session.query(SubtitleListFile).first()
s.added = datetime.datetime.now() + datetime.timedelta(-8)
task = execute_task('subtitle_emit')
assert len(task.entries) == 0, 'File should have expired.'
# Skip if subliminal is not installed or if python version <2.7
@pytest.mark.online
@pytest.mark.skipif(sys.version_info < (2, 7), reason='requires python2.7')
@pytest.mark.skipif(not subliminal, reason='requires subliminal')
def test_subtitle_list_subliminal_fail(self, execute_task):
task = execute_task('subtitle_add_with_languages')
assert len(task.entries) == 2, 'Task should have two entries.'
task = execute_task('subtitle_fail')
assert len(task.failed) == 2, 'Entries should fail since the files are not valid.'
# Skip if subliminal is not installed or if python version <2.7
@pytest.mark.skip
@pytest.mark.online
@pytest.mark.skipif(sys.version_info < (2, 7), reason='requires python2.7')
@pytest.mark.skipif(not subliminal, reason='requires subliminal')
def test_subtitle_list_subliminal_semi_fail(self, execute_task):
task = execute_task('subtitle_add_local_file')
assert len(task.entries) == 1, 'Task should have accepted walking dead local file'
task = execute_task('subtitle_fail')
# cleanup
try:
os.remove('subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.en.srt')
except OSError:
pass
assert len(task.failed) == 1, 'Only one language should have been downloaded which results in failure'
# Skip if subliminal is not installed or if python version <2.7
@pytest.mark.skip(reason="Test sporadically fails")
# @pytest.mark.skipif(sys.version_info < (2, 7), reason='requires python2.7')
# @pytest.mark.skipif(not subliminal, reason='requires subliminal')
def test_subtitle_list_subliminal_success(self, execute_task):
task = execute_task('subtitle_add_local_file')
assert len(task.entries) == 1, 'Task should have accepted walking dead local file'
task = execute_task('subtitle_add_another_local_file')
assert len(task.entries) == 1, 'Task should have accepted jessica jones file'
with open('subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.en.srt', 'a'):
os.utime('subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.en.srt', None)
with open('subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.ja.srt', 'a'):
os.utime('subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.ja.srt', None)
with open('subtitle_list_test_dir/Marvels.Jessica.Jones.S01E02-FlexGet.en.srt', 'a'):
os.utime('subtitle_list_test_dir/Marvels.Jessica.Jones.S01E02-FlexGet.en.srt', None)
task = execute_task('subtitle_simulate_success')
assert len(task.failed) == 1, 'Should have found both languages for walking dead but not for jessica jones'
task = execute_task('subtitle_emit')
assert len(task.entries) == 1, 'Walking Dead should have been removed from the list'
try:
os.remove('subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.en.srt')
except OSError:
pass
try:
os.remove('subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.ja.srt')
except OSError:
pass
try:
os.remove('subtitle_list_test_dir/Marvels.Jessica.Jones.S01E02-FlexGet.en.srt')
except OSError:
pass
# Skip if subliminal is not installed or if python version <2.7
@pytest.mark.skipif(sys.version_info < (2, 7), reason='requires python2.7')
@pytest.mark.skipif(not subliminal, reason='requires subliminal')
def test_subtitle_list_local_subtitles(self, execute_task):
task = execute_task('subtitle_add_local_file')
task = execute_task('subtitle_add_another_local_file')
task = execute_task('subtitle_add_a_third_local_file')
task = execute_task('subtitle_emit')
assert len(task.entries) == 2, 'Big Bang Theory already has a local subtitle and should have been removed.'
def test_subtitle_list_local_dir(self, execute_task):
task = execute_task('subtitle_add_local_dir')
with Session() as session:
s = session.query(SubtitleListFile).first()
assert s.title == 'My Videos', 'Should have added the dir with title "My Videos" to the list'
task = execute_task('subtitle_emit_dir')
assert len(task.entries) == 3, 'Should have found 3 video files and the containing dir should not be included.'
# Skip if subliminal is not installed or if python version <2.7
@pytest.mark.skipif(sys.version_info < (2, 7), reason='requires python2.7')
@pytest.mark.skipif(not subliminal, reason='requires subliminal')
def test_subtitle_list_subliminal_dir_success(self, execute_task):
task = execute_task('subtitle_add_local_dir')
with open('subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.ja.srt', 'a'):
os.utime('subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.ja.srt', None)
with open('subtitle_list_test_dir/Marvels.Jessica.Jones.S01E02-FlexGet.ja.srt', 'a'):
os.utime('subtitle_list_test_dir/Marvels.Jessica.Jones.S01E02-FlexGet.ja.srt', None)
with open('subtitle_list_test_dir/The.Big.Bang.Theory.S09E09-FlexGet.ja.srt', 'a'):
os.utime('subtitle_list_test_dir/The.Big.Bang.Theory.S09E09-FlexGet.ja.srt', None)
task = execute_task('subtitle_simulate_success_no_check')
assert len(task.all_entries) == 3, '"My Videos" should have been deleted'
assert len(task.accepted) == 3, 'All files have all subtitles'
with Session() as session:
s = session.query(SubtitleListFile).first()
assert s is None, '"My Videos" and contained files should have been deleted from the list'
try:
os.remove('subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.ja.srt')
except OSError:
pass
try:
os.remove('subtitle_list_test_dir/Marvels.Jessica.Jones.S01E02-FlexGet.ja.srt')
except OSError:
pass
try:
os.remove('subtitle_list_test_dir/The.Big.Bang.Theory.S09E09-FlexGet.ja.srt')
except OSError:
pass
def test_subtitle_list_force_file_existence_no(self, execute_task):
task = execute_task('subtitle_add_force_file_no')
assert not os.path.exists(task.entries[0]['location']), 'File should not exist.'
with Session() as session:
s = session.query(SubtitleListFile).first()
assert s, 'The file should have been added to the list even though it does not exist'
task = execute_task('subtitle_emit_force_no')
assert len(task.entries) == 0, 'List should not be empty, but since file does not exist it isn\' returned'
with Session() as session:
s = session.query(SubtitleListFile).first()
assert s, 'The file should still be in the list'
def test_subtitle_list_force_file_existence_yes(self, execute_task):
task = execute_task('subtitle_add_force_file')
assert not os.path.exists(task.entries[0]['location']), 'File should not exist.'
with Session() as session:
s = session.query(SubtitleListFile).first()
assert s is None, 'The file should not have been added to the list as it does not exist'
task = execute_task('subtitle_emit')
assert len(task.entries) == 0, 'List should be empty'
def test_subtitle_list_force_file_existence_yes_input(self, execute_task):
task = execute_task('subtitle_add_force_file_no')
assert not os.path.exists(task.entries[0]['location']), 'File should not exist.'
with Session() as session:
s = session.query(SubtitleListFile).first()
assert s, 'The file should have been added to the list even though it does not exist'
task = execute_task('subtitle_emit')
assert len(task.entries) == 0, 'No input should be returned as the file does not exist'
with Session() as session:
s = session.query(SubtitleListFile).first()
assert s is None, 'The file should have been removed from the list since it does not exist'
def test_subtitle_list_path(self, execute_task):
execute_task('subtitle_path')
with Session() as session:
s = session.query(SubtitleListFile).first()
assert s, 'The file should have been added to the list'
assert s.location == normalize_path('subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.mp4'), \
'location should be what the output field was set to'
def test_subtitle_list_relative_path(self, execute_task):
execute_task('subtitle_path_relative')
with Session() as session:
s = session.query(SubtitleListFile).first()
assert s, 'The file should have been added to the list'
assert s.location == normalize_path('subtitle_list_test_dir/The.Walking.Dead.S06E08-FlexGet.mp4'), \
'location should be what the output field was set to'
| 38.066815 | 120 | 0.595191 | 1,989 | 17,092 | 4.909 | 0.104575 | 0.082343 | 0.047624 | 0.073023 | 0.799877 | 0.767206 | 0.754199 | 0.709853 | 0.671241 | 0.63939 | 0 | 0.018361 | 0.318102 | 17,092 | 448 | 121 | 38.151786 | 0.819391 | 0.030014 | 0 | 0.654596 | 0 | 0 | 0.615909 | 0.172913 | 0 | 0 | 0 | 0 | 0.094708 | 1 | 0.038997 | false | 0.019499 | 0.027855 | 0 | 0.072423 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
9c0cf9332f14144d3b548058b81c1a37ed19207c | 24 | py | Python | task one/PrintMyName.py | odelolajosh/my-website | 556533c1eca3ad20e95322a42973108b76fe9c96 | [
"MIT"
] | null | null | null | task one/PrintMyName.py | odelolajosh/my-website | 556533c1eca3ad20e95322a42973108b76fe9c96 | [
"MIT"
] | null | null | null | task one/PrintMyName.py | odelolajosh/my-website | 556533c1eca3ad20e95322a42973108b76fe9c96 | [
"MIT"
] | null | null | null | print("Odelola Joshua"); | 24 | 24 | 0.75 | 3 | 24 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041667 | 24 | 1 | 24 | 24 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0.56 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
9c2b553cb20321977848898208587d9884b26555 | 30,042 | py | Python | sdk/python/pulumi_azure_native/network/v20180401/virtual_network.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_native/network/v20180401/virtual_network.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_native/network/v20180401/virtual_network.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from ... import _utilities
from . import outputs
from ._enums import *
from ._inputs import *
__all__ = ['VirtualNetworkArgs', 'VirtualNetwork']
@pulumi.input_type
class VirtualNetworkArgs:
def __init__(__self__, *,
resource_group_name: pulumi.Input[str],
address_space: Optional[pulumi.Input['AddressSpaceArgs']] = None,
ddos_protection_plan: Optional[pulumi.Input['SubResourceArgs']] = None,
dhcp_options: Optional[pulumi.Input['DhcpOptionsArgs']] = None,
enable_ddos_protection: Optional[pulumi.Input[bool]] = None,
enable_vm_protection: Optional[pulumi.Input[bool]] = None,
etag: Optional[pulumi.Input[str]] = None,
id: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
provisioning_state: Optional[pulumi.Input[str]] = None,
resource_guid: Optional[pulumi.Input[str]] = None,
subnets: Optional[pulumi.Input[Sequence[pulumi.Input['SubnetArgs']]]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
virtual_network_name: Optional[pulumi.Input[str]] = None,
virtual_network_peerings: Optional[pulumi.Input[Sequence[pulumi.Input['VirtualNetworkPeeringArgs']]]] = None):
"""
The set of arguments for constructing a VirtualNetwork resource.
:param pulumi.Input[str] resource_group_name: The name of the resource group.
:param pulumi.Input['AddressSpaceArgs'] address_space: The AddressSpace that contains an array of IP address ranges that can be used by subnets.
:param pulumi.Input['SubResourceArgs'] ddos_protection_plan: The DDoS protection plan associated with the virtual network.
:param pulumi.Input['DhcpOptionsArgs'] dhcp_options: The dhcpOptions that contains an array of DNS servers available to VMs deployed in the virtual network.
:param pulumi.Input[bool] enable_ddos_protection: Indicates if DDoS protection is enabled for all the protected resources in the virtual network. It requires a DDoS protection plan associated with the resource.
:param pulumi.Input[bool] enable_vm_protection: Indicates if VM protection is enabled for all the subnets in the virtual network.
:param pulumi.Input[str] etag: Gets a unique read-only string that changes whenever the resource is updated.
:param pulumi.Input[str] id: Resource ID.
:param pulumi.Input[str] location: Resource location.
:param pulumi.Input[str] provisioning_state: The provisioning state of the PublicIP resource. Possible values are: 'Updating', 'Deleting', and 'Failed'.
:param pulumi.Input[str] resource_guid: The resourceGuid property of the Virtual Network resource.
:param pulumi.Input[Sequence[pulumi.Input['SubnetArgs']]] subnets: A list of subnets in a Virtual Network.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: Resource tags.
:param pulumi.Input[str] virtual_network_name: The name of the virtual network.
:param pulumi.Input[Sequence[pulumi.Input['VirtualNetworkPeeringArgs']]] virtual_network_peerings: A list of peerings in a Virtual Network.
"""
pulumi.set(__self__, "resource_group_name", resource_group_name)
if address_space is not None:
pulumi.set(__self__, "address_space", address_space)
if ddos_protection_plan is not None:
pulumi.set(__self__, "ddos_protection_plan", ddos_protection_plan)
if dhcp_options is not None:
pulumi.set(__self__, "dhcp_options", dhcp_options)
if enable_ddos_protection is None:
enable_ddos_protection = False
if enable_ddos_protection is not None:
pulumi.set(__self__, "enable_ddos_protection", enable_ddos_protection)
if enable_vm_protection is None:
enable_vm_protection = False
if enable_vm_protection is not None:
pulumi.set(__self__, "enable_vm_protection", enable_vm_protection)
if etag is not None:
pulumi.set(__self__, "etag", etag)
if id is not None:
pulumi.set(__self__, "id", id)
if location is not None:
pulumi.set(__self__, "location", location)
if provisioning_state is not None:
pulumi.set(__self__, "provisioning_state", provisioning_state)
if resource_guid is not None:
pulumi.set(__self__, "resource_guid", resource_guid)
if subnets is not None:
pulumi.set(__self__, "subnets", subnets)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if virtual_network_name is not None:
pulumi.set(__self__, "virtual_network_name", virtual_network_name)
if virtual_network_peerings is not None:
pulumi.set(__self__, "virtual_network_peerings", virtual_network_peerings)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Input[str]:
"""
The name of the resource group.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter(name="addressSpace")
def address_space(self) -> Optional[pulumi.Input['AddressSpaceArgs']]:
"""
The AddressSpace that contains an array of IP address ranges that can be used by subnets.
"""
return pulumi.get(self, "address_space")
@address_space.setter
def address_space(self, value: Optional[pulumi.Input['AddressSpaceArgs']]):
pulumi.set(self, "address_space", value)
@property
@pulumi.getter(name="ddosProtectionPlan")
def ddos_protection_plan(self) -> Optional[pulumi.Input['SubResourceArgs']]:
"""
The DDoS protection plan associated with the virtual network.
"""
return pulumi.get(self, "ddos_protection_plan")
@ddos_protection_plan.setter
def ddos_protection_plan(self, value: Optional[pulumi.Input['SubResourceArgs']]):
pulumi.set(self, "ddos_protection_plan", value)
@property
@pulumi.getter(name="dhcpOptions")
def dhcp_options(self) -> Optional[pulumi.Input['DhcpOptionsArgs']]:
"""
The dhcpOptions that contains an array of DNS servers available to VMs deployed in the virtual network.
"""
return pulumi.get(self, "dhcp_options")
@dhcp_options.setter
def dhcp_options(self, value: Optional[pulumi.Input['DhcpOptionsArgs']]):
pulumi.set(self, "dhcp_options", value)
@property
@pulumi.getter(name="enableDdosProtection")
def enable_ddos_protection(self) -> Optional[pulumi.Input[bool]]:
"""
Indicates if DDoS protection is enabled for all the protected resources in the virtual network. It requires a DDoS protection plan associated with the resource.
"""
return pulumi.get(self, "enable_ddos_protection")
@enable_ddos_protection.setter
def enable_ddos_protection(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enable_ddos_protection", value)
@property
@pulumi.getter(name="enableVmProtection")
def enable_vm_protection(self) -> Optional[pulumi.Input[bool]]:
"""
Indicates if VM protection is enabled for all the subnets in the virtual network.
"""
return pulumi.get(self, "enable_vm_protection")
@enable_vm_protection.setter
def enable_vm_protection(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enable_vm_protection", value)
@property
@pulumi.getter
def etag(self) -> Optional[pulumi.Input[str]]:
"""
Gets a unique read-only string that changes whenever the resource is updated.
"""
return pulumi.get(self, "etag")
@etag.setter
def etag(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "etag", value)
@property
@pulumi.getter
def id(self) -> Optional[pulumi.Input[str]]:
"""
Resource ID.
"""
return pulumi.get(self, "id")
@id.setter
def id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "id", value)
@property
@pulumi.getter
def location(self) -> Optional[pulumi.Input[str]]:
"""
Resource location.
"""
return pulumi.get(self, "location")
@location.setter
def location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "location", value)
@property
@pulumi.getter(name="provisioningState")
def provisioning_state(self) -> Optional[pulumi.Input[str]]:
"""
The provisioning state of the PublicIP resource. Possible values are: 'Updating', 'Deleting', and 'Failed'.
"""
return pulumi.get(self, "provisioning_state")
@provisioning_state.setter
def provisioning_state(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "provisioning_state", value)
@property
@pulumi.getter(name="resourceGuid")
def resource_guid(self) -> Optional[pulumi.Input[str]]:
"""
The resourceGuid property of the Virtual Network resource.
"""
return pulumi.get(self, "resource_guid")
@resource_guid.setter
def resource_guid(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_guid", value)
@property
@pulumi.getter
def subnets(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['SubnetArgs']]]]:
"""
A list of subnets in a Virtual Network.
"""
return pulumi.get(self, "subnets")
@subnets.setter
def subnets(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['SubnetArgs']]]]):
pulumi.set(self, "subnets", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
Resource tags.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter(name="virtualNetworkName")
def virtual_network_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the virtual network.
"""
return pulumi.get(self, "virtual_network_name")
@virtual_network_name.setter
def virtual_network_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "virtual_network_name", value)
@property
@pulumi.getter(name="virtualNetworkPeerings")
def virtual_network_peerings(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['VirtualNetworkPeeringArgs']]]]:
"""
A list of peerings in a Virtual Network.
"""
return pulumi.get(self, "virtual_network_peerings")
@virtual_network_peerings.setter
def virtual_network_peerings(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['VirtualNetworkPeeringArgs']]]]):
pulumi.set(self, "virtual_network_peerings", value)
class VirtualNetwork(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
address_space: Optional[pulumi.Input[pulumi.InputType['AddressSpaceArgs']]] = None,
ddos_protection_plan: Optional[pulumi.Input[pulumi.InputType['SubResourceArgs']]] = None,
dhcp_options: Optional[pulumi.Input[pulumi.InputType['DhcpOptionsArgs']]] = None,
enable_ddos_protection: Optional[pulumi.Input[bool]] = None,
enable_vm_protection: Optional[pulumi.Input[bool]] = None,
etag: Optional[pulumi.Input[str]] = None,
id: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
provisioning_state: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
resource_guid: Optional[pulumi.Input[str]] = None,
subnets: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['SubnetArgs']]]]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
virtual_network_name: Optional[pulumi.Input[str]] = None,
virtual_network_peerings: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['VirtualNetworkPeeringArgs']]]]] = None,
__props__=None):
"""
Virtual Network resource.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['AddressSpaceArgs']] address_space: The AddressSpace that contains an array of IP address ranges that can be used by subnets.
:param pulumi.Input[pulumi.InputType['SubResourceArgs']] ddos_protection_plan: The DDoS protection plan associated with the virtual network.
:param pulumi.Input[pulumi.InputType['DhcpOptionsArgs']] dhcp_options: The dhcpOptions that contains an array of DNS servers available to VMs deployed in the virtual network.
:param pulumi.Input[bool] enable_ddos_protection: Indicates if DDoS protection is enabled for all the protected resources in the virtual network. It requires a DDoS protection plan associated with the resource.
:param pulumi.Input[bool] enable_vm_protection: Indicates if VM protection is enabled for all the subnets in the virtual network.
:param pulumi.Input[str] etag: Gets a unique read-only string that changes whenever the resource is updated.
:param pulumi.Input[str] id: Resource ID.
:param pulumi.Input[str] location: Resource location.
:param pulumi.Input[str] provisioning_state: The provisioning state of the PublicIP resource. Possible values are: 'Updating', 'Deleting', and 'Failed'.
:param pulumi.Input[str] resource_group_name: The name of the resource group.
:param pulumi.Input[str] resource_guid: The resourceGuid property of the Virtual Network resource.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['SubnetArgs']]]] subnets: A list of subnets in a Virtual Network.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: Resource tags.
:param pulumi.Input[str] virtual_network_name: The name of the virtual network.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['VirtualNetworkPeeringArgs']]]] virtual_network_peerings: A list of peerings in a Virtual Network.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: VirtualNetworkArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Virtual Network resource.
:param str resource_name: The name of the resource.
:param VirtualNetworkArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(VirtualNetworkArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
address_space: Optional[pulumi.Input[pulumi.InputType['AddressSpaceArgs']]] = None,
ddos_protection_plan: Optional[pulumi.Input[pulumi.InputType['SubResourceArgs']]] = None,
dhcp_options: Optional[pulumi.Input[pulumi.InputType['DhcpOptionsArgs']]] = None,
enable_ddos_protection: Optional[pulumi.Input[bool]] = None,
enable_vm_protection: Optional[pulumi.Input[bool]] = None,
etag: Optional[pulumi.Input[str]] = None,
id: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
provisioning_state: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
resource_guid: Optional[pulumi.Input[str]] = None,
subnets: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['SubnetArgs']]]]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
virtual_network_name: Optional[pulumi.Input[str]] = None,
virtual_network_peerings: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['VirtualNetworkPeeringArgs']]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = VirtualNetworkArgs.__new__(VirtualNetworkArgs)
__props__.__dict__["address_space"] = address_space
__props__.__dict__["ddos_protection_plan"] = ddos_protection_plan
__props__.__dict__["dhcp_options"] = dhcp_options
if enable_ddos_protection is None:
enable_ddos_protection = False
__props__.__dict__["enable_ddos_protection"] = enable_ddos_protection
if enable_vm_protection is None:
enable_vm_protection = False
__props__.__dict__["enable_vm_protection"] = enable_vm_protection
__props__.__dict__["etag"] = etag
__props__.__dict__["id"] = id
__props__.__dict__["location"] = location
__props__.__dict__["provisioning_state"] = provisioning_state
if resource_group_name is None and not opts.urn:
raise TypeError("Missing required property 'resource_group_name'")
__props__.__dict__["resource_group_name"] = resource_group_name
__props__.__dict__["resource_guid"] = resource_guid
__props__.__dict__["subnets"] = subnets
__props__.__dict__["tags"] = tags
__props__.__dict__["virtual_network_name"] = virtual_network_name
__props__.__dict__["virtual_network_peerings"] = virtual_network_peerings
__props__.__dict__["name"] = None
__props__.__dict__["type"] = None
alias_opts = pulumi.ResourceOptions(aliases=[pulumi.Alias(type_="azure-nextgen:network/v20180401:VirtualNetwork"), pulumi.Alias(type_="azure-native:network:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20150501preview:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20150501preview:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20150615:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20150615:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20160330:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20160330:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20160601:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20160601:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20160901:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20160901:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20161201:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20161201:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20170301:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20170301:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20170601:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20170601:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20170801:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20170801:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20170901:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20170901:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20171001:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20171001:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20171101:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20171101:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20180101:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20180101:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20180201:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20180201:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20180601:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20180601:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20180701:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20180701:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20180801:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20180801:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20181001:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20181001:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20181101:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20181101:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20181201:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20181201:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20190201:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20190201:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20190401:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20190401:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20190601:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20190601:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20190701:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20190701:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20190801:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20190801:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20190901:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20190901:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20191101:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20191101:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20191201:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20191201:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20200301:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20200301:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20200401:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20200401:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20200501:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20200501:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20200601:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20200601:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20200701:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20200701:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20200801:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20200801:VirtualNetwork"), pulumi.Alias(type_="azure-native:network/v20201101:VirtualNetwork"), pulumi.Alias(type_="azure-nextgen:network/v20201101:VirtualNetwork")])
opts = pulumi.ResourceOptions.merge(opts, alias_opts)
super(VirtualNetwork, __self__).__init__(
'azure-native:network/v20180401:VirtualNetwork',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None) -> 'VirtualNetwork':
"""
Get an existing VirtualNetwork resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = VirtualNetworkArgs.__new__(VirtualNetworkArgs)
__props__.__dict__["address_space"] = None
__props__.__dict__["ddos_protection_plan"] = None
__props__.__dict__["dhcp_options"] = None
__props__.__dict__["enable_ddos_protection"] = None
__props__.__dict__["enable_vm_protection"] = None
__props__.__dict__["etag"] = None
__props__.__dict__["location"] = None
__props__.__dict__["name"] = None
__props__.__dict__["provisioning_state"] = None
__props__.__dict__["resource_guid"] = None
__props__.__dict__["subnets"] = None
__props__.__dict__["tags"] = None
__props__.__dict__["type"] = None
__props__.__dict__["virtual_network_peerings"] = None
return VirtualNetwork(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="addressSpace")
def address_space(self) -> pulumi.Output[Optional['outputs.AddressSpaceResponse']]:
"""
The AddressSpace that contains an array of IP address ranges that can be used by subnets.
"""
return pulumi.get(self, "address_space")
@property
@pulumi.getter(name="ddosProtectionPlan")
def ddos_protection_plan(self) -> pulumi.Output[Optional['outputs.SubResourceResponse']]:
"""
The DDoS protection plan associated with the virtual network.
"""
return pulumi.get(self, "ddos_protection_plan")
@property
@pulumi.getter(name="dhcpOptions")
def dhcp_options(self) -> pulumi.Output[Optional['outputs.DhcpOptionsResponse']]:
"""
The dhcpOptions that contains an array of DNS servers available to VMs deployed in the virtual network.
"""
return pulumi.get(self, "dhcp_options")
@property
@pulumi.getter(name="enableDdosProtection")
def enable_ddos_protection(self) -> pulumi.Output[Optional[bool]]:
"""
Indicates if DDoS protection is enabled for all the protected resources in the virtual network. It requires a DDoS protection plan associated with the resource.
"""
return pulumi.get(self, "enable_ddos_protection")
@property
@pulumi.getter(name="enableVmProtection")
def enable_vm_protection(self) -> pulumi.Output[Optional[bool]]:
"""
Indicates if VM protection is enabled for all the subnets in the virtual network.
"""
return pulumi.get(self, "enable_vm_protection")
@property
@pulumi.getter
def etag(self) -> pulumi.Output[Optional[str]]:
"""
Gets a unique read-only string that changes whenever the resource is updated.
"""
return pulumi.get(self, "etag")
@property
@pulumi.getter
def location(self) -> pulumi.Output[Optional[str]]:
"""
Resource location.
"""
return pulumi.get(self, "location")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Resource name.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="provisioningState")
def provisioning_state(self) -> pulumi.Output[Optional[str]]:
"""
The provisioning state of the PublicIP resource. Possible values are: 'Updating', 'Deleting', and 'Failed'.
"""
return pulumi.get(self, "provisioning_state")
@property
@pulumi.getter(name="resourceGuid")
def resource_guid(self) -> pulumi.Output[Optional[str]]:
"""
The resourceGuid property of the Virtual Network resource.
"""
return pulumi.get(self, "resource_guid")
@property
@pulumi.getter
def subnets(self) -> pulumi.Output[Optional[Sequence['outputs.SubnetResponse']]]:
"""
A list of subnets in a Virtual Network.
"""
return pulumi.get(self, "subnets")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
Resource tags.
"""
return pulumi.get(self, "tags")
@property
@pulumi.getter
def type(self) -> pulumi.Output[str]:
"""
Resource type.
"""
return pulumi.get(self, "type")
@property
@pulumi.getter(name="virtualNetworkPeerings")
def virtual_network_peerings(self) -> pulumi.Output[Optional[Sequence['outputs.VirtualNetworkPeeringResponse']]]:
"""
A list of peerings in a Virtual Network.
"""
return pulumi.get(self, "virtual_network_peerings")
| 56.153271 | 5,121 | 0.690533 | 3,394 | 30,042 | 5.873011 | 0.068061 | 0.071188 | 0.054934 | 0.073245 | 0.856871 | 0.794211 | 0.729293 | 0.491396 | 0.442382 | 0.406261 | 0 | 0.023916 | 0.196924 | 30,042 | 534 | 5,122 | 56.258427 | 0.802288 | 0.208508 | 0 | 0.435821 | 1 | 0 | 0.247985 | 0.174779 | 0 | 0 | 0 | 0 | 0 | 1 | 0.149254 | false | 0.002985 | 0.023881 | 0 | 0.268657 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
9c3270a77c125bb0e2a01a1ee0f7ea655c38a0c7 | 1,494 | py | Python | testcases/indicator_tests/zerofindertests.py | quantwizard-com/pythonbacktest | 7056c2804c30ca571eb43dc1ae4cc3d537f6613e | [
"Apache-2.0"
] | null | null | null | testcases/indicator_tests/zerofindertests.py | quantwizard-com/pythonbacktest | 7056c2804c30ca571eb43dc1ae4cc3d537f6613e | [
"Apache-2.0"
] | null | null | null | testcases/indicator_tests/zerofindertests.py | quantwizard-com/pythonbacktest | 7056c2804c30ca571eb43dc1ae4cc3d537f6613e | [
"Apache-2.0"
] | null | null | null | import unittest
from pythonbacktest.indicator import ZeroFinder
import math
class ZeroFinderTests(unittest.TestCase):
def test_growing_line_list_values(self):
input_values = [-5, -4, -3, -2, -1, -1, 0, 1, 2, 3, 4, 5]
expected_all_results = [None, -math.log(4.0 + 1, 2), -2.0, -math.log(2.0 + 1, 2), \
-1.0, None, 0.0, None, None, None, None, None]
expected_result = None
zero_finder = ZeroFinder()
zero_finder.on_new_upstream_value(input_values)
actual_all_results = zero_finder.all_result
actual_result = zero_finder.result
self.assertEqual(expected_all_results, actual_all_results)
self.assertEqual(expected_result, actual_result)
def test_growing_and_falling_line_list_values(self):
input_values = [-5, -4, -3, -2, -1, -1, 0, 1, 2, 3, 4, 5, 4, 3, 2, 1, 0]
expected_all_results = [None, -math.log(4.0 + 1, 2), -2.0, -math.log(2.0 + 1, 2),\
-1.0, None, 0.0, None, None, None, None, None,\
math.log(4.0 + 1, 2), 2.0, math.log(2.0 + 1, 2), 1.0, 0.0]
expected_result = 0
zero_finder = ZeroFinder()
zero_finder.on_new_upstream_value(input_values)
actual_all_results = zero_finder.all_result
actual_result = zero_finder.result
self.assertEqual(expected_all_results, actual_all_results)
self.assertEqual(expected_result, actual_result)
| 34.744186 | 91 | 0.61245 | 216 | 1,494 | 3.986111 | 0.185185 | 0.018583 | 0.027875 | 0.074332 | 0.794425 | 0.789779 | 0.789779 | 0.789779 | 0.789779 | 0.789779 | 0 | 0.065574 | 0.26506 | 1,494 | 42 | 92 | 35.571429 | 0.718579 | 0 | 0 | 0.518519 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 1 | 0.074074 | false | 0 | 0.111111 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
9c4a9d914ba63df5e4c78a8ad11263691d464a79 | 780 | py | Python | coffer/admin.py | shalevy1/cultivar | 99b00c69376cd37fddb14a969cf562276296f01a | [
"Apache-2.0"
] | 37 | 2016-08-22T22:11:30.000Z | 2022-02-24T19:10:18.000Z | coffer/admin.py | DistrictDataLabs/trinket | 99b00c69376cd37fddb14a969cf562276296f01a | [
"Apache-2.0"
] | 75 | 2015-10-15T19:25:18.000Z | 2016-08-04T19:21:10.000Z | coffer/admin.py | DistrictDataLabs/trinket | 99b00c69376cd37fddb14a969cf562276296f01a | [
"Apache-2.0"
] | 17 | 2015-10-09T00:03:12.000Z | 2016-07-22T16:15:41.000Z | # coffer.admin
# Administrative utilities for Coffer models.
#
# Author: Benjamin Bengfort <bbengfort@districtdatalabs.com>
# Created: Thu Oct 08 21:44:48 2015 -0400
#
# Copyright (C) 2015 District Data Labs
# For license information, see LICENSE.txt
#
# ID: admin.py [] benjamin@bengfort.com $
"""
Administrative utilities for Coffer models.
"""
##########################################################################
## Imports
##########################################################################
from django.contrib import admin
from coffer.models import Dataset
##########################################################################
## Register Admin
##########################################################################
admin.site.register(Dataset)
| 27.857143 | 74 | 0.460256 | 62 | 780 | 5.790323 | 0.645161 | 0.100279 | 0.144847 | 0.178273 | 0.211699 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028409 | 0.097436 | 780 | 27 | 75 | 28.888889 | 0.481534 | 0.442308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
9c69c25f8d46a46d9e3fda071e0e024862b00796 | 33 | py | Python | universe/vncdriver/error.py | BitJetKit/universe | cc9ce6ec241821bfb0f3b85dd455bd36e4ee7a8c | [
"MIT"
] | 8,120 | 2016-12-05T06:37:45.000Z | 2022-03-21T14:45:20.000Z | universe/vncdriver/error.py | BitJetKit/universe | cc9ce6ec241821bfb0f3b85dd455bd36e4ee7a8c | [
"MIT"
] | 218 | 2019-03-11T06:05:52.000Z | 2022-03-30T16:59:22.000Z | universe/vncdriver/error.py | BitJetKit/universe | cc9ce6ec241821bfb0f3b85dd455bd36e4ee7a8c | [
"MIT"
] | 1,140 | 2016-12-05T06:50:43.000Z | 2022-03-23T08:28:32.000Z | class Error(Exception):
pass
| 11 | 23 | 0.69697 | 4 | 33 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.212121 | 33 | 2 | 24 | 16.5 | 0.884615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
9c6e6f200b53b63a46bd7e965749fb64d06f7680 | 1,112 | py | Python | api/migrations/0064_auto_20200504_0917.py | IFRCGo/ifrcgo-api | c1c3e0cf1076ab48d03db6aaf7a00f8485ca9e1a | [
"MIT"
] | 11 | 2018-06-11T06:05:12.000Z | 2022-03-25T09:31:44.000Z | api/migrations/0064_auto_20200504_0917.py | IFRCGo/ifrcgo-api | c1c3e0cf1076ab48d03db6aaf7a00f8485ca9e1a | [
"MIT"
] | 498 | 2017-11-07T21:20:13.000Z | 2022-03-31T14:37:18.000Z | api/migrations/0064_auto_20200504_0917.py | IFRCGo/ifrcgo-api | c1c3e0cf1076ab48d03db6aaf7a00f8485ca9e1a | [
"MIT"
] | 6 | 2018-04-11T13:29:50.000Z | 2020-07-16T16:52:11.000Z | # Generated by Django 2.2.10 on 2020-05-04 09:17
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('api', '0063_auto_20200501_1348'),
]
operations = [
migrations.AddField(
model_name='fieldreport',
name='epi_cases',
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name='fieldreport',
name='epi_confirmed_cases',
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name='fieldreport',
name='epi_num_dead',
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name='fieldreport',
name='epi_probable_cases',
field=models.IntegerField(blank=True, null=True),
),
migrations.AddField(
model_name='fieldreport',
name='epi_suspected_cases',
field=models.IntegerField(blank=True, null=True),
),
]
| 28.512821 | 61 | 0.577338 | 108 | 1,112 | 5.787037 | 0.37963 | 0.144 | 0.184 | 0.216 | 0.712 | 0.712 | 0.712 | 0.64 | 0.568 | 0.568 | 0 | 0.041667 | 0.309353 | 1,112 | 38 | 62 | 29.263158 | 0.772135 | 0.041367 | 0 | 0.625 | 1 | 0 | 0.148496 | 0.021617 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.03125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
92cbd9767c71fb6feb6d58e1e69b34a794c2a552 | 83 | py | Python | app_routes/threads/thread_id/applications/__init__.py | kskarbinski/threads-api | c144c1cb51422095922310d278f80e4996c10ea0 | [
"MIT"
] | null | null | null | app_routes/threads/thread_id/applications/__init__.py | kskarbinski/threads-api | c144c1cb51422095922310d278f80e4996c10ea0 | [
"MIT"
] | null | null | null | app_routes/threads/thread_id/applications/__init__.py | kskarbinski/threads-api | c144c1cb51422095922310d278f80e4996c10ea0 | [
"MIT"
] | null | null | null | from .threads_thread_id_applications_route import ThreadsThreadIdApplicationsRoute
| 41.5 | 82 | 0.939759 | 8 | 83 | 9.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048193 | 83 | 1 | 83 | 83 | 0.936709 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
92efe3827f14fedb8fd638d9dbf7b904c7981598 | 40 | py | Python | je_editor/utils/exception/__init__.py | JE-Chen/je_editor | 2f18dedb6f0eb27c38668dc53f520739c8d5c6c6 | [
"MIT"
] | 1 | 2021-12-10T14:57:15.000Z | 2021-12-10T14:57:15.000Z | je_editor/utils/exception/__init__.py | JE-Chen/je_editor | 2f18dedb6f0eb27c38668dc53f520739c8d5c6c6 | [
"MIT"
] | null | null | null | je_editor/utils/exception/__init__.py | JE-Chen/je_editor | 2f18dedb6f0eb27c38668dc53f520739c8d5c6c6 | [
"MIT"
] | null | null | null | from je_editor.utils.exception import *
| 20 | 39 | 0.825 | 6 | 40 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 40 | 1 | 40 | 40 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
92f1aef78e2665f71943882cddd9ce63e2567cfe | 123 | py | Python | recognition/src/models/modules/__init__.py | AlexeyZhuravlev/OCR-experiments | 8493045054678a2e13cafce6d9e85c7581086c7a | [
"MIT"
] | 2 | 2020-05-28T18:46:37.000Z | 2020-08-29T12:49:57.000Z | recognition/src/models/modules/__init__.py | AlexeyZhuravlev/OCR-experiments | 8493045054678a2e13cafce6d9e85c7581086c7a | [
"MIT"
] | null | null | null | recognition/src/models/modules/__init__.py | AlexeyZhuravlev/OCR-experiments | 8493045054678a2e13cafce6d9e85c7581086c7a | [
"MIT"
] | null | null | null | from .attention import AttentionCell
from .stn import SpatialTransformerNetwork
from .mobile_conv import MBInvertedResidual | 41 | 43 | 0.886179 | 13 | 123 | 8.307692 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089431 | 123 | 3 | 43 | 41 | 0.964286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
92f71fc29d23dcde27ab61241a988aebe111988e | 257 | py | Python | src/synamic/core/standalones/functions/__init__.py | SabujXi/SynamicX | 13773409ae5ed9b953b3531e23ce87166c041b79 | [
"MIT"
] | 7 | 2017-10-15T16:58:47.000Z | 2020-07-02T01:08:21.000Z | src/synamic/core/standalones/functions/__init__.py | SabujXi/SynamicX | 13773409ae5ed9b953b3531e23ce87166c041b79 | [
"MIT"
] | 5 | 2021-03-18T20:30:32.000Z | 2022-03-11T23:20:09.000Z | src/synamic/core/standalones/functions/__init__.py | SabujXi/SynamicX | 13773409ae5ed9b953b3531e23ce87166c041b79 | [
"MIT"
] | 2 | 2018-02-04T14:03:38.000Z | 2018-02-04T15:25:59.000Z | """
author: "Md. Sabuj Sarker"
copyright: "Copyright 2017-2018, The Synamic Project"
credits: ["Md. Sabuj Sarker"]
license: "MIT"
maintainer: "Md. Sabuj Sarker"
email: "md.sabuj.sarker@gmail.com"
status: "Development"
""" | 28.555556 | 58 | 0.614786 | 29 | 257 | 5.448276 | 0.655172 | 0.177215 | 0.329114 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.040816 | 0.237354 | 257 | 9 | 59 | 28.555556 | 0.765306 | 0.828794 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
133996d291139d2a131656c028268bacefc93ad5 | 51 | py | Python | backend/78.py | Uknown-creator/T1-Challenge | 25bc186d0d6f8c4d3aa33b2d1cd77713b5b9543b | [
"MIT"
] | 2 | 2021-11-29T10:29:12.000Z | 2021-11-29T16:35:03.000Z | backend/78.py | Uknown-creator/T1-Challenge | 25bc186d0d6f8c4d3aa33b2d1cd77713b5b9543b | [
"MIT"
] | null | null | null | backend/78.py | Uknown-creator/T1-Challenge | 25bc186d0d6f8c4d3aa33b2d1cd77713b5b9543b | [
"MIT"
] | 1 | 2021-11-28T08:31:41.000Z | 2021-11-28T08:31:41.000Z | from ai import Face
print(Face.emotion("123.jpg")) | 17 | 30 | 0.745098 | 9 | 51 | 4.222222 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065217 | 0.098039 | 51 | 3 | 30 | 17 | 0.76087 | 0 | 0 | 0 | 0 | 0 | 0.134615 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 5 |
133dad4f017f47be92ddaf7e8c282907026a6af5 | 143 | py | Python | teach04/apis_v1_urls.py | jianghaiming0707/whoami | a807fe44a9b8032b580b79b7b0f5ce32f4a5272d | [
"Apache-2.0"
] | null | null | null | teach04/apis_v1_urls.py | jianghaiming0707/whoami | a807fe44a9b8032b580b79b7b0f5ce32f4a5272d | [
"Apache-2.0"
] | null | null | null | teach04/apis_v1_urls.py | jianghaiming0707/whoami | a807fe44a9b8032b580b79b7b0f5ce32f4a5272d | [
"Apache-2.0"
] | null | null | null | from django.conf.urls import url
from teach04 import apis_v1 as api
urlpatterns = [
url(r'^add-language$',api.add_language_api)
]
| 17.875 | 48 | 0.706294 | 22 | 143 | 4.454545 | 0.681818 | 0.22449 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026087 | 0.195804 | 143 | 7 | 49 | 20.428571 | 0.826087 | 0 | 0 | 0 | 0 | 0 | 0.102941 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
1367a9a199a6c55c010eabcb1d7accb2f0043125 | 89 | py | Python | loader.py | math2001/PluginSettingContext | e5ac681ab91424e2202450e9ca603cae60f63635 | [
"MIT"
] | null | null | null | loader.py | math2001/PluginSettingContext | e5ac681ab91424e2202450e9ca603cae60f63635 | [
"MIT"
] | null | null | null | loader.py | math2001/PluginSettingContext | e5ac681ab91424e2202450e9ca603cae60f63635 | [
"MIT"
] | 1 | 2019-03-08T21:01:50.000Z | 2019-03-08T21:01:50.000Z | # -*- encoding: utf-8 -*-
from package_setting_context.all import PackageSettingContext
| 22.25 | 61 | 0.775281 | 10 | 89 | 6.7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012658 | 0.11236 | 89 | 3 | 62 | 29.666667 | 0.835443 | 0.258427 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
138324795a04f88bcd625693c9b7959454ec7259 | 204 | py | Python | python/built_ins/any_or_all.py | scouvreur/hackerrank | 93a52ed6a744ce41fd215d8007435a682228fa01 | [
"MIT"
] | 1 | 2021-01-28T14:23:24.000Z | 2021-01-28T14:23:24.000Z | python/built_ins/any_or_all.py | scouvreur/hackerrank | 93a52ed6a744ce41fd215d8007435a682228fa01 | [
"MIT"
] | null | null | null | python/built_ins/any_or_all.py | scouvreur/hackerrank | 93a52ed6a744ce41fd215d8007435a682228fa01 | [
"MIT"
] | null | null | null | _, list_ints = int(input().strip()), list(map(int, input().strip().split(" ")))
print(
all(list(map(lambda x: x > 0, list_ints)))
and any(list(map(lambda x: str(x) == str(x)[::-1], list_ints)))
)
| 34 | 79 | 0.583333 | 34 | 204 | 3.382353 | 0.470588 | 0.208696 | 0.226087 | 0.243478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011494 | 0.147059 | 204 | 5 | 80 | 40.8 | 0.649425 | 0 | 0 | 0 | 0 | 0 | 0.004902 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
139ac245913a00092707b0c2cac0c8fc915ebb1e | 42 | py | Python | jsonvl/core/_number/__init__.py | gregorybchris/jsonvl | bece787c767c49ab32cb7d499a1ab71657827cb7 | [
"Apache-2.0"
] | 5 | 2021-02-19T00:51:27.000Z | 2022-01-17T16:59:21.000Z | jsonvl/core/_number/__init__.py | gregorybchris/jsonvl | bece787c767c49ab32cb7d499a1ab71657827cb7 | [
"Apache-2.0"
] | 7 | 2021-02-18T01:06:12.000Z | 2021-12-01T19:11:58.000Z | jsonvl/core/_number/__init__.py | gregorybchris/jsonvl | bece787c767c49ab32cb7d499a1ab71657827cb7 | [
"Apache-2.0"
] | null | null | null | """Module for number validation logic."""
| 21 | 41 | 0.714286 | 5 | 42 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119048 | 42 | 1 | 42 | 42 | 0.810811 | 0.833333 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
13c5e6a63ca260bd2426f61f11068a86dad06b42 | 256 | py | Python | examples/__init__.py | Power8Chang/python-redfish | 8ac8a9bb4658584e747c8e1580836d727d666b33 | [
"Apache-2.0"
] | 10 | 2015-09-02T15:30:06.000Z | 2020-10-17T18:45:01.000Z | examples/__init__.py | Power8Chang/python-redfish | 8ac8a9bb4658584e747c8e1580836d727d666b33 | [
"Apache-2.0"
] | 68 | 2015-05-27T07:30:35.000Z | 2022-02-07T22:00:18.000Z | examples/__init__.py | Power8Chang/python-redfish | 8ac8a9bb4658584e747c8e1580836d727d666b33 | [
"Apache-2.0"
] | 24 | 2015-05-26T17:48:50.000Z | 2022-02-07T15:32:12.000Z | # coding=utf-8
from __future__ import unicode_literals
from __future__ import print_function
from __future__ import division
from __future__ import absolute_import
from future import standard_library
standard_library.install_aliases()
__author__ = 'uggla'
| 28.444444 | 39 | 0.863281 | 33 | 256 | 5.909091 | 0.545455 | 0.25641 | 0.410256 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004348 | 0.101563 | 256 | 8 | 40 | 32 | 0.843478 | 0.046875 | 0 | 0 | 0 | 0 | 0.020661 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.714286 | 0 | 0.714286 | 0.142857 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
13f79ac7f06231652485b4dd82c02a28ed97924f | 576 | py | Python | app/api/data/tests/test_user.py | rummens1337/federated-social-network | e9b15342e7640a0b154787303c8660fa75acba14 | [
"MIT"
] | null | null | null | app/api/data/tests/test_user.py | rummens1337/federated-social-network | e9b15342e7640a0b154787303c8660fa75acba14 | [
"MIT"
] | null | null | null | app/api/data/tests/test_user.py | rummens1337/federated-social-network | e9b15342e7640a0b154787303c8660fa75acba14 | [
"MIT"
] | null | null | null | # from flask import Blueprint, request
# from app.api.utils import good_json_response, bad_json_response
# blueprint = Blueprint('data_user', __name__)
# blueprint.route('/')
def test_user():
pass
# blueprint.route('/posts')
def test_posts():
pass
# blueprint.route('/details')
def test_details():
pass
# blueprint.route('/register', methods=['POST'])
def test_register():
pass
# blueprint.route('/delete', methods=['POST'])
def delete():
pass
# blueprint.route('/edit', methods=['POST'])
def test_edit():
pass
# __all__ = ('blueprint')
| 15.157895 | 65 | 0.670139 | 69 | 576 | 5.333333 | 0.391304 | 0.228261 | 0.244565 | 0.097826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.161458 | 576 | 37 | 66 | 15.567568 | 0.761905 | 0.657986 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
b96790d47a7150274d7d74d78b5ff9bf89a1e317 | 60 | py | Python | src/elasticsearch_kibana_cli/exceptions/ElasticsearchKibanaCLIException.py | ndejong/elasticsearch_kibana_cli | b98fc93f5425a7afa7b3fdc6c3f111761b9630b8 | [
"BSD-2-Clause"
] | 5 | 2020-04-14T15:30:33.000Z | 2021-09-08T04:19:40.000Z | src/elasticsearch_kibana_cli/exceptions/ElasticsearchKibanaCLIException.py | ndejong/elasticsearch_kibana_cli | b98fc93f5425a7afa7b3fdc6c3f111761b9630b8 | [
"BSD-2-Clause"
] | 1 | 2021-09-10T03:59:52.000Z | 2021-09-10T03:59:52.000Z | src/elasticsearch_kibana_cli/exceptions/ElasticsearchKibanaCLIException.py | ndejong/elasticsearch_kibana_cli | b98fc93f5425a7afa7b3fdc6c3f111761b9630b8 | [
"BSD-2-Clause"
] | 2 | 2021-09-09T14:38:17.000Z | 2021-11-19T10:29:53.000Z |
class ElasticsearchKibanaCLIException(Exception):
pass
| 15 | 49 | 0.816667 | 4 | 60 | 12.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 60 | 3 | 50 | 20 | 0.942308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
b996856c0e6c6fe2dd1e8dc0e5964790d9af420e | 367 | py | Python | src/exceptions.py | PearCoding/SpriteRecourceCompiler | 34dcd9175f92e580705a2f07998046a05a19329b | [
"MIT"
] | 1 | 2016-04-16T21:33:58.000Z | 2016-04-16T21:33:58.000Z | src/exceptions.py | PearCoding/SpriteRecourceCompiler | 34dcd9175f92e580705a2f07998046a05a19329b | [
"MIT"
] | null | null | null | src/exceptions.py | PearCoding/SpriteRecourceCompiler | 34dcd9175f92e580705a2f07998046a05a19329b | [
"MIT"
] | null | null | null | class SRCException(Exception):
pass
class XMLError(SRCException):
def __init__(self, str):
self.str = str
def __str__(self):
return 'XML Error: {0}'.format(self.str)
class ProcessorError(SRCException):
def __init__(self, str):
self.str = str
def __str__(self):
return 'Processing Error: {0}'.format(self.str)
| 19.315789 | 55 | 0.634877 | 44 | 367 | 4.931818 | 0.363636 | 0.193548 | 0.175115 | 0.211982 | 0.654378 | 0.479263 | 0.479263 | 0.479263 | 0.479263 | 0.479263 | 0 | 0.007194 | 0.242507 | 367 | 18 | 56 | 20.388889 | 0.773381 | 0 | 0 | 0.5 | 0 | 0 | 0.095368 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.083333 | 0 | 0.166667 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 5 |
b9b4e34e2ab0cd3d47d6ebffc037e7b4b894bdea | 15,654 | py | Python | appengine/findit/common/waterfall/test/buildbucket_client_test.py | allaparthi/monorail | e18645fc1b952a5a6ff5f06e0c740d75f1904473 | [
"BSD-3-Clause"
] | null | null | null | appengine/findit/common/waterfall/test/buildbucket_client_test.py | allaparthi/monorail | e18645fc1b952a5a6ff5f06e0c740d75f1904473 | [
"BSD-3-Clause"
] | null | null | null | appengine/findit/common/waterfall/test/buildbucket_client_test.py | allaparthi/monorail | e18645fc1b952a5a6ff5f06e0c740d75f1904473 | [
"BSD-3-Clause"
] | null | null | null | # Copyright 2015 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
import collections
from datetime import datetime
import json
import mock
from go.chromium.org.luci.buildbucket.proto.build_pb2 import Build
from go.chromium.org.luci.buildbucket.proto.build_pb2 import BuilderID
from go.chromium.org.luci.buildbucket.proto.common_pb2 import GitilesCommit
from go.chromium.org.luci.buildbucket.proto.rpc_pb2 import SearchBuildsResponse
from testing_utils import testing
from gae_libs.http import http_client_appengine
from common.findit_http_client import FinditHttpClient
from common.waterfall import buildbucket_client
_Result = collections.namedtuple('Result',
['content', 'status_code', 'headers'])
class BuildBucketClientTest(testing.AppengineTestCase):
def setUp(self):
super(BuildBucketClientTest, self).setUp()
self.maxDiff = None
with self.mock_urlfetch() as urlfetch:
self.mocked_urlfetch = urlfetch
def testGetBucketName(self):
mapping = {
'a': 'master.a',
'master.b': 'master.b',
}
for master_name, expected_full_master_name in mapping.iteritems():
self.assertEqual(expected_full_master_name,
buildbucket_client._GetBucketName(master_name))
def testTryJobToBuildbucketRequestWithTests(self):
properties = {'a': '1', 'tests': {'a_tests': ['Test.One', 'Test.Two']}}
try_job = buildbucket_client.TryJob('m', 'b', properties, ['a'])
expceted_parameters = {
'builder_name': 'b',
'properties': properties,
}
request_json = try_job.ToBuildbucketRequest()
self.assertEqual('master.m', request_json['bucket'])
self.assertEqual(2, len(request_json['tags']))
self.assertEqual('a', request_json['tags'][0])
self.assertEqual('user_agent:findit', request_json['tags'][1])
parameters = json.loads(request_json['parameters_json'])
self.assertEqual(expceted_parameters, parameters)
def testTryJobToSwarmbucketRequest(self):
properties = {'a': '1', 'tests': {'a_tests': ['Test.One', 'Test.Two'],}}
try_job = buildbucket_client.TryJob('luci.c', 'b', properties, ['a'],
'builder_abc123')
expceted_parameters = {
'builder_name': 'b',
'swarming': {
'override_builder_cfg': {
'caches': [{
'name': 'builder_abc123',
'path': 'builder'
}],
}
},
'properties': properties,
}
request_json = try_job.ToBuildbucketRequest()
self.assertEqual('luci.c', request_json['bucket'])
self.assertEqual(2, len(request_json['tags']))
self.assertEqual('a', request_json['tags'][0])
self.assertEqual('user_agent:findit', request_json['tags'][1])
parameters = json.loads(request_json['parameters_json'])
self.assertEqual(expceted_parameters, parameters)
def testTryJobToSwarmbucketRequestWithOverrides(self):
properties = {
'a': '1',
'recipe': 'b',
'tests': {
'a_tests': ['Test.One', 'Test.Two'],
}
}
try_job = buildbucket_client.TryJob(
'luci.c',
'b',
properties, ['a'],
'builder_abc123', ['os:Linux'],
priority=1)
expceted_parameters = {
'builder_name': 'b',
'swarming': {
'override_builder_cfg': {
'caches': [{
'name': 'builder_abc123',
'path': 'builder'
}],
'dimensions': ['os:Linux'],
'recipe': {
'name': 'b'
},
'priority': 1
}
},
'properties': {
'a': '1',
'recipe': 'b',
'tests': {
'a_tests': ['Test.One', 'Test.Two']
}
},
}
request_json = try_job.ToBuildbucketRequest()
self.assertEqual('luci.c', request_json['bucket'])
self.assertEqual(2, len(request_json['tags']))
self.assertEqual('a', request_json['tags'][0])
self.assertEqual('user_agent:findit', request_json['tags'][1])
parameters = json.loads(request_json['parameters_json'])
self.assertEqual(expceted_parameters, parameters)
@mock.patch.object(http_client_appengine.urlfetch, 'fetch')
def testTriggerTryJobsSuccess(self, mocked_fetch):
response = {
'build': {
'id': '1',
'url': 'url',
'status': 'SCHEDULED',
}
}
try_job = buildbucket_client.TryJob('m', 'b', {'a': 'b'}, [], {})
mocked_fetch.return_value = _Result(
status_code=200, content=json.dumps(response), headers={})
results = buildbucket_client.TriggerTryJobs([try_job])
self.assertEqual(1, len(results))
error, build = results[0]
self.assertIsNone(error)
self.assertIsNotNone(build)
self.assertEqual('1', build.id)
self.assertEqual('url', build.url)
self.assertEqual('SCHEDULED', build.status)
@mock.patch.object(http_client_appengine.urlfetch, 'fetch')
def testTriggerTryJobsFailure(self, mocked_fetch):
response = {
'error': {
'reason': 'error',
'message': 'message',
}
}
try_job = buildbucket_client.TryJob('m', 'b', {}, [], {})
mocked_fetch.return_value = _Result(
status_code=200, content=json.dumps(response), headers={})
results = buildbucket_client.TriggerTryJobs([try_job])
self.assertEqual(1, len(results))
error, build = results[0]
self.assertIsNotNone(error)
self.assertEqual('error', error.reason)
self.assertEqual('message', error.message)
self.assertIsNone(build)
@mock.patch.object(http_client_appengine.urlfetch, 'fetch')
def testTriggerTryJobsRequestFailure(self, mocked_fetch):
response = 'Not Found'
try_job = buildbucket_client.TryJob('m', 'b', {}, [], {})
mocked_fetch.return_value = _Result(
status_code=404, content=response, headers={})
results = buildbucket_client.TriggerTryJobs([try_job])
self.assertEqual(1, len(results))
error, build = results[0]
self.assertIsNotNone(error)
self.assertEqual(404, error.reason)
self.assertEqual('Not Found', error.message)
self.assertIsNone(build)
@mock.patch.object(http_client_appengine.urlfetch, 'fetch')
def testGetTryJobsSuccess(self, mocked_fetch):
response = {'build': {'id': '1', 'url': 'url', 'status': 'STARTED'}}
mocked_fetch.return_value = _Result(
status_code=200, content=json.dumps(response), headers={})
results = buildbucket_client.GetTryJobs(['1'])
self.assertEqual(1, len(results))
error, build = results[0]
self.assertIsNone(error)
self.assertIsNotNone(build)
self.assertEqual('1', build.id)
self.assertEqual('url', build.url)
self.assertEqual('STARTED', build.status)
@mock.patch.object(http_client_appengine.urlfetch, 'fetch')
def testGetTryJobsFailure(self, mocked_fetch):
response = {
'error': {
'reason': 'BUILD_NOT_FOUND',
'message': 'message',
}
}
mocked_fetch.return_value = _Result(
status_code=200, content=json.dumps(response), headers={})
results = buildbucket_client.GetTryJobs(['2'])
self.assertEqual(1, len(results))
error, build = results[0]
self.assertIsNotNone(error)
self.assertEqual('BUILD_NOT_FOUND', error.reason)
self.assertEqual('message', error.message)
self.assertIsNone(build)
@mock.patch.object(http_client_appengine.urlfetch, 'fetch')
def testGetTryJobsRequestFailure(self, mocked_fetch):
response = 'Not Found'
mocked_fetch.return_value = _Result(
status_code=404, content=response, headers={})
results = buildbucket_client.GetTryJobs(['3'])
self.assertEqual(1, len(results))
error, build = results[0]
self.assertIsNotNone(error)
self.assertEqual(404, error.reason)
self.assertEqual('Not Found', error.message)
self.assertIsNone(build)
@mock.patch.object(FinditHttpClient, 'Post')
def testGetV2Build(self, mock_post):
build_id = '8945610992972640896'
mock_build = Build()
mock_build.id = int(build_id)
mock_build.status = 12
mock_build.output.properties['mastername'] = 'chromium.linux'
mock_build.output.properties['buildername'] = 'Linux Builder'
mock_build.output.properties.get_or_create_struct(
'swarm_hashes_ref/heads/mockmaster(at){#123}'
)['mock_target'] = 'mock_hash'
gitiles_commit = mock_build.input.gitiles_commit
gitiles_commit.host = 'gitiles.host'
gitiles_commit.project = 'gitiles/project'
gitiles_commit.ref = 'refs/heads/mockmaster'
mock_build.builder.project = 'mock_luci_project'
mock_build.builder.bucket = 'mock_bucket'
mock_build.builder.builder = 'Linux Builder'
mock_headers = {'X-Prpc-Grpc-Code': '0'}
binary_data = mock_build.SerializeToString()
mock_post.return_value = (200, binary_data, mock_headers)
build = buildbucket_client.GetV2Build(build_id)
self.assertIsNotNone(build)
self.assertEqual(mock_build.id, build.id)
mock_headers = {'X-Prpc-Grpc-Code': '4'}
binary_data = mock_build.SerializeToString()
mock_post.return_value = (404, binary_data, mock_headers)
self.assertIsNone(buildbucket_client.GetV2Build(build_id))
@mock.patch.object(FinditHttpClient, 'Post')
def testGetBuildNumberFromBuildId(self, mock_post):
build_id = 10000
expected_build_number = 12345
mock_build = Build()
mock_build.id = build_id
mock_build.status = 12
mock_build.output.properties['mastername'] = 'chromium.linux'
mock_build.output.properties['buildername'] = 'Linux Builder'
mock_build.output.properties['buildnumber'] = expected_build_number
mock_build.output.properties.get_or_create_struct(
'swarm_hashes_ref/heads/mockmaster(at){#123}'
)['mock_target'] = 'mock_hash'
gitiles_commit = mock_build.input.gitiles_commit
gitiles_commit.host = 'gitiles.host'
gitiles_commit.project = 'gitiles/project'
gitiles_commit.ref = 'refs/heads/mockmaster'
mock_build.builder.project = 'mock_luci_project'
mock_build.builder.bucket = 'mock_bucket'
mock_build.builder.builder = 'Linux Builder'
mock_headers = {'X-Prpc-Grpc-Code': '0'}
binary_data = mock_build.SerializeToString()
mock_post.return_value = (200, binary_data, mock_headers)
self.assertEqual(expected_build_number,
buildbucket_client.GetBuildNumberFromBuildId(build_id))
@mock.patch.object(FinditHttpClient, 'Post')
def testGetV2BuildByBuilderAndBuildNumber(self, mock_post):
master_name = 'tryserver.chromium.win'
mock_build = Build()
mock_build.input.properties['mastername'] = master_name
mock_headers = {'X-Prpc-Grpc-Code': '0'}
binary_data = mock_build.SerializeToString()
mock_post.return_value = (200, binary_data, mock_headers)
build = buildbucket_client.GetV2BuildByBuilderAndBuildNumber(
'chromium', 'try', 'win10_chromium_x64_rel_ng', 123)
self.assertEqual(master_name, build.input.properties['mastername'])
@mock.patch.object(FinditHttpClient, 'Post')
def testGetV2BuildByBuilderAndBuildNumberRequestFail(self, mock_post):
mock_headers = {'X-Prpc-Grpc-Code': '404'}
mock_post.return_value = (404, None, mock_headers)
build = buildbucket_client.GetV2BuildByBuilderAndBuildNumber(
'chromium', 'try', 'win10_chromium_x64_rel_ng', 123)
self.assertIsNone(build)
@mock.patch.object(FinditHttpClient, 'Post')
def testSearchV2BuildsOnBuilder(self, mock_post):
builder = BuilderID(project='chromium', bucket='try', builder='linux-rel')
mock_build = Build(number=100)
mock_response = SearchBuildsResponse(
next_page_token='next_page_token', builds=[mock_build])
mock_headers = {'X-Prpc-Grpc-Code': '0'}
binary_data = mock_response.SerializeToString()
mock_post.return_value = (200, binary_data, mock_headers)
res = buildbucket_client.SearchV2BuildsOnBuilder(builder)
self.assertEqual('next_page_token', res.next_page_token)
self.assertEqual(1, len(res.builds))
self.assertEqual(100, res.builds[0].number)
@mock.patch.object(FinditHttpClient, 'Post')
def testSearchV2BuildsOnBuilderWithTimeRange(self, mock_post):
builder = BuilderID(project='chromium', bucket='try', builder='linux-rel')
mock_build = Build(number=100)
mock_response = SearchBuildsResponse(
next_page_token='next_page_token', builds=[mock_build])
mock_headers = {'X-Prpc-Grpc-Code': '0'}
binary_data = mock_response.SerializeToString()
mock_post.return_value = (200, binary_data, mock_headers)
res = buildbucket_client.SearchV2BuildsOnBuilder(
builder, create_time_range=(None, datetime(2019, 4, 9, 3, 33)))
self.assertEqual('next_page_token', res.next_page_token)
self.assertEqual(1, len(res.builds))
self.assertEqual(100, res.builds[0].number)
@mock.patch.object(FinditHttpClient, 'Post')
def testSearchV2BuildsOnBuilderWithBuildRange(self, mock_post):
builder = BuilderID(project='chromium', bucket='try', builder='linux-rel')
mock_build = Build(number=100)
mock_response = SearchBuildsResponse(
next_page_token='next_page_token', builds=[mock_build])
mock_headers = {'X-Prpc-Grpc-Code': '0'}
binary_data = mock_response.SerializeToString()
mock_post.return_value = (200, binary_data, mock_headers)
res = buildbucket_client.SearchV2BuildsOnBuilder(
builder, build_range=(None, 800000000099), page_size=1)
self.assertEqual('next_page_token', res.next_page_token)
self.assertEqual(1, len(res.builds))
self.assertEqual(100, res.builds[0].number)
@mock.patch.object(FinditHttpClient, 'Post')
def testTriggerV2Build(self, mock_post):
mock_build = Build()
mock_headers = {'X-Prpc-Grpc-Code': '0'}
binary_data = mock_build.SerializeToString()
mock_post.return_value = (200, binary_data, mock_headers)
builder = BuilderID(project='chromium', bucket='try', builder='linux-rel')
gitiles_commit = GitilesCommit(
project='gitiles/project',
host='gitiles.host.com',
ref='refs/heads/master',
id='git_hash')
properties = {
'property1': 'property1',
'property2': ['property2'],
'property3': {
'property3-key': 'property3-value'
}
}
build = buildbucket_client.TriggerV2Build(builder, gitiles_commit,
properties)
self.assertIsNotNone(build)
@mock.patch.object(FinditHttpClient, 'Post')
def testTriggerV2BuildFailed(self, mock_post):
mock_build = Build()
mock_headers = {'X-Prpc-Grpc-Code': '0'}
binary_data = mock_build.SerializeToString()
mock_post.return_value = (403, binary_data, mock_headers)
builder = BuilderID(project='chromium', bucket='try', builder='linux-rel')
gitiles_commit = GitilesCommit(
project='gitiles/project',
host='gitiles.host.com',
ref='refs/heads/master',
id='git_hash')
properties = {
'property1': 'property1',
'property2': ['property2'],
'property3': {
'property3-key': 'property3-value'
}
}
tags = [{'key': 'tag-key', 'value': 'tag-value'}]
dimensions = [{'key': 'gpu', 'value': 'NVidia'}]
build = buildbucket_client.TriggerV2Build(
builder, gitiles_commit, properties, tags=tags, dimensions=dimensions)
self.assertIsNone(build)
| 37.995146 | 79 | 0.670819 | 1,720 | 15,654 | 5.902326 | 0.133721 | 0.070922 | 0.024823 | 0.01576 | 0.79738 | 0.774921 | 0.745961 | 0.731186 | 0.709318 | 0.68223 | 0 | 0.019851 | 0.195477 | 15,654 | 411 | 80 | 38.087591 | 0.786247 | 0.009902 | 0 | 0.638889 | 0 | 0 | 0.138884 | 0.012907 | 0 | 0 | 0 | 0 | 0.180556 | 1 | 0.055556 | false | 0 | 0.033333 | 0 | 0.091667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
b9b684c428113bb36b617e570454085ef8c871e5 | 127 | py | Python | v0.90/dc/utils/test_commun.py | iaiting/Flask-and-pywebview-followup-application-gui | b665334403b4a8471b5f28054ee2dc7adda7d9fc | [
"MIT"
] | null | null | null | v0.90/dc/utils/test_commun.py | iaiting/Flask-and-pywebview-followup-application-gui | b665334403b4a8471b5f28054ee2dc7adda7d9fc | [
"MIT"
] | null | null | null | v0.90/dc/utils/test_commun.py | iaiting/Flask-and-pywebview-followup-application-gui | b665334403b4a8471b5f28054ee2dc7adda7d9fc | [
"MIT"
] | 1 | 2019-12-25T11:57:45.000Z | 2019-12-25T11:57:45.000Z | import unittest
from dc.utils.commun import Commun
com_funcs = Commun()
class TestCommun(unittest.TestCase):
pass
| 15.875 | 37 | 0.732283 | 16 | 127 | 5.75 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.19685 | 127 | 7 | 38 | 18.142857 | 0.901961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.2 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 5 |
b9ba0fb48bb0b56e0cdb6eb77cccf8863e7afb62 | 137 | py | Python | tests/web_platform/css_flexbox_1/test_flex_flow.py | fletchgraham/colosseum | 77be4896ee52b8f5956a3d77b5f2ccd2c8608e8f | [
"BSD-3-Clause"
] | null | null | null | tests/web_platform/css_flexbox_1/test_flex_flow.py | fletchgraham/colosseum | 77be4896ee52b8f5956a3d77b5f2ccd2c8608e8f | [
"BSD-3-Clause"
] | null | null | null | tests/web_platform/css_flexbox_1/test_flex_flow.py | fletchgraham/colosseum | 77be4896ee52b8f5956a3d77b5f2ccd2c8608e8f | [
"BSD-3-Clause"
] | 1 | 2020-01-16T01:56:41.000Z | 2020-01-16T01:56:41.000Z | from tests.utils import W3CTestCase
class TestFlexFlow(W3CTestCase):
vars().update(W3CTestCase.find_tests(__file__, 'flex-flow-'))
| 22.833333 | 65 | 0.773723 | 16 | 137 | 6.3125 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02439 | 0.10219 | 137 | 5 | 66 | 27.4 | 0.796748 | 0 | 0 | 0 | 0 | 0 | 0.073529 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
b9f681f46e1fdddbc0b3f8c13873bef534c8b2be | 274 | py | Python | score_map_method/__init__.py | naver-ai/calm | d77b46a047e1b27614b1624d031d52463acbfea0 | [
"MIT"
] | 77 | 2021-06-11T12:00:48.000Z | 2022-03-14T13:02:31.000Z | score_map_method/__init__.py | naver-ai/calm | d77b46a047e1b27614b1624d031d52463acbfea0 | [
"MIT"
] | 1 | 2022-02-19T06:58:17.000Z | 2022-02-19T12:30:18.000Z | score_map_method/__init__.py | naver-ai/calm | d77b46a047e1b27614b1624d031d52463acbfea0 | [
"MIT"
] | 7 | 2021-06-11T12:00:52.000Z | 2022-03-14T12:48:34.000Z | from .activation_map import activation_map
from .backprop import backprop
from .input_grad import input_grad
from .smooth_grad import smooth_grad
from .smooth_grad_square import smooth_grad_square
from .integrated_grad import integrated_grad
from .var_grad import var_grad
| 30.444444 | 50 | 0.868613 | 42 | 274 | 5.333333 | 0.261905 | 0.178571 | 0.125 | 0.160714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105839 | 274 | 8 | 51 | 34.25 | 0.914286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
6a15150281730b3f38300650727cd51f2900f612 | 153 | py | Python | venv/Lib/site-packages/palettable/cubehelix/__init__.py | EkremBayar/bayar | aad1a32044da671d0b4f11908416044753360b39 | [
"MIT"
] | null | null | null | venv/Lib/site-packages/palettable/cubehelix/__init__.py | EkremBayar/bayar | aad1a32044da671d0b4f11908416044753360b39 | [
"MIT"
] | null | null | null | venv/Lib/site-packages/palettable/cubehelix/__init__.py | EkremBayar/bayar | aad1a32044da671d0b4f11908416044753360b39 | [
"MIT"
] | 1 | 2022-02-09T07:06:24.000Z | 2022-02-09T07:06:24.000Z | from __future__ import absolute_import
from .cubehelix import __doc__, Cubehelix, print_maps, get_map, _get_all_maps
globals().update(_get_all_maps())
| 25.5 | 77 | 0.823529 | 22 | 153 | 4.954545 | 0.590909 | 0.110092 | 0.183486 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098039 | 153 | 5 | 78 | 30.6 | 0.789855 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0.333333 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
6a2c65cbd04301084e5517a72af6517d26caad84 | 10,015 | py | Python | test/raw_explain/test_raw_explanation.py | eedeleon/interpret-community | b6840118dc4575fedf3bd478a90b02ef36bea050 | [
"MIT"
] | null | null | null | test/raw_explain/test_raw_explanation.py | eedeleon/interpret-community | b6840118dc4575fedf3bd478a90b02ef36bea050 | [
"MIT"
] | null | null | null | test/raw_explain/test_raw_explanation.py | eedeleon/interpret-community | b6840118dc4575fedf3bd478a90b02ef36bea050 | [
"MIT"
] | null | null | null | # ---------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# ---------------------------------------------------------
import pytest
import numpy as np
from common_utils import create_sklearn_svm_classifier, create_sklearn_random_forest_regressor, \
create_sklearn_linear_regressor, create_multiclass_sparse_newsgroups_data, \
create_sklearn_logistic_regressor
from constants import DatasetConstants, owner_email_tools_and_ux
from datasets import retrieve_dataset
from sklearn.model_selection import train_test_split
from interpret_community.mimic.models.linear_model import LinearExplainableModel
LINEAR_METHOD = 'mimic.linear'
@pytest.mark.owner(email=owner_email_tools_and_ux)
@pytest.mark.usefixtures('clean_dir')
class TestRawExplanations:
def test_get_global_raw_explanations_classification(self, iris, tabular_explainer):
model = create_sklearn_svm_classifier(iris[DatasetConstants.X_TRAIN], iris[DatasetConstants.Y_TRAIN])
exp = tabular_explainer(model, iris[DatasetConstants.X_TRAIN], features=iris[DatasetConstants.FEATURES],
classes=iris[DatasetConstants.CLASSES])
global_explanation = exp.explain_global(iris[DatasetConstants.X_TEST])
assert not global_explanation.is_raw
assert not global_explanation.is_engineered
num_engineered_feats = len(iris[DatasetConstants.FEATURES])
feature_map = np.eye(num_engineered_feats - 1, num_engineered_feats)
feature_names = [str(i) for i in range(feature_map.shape[0])]
global_raw_explanation = global_explanation.get_raw_explanation(
[feature_map], raw_feature_names=feature_names[:feature_map.shape[0]])
self.validate_global_raw_explanation_classification(global_explanation, global_raw_explanation, feature_map,
iris[DatasetConstants.CLASSES], feature_names)
def test_get_global_raw_explanations_regression(self, boston, tabular_explainer):
model = create_sklearn_random_forest_regressor(boston[DatasetConstants.X_TRAIN],
boston[DatasetConstants.Y_TRAIN])
exp = tabular_explainer(model, boston[DatasetConstants.X_TRAIN], features=boston[DatasetConstants.FEATURES])
global_explanation = exp.explain_global(boston[DatasetConstants.X_TEST])
assert not global_explanation.is_raw
assert not global_explanation.is_engineered
num_engineered_feats = len(boston[DatasetConstants.FEATURES])
feature_map = np.eye(num_engineered_feats - 1, num_engineered_feats)
global_raw_explanation = global_explanation.get_raw_explanation([feature_map])
self.validate_global_raw_explanation_regression(global_explanation, global_raw_explanation, feature_map)
def test_get_local_raw_explanations_classification(self, iris, tabular_explainer):
model = create_sklearn_svm_classifier(iris[DatasetConstants.X_TRAIN], iris[DatasetConstants.Y_TRAIN])
exp = tabular_explainer(model, iris[DatasetConstants.X_TRAIN], features=iris[DatasetConstants.FEATURES],
classes=iris[DatasetConstants.CLASSES])
local_explanation = exp.explain_local(iris[DatasetConstants.X_TEST][0])
num_engineered_feats = len(iris[DatasetConstants.FEATURES])
feature_map = np.eye(num_engineered_feats - 1, num_engineered_feats)
local_raw_explanation = local_explanation.get_raw_explanation([feature_map])
assert len(local_raw_explanation.local_importance_values) == len(iris[DatasetConstants.CLASSES])
assert len(local_raw_explanation.local_importance_values[0]) == feature_map.shape[0]
local_rank = local_raw_explanation.get_local_importance_rank()
assert len(local_rank) == len(iris[DatasetConstants.CLASSES])
assert len(local_rank[0]) == feature_map.shape[0]
ranked_names = local_raw_explanation.get_ranked_local_names()
assert len(ranked_names) == len(iris[DatasetConstants.CLASSES])
assert len(ranked_names[0]) == feature_map.shape[0]
ranked_values = local_raw_explanation.get_ranked_local_values()
assert len(ranked_values) == len(iris[DatasetConstants.CLASSES])
assert len(ranked_values[0]) == feature_map.shape[0]
def test_get_local_raw_explanations_regression(self, boston, tabular_explainer):
model = create_sklearn_random_forest_regressor(boston[DatasetConstants.X_TRAIN],
boston[DatasetConstants.Y_TRAIN])
exp = tabular_explainer(model, boston[DatasetConstants.X_TRAIN], features=boston[DatasetConstants.FEATURES])
num_engineered_feats = len(boston[DatasetConstants.FEATURES])
feature_map = np.eye(num_engineered_feats - 1, num_engineered_feats)
local_explanation = exp.explain_local(boston[DatasetConstants.X_TEST][0])
local_raw_explanation = local_explanation.get_raw_explanation([feature_map])
assert len(local_raw_explanation.local_importance_values) == feature_map.shape[0]
local_rank = local_raw_explanation.get_local_importance_rank()
assert len(local_rank) == feature_map.shape[0]
ranked_names = local_raw_explanation.get_ranked_local_names()
assert len(ranked_names) == feature_map.shape[0]
ranked_values = local_raw_explanation.get_ranked_local_values()
assert len(ranked_values) == feature_map.shape[0]
def test_get_local_raw_explanations_sparse_regression(self, mimic_explainer):
X, y = retrieve_dataset('a1a.svmlight')
x_train, x_test, y_train, _ = train_test_split(X, y, test_size=0.2, random_state=7)
# Fit a linear regression model
model = create_sklearn_linear_regressor(x_train, y_train)
explainer = mimic_explainer(model, x_train, LinearExplainableModel,
explainable_model_args={'sparse_data': True})
global_explanation = explainer.explain_global(x_test)
assert global_explanation.method == LINEAR_METHOD
num_engineered_feats = x_train.shape[1]
feature_map = np.eye(5, num_engineered_feats)
global_raw_explanation = global_explanation.get_raw_explanation([feature_map])
self.validate_global_raw_explanation_regression(global_explanation, global_raw_explanation, feature_map)
def test_get_local_raw_explanations_sparse_classification(self, mimic_explainer):
x_train, x_test, y_train, _, classes, _ = create_multiclass_sparse_newsgroups_data()
# Fit a linear regression model
model = create_sklearn_logistic_regressor(x_train, y_train)
explainer = mimic_explainer(model, x_train, LinearExplainableModel,
explainable_model_args={'sparse_data': True}, classes=classes)
global_explanation = explainer.explain_global(x_test)
assert global_explanation.method == LINEAR_METHOD
num_engineered_feats = x_train.shape[1]
feature_map = np.eye(5, num_engineered_feats)
feature_names = [str(i) for i in range(feature_map.shape[0])]
raw_names = feature_names[:feature_map.shape[0]]
global_raw_explanation = global_explanation.get_raw_explanation([feature_map], raw_feature_names=raw_names)
self.validate_global_raw_explanation_classification(global_explanation, global_raw_explanation, feature_map,
classes, feature_names, is_sparse=True)
def validate_global_raw_explanation_regression(self, global_explanation, global_raw_explanation, feature_map):
assert not global_explanation.is_raw
assert global_explanation.is_engineered
assert np.array(global_raw_explanation.local_importance_values).shape[-1] == feature_map.shape[0]
assert global_raw_explanation.is_raw
assert not global_raw_explanation.is_engineered
assert np.array(global_raw_explanation.global_importance_values).shape[-1] == feature_map.shape[0]
def validate_global_raw_explanation_classification(self, global_explanation, global_raw_explanation,
feature_map, classes, feature_names, is_sparse=False):
assert not global_explanation.is_raw
assert global_explanation.is_engineered
assert global_raw_explanation.expected_values == global_explanation.expected_values
assert global_raw_explanation.init_data == global_explanation.init_data
if is_sparse:
assert np.all(global_raw_explanation.eval_data.data == global_explanation.eval_data.data)
else:
assert np.all(global_raw_explanation.eval_data == global_explanation.eval_data)
per_class_values = global_raw_explanation.get_ranked_per_class_values()
assert len(per_class_values) == len(classes)
assert len(per_class_values[0]) == feature_map.shape[0]
assert len(global_raw_explanation.get_ranked_per_class_names()[0]) == feature_map.shape[0]
feat_imps_global_local = np.array(global_raw_explanation.local_importance_values)
assert feat_imps_global_local.shape[-1] == feature_map.shape[0]
assert global_raw_explanation.is_raw
assert not global_raw_explanation.is_engineered
assert len(global_raw_explanation.get_ranked_global_values()) == feature_map.shape[0]
assert len(global_raw_explanation.get_ranked_global_names()) == feature_map.shape[0]
if isinstance(classes, list):
assert global_raw_explanation.classes == classes
else:
assert (global_raw_explanation.classes == classes).all()
assert global_raw_explanation.features == feature_names
feat_imps_global = np.array(global_raw_explanation.global_importance_values)
assert feat_imps_global.shape[-1] == feature_map.shape[0]
| 53.271277 | 116 | 0.729306 | 1,183 | 10,015 | 5.761623 | 0.103973 | 0.106808 | 0.1027 | 0.046948 | 0.848298 | 0.778462 | 0.741491 | 0.702465 | 0.632923 | 0.625 | 0 | 0.005391 | 0.185022 | 10,015 | 187 | 117 | 53.55615 | 0.829699 | 0.023265 | 0 | 0.454545 | 0 | 0 | 0.005626 | 0 | 0 | 0 | 0 | 0 | 0.318182 | 1 | 0.060606 | false | 0 | 0.121212 | 0 | 0.189394 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
6a423b60d3a349a3cdf601ca3bd3dbd0da2847b3 | 47 | py | Python | src/algorithms/__init__.py | 8sukanya8/SCD_CLEF_2019 | ede16f8057992dbe7283d86254554df97673f730 | [
"MIT"
] | 1 | 2021-03-09T13:10:01.000Z | 2021-03-09T13:10:01.000Z | src/algorithms/__init__.py | pan-webis-de/nath19 | ede16f8057992dbe7283d86254554df97673f730 | [
"MIT"
] | null | null | null | src/algorithms/__init__.py | pan-webis-de/nath19 | ede16f8057992dbe7283d86254554df97673f730 | [
"MIT"
] | 1 | 2019-06-25T13:30:14.000Z | 2019-06-25T13:30:14.000Z | #from src.algorithms.rolling_delta import delta | 47 | 47 | 0.87234 | 7 | 47 | 5.714286 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06383 | 47 | 1 | 47 | 47 | 0.909091 | 0.978723 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
e025c0ca2f47dd0193e347f3e73bc0535f458aec | 31 | py | Python | python/testData/regexp/danglingMetacharacters.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/regexp/danglingMetacharacters.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/regexp/danglingMetacharacters.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | import re
re.compile('{\w+}')
| 7.75 | 19 | 0.580645 | 5 | 31 | 3.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 31 | 3 | 20 | 10.333333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0.16129 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
e03bf402696f47e4514bbdf6db0fa870a0ddb739 | 105 | py | Python | tests/__init__.py | Datawheel/olap-client-py | 74e70230d162abd1e6b5cc2485b198e4714a8d9f | [
"MIT"
] | null | null | null | tests/__init__.py | Datawheel/olap-client-py | 74e70230d162abd1e6b5cc2485b198e4714a8d9f | [
"MIT"
] | null | null | null | tests/__init__.py | Datawheel/olap-client-py | 74e70230d162abd1e6b5cc2485b198e4714a8d9f | [
"MIT"
] | null | null | null | import sys
from pathlib import Path
sys.path.insert(1, str(Path(__file__).joinpath("../..").resolve()))
| 21 | 67 | 0.704762 | 15 | 105 | 4.666667 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010417 | 0.085714 | 105 | 4 | 68 | 26.25 | 0.71875 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
e04d09c86cdb8149233d267548f2ae62d4093659 | 115 | py | Python | vega/algorithms/nas/modnas/compat/__init__.py | jie311/vega | 1bba6100ead802697e691403b951e6652a99ccae | [
"MIT"
] | 724 | 2020-06-22T12:05:30.000Z | 2022-03-31T07:10:54.000Z | vega/algorithms/nas/modnas/compat/__init__.py | jie311/vega | 1bba6100ead802697e691403b951e6652a99ccae | [
"MIT"
] | 147 | 2020-06-30T13:34:46.000Z | 2022-03-29T11:30:17.000Z | vega/algorithms/nas/modnas/compat/__init__.py | jie311/vega | 1bba6100ead802697e691403b951e6652a99ccae | [
"MIT"
] | 160 | 2020-06-29T18:27:58.000Z | 2022-03-23T08:42:21.000Z | from . import importer
from .search_alg import ModNasAlgorithm
from .trainer_callback import ModNasTrainerCallback
| 28.75 | 51 | 0.869565 | 13 | 115 | 7.538462 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104348 | 115 | 3 | 52 | 38.333333 | 0.951456 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
e065fb4cb4f59d22ca80d841d27dff529a386e1a | 49 | py | Python | exercises/all-your-base/all_your_base.py | haithamk/python-exercism | 8166a98ba771e0d527efdda421d3d9e741f0459b | [
"MIT"
] | null | null | null | exercises/all-your-base/all_your_base.py | haithamk/python-exercism | 8166a98ba771e0d527efdda421d3d9e741f0459b | [
"MIT"
] | null | null | null | exercises/all-your-base/all_your_base.py | haithamk/python-exercism | 8166a98ba771e0d527efdda421d3d9e741f0459b | [
"MIT"
] | 1 | 2021-12-29T19:26:23.000Z | 2021-12-29T19:26:23.000Z | def rebase(from_base, digits, to_base):
pass
| 16.333333 | 39 | 0.714286 | 8 | 49 | 4.125 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.183673 | 49 | 2 | 40 | 24.5 | 0.825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
0ebf5a1e0ed65d37bf2b360009a981d7a1af6222 | 202 | py | Python | tests/test_achrom.py | donnyyy777/pyteomics | dc0cafb16823767457fa52342574e3fa1c61b970 | [
"Apache-2.0"
] | 47 | 2020-02-29T21:47:01.000Z | 2022-03-17T13:27:30.000Z | tests/test_achrom.py | donnyyy777/pyteomics | dc0cafb16823767457fa52342574e3fa1c61b970 | [
"Apache-2.0"
] | 53 | 2020-04-07T01:40:31.000Z | 2022-03-17T12:15:44.000Z | tests/test_achrom.py | donnyyy777/pyteomics | dc0cafb16823767457fa52342574e3fa1c61b970 | [
"Apache-2.0"
] | 23 | 2020-02-29T21:47:13.000Z | 2021-11-26T04:32:07.000Z | from os import path
import pyteomics
pyteomics.__path__ = [path.abspath(path.join(path.dirname(__file__), path.pardir, 'pyteomics'))]
import doctest
from pyteomics import achrom
doctest.testmod(achrom)
| 28.857143 | 96 | 0.806931 | 27 | 202 | 5.740741 | 0.481481 | 0.193548 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089109 | 202 | 6 | 97 | 33.666667 | 0.842391 | 0 | 0 | 0 | 0 | 0 | 0.044554 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
0ecf72e4ff0c0d41f48dafd866447880dc75cd7b | 60 | py | Python | Belajar Python BITSCHOOL/deklarasi.py | Ziyad-gif/PSJ__Ziyad | 1306d9a72fb197a7411d5045b1b0d75696765b69 | [
"BSL-1.0"
] | null | null | null | Belajar Python BITSCHOOL/deklarasi.py | Ziyad-gif/PSJ__Ziyad | 1306d9a72fb197a7411d5045b1b0d75696765b69 | [
"BSL-1.0"
] | null | null | null | Belajar Python BITSCHOOL/deklarasi.py | Ziyad-gif/PSJ__Ziyad | 1306d9a72fb197a7411d5045b1b0d75696765b69 | [
"BSL-1.0"
] | null | null | null | i = 20
print(i)
i = 20 * 12
print(i)
i = "dua"
print(i) | 10 | 12 | 0.5 | 13 | 60 | 2.307692 | 0.384615 | 0.6 | 0.466667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 0.3 | 60 | 6 | 13 | 10 | 0.571429 | 0 | 0 | 0.5 | 0 | 0 | 0.053571 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
0ef1cf777ca27a7878ab69fc56af49df6a62c739 | 10,690 | py | Python | tests/unit_tests/test_executor_argparser.py | chrisBrookes93/robotframework-slave | 663fe5b6ffae4582f7edab012eb013904046349b | [
"MIT"
] | 6 | 2020-03-27T21:29:12.000Z | 2022-02-16T08:20:23.000Z | tests/unit_tests/test_executor_argparser.py | chrisBrookes93/robotframework-slave | 663fe5b6ffae4582f7edab012eb013904046349b | [
"MIT"
] | 9 | 2020-12-30T18:21:51.000Z | 2021-02-05T10:00:46.000Z | tests/unit_tests/test_executor_argparser.py | chrisBrookes93/robotframework-slave | 663fe5b6ffae4582f7edab012eb013904046349b | [
"MIT"
] | 3 | 2020-12-30T18:06:12.000Z | 2022-01-30T02:49:07.000Z | import unittest
import os
import tempfile
from rfremoterunner.executor_argparser import ExecutorArgumentParser
class TestExecutorArgumentParser(unittest.TestCase):
def setUp(self):
self.curr_dir = os.path.abspath(os.path.dirname(__file__))
self.suite_dir = os.path.join(self.curr_dir, '..', 'integration_tests', 'resources')
self.temp_cwd = os.getcwd()
os.chdir(self.curr_dir)
def tearDown(self):
os.chdir(self.temp_cwd)
def test_args_set_correctly(self):
"""
Test that all arguments are set correctly in the ExecutorArgumentParser
"""
expected_host = '192.168.56.1'
expected_suites = self.suite_dir
expected_output_dir = '../../'
expected_output_xml = './out.xml'
expected_log_file = './l.html'
expected_report_file = './r.html'
expected_extension = 'html:tsv'
expected_include = 'ts_*'
expected_exclude = 'nts_*'
expected_test = 'a.b.c.*'
expected_suite = 'X.Y:X.Z:Y.A'.split(':')
expected_log_level = 'DEBUG'
input_args = [expected_host,
expected_suites,
'--outputdir', expected_output_dir,
'--output', expected_output_xml,
'--log', expected_log_file,
'--report', expected_report_file,
'--debug',
'--extension', expected_extension,
'--include', expected_include,
'--exclude', expected_exclude,
'--test', expected_test,
'--suite', 'X.Y:X.Z:Y.A',
'--loglevel', expected_log_level,
]
eap = ExecutorArgumentParser(input_args)
self.assertEqual(eap.host, expected_host)
self.assertEqual(eap.suites, [expected_suites])
self.assertEqual(eap.outputdir, expected_output_dir)
self.assertEqual(eap.output, expected_output_xml)
self.assertEqual(eap.log, expected_log_file)
self.assertEqual(eap.report, expected_report_file)
self.assertTrue(eap.debug)
self.assertEqual(eap.extension, expected_extension)
self.assertEqual(eap.include, expected_include)
self.assertEqual(eap.exclude, expected_exclude)
self.assertEqual(eap.test, expected_test)
self.assertEqual(eap.suite, expected_suite)
self.assertEqual(eap.loglevel, expected_log_level)
self.assertListEqual(
sorted(['loglevel', 'include', 'test', 'exclude', 'suite', 'extension']),
sorted(eap.robot_run_args.keys()))
def test_get_log_html_output_location_default(self):
"""
Test that get_log_html_output_location() returns the default location if one has not been specified
"""
input_args = ['127.0.0.1', self.suite_dir]
eap = ExecutorArgumentParser(input_args)
actual_val = eap.get_log_html_output_location()
expected_val = os.path.abspath(os.path.join(self.curr_dir, 'remote_log.html'))
self.assertEqual(expected_val, actual_val)
def test_get_log_html_output_location_outputdir_specified(self):
"""
Test that get_log_html_output_location() returns the correct path when one is specified in the config
"""
input_args = ['127.0.0.1',
self.suite_dir,
'--loglevel', 'DEBUG',
'--outputdir', '../../']
eap = ExecutorArgumentParser(input_args)
actual_val = eap.get_log_html_output_location()
expected_val = os.path.join(self.curr_dir, '..', '..', 'remote_log.html')
self.assertEqual(os.path.abspath(expected_val), os.path.abspath(actual_val))
def test_get_log_html_output_location_log_specified_relative(self):
"""
Test that get_log_html_output_location() returns the correct path when a relative one is specified in the config
"""
input_args = ['127.0.0.1',
self.suite_dir,
'--loglevel', 'DEBUG',
'--outputdir', '../../',
'--log', 'test_results/log.html']
eap = ExecutorArgumentParser(input_args)
actual_val = eap.get_log_html_output_location()
expected_val = os.path.join(self.curr_dir, '..', '..', 'test_results', 'log.html')
self.assertEqual(os.path.abspath(expected_val), os.path.abspath(actual_val))
def test_get_log_html_output_location_log_specified_absolute(self):
"""
Test that get_log_html_output_location() returns the correct path when an absolute one is specified in the
config
"""
temp_dir = tempfile.gettempdir()
expected_val = os.path.abspath(os.path.join(temp_dir, 'test_results', 'log.html'))
input_args = ['127.0.0.1',
self.suite_dir,
'--loglevel', 'DEBUG',
'--outputdir', '../../',
'--log', expected_val]
eap = ExecutorArgumentParser(input_args)
actual_val = eap.get_log_html_output_location()
self.assertEqual(expected_val, actual_val)
def test_get_report_html_output_location_default(self):
"""
Test that get_report_html_output_location() returns the default location if one has not been specified
"""
input_args = ['127.0.0.1', self.suite_dir]
eap = ExecutorArgumentParser(input_args)
actual_val = eap.get_report_html_output_location()
expected_val = os.path.join(self.curr_dir, 'remote_report.html')
self.assertEqual(expected_val, actual_val)
def test_get_report_html_output_location_outputdir_specified(self):
"""
Test that get_report_html_output_location() returns the correct path when one is specified in the config
"""
input_args = ['127.0.0.1',
self.suite_dir,
'--loglevel', 'DEBUG',
'--outputdir', '../../']
eap = ExecutorArgumentParser(input_args)
actual_val = eap.get_report_html_output_location()
expected_val = os.path.join(self.curr_dir, '..', '..', 'remote_report.html')
self.assertEqual(os.path.abspath(expected_val), os.path.abspath(actual_val))
def test_get_report_html_output_location_log_specified_relative(self):
"""
Test that get_report_html_output_location() returns the correct path when a relative one is specified in the
config
"""
input_args = ['127.0.0.1',
self.suite_dir,
'--loglevel', 'DEBUG',
'--outputdir', '../../',
'--report', 'test_results/report.html']
eap = ExecutorArgumentParser(input_args)
actual_val = eap.get_report_html_output_location()
expected_val = os.path.join(self.curr_dir, '..', '..', 'test_results', 'report.html')
self.assertEqual(os.path.abspath(expected_val), os.path.abspath(actual_val))
def test_get_report_html_output_location_log_specified_absolute(self):
"""
Test that get_report_html_output_location() returns the correct path when an absolute one is specified in the
config
"""
temp_dir = tempfile.gettempdir()
expected_val = os.path.abspath(os.path.join(temp_dir, 'test_results', 'report.html'))
input_args = ['127.0.0.1',
self.suite_dir,
'--loglevel', 'DEBUG',
'--outputdir', '../../',
'--report', expected_val]
eap = ExecutorArgumentParser(input_args)
actual_val = eap.get_report_html_output_location()
self.assertEqual(expected_val, actual_val)
def test_get_output_xml_output_location_default(self):
"""
Test that get_output_xml_output_location() returns the default location if one has not been specified
"""
input_args = ['127.0.0.1', self.suite_dir]
eap = ExecutorArgumentParser(input_args)
actual_val = eap.get_output_xml_output_location()
expected_val = os.path.join(self.curr_dir, 'remote_output.xml')
self.assertEqual(expected_val, actual_val)
def test_get_output_xml_output_location_outputdir_specified(self):
"""
Test that get_output_xml_output_location() returns the correct path when one is specified in the config
"""
input_args = ['127.0.0.1',
self.suite_dir,
'--loglevel', 'DEBUG',
'--outputdir', '../../']
eap = ExecutorArgumentParser(input_args)
actual_val = eap.get_output_xml_output_location()
expected_val = os.path.join(self.curr_dir, '..', '..', 'remote_output.xml')
self.assertEqual(os.path.abspath(expected_val), os.path.abspath(actual_val))
def test_get_output_xml_output_location_log_specified_relative(self):
"""
Test that get_output_xml_output_location() returns the correct path when a relative one is specified in the
config
"""
input_args = ['127.0.0.1',
self.suite_dir,
'--loglevel', 'DEBUG',
'--outputdir', '../../',
'--output', 'test_results/output.xml']
eap = ExecutorArgumentParser(input_args)
actual_val = eap.get_output_xml_output_location()
expected_val = os.path.join(self.curr_dir, '..', '..', 'test_results', 'output.xml')
self.assertEqual(os.path.abspath(expected_val), os.path.abspath(actual_val))
def test_get_output_xml_output_location_log_specified_absolute(self):
"""
Test that get_output_xml_output_location() returns the correct path when an absolute one is specified in the
config
"""
temp_dir = tempfile.gettempdir()
expected_val = os.path.abspath(os.path.join(temp_dir, 'test_results', 'output.xml'))
input_args = ['127.0.0.1',
self.suite_dir,
'--loglevel', 'DEBUG',
'--outputdir', '../../',
'--output', expected_val]
eap = ExecutorArgumentParser(input_args)
actual_val = eap.get_output_xml_output_location()
self.assertEqual(expected_val, actual_val)
| 46.885965 | 121 | 0.597194 | 1,196 | 10,690 | 5.026756 | 0.088629 | 0.083832 | 0.071856 | 0.050898 | 0.746673 | 0.740519 | 0.737026 | 0.73004 | 0.714737 | 0.701264 | 0 | 0.010666 | 0.289616 | 10,690 | 227 | 122 | 47.092511 | 0.781011 | 0.128906 | 0 | 0.481707 | 0 | 0 | 0.115952 | 0.007776 | 0 | 0 | 0 | 0 | 0.158537 | 1 | 0.091463 | false | 0 | 0.02439 | 0 | 0.121951 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
0efad4fbbaf54dbf46854229ee772f69697d2b3a | 402 | py | Python | tests/test_extract_rss.py | NewsPipe/newscrawler | 1f3bf3e4cd98a5a174a9a48ff662db848011e403 | [
"Apache-2.0"
] | null | null | null | tests/test_extract_rss.py | NewsPipe/newscrawler | 1f3bf3e4cd98a5a174a9a48ff662db848011e403 | [
"Apache-2.0"
] | null | null | null | tests/test_extract_rss.py | NewsPipe/newscrawler | 1f3bf3e4cd98a5a174a9a48ff662db848011e403 | [
"Apache-2.0"
] | 1 | 2020-05-31T17:40:36.000Z | 2020-05-31T17:40:36.000Z | """
tests.test_extract_rss.py
~~~~~~~~~~~~~~~~~~~~
Test suite for the extract_rss.py module
"""
import pytest
from newscrawler.extract_rss import extract_rss
from newscrawler.utils import coerce_url
def test_zeit():
url = coerce_url("zeit.de")
assert isinstance(extract_rss(url), list)
def test_spiegel():
url = coerce_url("spiegel.de")
assert isinstance(extract_rss(url), list)
| 18.272727 | 47 | 0.706468 | 56 | 402 | 4.857143 | 0.410714 | 0.220588 | 0.088235 | 0.183824 | 0.257353 | 0.257353 | 0.257353 | 0 | 0 | 0 | 0 | 0 | 0.146766 | 402 | 21 | 48 | 19.142857 | 0.793003 | 0.218905 | 0 | 0.222222 | 0 | 0 | 0.055738 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 1 | 0.222222 | false | 0 | 0.333333 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
1647cd86d947ad633ce60fd33d299f134ae70d42 | 420 | py | Python | inheritance/stack_of_strings.py | PetkoAndreev/Python-OOP | 2cc3094940cdf078f0ee60be938e883f843766e4 | [
"MIT"
] | 1 | 2021-05-27T07:59:17.000Z | 2021-05-27T07:59:17.000Z | inheritance/stack_of_strings.py | PetkoAndreev/Python-OOP | 2cc3094940cdf078f0ee60be938e883f843766e4 | [
"MIT"
] | null | null | null | inheritance/stack_of_strings.py | PetkoAndreev/Python-OOP | 2cc3094940cdf078f0ee60be938e883f843766e4 | [
"MIT"
] | null | null | null | class Stack:
def __init__(self):
self.data = []
def push(self, item):
self.data.append(item)
def pop(self):
return self.data.pop()
def peek(self):
return self.data[-1]
def is_empty(self):
return len(self.data) == 0
def __str__(self):
return f"[{', '.join(reversed(self.data))}]"
ss = Stack()
[ss.push(str(x)) for x in range(1, 10)]
print(ss) | 18.26087 | 52 | 0.55 | 61 | 420 | 3.639344 | 0.459016 | 0.216216 | 0.126126 | 0.162162 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016556 | 0.280952 | 420 | 23 | 53 | 18.26087 | 0.718543 | 0 | 0 | 0 | 0 | 0 | 0.08076 | 0.068884 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0 | 0 | 0.25 | 0.6875 | 0.0625 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
166af2ab036137748c3cf028b5ccd6dff13e2480 | 3,233 | py | Python | app/pipeline/__init__.py | farbodab/flatteningthecurve | 692fd9c8d78355e1208ff85a2cd1038da11c392f | [
"MIT"
] | 1 | 2020-03-24T23:46:29.000Z | 2020-03-24T23:46:29.000Z | app/pipeline/__init__.py | farbodab/flatteningthecurve | 692fd9c8d78355e1208ff85a2cd1038da11c392f | [
"MIT"
] | 13 | 2021-02-08T20:51:14.000Z | 2022-03-12T00:43:30.000Z | app/pipeline/__init__.py | farbodab/flatteningthecurve | 692fd9c8d78355e1208ff85a2cd1038da11c392f | [
"MIT"
] | 3 | 2020-06-09T20:24:29.000Z | 2020-06-09T20:26:16.000Z | from flask import Blueprint
import click
bp = Blueprint('pipeline', __name__,cli_group='pipeline')
from app.data_in import routes as data_in
from app.data_process import routes as data_process
from app.data_transform import routes as data_transform
from app.data_export import routes as data_export
@bp.cli.command('gov')
@click.pass_context
def gov_ontario(ctx):
ctx.forward(data_in.get_public_ontario_gov_conposcovidloc)
ctx.forward(data_in.get_public_ontario_gov_covidtesting)
ctx.forward(data_in.get_public_ontario_gov_daily_change_in_cases_by_phu)
ctx.forward(data_in.get_public_ices_vaccination)
ctx.forward(data_in.get_public_ices_percent_positivity)
ctx.forward(data_process.process_public_ontario_gov_conposcovidloc)
ctx.forward(data_process.process_public_ontario_gov_covidtesting)
ctx.forward(data_process.process_public_ontario_gov_daily_change_in_cases_by_phu)
ctx.forward(data_process.process_restricted_moh_iphis)
ctx.forward(data_process.process_public_ices_percent_positivity)
ctx.forward(data_process.process_public_ices_vaccination)
ctx.forward(data_transform.transform_public_cases_ontario_confirmed_positive_cases)
ctx.forward(data_transform.transform_public_cases_ontario_covid_summary)
ctx.forward(data_transform.transform_public_cases_ontario_cases_seven_day_rolling_average)
ctx.forward(data_transform.transform_public_capacity_ontario_testing_24_hours)
ctx.forward(data_transform.transform_public_ices_percent_positivity)
ctx.forward(data_transform.transform_public_vaccination_phu)
ctx.forward(data_transform.transform_public_summary_ontario)
ctx.forward(data_transform.transform_confidential_moh_iphis)
ctx.forward(data_export.export_public_summary_ontario)
ctx.forward(data_export.export_public_cases_ontario_covid_summary)
ctx.forward(data_export.export_public_capacity_ontario_testing_24_hours)
ctx.forward(data_export.export_confidential_moh_iphis)
ctx.forward(data_export.export_public_ices_positivity)
@bp.cli.command('vaccine')
@click.pass_context
def vaccine(ctx):
ctx.forward(data_in.get_public_ontario_gov_vaccination)
ctx.forward(data_process.process_public_ontario_gov_vaccination)
ctx.forward(data_transform.transform_public_vaccination_ontario)
ctx.forward(data_export.export_public_vaccination_ontario)
@bp.cli.command('ccso')
@click.pass_context
def ccso(ctx):
ctx.forward(data_process.process_restricted_ccso_ccis)
ctx.forward(data_transform.transform_public_capacity_ontario_phu_icu_capacity)
ctx.forward(data_transform.transform_public_capacity_ontario_phu_icu_capacity_timeseries)
ctx.forward(data_export.export_public_capacity_ontario_phu_icu_capacity)
@bp.cli.command('rt')
@click.pass_context
def rt(ctx):
ctx.forward(data_in.get_public_ontario_gov_daily_change_in_cases_by_phu)
ctx.forward(data_process.process_public_ontario_gov_daily_change_in_cases_by_phu)
ctx.forward(data_transform.transform_public_cases_ontario_cases_seven_day_rolling_average)
ctx.forward(data_transform.transform_public_rt_canada_bettencourt_and_ribeiro_approach)
ctx.forward(data_export.export_public_rt_canada_bettencourt_and_ribeiro_approach)
| 46.855072 | 94 | 0.858645 | 460 | 3,233 | 5.513043 | 0.143478 | 0.145899 | 0.204259 | 0.117902 | 0.812303 | 0.797319 | 0.737382 | 0.552445 | 0.415221 | 0.306782 | 0 | 0.00134 | 0.0764 | 3,233 | 68 | 95 | 47.544118 | 0.847957 | 0 | 0 | 0.178571 | 0 | 0 | 0.009898 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0.071429 | 0.107143 | 0 | 0.178571 | 0.035714 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
166e3f3eb1bcb0ba0252a046cc4a136ff716be23 | 178 | py | Python | ingestion/Blocktrace/Networks/Eosio.py | mharrisb1/blocktrace | 3c54286d4f28c3b0610f577dfdbbf643953475fa | [
"MIT"
] | null | null | null | ingestion/Blocktrace/Networks/Eosio.py | mharrisb1/blocktrace | 3c54286d4f28c3b0610f577dfdbbf643953475fa | [
"MIT"
] | null | null | null | ingestion/Blocktrace/Networks/Eosio.py | mharrisb1/blocktrace | 3c54286d4f28c3b0610f577dfdbbf643953475fa | [
"MIT"
] | null | null | null | from abc import ABC, abstractmethod
from typing import Generator
class Eosio(ABC):
@abstractmethod
def blocktrace(self) -> Generator:
raise NotImplementedError
| 19.777778 | 38 | 0.741573 | 19 | 178 | 6.947368 | 0.684211 | 0.257576 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.202247 | 178 | 8 | 39 | 22.25 | 0.929577 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
167f03893b349059c5b1712f17dff26ed7c14161 | 533 | py | Python | tools/telemetry/telemetry/core/platform/profiling_controller.py | kjthegod/chromium | cf940f7f418436b77e15b1ea23e6fa100ca1c91a | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | 925 | 2015-11-06T03:04:46.000Z | 2017-09-16T19:08:43.000Z | tools/telemetry/telemetry/core/platform/profiling_controller.py | kjthegod/chromium | cf940f7f418436b77e15b1ea23e6fa100ca1c91a | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | 29 | 2015-11-09T17:37:28.000Z | 2017-08-16T17:50:11.000Z | tools/telemetry/telemetry/core/platform/profiling_controller.py | kjthegod/chromium | cf940f7f418436b77e15b1ea23e6fa100ca1c91a | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | 51 | 2015-11-08T07:06:38.000Z | 2017-08-21T07:27:19.000Z | # Copyright 2014 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
class ProfilingController(object):
def __init__(self, profiling_controller_backend):
self._profiling_controller_backend = profiling_controller_backend
def Start(self, profiler_name, base_output_file):
self._profiling_controller_backend.Start(
profiler_name, base_output_file)
def Stop(self):
return self._profiling_controller_backend.Stop()
| 33.3125 | 72 | 0.789869 | 71 | 533 | 5.605634 | 0.591549 | 0.238693 | 0.326633 | 0.301508 | 0.130653 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008811 | 0.148218 | 533 | 15 | 73 | 35.533333 | 0.867841 | 0.290807 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0 | 0 | 0.125 | 0.625 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 5 |
168b6137b47131ce785a387367d4667a7ea3fe25 | 1,236 | py | Python | experiments/utils/ascii.py | tomdbar/naqs-for-quantum-chemistry | 2a0cba7adb1d07bd83a37af0fa822e1646fa987b | [
"MIT"
] | 6 | 2021-09-28T06:06:40.000Z | 2021-09-28T14:03:25.000Z | experiments/utils/ascii.py | tomdbar/naqs-for-quantum-chemistry | 2a0cba7adb1d07bd83a37af0fa822e1646fa987b | [
"MIT"
] | null | null | null | experiments/utils/ascii.py | tomdbar/naqs-for-quantum-chemistry | 2a0cba7adb1d07bd83a37af0fa822e1646fa987b | [
"MIT"
] | 1 | 2021-09-28T12:35:49.000Z | 2021-09-28T12:35:49.000Z | def success():
print('''
$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$' `$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$
$$$$$$$$$$$$$$$$$$$$$$$$$$$$' `$$$$$$$$$$$$$$$$$$$$$$$$$$$$
$$$'`$$$$$$$$$$$$$'`$$$$$$! !$$$$$$'`$$$$$$$$$$$$$'`$$$
$$$$ $$$$$$$$$$$ $$$$$$$ $$$$$$$ $$$$$$$$$$$ $$$$
$$$$. `$' \' \$` $$$$$$$! !$$$$$$$ '$/ `/ `$' .$$$$
$$$$$. !\ i i .$$$$$$$$ $$$$$$$$. i i /! .$$$$$
$$$$$$ `--`--.$$$$$$$$$ $$$$$$$$$.--'--' $$$$$$
$$$$$$L `$$$$$^^$$ $$^^$$$$$' J$$$$$$
$$$$$$$. .' ""~ $$$ $. .$ $$$ ~"" `. .$$$$$$$
$$$$$$$$. ; .e$$$$$! $$. .$$ !$$$$$e, ; .$$$$$$$$
$$$$$$$$$ `.$$$$$$$$$$$$ $$$. .$$$ $$$$$$$$$$$$.' $$$$$$$$$
$$$$$$$$ .$$$$$$$$$$$$$! $$`$$$$$$$$'$$ !$$$$$$$$$$$$$. $$$$$$$$
$JT&yd$ $$$$$$$$$$$$$$$$. $ $$ $ .$$$$$$$$$$$$$$$$ $by&TL$
$ $$ $
$. $$ .$
`$ $'
`$$$$$$$$'
''') | 61.8 | 79 | 0.025081 | 15 | 1,236 | 2.066667 | 0.733333 | 0.193548 | 0.193548 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.457929 | 1,236 | 20 | 80 | 61.8 | 0.046269 | 0 | 0 | 0 | 0 | 0 | 0.973323 | 0.140663 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | true | 0 | 0 | 0 | 0.05 | 0.05 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
169d9823276556d62899bb725e523c733134f472 | 53 | py | Python | Codes examples in the book/chs16-20_code/mathproj/comp/numeric/n2.py | almazkun/The-Quick-Python-Book-Third-Edition | d405e45b757d53e851843aed30118949b42c2ee6 | [
"Apache-2.0"
] | 2 | 2019-05-04T12:33:09.000Z | 2020-02-28T20:06:27.000Z | Codes examples in the book/chs16-20_code/mathproj/comp/numeric/n2.py | almazkun/The-Quick-Python-Book-Third-Edition | d405e45b757d53e851843aed30118949b42c2ee6 | [
"Apache-2.0"
] | null | null | null | Codes examples in the book/chs16-20_code/mathproj/comp/numeric/n2.py | almazkun/The-Quick-Python-Book-Third-Edition | d405e45b757d53e851843aed30118949b42c2ee6 | [
"Apache-2.0"
] | 1 | 2020-02-28T19:33:28.000Z | 2020-02-28T19:33:28.000Z | def h():
return "Called function h in module n2"
| 17.666667 | 43 | 0.660377 | 9 | 53 | 3.888889 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025 | 0.245283 | 53 | 2 | 44 | 26.5 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0.566038 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 5 |
16a9a13f62754fc9003f456e4a924ba3a18db725 | 66 | py | Python | tests/models/test_queries.py | sanjogpandasp/redash | 87e0cd9507658539d33bf985a9960a628a177e39 | [
"BSD-2-Clause"
] | 1 | 2018-02-10T07:40:50.000Z | 2018-02-10T07:40:50.000Z | tests/models/test_queries.py | connect2nelson/redash | 87e0cd9507658539d33bf985a9960a628a177e39 | [
"BSD-2-Clause"
] | null | null | null | tests/models/test_queries.py | connect2nelson/redash | 87e0cd9507658539d33bf985a9960a628a177e39 | [
"BSD-2-Clause"
] | null | null | null | from tests import BaseTestCase
# Add tests for change tracking
| 11 | 31 | 0.787879 | 9 | 66 | 5.777778 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.19697 | 66 | 5 | 32 | 13.2 | 0.981132 | 0.439394 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
bc6d90f4cc7690dd2f0a401df9233d1e755e4108 | 603 | py | Python | data/train/python/58cea8695758bcd039e43c411538b7ca5d394e08signals.py | harshp8l/deep-learning-lang-detection | 2a54293181c1c2b1a2b840ddee4d4d80177efb33 | [
"MIT"
] | 84 | 2017-10-25T15:49:21.000Z | 2021-11-28T21:25:54.000Z | data/train/python/58cea8695758bcd039e43c411538b7ca5d394e08signals.py | vassalos/deep-learning-lang-detection | cbb00b3e81bed3a64553f9c6aa6138b2511e544e | [
"MIT"
] | 5 | 2018-03-29T11:50:46.000Z | 2021-04-26T13:33:18.000Z | data/train/python/58cea8695758bcd039e43c411538b7ca5d394e08signals.py | vassalos/deep-learning-lang-detection | cbb00b3e81bed3a64553f9c6aa6138b2511e544e | [
"MIT"
] | 24 | 2017-11-22T08:31:00.000Z | 2022-03-27T01:22:31.000Z | import django.dispatch
added_member = django.dispatch.Signal(providing_args=["membership"])
invited_user = django.dispatch.Signal(providing_args=["membership"])
promoted_member = django.dispatch.Signal(providing_args=["membership"])
demoted_member = django.dispatch.Signal(providing_args=["membership"])
accepted_membership = django.dispatch.Signal(providing_args=["membership"])
rejected_membership = django.dispatch.Signal(providing_args=["membership"])
resent_invite = django.dispatch.Signal(providing_args=["membership"])
removed_membership = django.dispatch.Signal(providing_args=["team", "user"]) | 60.3 | 76 | 0.81592 | 68 | 603 | 7 | 0.279412 | 0.264706 | 0.336134 | 0.487395 | 0.802521 | 0.802521 | 0.531513 | 0 | 0 | 0 | 0 | 0 | 0.044776 | 603 | 10 | 76 | 60.3 | 0.826389 | 0 | 0 | 0 | 0 | 0 | 0.129139 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
bca6ee2d166dd2619d6d661a6c7a906bf2ec59b8 | 32 | py | Python | tfbspline/util/__init__.py | psomers3/tf-bspline | 1145e385ec08bf84fd5e953bbd641471f7e1292a | [
"MIT"
] | 4 | 2020-01-03T08:08:21.000Z | 2021-01-08T21:37:45.000Z | tfbspline/util/__init__.py | psomers3/tf-bspline | 1145e385ec08bf84fd5e953bbd641471f7e1292a | [
"MIT"
] | null | null | null | tfbspline/util/__init__.py | psomers3/tf-bspline | 1145e385ec08bf84fd5e953bbd641471f7e1292a | [
"MIT"
] | 1 | 2021-04-05T15:15:58.000Z | 2021-04-05T15:15:58.000Z | from .bspline import interpolate | 32 | 32 | 0.875 | 4 | 32 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 32 | 1 | 32 | 32 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
bcc08fde184c1fa7e22cac0e6fc6d079d7a093ce | 163 | py | Python | isu/models/actions/__init__.py | pecusys/isutils | 39fa92dc391cc430dcf1864f4c2f0212f0db58b6 | [
"MIT"
] | null | null | null | isu/models/actions/__init__.py | pecusys/isutils | 39fa92dc391cc430dcf1864f4c2f0212f0db58b6 | [
"MIT"
] | null | null | null | isu/models/actions/__init__.py | pecusys/isutils | 39fa92dc391cc430dcf1864f4c2f0212f0db58b6 | [
"MIT"
] | null | null | null | """
"""
from .jump import *
from .animation import *
from .audio import *
from .box import *
from .cursor import *
from .text import *
from .highlight import *
| 12.538462 | 24 | 0.674847 | 21 | 163 | 5.238095 | 0.428571 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.202454 | 163 | 12 | 25 | 13.583333 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
bcd8b846cb5c11795bcdbcd07476e33557d8c00d | 4,216 | py | Python | z2/part3/updated_part2_batch/jm/parser_errors_2/291538679.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 1 | 2020-04-16T12:13:47.000Z | 2020-04-16T12:13:47.000Z | z2/part3/updated_part2_batch/jm/parser_errors_2/291538679.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 18 | 2020-03-06T17:50:15.000Z | 2020-05-19T14:58:30.000Z | z2/part3/updated_part2_batch/jm/parser_errors_2/291538679.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 18 | 2020-03-06T17:45:13.000Z | 2020-06-09T19:18:31.000Z | from part1 import (
gamma_board,
gamma_busy_fields,
gamma_delete,
gamma_free_fields,
gamma_golden_move,
gamma_golden_possible,
gamma_move,
gamma_new,
)
"""
scenario: test_random_actions
uuid: 291538679
"""
"""
random actions, total chaos
"""
board = gamma_new(5, 5, 4, 2)
assert board is not None
assert gamma_move(board, 1, 1, 0) == 1
assert gamma_golden_move(board, 1, 2, 0) == 0
assert gamma_move(board, 2, 4, 2) == 1
assert gamma_move(board, 3, 2, 1) == 1
assert gamma_move(board, 4, 2, 2) == 1
assert gamma_free_fields(board, 4) == 21
assert gamma_golden_possible(board, 4) == 1
assert gamma_move(board, 1, 0, 0) == 1
assert gamma_free_fields(board, 1) == 20
assert gamma_golden_possible(board, 1) == 1
assert gamma_move(board, 2, 4, 3) == 1
assert gamma_move(board, 3, 0, 4) == 1
assert gamma_move(board, 3, 3, 2) == 0
assert gamma_move(board, 4, 1, 3) == 1
assert gamma_move(board, 1, 2, 0) == 1
assert gamma_move(board, 2, 0, 3) == 1
assert gamma_golden_move(board, 2, 0, 1) == 0
assert gamma_move(board, 3, 3, 2) == 0
assert gamma_move(board, 3, 1, 3) == 0
assert gamma_move(board, 4, 2, 3) == 1
assert gamma_move(board, 4, 4, 3) == 0
assert gamma_move(board, 1, 4, 3) == 0
assert gamma_move(board, 3, 4, 1) == 0
assert gamma_move(board, 3, 4, 3) == 0
assert gamma_move(board, 4, 2, 3) == 0
assert gamma_golden_move(board, 4, 0, 2) == 0
assert gamma_move(board, 1, 4, 0) == 1
assert gamma_move(board, 2, 2, 3) == 0
assert gamma_move(board, 3, 1, 4) == 1
assert gamma_move(board, 4, 1, 0) == 0
assert gamma_move(board, 1, 0, 4) == 0
assert gamma_move(board, 2, 1, 3) == 0
assert gamma_move(board, 2, 0, 3) == 0
board163068380 = gamma_board(board)
assert board163068380 is not None
assert board163068380 == ("33...\n" "244.2\n" "..4.2\n" "..3..\n" "111.1\n")
del board163068380
board163068380 = None
assert gamma_move(board, 3, 4, 2) == 0
assert gamma_move(board, 3, 1, 3) == 0
assert gamma_move(board, 4, 1, 3) == 0
assert gamma_move(board, 1, 2, 2) == 0
assert gamma_move(board, 2, 1, 3) == 0
assert gamma_move(board, 2, 2, 4) == 0
assert gamma_golden_possible(board, 2) == 1
assert gamma_move(board, 3, 1, 1) == 1
assert gamma_move(board, 3, 2, 1) == 0
assert gamma_busy_fields(board, 3) == 4
assert gamma_move(board, 4, 1, 4) == 0
assert gamma_move(board, 4, 3, 2) == 1
assert gamma_move(board, 1, 4, 4) == 0
assert gamma_move(board, 2, 4, 2) == 0
board790976973 = gamma_board(board)
assert board790976973 is not None
assert board790976973 == ("33...\n" "244.2\n" "..442\n" ".33..\n" "111.1\n")
del board790976973
board790976973 = None
assert gamma_move(board, 3, 0, 3) == 0
assert gamma_move(board, 3, 3, 3) == 0
assert gamma_move(board, 4, 0, 4) == 0
assert gamma_busy_fields(board, 4) == 4
assert gamma_move(board, 1, 1, 3) == 0
board960175527 = gamma_board(board)
assert board960175527 is not None
assert board960175527 == ("33...\n" "244.2\n" "..442\n" ".33..\n" "111.1\n")
del board960175527
board960175527 = None
assert gamma_move(board, 2, 4, 2) == 0
assert gamma_move(board, 2, 4, 4) == 1
assert gamma_move(board, 3, 3, 3) == 0
assert gamma_move(board, 3, 3, 4) == 0
assert gamma_golden_possible(board, 3) == 1
assert gamma_move(board, 4, 1, 4) == 0
assert gamma_move(board, 4, 1, 4) == 0
assert gamma_move(board, 1, 0, 4) == 0
assert gamma_move(board, 2, 2, 0) == 0
assert gamma_move(board, 3, 3, 3) == 0
assert gamma_move(board, 3, 1, 0) == 0
assert gamma_move(board, 4, 2, 3) == 0
assert gamma_golden_possible(board, 4) == 1
assert gamma_move(board, 1, 4, 0) == 0
assert gamma_move(board, 2, 3, 1) == 0
assert gamma_move(board, 2, 2, 1) == 0
assert gamma_free_fields(board, 2) == 4
assert gamma_move(board, 3, 0, 2) == 0
assert gamma_move(board, 3, 1, 1) == 0
assert gamma_move(board, 4, 1, 3) == 0
assert gamma_move(board, 4, 4, 1) == 1
assert gamma_move(board, 1, 2, 0) == 0
assert gamma_busy_fields(board, 1) == 4
assert gamma_move(board, 2, 1, 3) == 0
assert gamma_move(board, 2, 0, 4) == 0
assert gamma_free_fields(board, 3) == 4
assert gamma_move(board, 4, 1, 2) == 1
assert gamma_move(board, 1, 0, 3) == 0
assert gamma_move(board, 1, 1, 4) == 0
assert gamma_move(board, 2, 4, 3) == 0
assert gamma_move(board, 2, 4, 4) == 0
gamma_delete(board)
| 32.430769 | 76 | 0.670066 | 785 | 4,216 | 3.449682 | 0.053503 | 0.337149 | 0.376662 | 0.502216 | 0.789513 | 0.763294 | 0.627031 | 0.464549 | 0.345273 | 0.344165 | 0 | 0.142776 | 0.166034 | 4,216 | 129 | 77 | 32.682171 | 0.627418 | 0 | 0 | 0.207207 | 0 | 0 | 0.025436 | 0 | 0 | 0 | 0 | 0 | 0.810811 | 1 | 0 | false | 0 | 0.009009 | 0 | 0.009009 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
4c303e7df93164412fa5de58891614ac0744fbfc | 42,803 | py | Python | psr/admin.py | dennereed/paleocore | d6da6c39cde96050ee4b9e7213ec1200530cbeee | [
"MIT"
] | 1 | 2021-02-05T19:50:13.000Z | 2021-02-05T19:50:13.000Z | psr/admin.py | dennereed/paleocore | d6da6c39cde96050ee4b9e7213ec1200530cbeee | [
"MIT"
] | 59 | 2020-06-17T22:21:51.000Z | 2022-02-10T05:00:01.000Z | psr/admin.py | dennereed/paleocore | d6da6c39cde96050ee4b9e7213ec1200530cbeee | [
"MIT"
] | 2 | 2020-07-01T14:11:09.000Z | 2020-08-10T17:27:26.000Z | from django.contrib.gis import admin
from django.contrib.admin.widgets import AdminFileWidget
from django.urls import reverse, path
from django.utils.html import format_html
#from imagekit.admin import AdminThumbnail
from import_export import resources
import unicodecsv
from .models import * # import database models from models.py
import projects.admin
from django.forms import TextInput, Textarea # import custom form widgets
from mapwidgets.widgets import GooglePointFieldWidget
from .views import *
from .utilities import *
psrformfield = {
models.CharField: {'widget': TextInput(attrs={'size': '50'})},
models.TextField: {'widget': Textarea(attrs={'rows': 5, 'cols': 75})},
models.PointField: {"widget": GooglePointFieldWidget}
}
default_read_only_fields = ('id', 'geom', 'point_x', 'point_y', 'easting', 'northing', 'date_last_modified',
'name', 'date_created', 'last_import', 'collecting_method',)
default_occurrence_filter = ['geological_context', 'collector', 'finder',]
lithic_fields = ('dataclass', 'type1', 'type2', 'technique', 'form', 'raw_material', 'raw_material1',
'coretype', 'biftype', 'cortex', 'retedge', 'edgedamage', 'alteration', 'scarmorph',
'length_mm', 'width_mm', 'thick_mm', 'weight',
'platsurf', 'extplat', 'lip', 'pointimpact', 'platwidth', 'platthick', 'epa',
'scarlength', 'tqwidth', 'tqthick', 'midwidth', 'midthick', 'tipwidth','tipthick',
'lentowid', 'lentothick', 'roew1', 'roet1', 'roew3', 'roet3')
arch_export_fields = ['occurrence_ptr_id', 'archaeology_type',
'length_mm', 'width_mm', 'thick_mm', 'weight',
'archaeology_remarks', 'archaeology_notes']
bio_export_fields = ['occurrence_ptr_id', 'biology_type',
'sex', 'life_stage', 'size_class',
'taxon', 'verbatim_taxon', 'identification_qualifier', 'verbatim_identification_qualifier',
'taxonomy_remarks', 'type_status', 'fauna_notes']
geo_export_fields = ['occurrence_ptr_id', 'geology_type',
'dip', 'strike', 'color', 'texture']
agg_export_fields = ['occurrence_ptr_id', 'counts',
'screen_size', 'burning', 'bone', 'microfauna', 'molluscs', 'pebbles',
'smallplatforms', 'smalldebris',
'tinyplatforms', 'tinydebris',
'bull_find_remarks']
#from Django Forum from user kazukiyo923
class ImagesInline(admin.TabularInline):
model = Image
extra = 1
readonly_fields = ("id", "image", "image_preview",)
fields = ("id", "image_preview", "image", "description", )
formfield_overrides = { models.TextField: {'widget': Textarea(attrs={'rows': 3, 'cols': 25})} }
class FilesInline(admin.TabularInline):
model = File
extra = 0
readonly_fields = ("id",)
# Define Occurrence resource class for django import-export
class OccurrenceResource(resources.ModelResource):
class Meta:
model = Occurrence
class OccurrenceAdmin(projects.admin.PaleoCoreOccurrenceAdmin):
resource_class = OccurrenceResource
change_list_template = 'admin/psr/psr_change_list.html'
readonly_fields = default_read_only_fields + (
'date_recorded', 'field_number', 'basis_of_record',
'found_by', 'collector', 'finder',
'field_id', 'item_type', 'find_type', 'name')
list_filter = ['item_type'] + default_occurrence_filter
search_fields = ['id', 'item_type', 'item_description', 'barcode', 'field_id']
list_per_page = 500
list_display = ('field_id', 'item_type', 'find_type', 'geological_context')
fieldsets = (
('Record Details', {
'fields': [('field_id', 'item_type', 'find_type', 'name'),
('item_description'),
('basis_of_record',),
('remarks',),
('date_recorded', 'date_created', 'date_last_modified')]
}),
('Find Details', {
'fields': [('collecting_method',),
('collector', 'finder', 'found_by'),
('item_count', 'field_number',),
]
}),
('Location', {
'fields': [('geom',), ('point')]
}),
('Problems', {
'fields': [('problem', 'problem_comment'),
],
'classes': ['collapse']
}),
)
inlines = [ImagesInline]
save_as = True
formfield_overrides = psrformfield
actions = ['export_simple_csv', 'subtype_arch', 'subtype_bio', 'subtype_geo', 'subtype_agg']
def export_simple_csv(self, request, queryset):
fields_to_export = ['id', 'field_id', 'find_type', 'item_description', 'item_type', 'item_count',
'finder', 'collector',
'basis_of_record', 'collecting_method', 'collection_remarks',
'date_collected', 'date_created', 'date_last_modified',
'problem', 'problem_comment', 'remarks', 'georeference_remarks',
'point', 'geom',]
response = HttpResponse(content_type='text/csv') # declare the response type
response['Content-Disposition'] = 'attachment; filename="PSR_Survey-Occurrences.csv"' # declare the file name
writer = unicodecsv.writer(response) # open a .csv writer
writer.writerow(fields_to_export + ['longitude', 'latitude', 'geological_context', 'found_by', 'recorded_by']) # write column headers
for o in queryset.order_by('field_id', 'geological_context_id'):
try:
row_list = [o.__dict__.get(k) for k in fields_to_export]
row_list.append(o.point_x())
row_list.append(o.point_y())
gc = GeologicalContext.objects.get(id=o.geological_context_id)
row_list.append(gc.name)
if o.found_by_id is not None:
fb = Person.objects.get(id=o.found_by_id)
row_list.append(fb.__str__())
if o.recorded_by_id is not None:
rb = Person.objects.get(id=o.recorded_by_id)
row_list.append(rb.__str__())
writer.writerow(row_list)
except:
writer.writerow(o.id)
return response
export_simple_csv.short_description = "Export simple report to csv"
def subtype_arch(self, request, queryset):
for q in queryset:
q.item_type = "Archaeological"
q.save()
occurrence2archaeology(q, survey=True)
subtype_arch.short_description = "Subtype selected as archaeological"
def subtype_bio(self, request, queryset):
for q in queryset:
q.item_type = "Biological"
q.save()
occurrence2biology(q, survey=True)
subtype_bio.short_description = "Subtype selected as biological"
def subtype_geo(self, request, queryset):
for q in queryset:
q.item_type = "Geological"
q.save()
occurrence2geo(q, survey=True)
subtype_geo.short_description = "Subtype selected as geological"
def subtype_agg(self, request, queryset):
for q in queryset:
q.item_type = "Aggregate"
q.save()
occurrence2aggr(q, survey=True)
subtype_agg.short_description = "Subtype selected as aggregate/bulk find"
def get_urls(self):
tool_item_urls = [
path(r'import_data/', psr.views.ImportShapefile.as_view())
]
return tool_item_urls + super(OccurrenceAdmin, self).get_urls()
class ExcavationOccurrenceResource(resources.ModelResource):
class Meta:
model = ExcavationOccurrence
class ExcavationOccurrenceAdmin(projects.admin.PaleoCoreOccurrenceAdmin):
resource_class = ExcavationOccurrenceResource
change_list_template = 'admin/psr/psr_change_list.html'
readonly_fields = ('id', 'point', 'date_collected', 'field_id',
'unit', 'type', 'cat_number', 'date_created',
'date_last_modified',
'excavator', 'geological_context', 'prism')
list_display = ('type', 'geological_context', 'cat_number')
list_filter = ['type', 'geological_context']
fieldsets = (
('Record Details', {
'fields': [('field_id', 'type',),
('cat_number', 'date_collected',),
('date_created', 'date_last_modified')]
}),
('Find Details', {
'fields': [('level', 'prism'),
('excavator'),
]
}),
('Location', {
'fields': [('point'), ('geological_context', 'unit')]
})
)
save_as = True
formfield_overrides = psrformfield
actions = ['export_simple_csv', 'subtype_arch', 'subtype_bio', 'subtype_geo', 'subtype_agg']
def export_simple_csv(self, request, queryset):
fields_to_export = ['id', 'field_id', 'cat_number', 'type', 'item_type',
'prism', 'point', 'unit', 'level', 'excavator',
'date_collected', 'date_created', 'date_last_modified']
response = HttpResponse(content_type='text/csv') # declare the response type
response['Content-Disposition'] = 'attachment; filename="PSR_Excavated-Occurrences.csv"' # declare the file name
writer = unicodecsv.writer(response) # open a .csv writer
writer.writerow(fields_to_export + ['longitude', 'latitude', 'geological_context', 'found_by']) # write column headers
for o in queryset.order_by('field_id', 'barcode'):
try:
row_list = [o.__dict__.get(k) for k in fields_to_export]
row_list.append(o.point_x())
row_list.append(o.point_y())
gc = GeologicalContext.objects.get(id=o.geological_context_id)
row_list.append(gc.name)
if o.found_by_id is not None:
fb = Person.objects.get(id=o.found_by_id)
row_list.append(fb.__str__())
writer.writerow(row_list)
except:
writer.writerow(o.id)
return response
export_simple_csv.short_description = "Export simple report to csv"
def subtype_arch(self, request, queryset):
for q in queryset:
q.item_type = "Archaeological"
q.save()
occurrence2archaeology(q, survey=True)
subtype_arch.short_description = "Subtype selected as archaeological"
def subtype_bio(self, request, queryset):
for q in queryset:
q.item_type = "Biological"
q.save()
occurrence2biology(q, survey=True)
subtype_bio.short_description = "Subtype selected as biological"
def subtype_geo(self, request, queryset):
for q in queryset:
q.item_type = "Geological"
q.save()
occurrence2geo(q, survey=True)
subtype_geo.short_description = "Subtype selected as geological"
def subtype_agg(self, request, queryset):
for q in queryset:
q.item_type = "Aggregate"
q.save()
occurrence2aggr(q, survey=True)
subtype_agg.short_description = "Subtype selected as aggregate/bulk find"
def get_urls(self):
tool_item_urls = [
path(r'import_data/', psr.views.ImportAccessDatabase.as_view()),
# path(r'^summary/$',permission_required('mlp.change_occurrence',
# login_url='login/')(self.views.Summary.as_view()),
# name="summary"),
]
return tool_item_urls + super(ExcavationOccurrenceAdmin, self).get_urls()
class GeologicalContextResource(resources.ModelResource):
class Meta:
model = GeologicalContext
class GeologicalContextAdmin(projects.admin.PaleoCoreLocalityAdminGoogle):
resource_class = GeologicalContextResource
change_list_template = 'admin/psr/psr_change_list.html'
readonly_fields = default_read_only_fields + ('date_collected', 'context_type', 'basis_of_record', 'recorded_by')
list_filter = ['context_type', 'basis_of_record', 'collecting_method', 'geology_type', 'sediment_presence']
list_per_page = 500
search_fields = ['id', 'item_scientific_name', 'item_description', 'barcode', 'cat_number', 'name', 'geology_type',
'context_type']
list_display = ('name', 'context_type', 'geology_type')
fieldsets = (
('Record Details', {
'fields': [('name', 'context_type',),
('basis_of_record', 'collecting_method'),
('recorded_by'),
('remarks',),
('date_collected', 'date_created', 'date_last_modified')]
}),
('Geology Details', {
'fields': [('geology_type'), ('description'),
]
}),
('Cave Details', {
'fields': [('dip', 'strike', 'color', 'texture', 'height', 'width', 'depth'),
('slope_character'), ('sediment_presence', 'sediment_character',),
('cave_mouth_character'), ('rockfall_character'), ('speleothem_character')
],
'classes': ['collapse']
}),
('Section Details', {
'fields': [('size_of_loess', 'loess_mean_thickness', 'loess_max_thickness'),
('loess_landscape_position', 'loess_surface_inclination'),
('loess_presence_coarse_components', 'loess_amount_coarse_components'),
('loess_number_sediment_layers', 'loess_number_soil_horizons', 'loess_number_cultural_horizons',
'loess_number_coarse_layers'),
],
'classes': ['collapse']
}),
('Location', {
'fields': [('geom', 'point')]
}),
('Problems', {
'fields': [('problem', 'problem_comment'),
('error_notes')
],
'classes': ['collapse']
}),
)
cave_attributes = ['dip', 'strike', 'color', 'texture', 'height', 'width', 'depth',
'slope_character', 'sediment_presence', 'sediment_character',
'cave_mouth_character', 'rockfall_character', 'speleothem_character']
profile_attributes = ['size_of_loess', 'loess_mean_thickness', 'loess_max_thickness',
'loess_landscape_position', 'loess_surface_inclination',
'loess_presence_coarse_components', 'loess_amount_coarse_components',
'loess_number_sediment_layers', 'loess_number_soil_horizons',
'loess_number_cultural_horizons', 'loess_number_coarse_layers',
'loess_presence_vertical_profile']
save_as = True
formfield_overrides = psrformfield
inlines = [ImagesInline]
def export_simple_csv(self, request, queryset):
fields_to_export = ['id', 'name', 'context_type', 'geology_type', 'description',
'dip', 'strike', 'color', 'texture', 'height', 'width', 'depth',
'slope_character', 'sediment_presence', 'sediment_character',
'cave_mouth_character', 'rockfall_character', 'speleothem_character',
'size_of_loess', 'loess_mean_thickness', 'loess_max_thickness',
'loess_landscape_position', 'loess_surface_inclination',
'loess_presence_coarse_components', 'loess_amount_coarse_components',
'loess_number_sediment_layers', 'loess_number_soil_horizons',
'loess_number_cultural_horizons', 'loess_number_coarse_layers',
'loess_presence_vertical_profile',
'point', 'geom','basis_of_record', 'collecting_method',
'date_collected', 'date_created', 'date_last_modified',
'problem', 'problem_comment', 'remarks', 'georeference_remarks']
response = HttpResponse(content_type='text/csv') # declare the response type
response['Content-Disposition'] = 'attachment; filename="PSR_Geological-Contexts.csv"' # declare the file name
writer = unicodecsv.writer(response) # open a .csv writer
writer.writerow(fields_to_export + ['longitude', 'latitude', 'found_by']) # write column headers
for o in queryset.order_by('field_id', 'barcode'):
try:
row_list = [o.__dict__.get(k) for k in fields_to_export]
row_list.append(o.point_x())
row_list.append(o.point_y())
if o.found_by_id is not None:
fb = Person.objects.get(id=o.found_by_id)
row_list.append(fb.__str__())
writer.writerow(row_list)
except:
writer.writerow(o.id)
return response
export_simple_csv.short_description = "Export simple report to csv"
def get_urls(self):
tool_item_urls = [
path(r'import_data/', psr.views.ImportShapefile.as_view()),
# path(r'^summary/$',permission_required('mlp.change_occurrence',
# login_url='login/')(self.views.Summary.as_view()),
# name="summary"),
]
return tool_item_urls + super(GeologicalContextAdmin, self).get_urls()
class LithicInline(admin.StackedInline):
model = Lithic
extra = 0
fk_name = 'archaeology_ptr'
readonly_fields = ['archaeology_ptr_id']
fields = lithic_fields
class CeramicInline(admin.StackedInline):
model = Ceramic
extra = 0
fk_name = 'archaeology_ptr'
readonly_fields = ['archaeology_ptr_id']
fields = ('type', 'decorated')
class BoneInline(admin.StackedInline):
model = Bone
extra = 0
fk_name = 'archaeology_ptr'
readonly_fields = ['archaeology_ptr_id']
fields = ('part', 'cutmarks', 'burning')
class ArchaeologyResource(resources.ModelResource):
class Meta:
model = Archaeology
class ArchaeologyAdmin(OccurrenceAdmin):
model = Archaeology
resource_class = ArchaeologyResource
# empty_value_display = '-empty-'
list_display = ('occurrence_ptr_id', 'archaeology_type', 'geological_context')
list_filter = ['archaeology_type'] + default_occurrence_filter
formfield_overrides = psrformfield
fieldsets = (OccurrenceAdmin.fieldsets[0],
('Find Details', {
'fields': [('archaeology_type', 'period'),
('length_mm', 'width_mm', 'thick_mm', 'weight', ),
('archaeology_remarks', ),
('collecting_method',),
('collector', 'finder', 'found_by'),
('item_count', 'field_number',),
]
}),
OccurrenceAdmin.fieldsets[2])
actions = ['export_simple_csv', 'subtype_lithic', 'subtype_bone', 'subtype_ceramic']
inlines = [LithicInline, BoneInline, CeramicInline, ImagesInline]
def export_simple_csv(self, request, queryset):
fields_to_export = arch_export_fields
occurrence_export = ['field_id', 'find_type', 'item_description', 'item_type', 'item_count',
'finder', 'collector', 'collecting_method', 'date_collected',
'point', 'geom']
lithic_export = list(lithic_fields)
ceramic_export = ['type', 'decorated']
bone_export = ['cutmarks', 'burning', 'part']
response = HttpResponse(content_type='text/csv') # declare the response type
response['Content-Disposition'] = 'attachment; filename="PSR_Survey-Archaeology.csv"' # declare the file name
writer = unicodecsv.writer(response) # open a .csv writer
writer.writerow(fields_to_export + ['longitude', 'latitude', 'found_by'] + occurrence_export + lithic_export + ceramic_export + bone_export +
['geological_context']) # write column headers
for o in queryset.order_by('field_id', 'barcode'):
try:
row_list = [o.__dict__.get(k) for k in fields_to_export]
row_list.append(o.point_x())
row_list.append(o.point_y())
if o.found_by_id is not None:
fb = Person.objects.get(id=o.found_by_id)
row_list.append(fb.__str__())
row_list = row_list + [o.__dict__.get(k) for k in occurrence_export]
if o.__dict__.get('find_type') in PSR_LITHIC_VOCABULARY:
l = Lithic.objects.get(id=o.__dict__.get('id'))
row_list = row_list + [l.__dict__.get(k) for k in lithic_export]
else:
row_list = row_list + [None for k in lithic_export]
if o.__dict__.get('find_type') in PSR_CERAMIC_VOCABULARY:
c = Ceramic.objects.get(id=o.__dict__.get('id'))
row_list = row_list + [c.__dict__.get(k) for k in ceramic_export]
else:
row_list = row_list + [None for k in ceramic_export]
if o.__dict__.get('find_type') in PSR_BONE_VOCABULARY:
b = Ceramic.objects.get(id=o.__dict__.get('id'))
row_list = row_list + [b.__dict__.get(k) for k in ceramic_export]
else:
row_list = row_list + [None for k in bone_export]
gc = GeologicalContext.objects.get(id=o.geological_context_id)
row_list.append(gc.name)
writer.writerow(row_list)
except:
writer.writerow(o.id)
return response
export_simple_csv.short_description = "Export simple report to csv"
def subtype_lithic(self, request, queryset):
for q in queryset:
archaeology2lithic(q, survey=True)
subtype_lithic.short_description = "Subtype selected as survey lithic(s)"
def subtype_bone(self, request, queryset):
for q in queryset:
archaeology2bone(q, survey=True)
subtype_bone.short_description = "Subtype selected as survey bone(s)"
def subtype_ceramic(self, request, queryset):
for q in queryset:
archaeology2ceramic(q, survey=True)
subtype_ceramic.short_description = "Subtype selected as survey ceramic(s)"
class BiologyResource(resources.ModelResource):
class Meta:
model = Biology
class BiologyAdmin(OccurrenceAdmin):
model = Biology
resource_class = BiologyResource
# empty_value_display = '-empty-'
list_display = ('occurrence_ptr_id', 'biology_type', 'geological_context')
list_filter = ['biology_type'] + default_occurrence_filter
formfield_overrides = psrformfield
fieldsets = (OccurrenceAdmin.fieldsets[0],
('Find Details', {
'fields': [('biology_type', ),
('sex', 'life_stage', 'size_class',),
('taxon', 'identification_qualifier'),
('collecting_method',),
('collector', 'finder', 'found_by'),
('item_count', 'field_number',),
]
}),
OccurrenceAdmin.fieldsets[2])
fields_to_export = bio_export_fields
actions = ['export_simple_csv']
def export_simple_csv(self, request, queryset):
fields_to_export = bio_export_fields
occurrence_export = ['field_id', 'find_type', 'item_description', 'item_type', 'item_count',
'finder', 'collector', 'collecting_method', 'date_collected',
'point', 'geom']
response = HttpResponse(content_type='text/csv') # declare the response type
response['Content-Disposition'] = 'attachment; filename="PSR_Survey-Bio.csv"' # declare the file name
writer = unicodecsv.writer(response) # open a .csv writer
writer.writerow(
fields_to_export + ['found_by'] + occurrence_export + ['geological_context']) # write column headers
for o in queryset.order_by('field_id', 'barcode'):
try:
row_list = [o.__dict__.get(k) for k in fields_to_export]
if o.found_by_id is not None:
fb = Person.objects.get(id=o.found_by_id)
row_list.append(fb.__str__())
row_list = row_list + [o.__dict__.get(k) for k in occurrence_export]
gc = GeologicalContext.objects.get(id=o.geological_context_id)
row_list.append(gc.name)
writer.writerow(row_list)
except:
writer.writerow(o.id)
return response
export_simple_csv.short_description = "Export simple report to csv"
class GeologyResource(resources.ModelResource):
class Meta:
model = Geology
class GeologyAdmin(OccurrenceAdmin):
model = Geology
resource_class = GeologyResource
# empty_value_display = '-empty-'
list_display = ('occurrence_ptr_id', 'geology_type', 'geological_context')
list_filter = ['geology_type'] + default_occurrence_filter
formfield_overrides = psrformfield
fieldsets = (OccurrenceAdmin.fieldsets[0],
('Find Details', {
'fields': [('geology_type',),
('dip', 'strike', 'color', 'texture'),
('collecting_method',),
('collector', 'finder', 'found_by'),
('item_count', 'field_number',),
]
}),
OccurrenceAdmin.fieldsets[2])
fields_to_export = geo_export_fields
actions = ['export_simple_csv']
def export_simple_csv(self, request, queryset):
fields_to_export = geo_export_fields
occurrence_export = ['field_id', 'find_type', 'item_description', 'item_type', 'item_count',
'finder', 'collector', 'collecting_method', 'date_collected',
'point', 'geom']
response = HttpResponse(content_type='text/csv') # declare the response type
response['Content-Disposition'] = 'attachment; filename="PSR_Survey-Geo.csv"' # declare the file name
writer = unicodecsv.writer(response) # open a .csv writer
writer.writerow(
fields_to_export + ['found_by'] + occurrence_export + ['geological_context']) # write column headers
for o in queryset.order_by('field_id', 'barcode'):
try:
row_list = [o.__dict__.get(k) for k in fields_to_export]
if o.found_by_id is not None:
fb = Person.objects.get(id=o.found_by_id)
row_list.append(fb.__str__())
row_list = row_list + [o.__dict__.get(k) for k in occurrence_export]
gc = GeologicalContext.objects.get(id=o.geological_context_id)
row_list.append(gc.name)
writer.writerow(row_list)
except:
writer.writerow(o.id)
return response
export_simple_csv.short_description = "Export simple report to csv"
class AggregateResource(resources.ModelResource):
class Meta:
model = Aggregate
class AggregateAdmin(OccurrenceAdmin):
model = Aggregate
resource_class = AggregateResource
# empty_value_display = '-empty-'
list_display = ('occurrence_ptr_id', 'screen_size', 'geological_context')
list_filter = default_occurrence_filter
formfield_overrides = psrformfield
fieldsets = (OccurrenceAdmin.fieldsets[0],
('Find Details', {
'fields': [('screen_size', 'counts'),
('burning', 'bone', 'microfauna', 'molluscs', 'pebbles'),
('smallplatforms', 'smalldebris', 'tinyplatforms', 'tinydebris'),
('collecting_method',),
('collector', 'finder', 'found_by'),
('item_count', 'field_number',),
]
}),
OccurrenceAdmin.fieldsets[2])
fields_to_export = agg_export_fields
actions = ['export_simple_csv']
def export_simple_csv(self, request, queryset):
fields_to_export = agg_export_fields
occurrence_export = ['field_id', 'find_type', 'item_description', 'item_type', 'item_count',
'finder', 'collector', 'collecting_method', 'date_collected',
'point', 'geom']
response = HttpResponse(content_type='text/csv') # declare the response type
response['Content-Disposition'] = 'attachment; filename="PSR_Survey-Aggregates.csv"' # declare the file name
writer = unicodecsv.writer(response) # open a .csv writer
writer.writerow(
fields_to_export + ['found_by'] + occurrence_export + ['geological_context']) # write column headers
for o in queryset.order_by('field_id', 'barcode'):
try:
row_list = [o.__dict__.get(k) for k in fields_to_export]
if o.found_by_id is not None:
fb = Person.objects.get(id=o.found_by_id)
row_list.append(fb.__str__())
row_list = row_list + [o.__dict__.get(k) for k in occurrence_export]
gc = GeologicalContext.objects.get(id=o.geological_context_id)
row_list.append(gc.name)
writer.writerow(row_list)
except:
writer.writerow(o.id)
return response
export_simple_csv.short_description = "Export simple report to csv"
class ExcavLithicInline(admin.StackedInline):
model = ExcavatedLithic
extra = 0
fk_name = 'excavatedarchaeology_ptr'
readonly_fields = ['excavatedarchaeology_ptr_id']
fields = lithic_fields
class ExcavCeramicInline(admin.StackedInline):
model = ExcavatedCeramic
extra = 0
fk_name = 'excavatedarchaeology_ptr'
readonly_fields = ['excavatedarchaeology_ptr_id']
fields = ('type', 'decorated')
class ExcavBoneInline(admin.StackedInline):
model = ExcavatedBone
extra = 0
fk_name = 'excavatedarchaeology_ptr'
readonly_fields = ['excavatedarchaeology_ptr_id']
fields = ('part', 'cutmarks', 'burning')
class ExcavatedArchaeologyResource(resources.ModelResource):
class Meta:
model = ExcavatedArchaeology
class ExcavatedArchaeologyAdmin(ExcavationOccurrenceAdmin):
model = ExcavatedArchaeology
resource_class = ExcavatedArchaeologyResource
# empty_value_display = '-empty-'
list_display = ('excavationoccurrence_ptr_id', 'archaeology_type', 'geological_context')
formfield_overrides = psrformfield
fieldsets = (ExcavationOccurrenceAdmin.fieldsets[0],
('Find Details', {
'fields': [('archaeology_type', 'period'),
('length_mm', 'width_mm', 'thick_mm', 'weight', ),
('archaeology_remarks', ),
('level', 'prism'),
('excavator'),
]
}),
ExcavationOccurrenceAdmin.fieldsets[2])
fields_to_export = arch_export_fields
inlines = [ExcavLithicInline, ExcavBoneInline, ExcavCeramicInline]
actions = ['export_simple_csv', 'subtype_lithic', 'subtype_bone', 'subtype_ceramic']
def export_simple_csv(self, request, queryset):
fields_to_export = arch_export_fields
occurrence_export = ['id', 'field_id', 'cat_number', 'type', 'item_type',
'prism', 'point', 'unit', 'level', 'excavator']
lithic_export = list(lithic_fields)
ceramic_export = ['type', 'decorated']
bone_export = ['cutmarks', 'burning', 'part']
response = HttpResponse(content_type='text/csv') # declare the response type
response['Content-Disposition'] = 'attachment; filename="PSR_Excavated-Archaeology.csv"' # declare the file name
writer = unicodecsv.writer(response) # open a .csv writer
writer.writerow(fields_to_export + ['found_by'] + occurrence_export + lithic_export + ceramic_export + bone_export+ ['geological_context']) # write column headers
for o in queryset.order_by('field_id', 'barcode'):
try:
row_list = [o.__dict__.get(k) for k in fields_to_export]
if o.found_by_id is not None:
fb = Person.objects.get(id=o.found_by_id)
row_list.append(fb.__str__())
row_list = row_list + [o.__dict__.get(k) for k in occurrence_export]
if o.__dict__.get('find_type') in PSR_LITHIC_VOCABULARY:
l = Lithic.objects.get(id=o.__dict__.get('id'))
row_list = row_list + [l.__dict__.get(k) for k in lithic_export]
else:
row_list = row_list + [None for k in lithic_export]
if o.__dict__.get('find_type') in PSR_CERAMIC_VOCABULARY:
c = Ceramic.objects.get(id=o.__dict__.get('id'))
row_list = row_list + [c.__dict__.get(k) for k in ceramic_export]
else:
row_list = row_list + [None for k in ceramic_export]
if o.__dict__.get('find_type') in PSR_BONE_VOCABULARY:
b = Ceramic.objects.get(id=o.__dict__.get('id'))
row_list = row_list + [b.__dict__.get(k) for k in ceramic_export]
else:
row_list = row_list + [None for k in bone_export]
gc = GeologicalContext.objects.get(id=o.geological_context_id)
row_list.append(gc.name)
writer.writerow(row_list)
except:
writer.writerow(o.id)
return response
export_simple_csv.short_description = "Export simple report to csv"
def subtype_lithic(self, request, queryset):
for q in queryset:
archaeology2lithic(q, survey=False)
subtype_lithic.short_description = "Subtype selected as excavated lithic(s)"
def subtype_bone(self, request, queryset):
for q in queryset:
archaeology2bone(q, survey=False)
subtype_bone.short_description = "Subtype selected as excavated bone(s)"
def subtype_ceramic(self, request, queryset):
for q in queryset:
archaeology2ceramic(q, survey=False)
subtype_ceramic.short_description = "Subtype selected as excavated ceramic(s)"
class ExcavatedBiologyResource(resources.ModelResource):
class Meta:
model = ExcavatedBiology
class ExcavatedBiologyAdmin(ExcavationOccurrenceAdmin):
model = ExcavatedBiology
resource_class = BiologyResource
# empty_value_display = '-empty-'
list_display = ('excavationoccurrence_ptr_id', 'biology_type', 'geological_context')
formfield_overrides = psrformfield
fieldsets = (ExcavationOccurrenceAdmin.fieldsets[0],
('Find Details', {
'fields': [('biology_type', ),
('sex', 'life_stage', 'size_class', ),
('taxon', 'identification_qualifier'),
('level', 'prism'),
('excavator'),
]
}),
ExcavationOccurrenceAdmin.fieldsets[2])
actions = ['export_simple_csv']
def export_simple_csv(self, request, queryset):
fields_to_export = bio_export_fields
occurrence_export = ['id', 'field_id', 'cat_number', 'type', 'item_type',
'prism', 'point', 'unit', 'level', 'excavator']
response = HttpResponse(content_type='text/csv') # declare the response type
response['Content-Disposition'] = 'attachment; filename="PSR_Excavated-Bio.csv"' # declare the file name
writer = unicodecsv.writer(response) # open a .csv writer
writer.writerow(fields_to_export + ['found_by'] + occurrence_export + ['geological_context']) # write column headers
for o in queryset.order_by('field_id', 'barcode'):
try:
row_list = [o.__dict__.get(k) for k in fields_to_export]
if o.found_by_id is not None:
fb = Person.objects.get(id=o.found_by_id)
row_list.append(fb.__str__())
row_list = row_list + [o.__dict__.get(k) for k in occurrence_export]
gc = GeologicalContext.objects.get(id=o.geological_context_id)
row_list.append(gc.name)
writer.writerow(row_list)
except:
writer.writerow(o.id)
return response
export_simple_csv.short_description = "Export simple report to csv"
class ExcavatedGeologyResource(resources.ModelResource):
class Meta:
model = ExcavatedGeology
class ExcavatedGeologyAdmin(ExcavationOccurrenceAdmin):
model = ExcavatedGeology
resource_class = ExcavatedGeologyResource
# empty_value_display = '-empty-'
list_display = ('excavationoccurrence_ptr_id', 'geology_type', 'geological_context')
formfield_overrides = psrformfield
fieldsets = (ExcavationOccurrenceAdmin.fieldsets[0],
('Find Details', {
'fields': [('geology_type'),
('color', 'texture'),
('level', 'prism'),
('excavator'),
]
}),
ExcavationOccurrenceAdmin.fieldsets[2])
actions = ['export_simple_csv']
def export_simple_csv(self, request, queryset):
fields_to_export = geo_export_fields
occurrence_export = ['id', 'field_id', 'cat_number', 'type', 'item_type',
'prism', 'point', 'unit', 'level', 'excavator']
response = HttpResponse(content_type='text/csv') # declare the response type
response['Content-Disposition'] = 'attachment; filename="PSR_Excavated-Geo.csv"' # declare the file name
writer = unicodecsv.writer(response) # open a .csv writer
writer.writerow(fields_to_export + ['found_by'] + occurrence_export + ['geological_context']) # write column headers
for o in queryset.order_by('field_id', 'barcode'):
try:
row_list = [o.__dict__.get(k) for k in fields_to_export]
if o.found_by_id is not None:
fb = Person.objects.get(id=o.found_by_id)
row_list.append(fb.__str__())
row_list = row_list + [o.__dict__.get(k) for k in occurrence_export]
gc = GeologicalContext.objects.get(id=o.geological_context_id)
row_list.append(gc.name)
writer.writerow(row_list)
except:
writer.writerow(o.id)
return response
export_simple_csv.short_description = "Export simple report to csv"
class ExcavatedAggregateResource(resources.ModelResource):
class Meta:
model = ExcavatedAggregate
class ExcavatedAggregateAdmin(ExcavationOccurrenceAdmin):
model = ExcavatedAggregate
resource_class = ExcavatedAggregateResource
# empty_value_display = '-empty-'
list_display = ('excavationoccurrence_ptr_id', 'screen_size', 'geological_context')
formfield_overrides = psrformfield
fieldsets = (ExcavationOccurrenceAdmin.fieldsets[0],
('Find Details', {
'fields': [('screen_size', 'counts'),
('burning', 'bone', 'microfauna', 'molluscs', 'pebbles'),
('smallplatforms', 'smalldebris', 'tinyplatforms', 'tinydebris'),
('level', 'prism'),
('excavator'),
]
}),
ExcavationOccurrenceAdmin.fieldsets[2])
actions = ['export_simple_csv']
def export_simple_csv(self, request, queryset):
fields_to_export = agg_export_fields
occurrence_export = ['id', 'field_id', 'cat_number', 'type', 'item_type',
'prism', 'point', 'unit', 'level', 'excavator']
response = HttpResponse(content_type='text/csv') # declare the response type
response['Content-Disposition'] = 'attachment; filename="PSR_Excavated-Buckets.csv"' # declare the file name
writer = unicodecsv.writer(response) # open a .csv writer
writer.writerow(fields_to_export + ['found_by'] + occurrence_export + ['geological_context']) # write column headers
for o in queryset.order_by('field_id', 'barcode'):
try:
row_list = [o.__dict__.get(k) for k in fields_to_export]
if o.found_by_id is not None:
fb = Person.objects.get(id=o.found_by_id)
row_list.append(fb.__str__())
row_list = row_list + [o.__dict__.get(k) for k in occurrence_export]
gc = GeologicalContext.objects.get(id=o.geological_context_id)
row_list.append(gc.name)
writer.writerow(row_list)
except:
writer.writerow(o.id)
return response
export_simple_csv.short_description = "Export simple report to csv"
admin.site.register(Biology, BiologyAdmin)
admin.site.register(Archaeology, ArchaeologyAdmin)
admin.site.register(Geology, GeologyAdmin)
admin.site.register(GeologicalContext, GeologicalContextAdmin)
admin.site.register(Occurrence, OccurrenceAdmin)
admin.site.register(ExcavationOccurrence, ExcavationOccurrenceAdmin)
admin.site.register(ExcavatedBiology, ExcavatedBiologyAdmin)
admin.site.register(ExcavatedArchaeology, ExcavatedArchaeologyAdmin)
admin.site.register(ExcavatedGeology, ExcavatedGeologyAdmin)
admin.site.register(Aggregate, AggregateAdmin)
admin.site.register(ExcavatedAggregate, ExcavatedAggregateAdmin)
| 44.493763 | 172 | 0.594795 | 4,292 | 42,803 | 5.629543 | 0.089003 | 0.026653 | 0.021439 | 0.015065 | 0.799934 | 0.76343 | 0.740833 | 0.717987 | 0.710579 | 0.675896 | 0 | 0.002049 | 0.292993 | 42,803 | 961 | 173 | 44.540062 | 0.796378 | 0.040815 | 0 | 0.688295 | 0 | 0 | 0.229557 | 0.038739 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035623 | false | 0 | 0.020356 | 0 | 0.291349 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
d5df3528f9e019483ed6f76dd4613e2311812a7c | 145 | py | Python | commanderbot_ext/vote/__init__.py | Arcensoth/commanderbot-ext | e140fda60c004e7b647e7f92638a1d0a76f29e87 | [
"MIT"
] | null | null | null | commanderbot_ext/vote/__init__.py | Arcensoth/commanderbot-ext | e140fda60c004e7b647e7f92638a1d0a76f29e87 | [
"MIT"
] | null | null | null | commanderbot_ext/vote/__init__.py | Arcensoth/commanderbot-ext | e140fda60c004e7b647e7f92638a1d0a76f29e87 | [
"MIT"
] | null | null | null | from discord.ext import commands
from commanderbot_ext.vote.vote_cog import VoteCog
def setup(bot: commands.Bot):
bot.add_cog(VoteCog(bot)) | 24.166667 | 50 | 0.793103 | 23 | 145 | 4.869565 | 0.565217 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117241 | 145 | 6 | 51 | 24.166667 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
9128f738b7634781d65b07717183fdfa0a15a1cd | 4,339 | py | Python | tests/qrcode/test_limits.py | rsbondi/hermit | 0007d8077547484efa173295090775b3cd5ce75b | [
"Apache-2.0"
] | 1 | 2021-07-23T16:43:06.000Z | 2021-07-23T16:43:06.000Z | tests/qrcode/test_limits.py | rsbondi/hermit | 0007d8077547484efa173295090775b3cd5ce75b | [
"Apache-2.0"
] | null | null | null | tests/qrcode/test_limits.py | rsbondi/hermit | 0007d8077547484efa173295090775b3cd5ce75b | [
"Apache-2.0"
] | 1 | 2021-05-10T22:00:15.000Z | 2021-05-10T22:00:15.000Z | import base64
import gzip
import random
import string
from numpy import array
import cv2
import pytest
import qrcode
#
# Version 40 QRcode Storage Limits with Low Error Correction
# Mixed Bytes: 23,648 (tests show this library is limited to 23,549)
# Numeric: 7,089
# Alphanumeric: 4,296
# Binary: 2,953
#
@pytest.mark.qrcode
class TestQRCodeStorage(object):
def test_numeric_maximum(self):
N = 7089
data = ''.join([
random.choice(string.digits)
for _ in range(N)])
qr = qrcode.QRCode(
version=1,
error_correction=qrcode.constants.ERROR_CORRECT_L,
box_size=10,
border=4,
)
qr.add_data(data)
qr.make()
assert True
def test_numeric_overflow(self):
N = 7090
data = ''.join([
random.choice(string.digits)
for _ in range(N)])
qr = qrcode.QRCode(
version=1,
error_correction=qrcode.constants.ERROR_CORRECT_L,
box_size=10,
border=4,
)
qr.add_data(data)
with pytest.raises(qrcode.exceptions.DataOverflowError):
qr.make()
assert True
def test_alphanumeric_maximum(self):
N = 4296
data = ''.join([
random.choice(string.digits + string.ascii_uppercase)
for _ in range(N)])
qr = qrcode.QRCode(
version=1,
error_correction=qrcode.constants.ERROR_CORRECT_L,
box_size=10,
border=4,
)
qr.add_data(data)
qr.make()
assert True
def test_alphanumeric_overflow(self):
N = 4297
data = ''.join([
random.choice(string.digits + string.ascii_uppercase)
for _ in range(N)])
qr = qrcode.QRCode(
version=1,
error_correction=qrcode.constants.ERROR_CORRECT_L,
box_size=10,
border=4,
)
qr.add_data(data, optimize=0)
with pytest.raises(qrcode.exceptions.DataOverflowError):
qr.make()
assert True
def test_binary_maximum(self):
N = 2953
data = bytes(bytearray((random.getrandbits(8) for i in range(N))))
qr = qrcode.QRCode(
version=1,
error_correction=qrcode.constants.ERROR_CORRECT_L,
box_size=10,
border=4,
)
qr.add_data(data)
qr.make()
assert True
def test_binary_overflow(self):
N = 2954
data = bytes(bytearray((random.getrandbits(8) for i in range(N))))
qr = qrcode.QRCode(
version=1,
error_correction=qrcode.constants.ERROR_CORRECT_L,
box_size=10,
border=4,
)
qr.add_data(data, optimize=0)
with pytest.raises(qrcode.exceptions.DataOverflowError):
qr.make()
assert True
def test_bits_maximum(self):
N = 23549
data = 1 << N
qr = qrcode.QRCode(
version=1,
error_correction=qrcode.constants.ERROR_CORRECT_L,
box_size=10,
border=4,
)
qr.add_data(data)
qr.make()
assert True
def test_bits_overflow(self):
N = 23550
data = 1 << N
qr = qrcode.QRCode(
version=1,
error_correction=qrcode.constants.ERROR_CORRECT_L,
box_size=10,
border=4,
)
qr.add_data(data)
with pytest.raises(qrcode.exceptions.DataOverflowError):
qr.make()
assert True
def test_alphanumeric_compression_maximum(self):
M = 4
N = 2000
data = ''.join([
random.choice(string.digits + string.ascii_letters) * M
for _ in range(N)])
data = data.encode('utf-8')
print(len(data)) # 10000
data = gzip.compress(data)
data = base64.b32encode(data)
print(len(data))
qr = qrcode.QRCode(
version=1,
error_correction=qrcode.constants.ERROR_CORRECT_L,
box_size=10,
border=4,
)
qr.add_data(data)
qr.make()
assert len(data) < 4296
assert True
| 26.457317 | 74 | 0.539756 | 486 | 4,339 | 4.670782 | 0.207819 | 0.038767 | 0.055507 | 0.08326 | 0.737885 | 0.737885 | 0.737885 | 0.723789 | 0.704846 | 0.704846 | 0 | 0.044444 | 0.367366 | 4,339 | 163 | 75 | 26.619632 | 0.782514 | 0.041484 | 0 | 0.690647 | 0 | 0 | 0.001205 | 0 | 0 | 0 | 0 | 0 | 0.071942 | 1 | 0.064748 | false | 0 | 0.057554 | 0 | 0.129496 | 0.014388 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
e66f938ad7719315769b3d3e9f745dcf94f93876 | 254 | py | Python | sample_applications/AgentlessIdpSample/Controller/ErrorController.py | pingidentity/pf-agentless-ik-sample-python | 59c0f73ce30f7518201707bda16bf04ca8c362e7 | [
"Apache-2.0"
] | null | null | null | sample_applications/AgentlessIdpSample/Controller/ErrorController.py | pingidentity/pf-agentless-ik-sample-python | 59c0f73ce30f7518201707bda16bf04ca8c362e7 | [
"Apache-2.0"
] | null | null | null | sample_applications/AgentlessIdpSample/Controller/ErrorController.py | pingidentity/pf-agentless-ik-sample-python | 59c0f73ce30f7518201707bda16bf04ca8c362e7 | [
"Apache-2.0"
] | null | null | null | from flask import render_template
from sample_applications.AgentlessIdpSample.Controller.RequestController import RequestController
class ErrorController(RequestController):
def handle(self, request):
return render_template('Error.html')
| 25.4 | 97 | 0.818898 | 25 | 254 | 8.2 | 0.76 | 0.136585 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122047 | 254 | 9 | 98 | 28.222222 | 0.919283 | 0 | 0 | 0 | 0 | 0 | 0.039526 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0.2 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 5 |
e672439b9f9d536e32236c3336cd037042960d39 | 57 | py | Python | OurTasks/01/05/05.py | nakov/Python-Course-SoftUni | b6036064c259adbdae4e2d87b67230b9cf9ddefc | [
"MIT"
] | 6 | 2017-06-09T17:45:28.000Z | 2020-03-31T11:59:39.000Z | OurTasks/01/05/05.py | nakov/Python-Course-SoftUni | b6036064c259adbdae4e2d87b67230b9cf9ddefc | [
"MIT"
] | null | null | null | OurTasks/01/05/05.py | nakov/Python-Course-SoftUni | b6036064c259adbdae4e2d87b67230b9cf9ddefc | [
"MIT"
] | 1 | 2019-07-02T11:26:00.000Z | 2019-07-02T11:26:00.000Z | import math
print("{0:.2f}".format (math.sqrt (12345)))
| 14.25 | 43 | 0.649123 | 9 | 57 | 4.111111 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137255 | 0.105263 | 57 | 3 | 44 | 19 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0.122807 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 5 |
e675abd327b49f64493e9b622d289d60ba79be00 | 150 | py | Python | botworks/config_constants.py | doubleyuhtee/botworks | f9ffc9e77b0b924a42e19d71542976e7cf406725 | [
"MIT"
] | null | null | null | botworks/config_constants.py | doubleyuhtee/botworks | f9ffc9e77b0b924a42e19d71542976e7cf406725 | [
"MIT"
] | null | null | null | botworks/config_constants.py | doubleyuhtee/botworks | f9ffc9e77b0b924a42e19d71542976e7cf406725 | [
"MIT"
] | null | null | null | SLEEP_TIME = "sleep_time"
ERROR_SLEEP_TIME = "error_sleep_time"
LOG_LEVEL = "log_level"
UNHEALTHY_HANDLER = "unhealthy"
HEALTHY_HANDLER = "healthy"
| 18.75 | 37 | 0.786667 | 20 | 150 | 5.4 | 0.4 | 0.333333 | 0.259259 | 0.351852 | 0.342593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113333 | 150 | 7 | 38 | 21.428571 | 0.81203 | 0 | 0 | 0 | 0 | 0 | 0.344595 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
e67695270cbc08d78233fa7bcd4451a7a757bb10 | 156 | py | Python | run_tests.py | cmpadden/microdot | a5a14ff9f327925108303cf0d7f7f69d9b36d1b1 | [
"MIT"
] | 3 | 2021-01-31T22:14:34.000Z | 2021-08-21T23:28:51.000Z | run_tests.py | dpgeorge/microdot | dfbe2edd797153fc9be40bc1928d93bdee7e7be5 | [
"MIT"
] | null | null | null | run_tests.py | dpgeorge/microdot | dfbe2edd797153fc9be40bc1928d93bdee7e7be5 | [
"MIT"
] | null | null | null | import sys
sys.path.insert(0, 'microdot')
sys.path.insert(1, 'microdot-asyncio')
sys.path.insert(2, 'tests/libs')
import unittest
unittest.main('tests')
| 15.6 | 38 | 0.730769 | 24 | 156 | 4.75 | 0.541667 | 0.184211 | 0.342105 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021127 | 0.089744 | 156 | 9 | 39 | 17.333333 | 0.78169 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
e6bc009f43c45787520fd30b8b09eb2b7474d956 | 25 | py | Python | SinglePackage/single_package/helpers_single_package.py | CJosephides/PythonApplicationStructures | b82385f7a35f3097eac08011d24d9d1429cee171 | [
"RSA-MD"
] | 1 | 2019-02-05T11:45:11.000Z | 2019-02-05T11:45:11.000Z | SinglePackage/single_package/helpers_single_package.py | CJosephides/PythonApplicationStructures | b82385f7a35f3097eac08011d24d9d1429cee171 | [
"RSA-MD"
] | null | null | null | SinglePackage/single_package/helpers_single_package.py | CJosephides/PythonApplicationStructures | b82385f7a35f3097eac08011d24d9d1429cee171 | [
"RSA-MD"
] | null | null | null | class Helper():
pass
| 8.333333 | 15 | 0.6 | 3 | 25 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.28 | 25 | 2 | 16 | 12.5 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
e6eee65d7ee1af4e888a25e45febc874ab83ca9e | 8,328 | py | Python | djsani/insurance/forms.py | carthagecollege/django-djsani | ad95158f443c9d4e0cd0cea2a99ebe7062d38ce5 | [
"BSD-3-Clause"
] | null | null | null | djsani/insurance/forms.py | carthagecollege/django-djsani | ad95158f443c9d4e0cd0cea2a99ebe7062d38ce5 | [
"BSD-3-Clause"
] | null | null | null | djsani/insurance/forms.py | carthagecollege/django-djsani | ad95158f443c9d4e0cd0cea2a99ebe7062d38ce5 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""Views for the insurance forms."""
from django import forms
from django.conf import settings
from django.core.validators import FileExtensionValidator
from djsani.insurance.models import StudentHealthInsurance
from djtools.fields import REQ_CSS
from djtools.fields import STATE_CHOICES
from djtools.fields.localflavor import USPhoneNumberField
POLICY_CHOICES = (
('', '---select---'),
('HMO', 'HMO'),
('PPO', 'PPO'),
('POS', 'POS'),
('Gov', 'Medicaid'),
('Mil', 'Military'),
('Int', 'International'),
)
ALLOWED_IMAGE_EXTENSIONS = settings.ALLOWED_IMAGE_EXTENSIONS
class AthleteForm(forms.ModelForm):
"""Insurance form for student athletes."""
def __init__(self, *args, **kwargs):
"""Initialise the form with a manager and insurance object."""
self.manager = kwargs.pop('manager', None)
self.insurance = kwargs.pop('insurance', None)
super(AthleteForm, self).__init__(*args, **kwargs)
opt_out = forms.CharField(
widget=forms.HiddenInput(), required=False,
)
primary_policy_holder = forms.CharField(
label="Policy holder",
max_length=128,
required=False,
widget=forms.TextInput(attrs=REQ_CSS),
)
primary_dob = forms.DateField(
label="Birth date (policy holder)",
help_text="Format: mm/dd/yyyy",
required=False,
widget=forms.TextInput(attrs={'class': 'required date'}),
)
primary_company = forms.CharField(
label="Insurance company",
max_length=128,
required=False,
widget=forms.TextInput(attrs=REQ_CSS),
)
primary_phone = USPhoneNumberField(
label="Insurance phone number",
max_length=12,
help_text="Please provide the toll free customer service number",
required=False,
widget=forms.TextInput(attrs={'class': 'required'}),
)
primary_policy_address = forms.CharField(
label="Insurance address",
widget=forms.Textarea(),
required=False,
)
primary_member_id = forms.CharField(
label="Member ID",
max_length=64,
required=False,
widget=forms.TextInput(attrs=REQ_CSS),
)
primary_group_no = forms.CharField(
label="Group number",
max_length=64,
required=False,
widget=forms.TextInput(attrs=REQ_CSS),
)
primary_policy_type = forms.CharField(
label="Type of policy",
required=False,
widget=forms.Select(choices=POLICY_CHOICES, attrs=REQ_CSS),
)
primary_policy_state = forms.CharField(
label="If Medicaid, in which state?",
widget=forms.Select(choices=STATE_CHOICES),
required=False,
)
primary_card_front = forms.FileField(
label="Insurance Card Front",
help_text="Photo/Scan of your insurance card",
validators=[
FileExtensionValidator(allowed_extensions=ALLOWED_IMAGE_EXTENSIONS),
],
required=False,
)
primary_card_back = forms.FileField(
label="Insurance Card Back",
help_text="Photo/Scan of your insurance card",
validators=[
FileExtensionValidator(allowed_extensions=ALLOWED_IMAGE_EXTENSIONS),
],
required=False,
)
# secondary
secondary_policy_holder = forms.CharField(
label="Policy holder",
max_length=128,
required=False,
widget=forms.TextInput(attrs=REQ_CSS),
)
secondary_dob = forms.DateField(
label="Birth date (policy holder)",
help_text="Format: mm/dd/yyyy",
required=False,
widget=forms.TextInput(attrs={'class': 'required date'}),
)
secondary_company = forms.CharField(
label="Insurance company",
max_length=128,
required=False,
widget=forms.TextInput(attrs=REQ_CSS),
)
secondary_phone = USPhoneNumberField(
label="Insurance phone number",
max_length=12,
help_text="Please provide the toll free customer service number",
required=False,
widget=forms.TextInput(attrs={'class': 'required phoneUS'}),
)
secondary_policy_address = forms.CharField(
label="Insurance address",
widget=forms.Textarea(),
required=False,
)
secondary_member_id = forms.CharField(
label="Member ID",
max_length=64,
required=False,
widget=forms.TextInput(attrs=REQ_CSS),
)
secondary_group_no = forms.CharField(
label="Group number",
max_length=64,
required=False,
widget=forms.TextInput(attrs=REQ_CSS),
)
secondary_policy_type = forms.CharField(
label="Type of policy",
required=False,
widget=forms.Select(choices=POLICY_CHOICES, attrs=REQ_CSS),
)
secondary_policy_state = forms.CharField(
label="If Medicaid, in which state?",
widget=forms.Select(choices=STATE_CHOICES),
required=False,
)
# tertiary
tertiary_policy_holder = forms.CharField(
label="Policy holder",
max_length=128,
required=False,
widget=forms.TextInput(attrs=REQ_CSS),
)
tertiary_dob = forms.DateField(
label="Birth date (policy holder)",
help_text="Format: mm/dd/yyyy",
required=False,
widget=forms.TextInput(attrs={'class': 'required date'}),
)
tertiary_company = forms.CharField(
label="Insurance company",
max_length=128,
required=False,
widget=forms.TextInput(attrs=REQ_CSS),
)
tertiary_phone = USPhoneNumberField(
label="Insurance phone number",
max_length=12,
help_text="Please provide the toll free customer service number",
required=False,
widget=forms.TextInput(attrs={'class': 'required phoneUS'}),
)
tertiary_policy_address = forms.CharField(
label="Insurance address",
widget=forms.Textarea,
required=False,
)
tertiary_member_id = forms.CharField(
label="Member ID",
max_length=64,
required=False,
widget=forms.TextInput(attrs=REQ_CSS),
)
tertiary_group_no = forms.CharField(
label="Group number",
max_length=64,
required=False,
widget=forms.TextInput(attrs=REQ_CSS),
)
tertiary_policy_type = forms.CharField(
label="Type of policy",
required=False,
widget=forms.Select(choices=POLICY_CHOICES, attrs=REQ_CSS),
)
tertiary_policy_state = forms.CharField(
label="If Medicaid, in which state?",
widget=forms.Select(choices=STATE_CHOICES),
required=False,
)
tertiary_card = forms.FileField(
label="Insurance Card",
help_text="Photo/Scan of your insurance card",
validators=[
FileExtensionValidator(allowed_extensions=ALLOWED_IMAGE_EXTENSIONS),
],
required=False,
)
class Meta:
"""Attributes about the form class."""
model = StudentHealthInsurance
exclude = (
'college_id',
'created_at',
'manager',
'primary_card_front_status',
'primary_card_back_status',
)
def clean(self):
"""Form validation for all fields."""
cd = self.cleaned_data
insurance = self.insurance
manager = self.manager
if manager.sports() and not cd['opt_out']:
front = cd.get('primary_card_front')
back = cd.get('primary_card_back')
if not front or not back:
if not insurance:
error = "Required Field"
self._errors['primary_card_front'] = self.error_class(
[error],
)
self._errors['primary_card_back'] = self.error_class(
[error],
)
return self.cleaned_data
class StudentForm(AthleteForm):
"""Insurance form for students."""
class Meta:
"""Attributes about the form class."""
model = StudentHealthInsurance
exclude = (
'college_id',
'created_at',
'manager',
'primary_card_front_status',
'primary_card_back_status',
)
| 31.191011 | 80 | 0.614673 | 861 | 8,328 | 5.770035 | 0.166086 | 0.081119 | 0.080314 | 0.101449 | 0.748188 | 0.724034 | 0.724034 | 0.724034 | 0.724034 | 0.724034 | 0 | 0.006144 | 0.276897 | 8,328 | 266 | 81 | 31.308271 | 0.818831 | 0.035183 | 0 | 0.551867 | 0 | 0 | 0.15906 | 0.012255 | 0 | 0 | 0 | 0 | 0 | 1 | 0.008299 | false | 0 | 0.029046 | 0 | 0.186722 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
fc685cc61c1f687f08051621ce4397b397bcd259 | 21 | py | Python | hello_world.py | cwatkins-physna/profiles-rest-api | f41ce527f7e39128310b5ad40e69ba9563cb2a24 | [
"MIT"
] | null | null | null | hello_world.py | cwatkins-physna/profiles-rest-api | f41ce527f7e39128310b5ad40e69ba9563cb2a24 | [
"MIT"
] | 6 | 2020-06-06T01:22:40.000Z | 2022-02-10T11:42:11.000Z | hello_world.py | cwatkins-physna/profiles-rest-api | f41ce527f7e39128310b5ad40e69ba9563cb2a24 | [
"MIT"
] | null | null | null | print('Hello butts')
| 10.5 | 20 | 0.714286 | 3 | 21 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 21 | 1 | 21 | 21 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0.52381 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
fc7425056b77e759c82e5d39da006bab784c3a30 | 222 | py | Python | env/lib/python3.6/site-packages/web3/utils/exception.py | bopopescu/smart_contracts7 | 40a487cb3843e86ab5e4cb50b1aafa2095f648cd | [
"Apache-2.0"
] | null | null | null | env/lib/python3.6/site-packages/web3/utils/exception.py | bopopescu/smart_contracts7 | 40a487cb3843e86ab5e4cb50b1aafa2095f648cd | [
"Apache-2.0"
] | null | null | null | env/lib/python3.6/site-packages/web3/utils/exception.py | bopopescu/smart_contracts7 | 40a487cb3843e86ab5e4cb50b1aafa2095f648cd | [
"Apache-2.0"
] | 1 | 2020-07-24T17:53:25.000Z | 2020-07-24T17:53:25.000Z | import sys
# raise MyException() from original_exception compatibility
if sys.version_info.major == 2:
from .exception_py2 import raise_from # noqa: F401
else:
from .exception_py3 import raise_from # noqa: F401
| 27.75 | 59 | 0.761261 | 31 | 222 | 5.258065 | 0.580645 | 0.159509 | 0.184049 | 0.233129 | 0.282209 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048913 | 0.171171 | 222 | 7 | 60 | 31.714286 | 0.836957 | 0.355856 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
fc8606d750db5ff5c4801f8e982dadc4d312b2a8 | 92 | py | Python | MyUser/admin.py | mrvafa/blog-api | cef0773b3199bfb785126ce82567fd7fc78e6c89 | [
"MIT"
] | null | null | null | MyUser/admin.py | mrvafa/blog-api | cef0773b3199bfb785126ce82567fd7fc78e6c89 | [
"MIT"
] | null | null | null | MyUser/admin.py | mrvafa/blog-api | cef0773b3199bfb785126ce82567fd7fc78e6c89 | [
"MIT"
] | null | null | null | from django.contrib import admin
from MyUser.models import User
admin.site.register(User)
| 15.333333 | 32 | 0.815217 | 14 | 92 | 5.357143 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119565 | 92 | 5 | 33 | 18.4 | 0.925926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
fca6c5d49b30baa5b8fde44e4b4e2f786bb79954 | 1,178 | py | Python | spec/construct/test_fixed_struct.py | DarkShadow44/kaitai_struct_tests | 4bb13cef82965cca66dda2eb2b77cd64e9f70a12 | [
"MIT"
] | 11 | 2018-04-01T03:58:15.000Z | 2021-08-14T09:04:55.000Z | spec/construct/test_fixed_struct.py | DarkShadow44/kaitai_struct_tests | 4bb13cef82965cca66dda2eb2b77cd64e9f70a12 | [
"MIT"
] | 73 | 2016-07-20T10:27:15.000Z | 2020-12-17T18:56:46.000Z | spec/construct/test_fixed_struct.py | DarkShadow44/kaitai_struct_tests | 4bb13cef82965cca66dda2eb2b77cd64e9f70a12 | [
"MIT"
] | 37 | 2016-08-15T08:25:56.000Z | 2021-08-28T14:48:46.000Z | # Autogenerated from KST: please remove this line if doing any edits by hand!
import unittest
from fixed_struct import _schema
class TestFixedStruct(unittest.TestCase):
def test_fixed_struct(self):
r = _schema.parse_file('src/fixed_struct.bin')
self.assertEqual(r.hdr.uint8, 255)
self.assertEqual(r.hdr.uint16, 65535)
self.assertEqual(r.hdr.uint32, 4294967295)
self.assertEqual(r.hdr.uint64, 18446744073709551615)
self.assertEqual(r.hdr.sint8, -1)
self.assertEqual(r.hdr.sint16, -1)
self.assertEqual(r.hdr.sint32, -1)
self.assertEqual(r.hdr.sint64, -1)
self.assertEqual(r.hdr.uint16le, 66)
self.assertEqual(r.hdr.uint32le, 66)
self.assertEqual(r.hdr.uint64le, 66)
self.assertEqual(r.hdr.sint16le, -66)
self.assertEqual(r.hdr.sint32le, -66)
self.assertEqual(r.hdr.sint64le, -66)
self.assertEqual(r.hdr.uint16be, 66)
self.assertEqual(r.hdr.uint32be, 66)
self.assertEqual(r.hdr.uint64be, 66)
self.assertEqual(r.hdr.sint16be, -66)
self.assertEqual(r.hdr.sint32be, -66)
self.assertEqual(r.hdr.sint64be, -66)
| 39.266667 | 77 | 0.66893 | 157 | 1,178 | 4.974522 | 0.363057 | 0.384123 | 0.409731 | 0.486556 | 0.398207 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0.205433 | 1,178 | 29 | 78 | 40.62069 | 0.723291 | 0.063667 | 0 | 0 | 1 | 0 | 0.018165 | 0 | 0 | 0 | 0 | 0 | 0.8 | 1 | 0.04 | false | 0 | 0.08 | 0 | 0.16 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
5d81db198ef3ac98f3d761c1b242b341ff185836 | 152 | py | Python | apps/groups/admin.py | aldwyn/effigia | eb456656949bf68934530bbec9c15ebc6d0236b8 | [
"MIT"
] | 1 | 2018-11-15T05:17:30.000Z | 2018-11-15T05:17:30.000Z | apps/groups/admin.py | aldwyn/effigia | eb456656949bf68934530bbec9c15ebc6d0236b8 | [
"MIT"
] | 5 | 2021-06-09T17:20:01.000Z | 2022-03-11T23:18:06.000Z | apps/groups/admin.py | aldwyn/effigia | eb456656949bf68934530bbec9c15ebc6d0236b8 | [
"MIT"
] | 1 | 2018-10-05T19:03:27.000Z | 2018-10-05T19:03:27.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.contrib import admin
from .models import Group
admin.site.register(Group)
| 19 | 39 | 0.769737 | 21 | 152 | 5.333333 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007576 | 0.131579 | 152 | 7 | 40 | 21.714286 | 0.840909 | 0.138158 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
5d836bb35426ba2295a5dcd4aa2535c6336263c7 | 59,681 | py | Python | ot/gromov.py | IloneM/POT | 1fdf697039d68bcd4632fd48e22254f15b3456ca | [
"MIT"
] | null | null | null | ot/gromov.py | IloneM/POT | 1fdf697039d68bcd4632fd48e22254f15b3456ca | [
"MIT"
] | null | null | null | ot/gromov.py | IloneM/POT | 1fdf697039d68bcd4632fd48e22254f15b3456ca | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Gromov-Wasserstein and Fused-Gromov-Wasserstein solvers
"""
# Author: Erwan Vautier <erwan.vautier@gmail.com>
# Nicolas Courty <ncourty@irisa.fr>
# Rémi Flamary <remi.flamary@unice.fr>
# Titouan Vayer <titouan.vayer@irisa.fr>
#
# License: MIT License
import numpy as np
from .bregman import sinkhorn
from .utils import dist, UndefinedParameter, list_to_array
from .optim import cg
from .lp import emd_1d, emd
from .utils import check_random_state
from .backend import get_backend
def init_matrix(C1, C2, p, q, loss_fun='square_loss'):
r"""Return loss matrices and tensors for Gromov-Wasserstein fast computation
Returns the value of :math:`\mathcal{L}(\mathbf{C_1}, \mathbf{C_2}) \otimes \mathbf{T}` with the
selected loss function as the loss function of Gromow-Wasserstein discrepancy.
The matrices are computed as described in Proposition 1 in :ref:`[12] <references-init-matrix>`
Where :
- :math:`\mathbf{C_1}`: Metric cost matrix in the source space
- :math:`\mathbf{C_2}`: Metric cost matrix in the target space
- :math:`\mathbf{T}`: A coupling between those two spaces
The square-loss function :math:`L(a, b) = |a - b|^2` is read as :
.. math::
L(a, b) = f_1(a) + f_2(b) - h_1(a) h_2(b)
\mathrm{with} \ f_1(a) &= a^2
f_2(b) &= b^2
h_1(a) &= a
h_2(b) &= 2b
The kl-loss function :math:`L(a, b) = a \log\left(\frac{a}{b}\right) - a + b` is read as :
.. math::
L(a, b) = f_1(a) + f_2(b) - h_1(a) h_2(b)
\mathrm{with} \ f_1(a) &= a \log(a) - a
f_2(b) &= b
h_1(a) &= a
h_2(b) &= \log(b)
Parameters
----------
C1 : array-like, shape (ns, ns)
Metric cost matrix in the source space
C2 : array-like, shape (nt, nt)
Metric cost matrix in the target space
T : array-like, shape (ns, nt)
Coupling between source and target spaces
p : array-like, shape (ns,)
Returns
-------
constC : array-like, shape (ns, nt)
Constant :math:`\mathbf{C}` matrix in Eq. (6)
hC1 : array-like, shape (ns, ns)
:math:`\mathbf{h1}(\mathbf{C1})` matrix in Eq. (6)
hC2 : array-like, shape (nt, nt)
:math:`\mathbf{h2}(\mathbf{C2})` matrix in Eq. (6)
.. _references-init-matrix:
References
----------
.. [12] Gabriel Peyré, Marco Cuturi, and Justin Solomon,
"Gromov-Wasserstein averaging of kernel and distance matrices."
International Conference on Machine Learning (ICML). 2016.
"""
C1, C2, p, q = list_to_array(C1, C2, p, q)
nx = get_backend(C1, C2, p, q)
if loss_fun == 'square_loss':
def f1(a):
return (a**2)
def f2(b):
return (b**2)
def h1(a):
return a
def h2(b):
return 2 * b
elif loss_fun == 'kl_loss':
def f1(a):
return a * nx.log(a + 1e-15) - a
def f2(b):
return b
def h1(a):
return a
def h2(b):
return nx.log(b + 1e-15)
constC1 = nx.dot(
nx.dot(f1(C1), nx.reshape(p, (-1, 1))),
nx.ones((1, len(q)), type_as=q)
)
constC2 = nx.dot(
nx.ones((len(p), 1), type_as=p),
nx.dot(nx.reshape(q, (1, -1)), f2(C2).T)
)
constC = constC1 + constC2
hC1 = h1(C1)
hC2 = h2(C2)
return constC, hC1, hC2
def tensor_product(constC, hC1, hC2, T):
r"""Return the tensor for Gromov-Wasserstein fast computation
The tensor is computed as described in Proposition 1 Eq. (6) in :ref:`[12] <references-tensor-product>`
Parameters
----------
constC : array-like, shape (ns, nt)
Constant :math:`\mathbf{C}` matrix in Eq. (6)
hC1 : array-like, shape (ns, ns)
:math:`\mathbf{h1}(\mathbf{C1})` matrix in Eq. (6)
hC2 : array-like, shape (nt, nt)
:math:`\mathbf{h2}(\mathbf{C2})` matrix in Eq. (6)
Returns
-------
tens : array-like, shape (`ns`, `nt`)
:math:`\mathcal{L}(\mathbf{C_1}, \mathbf{C_2}) \otimes \mathbf{T}` tensor-matrix multiplication result
.. _references-tensor-product:
References
----------
.. [12] Gabriel Peyré, Marco Cuturi, and Justin Solomon,
"Gromov-Wasserstein averaging of kernel and distance matrices."
International Conference on Machine Learning (ICML). 2016.
"""
constC, hC1, hC2, T = list_to_array(constC, hC1, hC2, T)
nx = get_backend(constC, hC1, hC2, T)
A = - nx.dot(
nx.dot(hC1, T), hC2.T
)
tens = constC + A
# tens -= tens.min()
return tens
def gwloss(constC, hC1, hC2, T):
r"""Return the Loss for Gromov-Wasserstein
The loss is computed as described in Proposition 1 Eq. (6) in :ref:`[12] <references-gwloss>`
Parameters
----------
constC : array-like, shape (ns, nt)
Constant :math:`\mathbf{C}` matrix in Eq. (6)
hC1 : array-like, shape (ns, ns)
:math:`\mathbf{h1}(\mathbf{C1})` matrix in Eq. (6)
hC2 : array-like, shape (nt, nt)
:math:`\mathbf{h2}(\mathbf{C2})` matrix in Eq. (6)
T : array-like, shape (ns, nt)
Current value of transport matrix :math:`\mathbf{T}`
Returns
-------
loss : float
Gromov Wasserstein loss
.. _references-gwloss:
References
----------
.. [12] Gabriel Peyré, Marco Cuturi, and Justin Solomon,
"Gromov-Wasserstein averaging of kernel and distance matrices."
International Conference on Machine Learning (ICML). 2016.
"""
tens = tensor_product(constC, hC1, hC2, T)
tens, T = list_to_array(tens, T)
nx = get_backend(tens, T)
return nx.sum(tens * T)
def gwggrad(constC, hC1, hC2, T):
r"""Return the gradient for Gromov-Wasserstein
The gradient is computed as described in Proposition 2 in :ref:`[12] <references-gwggrad>`
Parameters
----------
constC : array-like, shape (ns, nt)
Constant :math:`\mathbf{C}` matrix in Eq. (6)
hC1 : array-like, shape (ns, ns)
:math:`\mathbf{h1}(\mathbf{C1})` matrix in Eq. (6)
hC2 : array-like, shape (nt, nt)
:math:`\mathbf{h2}(\mathbf{C2})` matrix in Eq. (6)
T : array-like, shape (ns, nt)
Current value of transport matrix :math:`\mathbf{T}`
Returns
-------
grad : array-like, shape (`ns`, `nt`)
Gromov Wasserstein gradient
.. _references-gwggrad:
References
----------
.. [12] Gabriel Peyré, Marco Cuturi, and Justin Solomon,
"Gromov-Wasserstein averaging of kernel and distance matrices."
International Conference on Machine Learning (ICML). 2016.
"""
return 2 * tensor_product(constC, hC1, hC2,
T) # [12] Prop. 2 misses a 2 factor
def update_square_loss(p, lambdas, T, Cs):
r"""
Updates :math:`\mathbf{C}` according to the L2 Loss kernel with the `S` :math:`\mathbf{T}_s`
couplings calculated at each iteration
Parameters
----------
p : array-like, shape (N,)
Masses in the targeted barycenter.
lambdas : list of float
List of the `S` spaces' weights.
T : list of S array-like of shape (ns,N)
The `S` :math:`\mathbf{T}_s` couplings calculated at each iteration.
Cs : list of S array-like, shape(ns,ns)
Metric cost matrices.
Returns
----------
C : array-like, shape (`nt`, `nt`)
Updated :math:`\mathbf{C}` matrix.
"""
T = list_to_array(*T)
Cs = list_to_array(*Cs)
p = list_to_array(p)
nx = get_backend(p, *T, *Cs)
tmpsum = sum([
lambdas[s] * nx.dot(
nx.dot(T[s].T, Cs[s]),
T[s]
) for s in range(len(T))
])
ppt = nx.outer(p, p)
return tmpsum / ppt
def update_kl_loss(p, lambdas, T, Cs):
r"""
Updates :math:`\mathbf{C}` according to the KL Loss kernel with the `S` :math:`\mathbf{T}_s` couplings calculated at each iteration
Parameters
----------
p : array-like, shape (N,)
Weights in the targeted barycenter.
lambdas : list of float
List of the `S` spaces' weights
T : list of S array-like of shape (ns,N)
The `S` :math:`\mathbf{T}_s` couplings calculated at each iteration.
Cs : list of S array-like, shape(ns,ns)
Metric cost matrices.
Returns
----------
C : array-like, shape (`ns`, `ns`)
updated :math:`\mathbf{C}` matrix
"""
Cs = list_to_array(*Cs)
T = list_to_array(*T)
p = list_to_array(p)
nx = get_backend(p, *T, *Cs)
tmpsum = sum([
lambdas[s] * nx.dot(
nx.dot(T[s].T, Cs[s]),
T[s]
) for s in range(len(T))
])
ppt = nx.outer(p, p)
return nx.exp(tmpsum / ppt)
def gromov_wasserstein(C1, C2, p, q, loss_fun='square_loss', log=False, armijo=False, **kwargs):
r"""
Returns the gromov-wasserstein transport between :math:`(\mathbf{C_1}, \mathbf{p})` and :math:`(\mathbf{C_2}, \mathbf{q})`
The function solves the following optimization problem:
.. math::
\mathbf{GW} = \mathop{\arg \min}_\mathbf{T} \quad \sum_{i,j,k,l}
L(\mathbf{C_1}_{i,k}, \mathbf{C_2}_{j,l}) \mathbf{T}_{i,j} \mathbf{T}_{k,l}
Where :
- :math:`\mathbf{C_1}`: Metric cost matrix in the source space
- :math:`\mathbf{C_2}`: Metric cost matrix in the target space
- :math:`\mathbf{p}`: distribution in the source space
- :math:`\mathbf{q}`: distribution in the target space
- `L`: loss function to account for the misfit between the similarity matrices
Parameters
----------
C1 : array-like, shape (ns, ns)
Metric cost matrix in the source space
C2 : array-like, shape (nt, nt)
Metric cost matrix in the target space
p : array-like, shape (ns,)
Distribution in the source space
q : array-like, shape (nt,)
Distribution in the target space
loss_fun : str
loss function used for the solver either 'square_loss' or 'kl_loss'
max_iter : int, optional
Max number of iterations
tol : float, optional
Stop threshold on error (>0)
verbose : bool, optional
Print information along iterations
log : bool, optional
record log if True
armijo : bool, optional
If True the step of the line-search is found via an armijo research. Else closed form is used.
If there are convergence issues use False.
**kwargs : dict
parameters can be directly passed to the ot.optim.cg solver
Returns
-------
T : array-like, shape (`ns`, `nt`)
Coupling between the two spaces that minimizes:
:math:`\sum_{i,j,k,l} L(\mathbf{C_1}_{i,k}, \mathbf{C_2}_{j,l}) \mathbf{T}_{i,j} \mathbf{T}_{k,l}`
log : dict
Convergence information and loss.
References
----------
.. [12] Gabriel Peyré, Marco Cuturi, and Justin Solomon,
"Gromov-Wasserstein averaging of kernel and distance matrices."
International Conference on Machine Learning (ICML). 2016.
.. [13] Mémoli, Facundo. Gromov–Wasserstein distances and the
metric approach to object matching. Foundations of computational
mathematics 11.4 (2011): 417-487.
"""
p, q = list_to_array(p, q)
p0, q0, C10, C20 = p, q, C1, C2
nx = get_backend(p0, q0, C10, C20)
p = nx.to_numpy(p)
q = nx.to_numpy(q)
C1 = nx.to_numpy(C10)
C2 = nx.to_numpy(C20)
constC, hC1, hC2 = init_matrix(C1, C2, p, q, loss_fun)
G0 = p[:, None] * q[None, :]
def f(G):
return gwloss(constC, hC1, hC2, G)
def df(G):
return gwggrad(constC, hC1, hC2, G)
if log:
res, log = cg(p, q, 0, 1, f, df, G0, log=True, armijo=armijo, C1=C1, C2=C2, constC=constC, **kwargs)
log['gw_dist'] = nx.from_numpy(gwloss(constC, hC1, hC2, res), type_as=C10)
log['u'] = nx.from_numpy(log['u'], type_as=C10)
log['v'] = nx.from_numpy(log['v'], type_as=C10)
return nx.from_numpy(res, type_as=C10), log
else:
return nx.from_numpy(cg(p, q, 0, 1, f, df, G0, armijo=armijo, C1=C1, C2=C2, constC=constC, log=False, **kwargs), type_as=C10)
def gromov_wasserstein2(C1, C2, p, q, loss_fun='square_loss', log=False, armijo=False, **kwargs):
r"""
Returns the gromov-wasserstein discrepancy between :math:`(\mathbf{C_1}, \mathbf{p})` and :math:`(\mathbf{C_2}, \mathbf{q})`
The function solves the following optimization problem:
.. math::
GW = \min_\mathbf{T} \quad \sum_{i,j,k,l}
L(\mathbf{C_1}_{i,k}, \mathbf{C_2}_{j,l}) \mathbf{T}_{i,j} \mathbf{T}_{k,l}
Where :
- :math:`\mathbf{C_1}`: Metric cost matrix in the source space
- :math:`\mathbf{C_2}`: Metric cost matrix in the target space
- :math:`\mathbf{p}`: distribution in the source space
- :math:`\mathbf{q}`: distribution in the target space
- `L`: loss function to account for the misfit between the similarity
matrices
Note that when using backends, this loss function is differentiable wrt the
marices and weights for quadratic loss using the gradients from [38]_.
Parameters
----------
C1 : array-like, shape (ns, ns)
Metric cost matrix in the source space
C2 : array-like, shape (nt, nt)
Metric cost matrix in the target space
p : array-like, shape (ns,)
Distribution in the source space.
q : array-like, shape (nt,)
Distribution in the target space.
loss_fun : str
loss function used for the solver either 'square_loss' or 'kl_loss'
max_iter : int, optional
Max number of iterations
tol : float, optional
Stop threshold on error (>0)
verbose : bool, optional
Print information along iterations
log : bool, optional
record log if True
armijo : bool, optional
If True the step of the line-search is found via an armijo research. Else closed form is used.
If there are convergence issues use False.
Returns
-------
gw_dist : float
Gromov-Wasserstein distance
log : dict
convergence information and Coupling marix
References
----------
.. [12] Gabriel Peyré, Marco Cuturi, and Justin Solomon,
"Gromov-Wasserstein averaging of kernel and distance matrices."
International Conference on Machine Learning (ICML). 2016.
.. [13] Mémoli, Facundo. Gromov–Wasserstein distances and the
metric approach to object matching. Foundations of computational
mathematics 11.4 (2011): 417-487.
.. [38] C. Vincent-Cuaz, T. Vayer, R. Flamary, M. Corneli, N. Courty, Online
Graph Dictionary Learning, International Conference on Machine Learning
(ICML), 2021.
"""
p, q = list_to_array(p, q)
p0, q0, C10, C20 = p, q, C1, C2
nx = get_backend(p0, q0, C10, C20)
p = nx.to_numpy(p)
q = nx.to_numpy(q)
C1 = nx.to_numpy(C10)
C2 = nx.to_numpy(C20)
constC, hC1, hC2 = init_matrix(C1, C2, p, q, loss_fun)
G0 = p[:, None] * q[None, :]
def f(G):
return gwloss(constC, hC1, hC2, G)
def df(G):
return gwggrad(constC, hC1, hC2, G)
T, log_gw = cg(p, q, 0, 1, f, df, G0, log=True, armijo=armijo, C1=C1, C2=C2, constC=constC, **kwargs)
T0 = nx.from_numpy(T, type_as=C10)
log_gw['gw_dist'] = nx.from_numpy(gwloss(constC, hC1, hC2, T), type_as=C10)
log_gw['u'] = nx.from_numpy(log_gw['u'], type_as=C10)
log_gw['v'] = nx.from_numpy(log_gw['v'], type_as=C10)
log_gw['T'] = T0
gw = log_gw['gw_dist']
if loss_fun == 'square_loss':
gC1 = nx.from_numpy(2 * C1 * (p[:, None] * p[None, :]) - 2 * T.dot(C2).dot(T.T))
gC2 = nx.from_numpy(2 * C2 * (q[:, None] * q[None, :]) - 2 * T.T.dot(C1).dot(T))
gw = nx.set_gradients(gw, (p0, q0, C10, C20),
(log_gw['u'], log_gw['v'], gC1, gC2))
if log:
return gw, log_gw
else:
return gw
def fused_gromov_wasserstein(M, C1, C2, p, q, loss_fun='square_loss', alpha=0.5, armijo=False, log=False, **kwargs):
r"""
Computes the FGW transport between two graphs (see :ref:`[24] <references-fused-gromov-wasserstein>`)
.. math::
\gamma = \mathop{\arg \min}_\gamma \quad (1 - \alpha) \langle \gamma, \mathbf{M} \rangle_F +
\alpha \sum_{i,j,k,l} L(\mathbf{C_1}_{i,k}, \mathbf{C_2}_{j,l}) \mathbf{T}_{i,j} \mathbf{T}_{k,l}
s.t. \ \mathbf{\gamma} \mathbf{1} &= \mathbf{p}
\mathbf{\gamma}^T \mathbf{1} &= \mathbf{q}
\mathbf{\gamma} &\geq 0
where :
- :math:`\mathbf{M}` is the (`ns`, `nt`) metric cost matrix
- :math:`\mathbf{p}` and :math:`\mathbf{q}` are source and target weights (sum to 1)
- `L` is a loss function to account for the misfit between the similarity matrices
The algorithm used for solving the problem is conditional gradient as discussed in :ref:`[24] <references-fused-gromov-wasserstein>`
Parameters
----------
M : array-like, shape (ns, nt)
Metric cost matrix between features across domains
C1 : array-like, shape (ns, ns)
Metric cost matrix representative of the structure in the source space
C2 : array-like, shape (nt, nt)
Metric cost matrix representative of the structure in the target space
p : array-like, shape (ns,)
Distribution in the source space
q : array-like, shape (nt,)
Distribution in the target space
loss_fun : str, optional
Loss function used for the solver
alpha : float, optional
Trade-off parameter (0 < alpha < 1)
armijo : bool, optional
If True the step of the line-search is found via an armijo research. Else closed form is used.
If there are convergence issues use False.
log : bool, optional
record log if True
**kwargs : dict
parameters can be directly passed to the ot.optim.cg solver
Returns
-------
gamma : array-like, shape (`ns`, `nt`)
Optimal transportation matrix for the given parameters.
log : dict
Log dictionary return only if log==True in parameters.
.. _references-fused-gromov-wasserstein:
References
----------
.. [24] Vayer Titouan, Chapel Laetitia, Flamary Rémi, Tavenard Romain
and Courty Nicolas "Optimal Transport for structured data with
application on graphs", International Conference on Machine Learning
(ICML). 2019.
"""
p, q = list_to_array(p, q)
p0, q0, C10, C20, M0 = p, q, C1, C2, M
nx = get_backend(p0, q0, C10, C20, M0)
p = nx.to_numpy(p)
q = nx.to_numpy(q)
C1 = nx.to_numpy(C10)
C2 = nx.to_numpy(C20)
M = nx.to_numpy(M0)
constC, hC1, hC2 = init_matrix(C1, C2, p, q, loss_fun)
G0 = p[:, None] * q[None, :]
def f(G):
return gwloss(constC, hC1, hC2, G)
def df(G):
return gwggrad(constC, hC1, hC2, G)
if log:
res, log = cg(p, q, (1 - alpha) * M, alpha, f, df, G0, armijo=armijo, C1=C1, C2=C2, constC=constC, log=True, **kwargs)
fgw_dist = nx.from_numpy(log['loss'][-1], type_as=C10)
log['fgw_dist'] = fgw_dist
log['u'] = nx.from_numpy(log['u'], type_as=C10)
log['v'] = nx.from_numpy(log['v'], type_as=C10)
return nx.from_numpy(res, type_as=C10), log
else:
return nx.from_numpy(cg(p, q, (1 - alpha) * M, alpha, f, df, G0, armijo=armijo, C1=C1, C2=C2, constC=constC, **kwargs), type_as=C10)
def fused_gromov_wasserstein2(M, C1, C2, p, q, loss_fun='square_loss', alpha=0.5, armijo=False, log=False, **kwargs):
r"""
Computes the FGW distance between two graphs see (see :ref:`[24] <references-fused-gromov-wasserstein2>`)
.. math::
\min_\gamma \quad (1 - \alpha) \langle \gamma, \mathbf{M} \rangle_F + \alpha \sum_{i,j,k,l}
L(\mathbf{C_1}_{i,k}, \mathbf{C_2}_{j,l}) \mathbf{T}_{i,j} \mathbf{T}_{k,l}
s.t. \ \mathbf{\gamma} \mathbf{1} &= \mathbf{p}
\mathbf{\gamma}^T \mathbf{1} &= \mathbf{q}
\mathbf{\gamma} &\geq 0
where :
- :math:`\mathbf{M}` is the (`ns`, `nt`) metric cost matrix
- :math:`\mathbf{p}` and :math:`\mathbf{q}` are source and target weights (sum to 1)
- `L` is a loss function to account for the misfit between the similarity matrices
The algorithm used for solving the problem is conditional gradient as
discussed in :ref:`[24] <references-fused-gromov-wasserstein2>`
Note that when using backends, this loss function is differentiable wrt the
marices and weights for quadratic loss using the gradients from [38]_.
Parameters
----------
M : array-like, shape (ns, nt)
Metric cost matrix between features across domains
C1 : array-like, shape (ns, ns)
Metric cost matrix representative of the structure in the source space.
C2 : array-like, shape (nt, nt)
Metric cost matrix representative of the structure in the target space.
p : array-like, shape (ns,)
Distribution in the source space.
q : array-like, shape (nt,)
Distribution in the target space.
loss_fun : str, optional
Loss function used for the solver.
alpha : float, optional
Trade-off parameter (0 < alpha < 1)
armijo : bool, optional
If True the step of the line-search is found via an armijo research.
Else closed form is used. If there are convergence issues use False.
log : bool, optional
Record log if True.
**kwargs : dict
Parameters can be directly passed to the ot.optim.cg solver.
Returns
-------
fgw-distance : float
Fused gromov wasserstein distance for the given parameters.
log : dict
Log dictionary return only if log==True in parameters.
.. _references-fused-gromov-wasserstein2:
References
----------
.. [24] Vayer Titouan, Chapel Laetitia, Flamary Rémi, Tavenard Romain
and Courty Nicolas
"Optimal Transport for structured data with application on graphs"
International Conference on Machine Learning (ICML). 2019.
.. [38] C. Vincent-Cuaz, T. Vayer, R. Flamary, M. Corneli, N. Courty, Online
Graph Dictionary Learning, International Conference on Machine Learning
(ICML), 2021.
"""
p, q = list_to_array(p, q)
p0, q0, C10, C20, M0 = p, q, C1, C2, M
nx = get_backend(p0, q0, C10, C20, M0)
p = nx.to_numpy(p)
q = nx.to_numpy(q)
C1 = nx.to_numpy(C10)
C2 = nx.to_numpy(C20)
M = nx.to_numpy(M0)
constC, hC1, hC2 = init_matrix(C1, C2, p, q, loss_fun)
G0 = p[:, None] * q[None, :]
def f(G):
return gwloss(constC, hC1, hC2, G)
def df(G):
return gwggrad(constC, hC1, hC2, G)
T, log_fgw = cg(p, q, (1 - alpha) * M, alpha, f, df, G0, armijo=armijo, C1=C1, C2=C2, constC=constC, log=True, **kwargs)
fgw_dist = nx.from_numpy(log_fgw['loss'][-1], type_as=C10)
T0 = nx.from_numpy(T, type_as=C10)
log_fgw['fgw_dist'] = fgw_dist
log_fgw['u'] = nx.from_numpy(log_fgw['u'], type_as=C10)
log_fgw['v'] = nx.from_numpy(log_fgw['v'], type_as=C10)
log_fgw['T'] = T0
if loss_fun == 'square_loss':
gC1 = nx.from_numpy(2 * C1 * (p[:, None] * p[None, :]) - 2 * T.dot(C2).dot(T.T))
gC2 = nx.from_numpy(2 * C2 * (q[:, None] * q[None, :]) - 2 * T.T.dot(C1).dot(T))
fgw_dist = nx.set_gradients(fgw_dist, (p0, q0, C10, C20, M0),
(log_fgw['u'], log_fgw['v'], alpha * gC1, alpha * gC2, (1 - alpha) * T0))
if log:
return fgw_dist, log_fgw
else:
return fgw_dist
def GW_distance_estimation(C1, C2, p, q, loss_fun, T,
nb_samples_p=None, nb_samples_q=None, std=True, random_state=None):
r"""
Returns an approximation of the gromov-wasserstein cost between :math:`(\mathbf{C_1}, \mathbf{p})` and :math:`(\mathbf{C_2}, \mathbf{q})`
with a fixed transport plan :math:`\mathbf{T}`.
The function gives an unbiased approximation of the following equation:
.. math::
GW = \sum_{i,j,k,l} L(\mathbf{C_{1}}_{i,k}, \mathbf{C_{2}}_{j,l}) \mathbf{T}_{i,j} \mathbf{T}_{k,l}
Where :
- :math:`\mathbf{C_1}`: Metric cost matrix in the source space
- :math:`\mathbf{C_2}`: Metric cost matrix in the target space
- `L` : Loss function to account for the misfit between the similarity matrices
- :math:`\mathbf{T}`: Matrix with marginal :math:`\mathbf{p}` and :math:`\mathbf{q}`
Parameters
----------
C1 : array-like, shape (ns, ns)
Metric cost matrix in the source space
C2 : array-like, shape (nt, nt)
Metric cost matrix in the target space
p : array-like, shape (ns,)
Distribution in the source space
q : array-like, shape (nt,)
Distribution in the target space
loss_fun : function: :math:`\mathbb{R} \times \mathbb{R} \mapsto \mathbb{R}`
Loss function used for the distance, the transport plan does not depend on the loss function
T : csr or array-like, shape (ns, nt)
Transport plan matrix, either a sparse csr or a dense matrix
nb_samples_p : int, optional
`nb_samples_p` is the number of samples (without replacement) along the first dimension of :math:`\mathbf{T}`
nb_samples_q : int, optional
`nb_samples_q` is the number of samples along the second dimension of :math:`\mathbf{T}`, for each sample along the first
std : bool, optional
Standard deviation associated with the prediction of the gromov-wasserstein cost
random_state : int or RandomState instance, optional
Fix the seed for reproducibility
Returns
-------
: float
Gromov-wasserstein cost
References
----------
.. [14] Kerdoncuff, Tanguy, Emonet, Rémi, Sebban, Marc
"Sampled Gromov Wasserstein."
Machine Learning Journal (MLJ). 2021.
"""
C1, C2, p, q = list_to_array(C1, C2, p, q)
nx = get_backend(C1, C2, p, q)
generator = check_random_state(random_state)
len_p = p.shape[0]
len_q = q.shape[0]
# It is always better to sample from the biggest distribution first.
if len_p < len_q:
p, q = q, p
len_p, len_q = len_q, len_p
C1, C2 = C2, C1
T = T.T
if nb_samples_p is None:
if nx.issparse(T):
# If T is sparse, it probably mean that PoGroW was used, thus the number of sample is reduced
nb_samples_p = min(int(5 * (len_p * np.log(len_p)) ** 0.5), len_p)
else:
nb_samples_p = len_p
else:
# The number of sample along the first dimension is without replacement.
nb_samples_p = min(nb_samples_p, len_p)
if nb_samples_q is None:
nb_samples_q = 1
if std:
nb_samples_q = max(2, nb_samples_q)
index_k = np.zeros((nb_samples_p, nb_samples_q), dtype=int)
index_l = np.zeros((nb_samples_p, nb_samples_q), dtype=int)
index_i = generator.choice(len_p, size=nb_samples_p, p=p, replace=False)
index_j = generator.choice(len_p, size=nb_samples_p, p=p, replace=False)
for i in range(nb_samples_p):
if nx.issparse(T):
T_indexi = nx.reshape(nx.todense(T[index_i[i], :]), (-1,))
T_indexj = nx.reshape(nx.todense(T[index_j[i], :]), (-1,))
else:
T_indexi = T[index_i[i], :]
T_indexj = T[index_j[i], :]
# For each of the row sampled, the column is sampled.
index_k[i] = generator.choice(
len_q,
size=nb_samples_q,
p=T_indexi / nx.sum(T_indexi),
replace=True
)
index_l[i] = generator.choice(
len_q,
size=nb_samples_q,
p=T_indexj / nx.sum(T_indexj),
replace=True
)
list_value_sample = nx.stack([
loss_fun(
C1[np.ix_(index_i, index_j)],
C2[np.ix_(index_k[:, n], index_l[:, n])]
) for n in range(nb_samples_q)
], axis=2)
if std:
std_value = nx.sum(nx.std(list_value_sample, axis=2) ** 2) ** 0.5
return nx.mean(list_value_sample), std_value / (nb_samples_p * nb_samples_p)
else:
return nx.mean(list_value_sample)
def pointwise_gromov_wasserstein(C1, C2, p, q, loss_fun,
alpha=1, max_iter=100, threshold_plan=0, log=False, verbose=False, random_state=None):
r"""
Returns the gromov-wasserstein transport between :math:`(\mathbf{C_1}, \mathbf{p})` and :math:`(\mathbf{C_2}, \mathbf{q})` using a stochastic Frank-Wolfe.
This method has a :math:`\mathcal{O}(\mathrm{max\_iter} \times PN^2)` time complexity with `P` the number of Sinkhorn iterations.
The function solves the following optimization problem:
.. math::
\mathbf{GW} = \mathop{\arg \min}_\mathbf{T} \quad \sum_{i,j,k,l}
L(\mathbf{C_1}_{i,k}, \mathbf{C_2}_{j,l}) \mathbf{T}_{i,j} \mathbf{T}_{k,l}
s.t. \ \mathbf{T} \mathbf{1} &= \mathbf{p}
\mathbf{T}^T \mathbf{1} &= \mathbf{q}
\mathbf{T} &\geq 0
Where :
- :math:`\mathbf{C_1}`: Metric cost matrix in the source space
- :math:`\mathbf{C_2}`: Metric cost matrix in the target space
- :math:`\mathbf{p}`: distribution in the source space
- :math:`\mathbf{q}`: distribution in the target space
- `L`: loss function to account for the misfit between the similarity matrices
Parameters
----------
C1 : array-like, shape (ns, ns)
Metric cost matrix in the source space
C2 : array-like, shape (nt, nt)
Metric cost matrix in the target space
p : array-like, shape (ns,)
Distribution in the source space
q : array-like, shape (nt,)
Distribution in the target space
loss_fun : function: :math:`\mathbb{R} \times \mathbb{R} \mapsto \mathbb{R}`
Loss function used for the distance, the transport plan does not depend on the loss function
alpha : float
Step of the Frank-Wolfe algorithm, should be between 0 and 1
max_iter : int, optional
Max number of iterations
threshold_plan : float, optional
Deleting very small values in the transport plan. If above zero, it violates the marginal constraints.
verbose : bool, optional
Print information along iterations
log : bool, optional
Gives the distance estimated and the standard deviation
random_state : int or RandomState instance, optional
Fix the seed for reproducibility
Returns
-------
T : array-like, shape (`ns`, `nt`)
Optimal coupling between the two spaces
References
----------
.. [14] Kerdoncuff, Tanguy, Emonet, Rémi, Sebban, Marc
"Sampled Gromov Wasserstein."
Machine Learning Journal (MLJ). 2021.
"""
C1, C2, p, q = list_to_array(C1, C2, p, q)
nx = get_backend(C1, C2, p, q)
len_p = p.shape[0]
len_q = q.shape[0]
generator = check_random_state(random_state)
index = np.zeros(2, dtype=int)
# Initialize with default marginal
index[0] = generator.choice(len_p, size=1, p=p)
index[1] = generator.choice(len_q, size=1, p=q)
T = nx.tocsr(emd_1d(C1[index[0]], C2[index[1]], a=p, b=q, dense=False))
best_gw_dist_estimated = np.inf
for cpt in range(max_iter):
index[0] = generator.choice(len_p, size=1, p=p)
T_index0 = nx.reshape(nx.todense(T[index[0], :]), (-1,))
index[1] = generator.choice(len_q, size=1, p=T_index0 / T_index0.sum())
if alpha == 1:
T = nx.tocsr(
emd_1d(C1[index[0]], C2[index[1]], a=p, b=q, dense=False)
)
else:
new_T = nx.tocsr(
emd_1d(C1[index[0]], C2[index[1]], a=p, b=q, dense=False)
)
T = (1 - alpha) * T + alpha * new_T
# To limit the number of non 0, the values below the threshold are set to 0.
T = nx.eliminate_zeros(T, threshold=threshold_plan)
if cpt % 10 == 0 or cpt == (max_iter - 1):
gw_dist_estimated = GW_distance_estimation(
C1=C1, C2=C2, loss_fun=loss_fun,
p=p, q=q, T=T, std=False, random_state=generator
)
if gw_dist_estimated < best_gw_dist_estimated:
best_gw_dist_estimated = gw_dist_estimated
best_T = nx.copy(T)
if verbose:
if cpt % 200 == 0:
print('{:5s}|{:12s}'.format('It.', 'Best gw estimated') + '\n' + '-' * 19)
print('{:5d}|{:8e}|'.format(cpt, best_gw_dist_estimated))
if log:
log = {}
log["gw_dist_estimated"], log["gw_dist_std"] = GW_distance_estimation(
C1=C1, C2=C2, loss_fun=loss_fun,
p=p, q=q, T=best_T, random_state=generator
)
return best_T, log
return best_T
def sampled_gromov_wasserstein(C1, C2, p, q, loss_fun,
nb_samples_grad=100, epsilon=1, max_iter=500, log=False, verbose=False,
random_state=None):
r"""
Returns the gromov-wasserstein transport between :math:`(\mathbf{C_1}, \mathbf{p})` and :math:`(\mathbf{C_2}, \mathbf{q})` using a 1-stochastic Frank-Wolfe.
This method has a :math:`\mathcal{O}(\mathrm{max\_iter} \times N \log(N))` time complexity by relying on the 1D Optimal Transport solver.
The function solves the following optimization problem:
.. math::
\mathbf{GW} = \mathop{\arg \min}_\mathbf{T} \quad \sum_{i,j,k,l}
L(\mathbf{C_1}_{i,k}, \mathbf{C_2}_{j,l}) \mathbf{T}_{i,j} \mathbf{T}_{k,l}
s.t. \ \mathbf{T} \mathbf{1} &= \mathbf{p}
\mathbf{T}^T \mathbf{1} &= \mathbf{q}
\mathbf{T} &\geq 0
Where :
- :math:`\mathbf{C_1}`: Metric cost matrix in the source space
- :math:`\mathbf{C_2}`: Metric cost matrix in the target space
- :math:`\mathbf{p}`: distribution in the source space
- :math:`\mathbf{q}`: distribution in the target space
- `L`: loss function to account for the misfit between the similarity matrices
Parameters
----------
C1 : array-like, shape (ns, ns)
Metric cost matrix in the source space
C2 : array-like, shape (nt, nt)
Metric cost matrix in the target space
p : array-like, shape (ns,)
Distribution in the source space
q : array-like, shape (nt,)
Distribution in the target space
loss_fun : function: :math:`\mathbb{R} \times \mathbb{R} \mapsto \mathbb{R}`
Loss function used for the distance, the transport plan does not depend on the loss function
nb_samples_grad : int
Number of samples to approximate the gradient
epsilon : float
Weight of the Kullback-Leibler regularization
max_iter : int, optional
Max number of iterations
verbose : bool, optional
Print information along iterations
log : bool, optional
Gives the distance estimated and the standard deviation
random_state : int or RandomState instance, optional
Fix the seed for reproducibility
Returns
-------
T : array-like, shape (`ns`, `nt`)
Optimal coupling between the two spaces
References
----------
.. [14] Kerdoncuff, Tanguy, Emonet, Rémi, Sebban, Marc
"Sampled Gromov Wasserstein."
Machine Learning Journal (MLJ). 2021.
"""
C1, C2, p, q = list_to_array(C1, C2, p, q)
nx = get_backend(C1, C2, p, q)
len_p = p.shape[0]
len_q = q.shape[0]
generator = check_random_state(random_state)
# The most natural way to define nb_sample is with a simple integer.
if isinstance(nb_samples_grad, int):
if nb_samples_grad > len_p:
# As the sampling along the first dimension is done without replacement, the rest is reported to the second
# dimension.
nb_samples_grad_p, nb_samples_grad_q = len_p, nb_samples_grad // len_p
else:
nb_samples_grad_p, nb_samples_grad_q = nb_samples_grad, 1
else:
nb_samples_grad_p, nb_samples_grad_q = nb_samples_grad
T = nx.outer(p, q)
# continue_loop allows to stop the loop if there is several successive small modification of T.
continue_loop = 0
# The gradient of GW is more complex if the two matrices are not symmetric.
C_are_symmetric = nx.allclose(C1, C1.T, rtol=1e-10, atol=1e-10) and nx.allclose(C2, C2.T, rtol=1e-10, atol=1e-10)
for cpt in range(max_iter):
index0 = generator.choice(len_p, size=nb_samples_grad_p, p=p, replace=False)
Lik = 0
for i, index0_i in enumerate(index0):
index1 = generator.choice(len_q,
size=nb_samples_grad_q,
p=T[index0_i, :] / nx.sum(T[index0_i, :]),
replace=False)
# If the matrices C are not symmetric, the gradient has 2 terms, thus the term is chosen randomly.
if (not C_are_symmetric) and generator.rand(1) > 0.5:
Lik += nx.mean(loss_fun(
C1[:, [index0[i]] * nb_samples_grad_q][:, None, :],
C2[:, index1][None, :, :]
), axis=2)
else:
Lik += nx.mean(loss_fun(
C1[[index0[i]] * nb_samples_grad_q, :][:, :, None],
C2[index1, :][:, None, :]
), axis=0)
max_Lik = nx.max(Lik)
if max_Lik == 0:
continue
# This division by the max is here to facilitate the choice of epsilon.
Lik /= max_Lik
if epsilon > 0:
# Set to infinity all the numbers below exp(-200) to avoid log of 0.
log_T = nx.log(nx.clip(T, np.exp(-200), 1))
log_T = nx.where(log_T == -200, -np.inf, log_T)
Lik = Lik - epsilon * log_T
try:
new_T = sinkhorn(a=p, b=q, M=Lik, reg=epsilon)
except (RuntimeWarning, UserWarning):
print("Warning catched in Sinkhorn: Return last stable T")
break
else:
new_T = emd(a=p, b=q, M=Lik)
change_T = nx.mean((T - new_T) ** 2)
if change_T <= 10e-20:
continue_loop += 1
if continue_loop > 100: # Number max of low modifications of T
T = nx.copy(new_T)
break
else:
continue_loop = 0
if verbose and cpt % 10 == 0:
if cpt % 200 == 0:
print('{:5s}|{:12s}'.format('It.', '||T_n - T_{n+1}||') + '\n' + '-' * 19)
print('{:5d}|{:8e}|'.format(cpt, change_T))
T = nx.copy(new_T)
if log:
log = {}
log["gw_dist_estimated"], log["gw_dist_std"] = GW_distance_estimation(
C1=C1, C2=C2, loss_fun=loss_fun,
p=p, q=q, T=T, random_state=generator
)
return T, log
return T
def entropic_gromov_wasserstein(C1, C2, p, q, loss_fun, epsilon,
max_iter=1000, tol=1e-9, verbose=False, log=False):
r"""
Returns the gromov-wasserstein transport between :math:`(\mathbf{C_1}, \mathbf{p})` and :math:`(\mathbf{C_2}, \mathbf{q})`
The function solves the following optimization problem:
.. math::
\mathbf{GW} = \mathop{\arg\min}_\mathbf{T} \quad \sum_{i,j,k,l} L(\mathbf{C_1}_{i,k}, \mathbf{C_2}_{j,l}) \mathbf{T}_{i,j} \mathbf{T}_{k,l} - \epsilon(H(\mathbf{T}))
s.t. \ \mathbf{T} \mathbf{1} &= \mathbf{p}
\mathbf{T}^T \mathbf{1} &= \mathbf{q}
\mathbf{T} &\geq 0
Where :
- :math:`\mathbf{C_1}`: Metric cost matrix in the source space
- :math:`\mathbf{C_2}`: Metric cost matrix in the target space
- :math:`\mathbf{p}`: distribution in the source space
- :math:`\mathbf{q}`: distribution in the target space
- `L`: loss function to account for the misfit between the similarity matrices
- `H`: entropy
Parameters
----------
C1 : array-like, shape (ns, ns)
Metric cost matrix in the source space
C2 : array-like, shape (nt, nt)
Metric cost matrix in the target space
p : array-like, shape (ns,)
Distribution in the source space
q : array-like, shape (nt,)
Distribution in the target space
loss_fun : string
Loss function used for the solver either 'square_loss' or 'kl_loss'
epsilon : float
Regularization term >0
max_iter : int, optional
Max number of iterations
tol : float, optional
Stop threshold on error (>0)
verbose : bool, optional
Print information along iterations
log : bool, optional
Record log if True.
Returns
-------
T : array-like, shape (`ns`, `nt`)
Optimal coupling between the two spaces
References
----------
.. [12] Gabriel Peyré, Marco Cuturi, and Justin Solomon,
"Gromov-Wasserstein averaging of kernel and distance matrices."
International Conference on Machine Learning (ICML). 2016.
"""
C1, C2, p, q = list_to_array(C1, C2, p, q)
nx = get_backend(C1, C2, p, q)
T = nx.outer(p, q)
constC, hC1, hC2 = init_matrix(C1, C2, p, q, loss_fun)
cpt = 0
err = 1
if log:
log = {'err': []}
if verbose and isinstance(verbose, int):
freqprint = verbose
else:
freqprint = 10
while (err > tol and cpt < max_iter):
Tprev = T
# compute the gradient
tens = gwggrad(constC, hC1, hC2, T)
T = sinkhorn(p, q, tens, epsilon, method='sinkhorn')
if cpt % freqprint == 0:
# we can speed up the process by checking for the error only all
# the 10th iterations
err = nx.norm(T - Tprev)
if log:
log['err'].append(err)
if verbose:
if cpt % (freqprint * 20) == 0:
print('{:5s}|{:12s}'.format(
'It.', 'Err') + '\n' + '-' * 19)
print('{:5d}|{:8e}|'.format(cpt, err))
cpt += 1
if log:
log['gw_dist'] = gwloss(constC, hC1, hC2, T)
return T, log
else:
return T
def entropic_gromov_wasserstein2(C1, C2, p, q, loss_fun, epsilon,
max_iter=1000, tol=1e-9, verbose=False, log=False):
r"""
Returns the entropic gromov-wasserstein discrepancy between the two measured similarity matrices :math:`(\mathbf{C_1}, \mathbf{p})` and :math:`(\mathbf{C_2}, \mathbf{q})`
The function solves the following optimization problem:
.. math::
GW = \min_\mathbf{T} \quad \sum_{i,j,k,l} L(\mathbf{C_1}_{i,k}, \mathbf{C_2}_{j,l})
\mathbf{T}_{i,j} \mathbf{T}_{k,l} - \epsilon(H(\mathbf{T}))
Where :
- :math:`\mathbf{C_1}`: Metric cost matrix in the source space
- :math:`\mathbf{C_2}`: Metric cost matrix in the target space
- :math:`\mathbf{p}`: distribution in the source space
- :math:`\mathbf{q}`: distribution in the target space
- `L`: loss function to account for the misfit between the similarity matrices
- `H`: entropy
Parameters
----------
C1 : array-like, shape (ns, ns)
Metric cost matrix in the source space
C2 : array-like, shape (nt, nt)
Metric cost matrix in the target space
p : array-like, shape (ns,)
Distribution in the source space
q : array-like, shape (nt,)
Distribution in the target space
loss_fun : str
Loss function used for the solver either 'square_loss' or 'kl_loss'
epsilon : float
Regularization term >0
max_iter : int, optional
Max number of iterations
tol : float, optional
Stop threshold on error (>0)
verbose : bool, optional
Print information along iterations
log : bool, optional
Record log if True.
Returns
-------
gw_dist : float
Gromov-Wasserstein distance
References
----------
.. [12] Gabriel Peyré, Marco Cuturi, and Justin Solomon,
"Gromov-Wasserstein averaging of kernel and distance matrices."
International Conference on Machine Learning (ICML). 2016.
"""
gw, logv = entropic_gromov_wasserstein(
C1, C2, p, q, loss_fun, epsilon, max_iter, tol, verbose, log=True)
logv['T'] = gw
if log:
return logv['gw_dist'], logv
else:
return logv['gw_dist']
def entropic_gromov_barycenters(N, Cs, ps, p, lambdas, loss_fun, epsilon,
max_iter=1000, tol=1e-9, verbose=False, log=False, init_C=None, random_state=None):
r"""
Returns the gromov-wasserstein barycenters of `S` measured similarity matrices :math:`(\mathbf{C}_s)_{1 \leq s \leq S}`
The function solves the following optimization problem:
.. math::
\mathbf{C} = \mathop{\arg \min}_{\mathbf{C}\in \mathbb{R}^{N \times N}} \quad \sum_s \lambda_s \mathrm{GW}(\mathbf{C}, \mathbf{C}_s, \mathbf{p}, \mathbf{p}_s)
Where :
- :math:`\mathbf{C}_s`: metric cost matrix
- :math:`\mathbf{p}_s`: distribution
Parameters
----------
N : int
Size of the targeted barycenter
Cs : list of S array-like of shape (ns,ns)
Metric cost matrices
ps : list of S array-like of shape (ns,)
Sample weights in the `S` spaces
p : array-like, shape(N,)
Weights in the targeted barycenter
lambdas : list of float
List of the `S` spaces' weights.
loss_fun : callable
Tensor-matrix multiplication function based on specific loss function.
update : callable
function(:math:`\mathbf{p}`, lambdas, :math:`\mathbf{T}`, :math:`\mathbf{Cs}`) that updates
:math:`\mathbf{C}` according to a specific Kernel with the `S` :math:`\mathbf{T}_s` couplings
calculated at each iteration
epsilon : float
Regularization term >0
max_iter : int, optional
Max number of iterations
tol : float, optional
Stop threshold on error (>0)
verbose : bool, optional
Print information along iterations.
log : bool, optional
Record log if True.
init_C : bool | array-like, shape (N, N)
Random initial value for the :math:`\mathbf{C}` matrix provided by user.
random_state : int or RandomState instance, optional
Fix the seed for reproducibility
Returns
-------
C : array-like, shape (`N`, `N`)
Similarity matrix in the barycenter space (permutated arbitrarily)
References
----------
.. [12] Gabriel Peyré, Marco Cuturi, and Justin Solomon,
"Gromov-Wasserstein averaging of kernel and distance matrices."
International Conference on Machine Learning (ICML). 2016.
"""
Cs = list_to_array(*Cs)
ps = list_to_array(*ps)
p = list_to_array(p)
nx = get_backend(*Cs, *ps, p)
S = len(Cs)
# Initialization of C : random SPD matrix (if not provided by user)
if init_C is None:
generator = check_random_state(random_state)
xalea = generator.randn(N, 2)
C = dist(xalea, xalea)
C /= C.max()
C = nx.from_numpy(C, type_as=p)
else:
C = init_C
cpt = 0
err = 1
error = []
while (err > tol) and (cpt < max_iter):
Cprev = C
T = [entropic_gromov_wasserstein(Cs[s], C, ps[s], p, loss_fun, epsilon,
max_iter, 1e-4, verbose, log) for s in range(S)]
if loss_fun == 'square_loss':
C = update_square_loss(p, lambdas, T, Cs)
elif loss_fun == 'kl_loss':
C = update_kl_loss(p, lambdas, T, Cs)
if cpt % 10 == 0:
# we can speed up the process by checking for the error only all
# the 10th iterations
err = nx.norm(C - Cprev)
error.append(err)
if log:
log['err'].append(err)
if verbose:
if cpt % 200 == 0:
print('{:5s}|{:12s}'.format(
'It.', 'Err') + '\n' + '-' * 19)
print('{:5d}|{:8e}|'.format(cpt, err))
cpt += 1
return C
def gromov_barycenters(N, Cs, ps, p, lambdas, loss_fun,
max_iter=1000, tol=1e-9, verbose=False, log=False, init_C=None, random_state=None):
r"""
Returns the gromov-wasserstein barycenters of `S` measured similarity matrices :math:`(\mathbf{C}_s)_{1 \leq s \leq S}`
The function solves the following optimization problem with block coordinate descent:
.. math::
\mathbf{C} = \mathop{\arg \min}_{\mathbf{C}\in \mathbb{R}^{N \times N}} \quad \sum_s \lambda_s \mathrm{GW}(\mathbf{C}, \mathbf{C}_s, \mathbf{p}, \mathbf{p}_s)
Where :
- :math:`\mathbf{C}_s`: metric cost matrix
- :math:`\mathbf{p}_s`: distribution
Parameters
----------
N : int
Size of the targeted barycenter
Cs : list of S array-like of shape (ns, ns)
Metric cost matrices
ps : list of S array-like of shape (ns,)
Sample weights in the `S` spaces
p : array-like, shape (N,)
Weights in the targeted barycenter
lambdas : list of float
List of the `S` spaces' weights
loss_fun : callable
tensor-matrix multiplication function based on specific loss function
update : callable
function(:math:`\mathbf{p}`, lambdas, :math:`\mathbf{T}`, :math:`\mathbf{Cs}`) that updates
:math:`\mathbf{C}` according to a specific Kernel with the `S` :math:`\mathbf{T}_s` couplings
calculated at each iteration
max_iter : int, optional
Max number of iterations
tol : float, optional
Stop threshold on error (>0).
verbose : bool, optional
Print information along iterations.
log : bool, optional
Record log if True.
init_C : bool | array-like, shape(N,N)
Random initial value for the :math:`\mathbf{C}` matrix provided by user.
random_state : int or RandomState instance, optional
Fix the seed for reproducibility
Returns
-------
C : array-like, shape (`N`, `N`)
Similarity matrix in the barycenter space (permutated arbitrarily)
References
----------
.. [12] Gabriel Peyré, Marco Cuturi, and Justin Solomon,
"Gromov-Wasserstein averaging of kernel and distance matrices."
International Conference on Machine Learning (ICML). 2016.
"""
Cs = list_to_array(*Cs)
ps = list_to_array(*ps)
p = list_to_array(p)
nx = get_backend(*Cs, *ps, p)
S = len(Cs)
# Initialization of C : random SPD matrix (if not provided by user)
if init_C is None:
generator = check_random_state(random_state)
xalea = generator.randn(N, 2)
C = dist(xalea, xalea)
C /= C.max()
C = nx.from_numpy(C, type_as=p)
else:
C = init_C
cpt = 0
err = 1
error = []
while(err > tol and cpt < max_iter):
Cprev = C
T = [gromov_wasserstein(Cs[s], C, ps[s], p, loss_fun,
numItermax=max_iter, stopThr=1e-5, verbose=verbose, log=log) for s in range(S)]
if loss_fun == 'square_loss':
C = update_square_loss(p, lambdas, T, Cs)
elif loss_fun == 'kl_loss':
C = update_kl_loss(p, lambdas, T, Cs)
if cpt % 10 == 0:
# we can speed up the process by checking for the error only all
# the 10th iterations
err = nx.norm(C - Cprev)
error.append(err)
if log:
log['err'].append(err)
if verbose:
if cpt % 200 == 0:
print('{:5s}|{:12s}'.format(
'It.', 'Err') + '\n' + '-' * 19)
print('{:5d}|{:8e}|'.format(cpt, err))
cpt += 1
return C
def fgw_barycenters(N, Ys, Cs, ps, lambdas, alpha, fixed_structure=False, fixed_features=False,
p=None, loss_fun='square_loss', max_iter=100, tol=1e-9,
verbose=False, log=False, init_C=None, init_X=None, random_state=None):
r"""Compute the fgw barycenter as presented eq (5) in :ref:`[24] <references-fgw-barycenters>`
Parameters
----------
N : int
Desired number of samples of the target barycenter
Ys: list of array-like, each element has shape (ns,d)
Features of all samples
Cs : list of array-like, each element has shape (ns,ns)
Structure matrices of all samples
ps : list of array-like, each element has shape (ns,)
Masses of all samples.
lambdas : list of float
List of the `S` spaces' weights
alpha : float
Alpha parameter for the fgw distance
fixed_structure : bool
Whether to fix the structure of the barycenter during the updates
fixed_features : bool
Whether to fix the feature of the barycenter during the updates
loss_fun : str
Loss function used for the solver either 'square_loss' or 'kl_loss'
max_iter : int, optional
Max number of iterations
tol : float, optional
Stop threshold on error (>0).
verbose : bool, optional
Print information along iterations.
log : bool, optional
Record log if True.
init_C : array-like, shape (N,N), optional
Initialization for the barycenters' structure matrix. If not set
a random init is used.
init_X : array-like, shape (N,d), optional
Initialization for the barycenters' features. If not set a
random init is used.
random_state : int or RandomState instance, optional
Fix the seed for reproducibility
Returns
-------
X : array-like, shape (`N`, `d`)
Barycenters' features
C : array-like, shape (`N`, `N`)
Barycenters' structure matrix
log : dict
Only returned when log=True. It contains the keys:
- :math:`\mathbf{T}`: list of (`N`, `ns`) transport matrices
- :math:`(\mathbf{M}_s)_s`: all distance matrices between the feature of the barycenter and the other features :math:`(dist(\mathbf{X}, \mathbf{Y}_s))_s` shape (`N`, `ns`)
.. _references-fgw-barycenters:
References
----------
.. [24] Vayer Titouan, Chapel Laetitia, Flamary Rémi, Tavenard Romain
and Courty Nicolas
"Optimal Transport for structured data with application on graphs"
International Conference on Machine Learning (ICML). 2019.
"""
Cs = list_to_array(*Cs)
ps = list_to_array(*ps)
Ys = list_to_array(*Ys)
p = list_to_array(p)
nx = get_backend(*Cs, *Ys, *ps)
S = len(Cs)
d = Ys[0].shape[1] # dimension on the node features
if p is None:
p = nx.ones(N, type_as=Cs[0]) / N
if fixed_structure:
if init_C is None:
raise UndefinedParameter('If C is fixed it must be initialized')
else:
C = init_C
else:
if init_C is None:
generator = check_random_state(random_state)
xalea = generator.randn(N, 2)
C = dist(xalea, xalea)
C = nx.from_numpy(C, type_as=ps[0])
else:
C = init_C
if fixed_features:
if init_X is None:
raise UndefinedParameter('If X is fixed it must be initialized')
else:
X = init_X
else:
if init_X is None:
X = nx.zeros((N, d), type_as=ps[0])
else:
X = init_X
T = [nx.outer(p, q) for q in ps]
Ms = [dist(X, Ys[s]) for s in range(len(Ys))]
cpt = 0
err_feature = 1
err_structure = 1
if log:
log_ = {}
log_['err_feature'] = []
log_['err_structure'] = []
log_['Ts_iter'] = []
while((err_feature > tol or err_structure > tol) and cpt < max_iter):
Cprev = C
Xprev = X
if not fixed_features:
Ys_temp = [y.T for y in Ys]
X = update_feature_matrix(lambdas, Ys_temp, T, p).T
Ms = [dist(X, Ys[s]) for s in range(len(Ys))]
if not fixed_structure:
if loss_fun == 'square_loss':
T_temp = [t.T for t in T]
C = update_structure_matrix(p, lambdas, T_temp, Cs)
T = [fused_gromov_wasserstein(Ms[s], C, Cs[s], p, ps[s], loss_fun, alpha,
numItermax=max_iter, stopThr=1e-5, verbose=verbose) for s in range(S)]
# T is N,ns
err_feature = nx.norm(X - nx.reshape(Xprev, (N, d)))
err_structure = nx.norm(C - Cprev)
if log:
log_['err_feature'].append(err_feature)
log_['err_structure'].append(err_structure)
log_['Ts_iter'].append(T)
if verbose:
if cpt % 200 == 0:
print('{:5s}|{:12s}'.format(
'It.', 'Err') + '\n' + '-' * 19)
print('{:5d}|{:8e}|'.format(cpt, err_structure))
print('{:5d}|{:8e}|'.format(cpt, err_feature))
cpt += 1
if log:
log_['T'] = T # from target to Ys
log_['p'] = p
log_['Ms'] = Ms
if log:
return X, C, log_
else:
return X, C
def update_structure_matrix(p, lambdas, T, Cs):
r"""Updates :math:`\mathbf{C}` according to the L2 Loss kernel with the `S` :math:`\mathbf{T}_s` couplings.
It is calculated at each iteration
Parameters
----------
p : array-like, shape (N,)
Masses in the targeted barycenter.
lambdas : list of float
List of the `S` spaces' weights.
T : list of S array-like of shape (ns, N)
The `S` :math:`\mathbf{T}_s` couplings calculated at each iteration.
Cs : list of S array-like, shape (ns, ns)
Metric cost matrices.
Returns
-------
C : array-like, shape (`nt`, `nt`)
Updated :math:`\mathbf{C}` matrix.
"""
p = list_to_array(p)
T = list_to_array(*T)
Cs = list_to_array(*Cs)
nx = get_backend(*Cs, *T, p)
tmpsum = sum([
lambdas[s] * nx.dot(
nx.dot(T[s].T, Cs[s]),
T[s]
) for s in range(len(T))
])
ppt = nx.outer(p, p)
return tmpsum / ppt
def update_feature_matrix(lambdas, Ys, Ts, p):
r"""Updates the feature with respect to the `S` :math:`\mathbf{T}_s` couplings.
See "Solving the barycenter problem with Block Coordinate Descent (BCD)"
in :ref:`[24] <references-update-feature-matrix>` calculated at each iteration
Parameters
----------
p : array-like, shape (N,)
masses in the targeted barycenter
lambdas : list of float
List of the `S` spaces' weights
Ts : list of S array-like, shape (ns,N)
The `S` :math:`\mathbf{T}_s` couplings calculated at each iteration
Ys : list of S array-like, shape (d,ns)
The features.
Returns
-------
X : array-like, shape (`d`, `N`)
.. _references-update-feature-matrix:
References
----------
.. [24] Vayer Titouan, Chapel Laetitia, Flamary Rémi, Tavenard Romain and Courty Nicolas
"Optimal Transport for structured data with application on graphs"
International Conference on Machine Learning (ICML). 2019.
"""
p = list_to_array(p)
Ts = list_to_array(*Ts)
Ys = list_to_array(*Ys)
nx = get_backend(*Ys, *Ts, p)
p = 1. / p
tmpsum = sum([
lambdas[s] * nx.dot(Ys[s], Ts[s].T) * p[None, :]
for s in range(len(Ts))
])
return tmpsum
| 33.623099 | 179 | 0.587105 | 8,569 | 59,681 | 3.988097 | 0.06897 | 0.031896 | 0.035641 | 0.021537 | 0.80374 | 0.754492 | 0.730643 | 0.712413 | 0.699128 | 0.685404 | 0 | 0.023791 | 0.283742 | 59,681 | 1,774 | 180 | 33.642052 | 0.77561 | 0.561569 | 0 | 0.532231 | 0 | 0 | 0.032454 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.059504 | false | 0 | 0.01157 | 0.026446 | 0.147107 | 0.029752 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
5da2c382d30628aafdfb9b79f58f2b9044d76bf2 | 94 | py | Python | hello.py | alflanagan/test_repo_for_github_api | 12b395f2f132dbbbeb24fb578ca2ff5b6cc446d9 | [
"MIT"
] | null | null | null | hello.py | alflanagan/test_repo_for_github_api | 12b395f2f132dbbbeb24fb578ca2ff5b6cc446d9 | [
"MIT"
] | 1 | 2019-11-04T21:53:26.000Z | 2019-11-04T21:53:26.000Z | hello.py | alflanagan/test_repo_for_github_api | 12b395f2f132dbbbeb24fb578ca2ff5b6cc446d9 | [
"MIT"
] | null | null | null | from __future__ import print_function
if __name__ == '__main__':
print("hello, world!")
| 15.666667 | 37 | 0.712766 | 11 | 94 | 4.909091 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170213 | 94 | 5 | 38 | 18.8 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0.223404 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.666667 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 5 |
5de4547d614c6551a95bf84c8f2c760d9e4a4cad | 23 | py | Python | algoritmika.py | anastasgromova/algoritmika | f73ed66446e8c071c62b0dde85d01bddb394ba8b | [
"MIT"
] | null | null | null | algoritmika.py | anastasgromova/algoritmika | f73ed66446e8c071c62b0dde85d01bddb394ba8b | [
"MIT"
] | null | null | null | algoritmika.py | anastasgromova/algoritmika | f73ed66446e8c071c62b0dde85d01bddb394ba8b | [
"MIT"
] | null | null | null | import pandas as pd
pd | 7.666667 | 19 | 0.782609 | 5 | 23 | 3.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.217391 | 23 | 3 | 20 | 7.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.