hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
d4d95342f41061299c64490ea9ab6705d5a84b95 | 634 | py | Python | chapter-6/holdings/holdings/clients.py | wallacei/microservices-in-action-copy | f9840464a1f9ec40622989e9e5377742246244f3 | [
"MIT"
] | 115 | 2017-11-06T08:12:07.000Z | 2022-02-25T09:56:59.000Z | chapter-6/holdings/holdings/clients.py | wallacei/microservices-in-action-copy | f9840464a1f9ec40622989e9e5377742246244f3 | [
"MIT"
] | 12 | 2017-08-05T14:51:35.000Z | 2020-12-01T11:05:14.000Z | chapter-6/holdings/holdings/clients.py | wallacei/microservices-in-action-copy | f9840464a1f9ec40622989e9e5377742246244f3 | [
"MIT"
] | 82 | 2017-08-05T09:41:12.000Z | 2022-02-18T00:57:39.000Z | import logging
import requests
from tenacity import before_log, retry, stop_after_attempt
class MarketDataClient(object):
logger = logging.getLogger(__name__)
base_url = 'http://market-data:8000'
def _make_request(self, url):
response = requests.get(
f"{self.base_url}/{url}", headers={'content-type': 'application/json'})
return response.json()
@retry(stop=stop_after_attempt(3),
before=before_log(logger, logging.DEBUG))
def all_prices(self):
return self._make_request("prices")
def price(self, code):
return self._make_request(f"prices/{code}")
| 26.416667 | 83 | 0.675079 | 79 | 634 | 5.177215 | 0.531646 | 0.080685 | 0.07824 | 0.102689 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009862 | 0.200315 | 634 | 23 | 84 | 27.565217 | 0.796844 | 0 | 0 | 0 | 0 | 0 | 0.143533 | 0.033123 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1875 | false | 0 | 0.1875 | 0.125 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
d4d9e6d71e69698ba77a6289f74120c3c4f42de9 | 1,528 | py | Python | openclean/profiling/tests.py | remram44/openclean-core | 8c09c8302cadbb3bb02c959907f91a3ae343f939 | [
"BSD-3-Clause"
] | 4 | 2021-04-20T09:06:26.000Z | 2021-11-20T20:31:28.000Z | openclean/profiling/tests.py | remram44/openclean-core | 8c09c8302cadbb3bb02c959907f91a3ae343f939 | [
"BSD-3-Clause"
] | 14 | 2021-01-19T19:23:16.000Z | 2021-04-28T14:31:03.000Z | openclean/profiling/tests.py | remram44/openclean-core | 8c09c8302cadbb3bb02c959907f91a3ae343f939 | [
"BSD-3-Clause"
] | 5 | 2021-08-24T11:57:21.000Z | 2022-03-17T04:39:04.000Z | # This file is part of the Data Cleaning Library (openclean).
#
# Copyright (C) 2018-2021 New York University.
#
# openclean is released under the Revised BSD License. See file LICENSE for
# full license details.
"""Helper class for testing profiling functionality."""
from collections import Counter
from openclean.data.types import Scalar
from openclean.profiling.base import DataStreamProfiler
class ValueCounter(DataStreamProfiler):
"""Test profiler that collects the values and counts that are passed to it
in a Counter.
"""
def __init__(self):
"""Create the internal counter variable."""
self.counter = None
def close(self) -> Counter:
"""Return the counter at the end of the stream.
Returns
-------
collections.Counter
"""
return self.counter
def consume(self, value: Scalar, count: int):
"""Add value and count to the internal counter.
Parameters
----------
value: scalar
Scalar column value from a dataset that is part of the data stream
that is being profiled.
count: int
Frequency of the value. Note that this count only relates to the
given value and does not necessarily represent the total number of
occurrences of the value in the stream.
"""
self.counter[value] += count
def open(self):
"""Initialize an empty counter at the beginning of the stream."""
self.counter = Counter()
| 29.960784 | 78 | 0.646597 | 190 | 1,528 | 5.178947 | 0.489474 | 0.030488 | 0.01626 | 0.022358 | 0.030488 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00724 | 0.276832 | 1,528 | 50 | 79 | 30.56 | 0.883258 | 0.584424 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
d4e3e40c747dc3f18235ac1cf9b360b007d513c8 | 15,286 | py | Python | tests/extension/types_/axi_/slave_read/test_types_axi_slave_read.py | akmaru/veriloggen | 74f998139e8cf613f7703fa4cffd571bbf069bbc | [
"Apache-2.0"
] | null | null | null | tests/extension/types_/axi_/slave_read/test_types_axi_slave_read.py | akmaru/veriloggen | 74f998139e8cf613f7703fa4cffd571bbf069bbc | [
"Apache-2.0"
] | null | null | null | tests/extension/types_/axi_/slave_read/test_types_axi_slave_read.py | akmaru/veriloggen | 74f998139e8cf613f7703fa4cffd571bbf069bbc | [
"Apache-2.0"
] | null | null | null | from __future__ import absolute_import
from __future__ import print_function
import veriloggen
import types_axi_slave_read
expected_verilog = """
module test;
reg CLK;
reg RST;
reg [32-1:0] myaxi_awaddr;
reg [8-1:0] myaxi_awlen;
reg [3-1:0] myaxi_awsize;
reg [2-1:0] myaxi_awburst;
reg [1-1:0] myaxi_awlock;
reg [4-1:0] myaxi_awcache;
reg [3-1:0] myaxi_awprot;
reg [4-1:0] myaxi_awqos;
reg [2-1:0] myaxi_awuser;
reg myaxi_awvalid;
wire myaxi_awready;
reg [32-1:0] myaxi_wdata;
reg [4-1:0] myaxi_wstrb;
reg myaxi_wlast;
reg myaxi_wvalid;
wire myaxi_wready;
wire [2-1:0] myaxi_bresp;
wire myaxi_bvalid;
reg myaxi_bready;
reg [32-1:0] myaxi_araddr;
reg [8-1:0] myaxi_arlen;
reg [3-1:0] myaxi_arsize;
reg [2-1:0] myaxi_arburst;
reg [1-1:0] myaxi_arlock;
reg [4-1:0] myaxi_arcache;
reg [3-1:0] myaxi_arprot;
reg [4-1:0] myaxi_arqos;
reg [2-1:0] myaxi_aruser;
reg myaxi_arvalid;
wire myaxi_arready;
wire [32-1:0] myaxi_rdata;
wire [2-1:0] myaxi_rresp;
wire myaxi_rlast;
wire myaxi_rvalid;
reg myaxi_rready;
reg [32-1:0] _axi_awaddr;
reg [8-1:0] _axi_awlen;
wire [3-1:0] _axi_awsize;
wire [2-1:0] _axi_awburst;
wire [1-1:0] _axi_awlock;
wire [4-1:0] _axi_awcache;
wire [3-1:0] _axi_awprot;
wire [4-1:0] _axi_awqos;
wire [2-1:0] _axi_awuser;
reg _axi_awvalid;
wire _axi_awready;
reg [32-1:0] _axi_wdata;
reg [4-1:0] _axi_wstrb;
reg _axi_wlast;
reg _axi_wvalid;
wire _axi_wready;
wire [2-1:0] _axi_bresp;
wire _axi_bvalid;
wire _axi_bready;
reg [32-1:0] _axi_araddr;
reg [8-1:0] _axi_arlen;
wire [3-1:0] _axi_arsize;
wire [2-1:0] _axi_arburst;
wire [1-1:0] _axi_arlock;
wire [4-1:0] _axi_arcache;
wire [3-1:0] _axi_arprot;
wire [4-1:0] _axi_arqos;
wire [2-1:0] _axi_aruser;
reg _axi_arvalid;
wire _axi_arready;
wire [32-1:0] _axi_rdata;
wire [2-1:0] _axi_rresp;
wire _axi_rlast;
wire _axi_rvalid;
wire _axi_rready;
assign _axi_awsize = 2;
assign _axi_awburst = 1;
assign _axi_awlock = 0;
assign _axi_awcache = 3;
assign _axi_awprot = 0;
assign _axi_awqos = 0;
assign _axi_awuser = 0;
assign _axi_bready = 1;
assign _axi_arsize = 2;
assign _axi_arburst = 1;
assign _axi_arlock = 0;
assign _axi_arcache = 3;
assign _axi_arprot = 0;
assign _axi_arqos = 0;
assign _axi_aruser = 0;
wire [32-1:0] _tmp_0;
assign _tmp_0 = _axi_awaddr;
always @(*) begin
myaxi_awaddr = _tmp_0;
end
wire [8-1:0] _tmp_1;
assign _tmp_1 = _axi_awlen;
always @(*) begin
myaxi_awlen = _tmp_1;
end
wire [3-1:0] _tmp_2;
assign _tmp_2 = _axi_awsize;
always @(*) begin
myaxi_awsize = _tmp_2;
end
wire [2-1:0] _tmp_3;
assign _tmp_3 = _axi_awburst;
always @(*) begin
myaxi_awburst = _tmp_3;
end
wire [1-1:0] _tmp_4;
assign _tmp_4 = _axi_awlock;
always @(*) begin
myaxi_awlock = _tmp_4;
end
wire [4-1:0] _tmp_5;
assign _tmp_5 = _axi_awcache;
always @(*) begin
myaxi_awcache = _tmp_5;
end
wire [3-1:0] _tmp_6;
assign _tmp_6 = _axi_awprot;
always @(*) begin
myaxi_awprot = _tmp_6;
end
wire [4-1:0] _tmp_7;
assign _tmp_7 = _axi_awqos;
always @(*) begin
myaxi_awqos = _tmp_7;
end
wire [2-1:0] _tmp_8;
assign _tmp_8 = _axi_awuser;
always @(*) begin
myaxi_awuser = _tmp_8;
end
wire _tmp_9;
assign _tmp_9 = _axi_awvalid;
always @(*) begin
myaxi_awvalid = _tmp_9;
end
assign _axi_awready = myaxi_awready;
wire [32-1:0] _tmp_10;
assign _tmp_10 = _axi_wdata;
always @(*) begin
myaxi_wdata = _tmp_10;
end
wire [4-1:0] _tmp_11;
assign _tmp_11 = _axi_wstrb;
always @(*) begin
myaxi_wstrb = _tmp_11;
end
wire _tmp_12;
assign _tmp_12 = _axi_wlast;
always @(*) begin
myaxi_wlast = _tmp_12;
end
wire _tmp_13;
assign _tmp_13 = _axi_wvalid;
always @(*) begin
myaxi_wvalid = _tmp_13;
end
assign _axi_wready = myaxi_wready;
assign _axi_bresp = myaxi_bresp;
assign _axi_bvalid = myaxi_bvalid;
wire _tmp_14;
assign _tmp_14 = _axi_bready;
always @(*) begin
myaxi_bready = _tmp_14;
end
wire [32-1:0] _tmp_15;
assign _tmp_15 = _axi_araddr;
always @(*) begin
myaxi_araddr = _tmp_15;
end
wire [8-1:0] _tmp_16;
assign _tmp_16 = _axi_arlen;
always @(*) begin
myaxi_arlen = _tmp_16;
end
wire [3-1:0] _tmp_17;
assign _tmp_17 = _axi_arsize;
always @(*) begin
myaxi_arsize = _tmp_17;
end
wire [2-1:0] _tmp_18;
assign _tmp_18 = _axi_arburst;
always @(*) begin
myaxi_arburst = _tmp_18;
end
wire [1-1:0] _tmp_19;
assign _tmp_19 = _axi_arlock;
always @(*) begin
myaxi_arlock = _tmp_19;
end
wire [4-1:0] _tmp_20;
assign _tmp_20 = _axi_arcache;
always @(*) begin
myaxi_arcache = _tmp_20;
end
wire [3-1:0] _tmp_21;
assign _tmp_21 = _axi_arprot;
always @(*) begin
myaxi_arprot = _tmp_21;
end
wire [4-1:0] _tmp_22;
assign _tmp_22 = _axi_arqos;
always @(*) begin
myaxi_arqos = _tmp_22;
end
wire [2-1:0] _tmp_23;
assign _tmp_23 = _axi_aruser;
always @(*) begin
myaxi_aruser = _tmp_23;
end
wire _tmp_24;
assign _tmp_24 = _axi_arvalid;
always @(*) begin
myaxi_arvalid = _tmp_24;
end
assign _axi_arready = myaxi_arready;
assign _axi_rdata = myaxi_rdata;
assign _axi_rresp = myaxi_rresp;
assign _axi_rlast = myaxi_rlast;
assign _axi_rvalid = myaxi_rvalid;
wire _tmp_25;
assign _tmp_25 = _axi_rready;
always @(*) begin
myaxi_rready = _tmp_25;
end
reg [32-1:0] fsm;
localparam fsm_init = 0;
reg [32-1:0] sum;
reg [9-1:0] _tmp_26;
reg __axi_cond_0_1;
reg [9-1:0] _tmp_27;
reg __axi_cond_1_1;
assign _axi_rready = (fsm == 1) || (fsm == 3);
main
uut
(
.CLK(CLK),
.RST(RST),
.myaxi_awaddr(myaxi_awaddr),
.myaxi_awlen(myaxi_awlen),
.myaxi_awsize(myaxi_awsize),
.myaxi_awburst(myaxi_awburst),
.myaxi_awlock(myaxi_awlock),
.myaxi_awcache(myaxi_awcache),
.myaxi_awprot(myaxi_awprot),
.myaxi_awqos(myaxi_awqos),
.myaxi_awuser(myaxi_awuser),
.myaxi_awvalid(myaxi_awvalid),
.myaxi_awready(myaxi_awready),
.myaxi_wdata(myaxi_wdata),
.myaxi_wstrb(myaxi_wstrb),
.myaxi_wlast(myaxi_wlast),
.myaxi_wvalid(myaxi_wvalid),
.myaxi_wready(myaxi_wready),
.myaxi_bresp(myaxi_bresp),
.myaxi_bvalid(myaxi_bvalid),
.myaxi_bready(myaxi_bready),
.myaxi_araddr(myaxi_araddr),
.myaxi_arlen(myaxi_arlen),
.myaxi_arsize(myaxi_arsize),
.myaxi_arburst(myaxi_arburst),
.myaxi_arlock(myaxi_arlock),
.myaxi_arcache(myaxi_arcache),
.myaxi_arprot(myaxi_arprot),
.myaxi_arqos(myaxi_arqos),
.myaxi_aruser(myaxi_aruser),
.myaxi_arvalid(myaxi_arvalid),
.myaxi_arready(myaxi_arready),
.myaxi_rdata(myaxi_rdata),
.myaxi_rresp(myaxi_rresp),
.myaxi_rlast(myaxi_rlast),
.myaxi_rvalid(myaxi_rvalid),
.myaxi_rready(myaxi_rready)
);
initial begin
$dumpfile("uut.vcd");
$dumpvars(0, uut, CLK, RST, myaxi_awaddr, myaxi_awlen, myaxi_awsize, myaxi_awburst, myaxi_awlock, myaxi_awcache, myaxi_awprot, myaxi_awqos, myaxi_awuser, myaxi_awvalid, myaxi_awready, myaxi_wdata, myaxi_wstrb, myaxi_wlast, myaxi_wvalid, myaxi_wready, myaxi_bresp, myaxi_bvalid, myaxi_bready, myaxi_araddr, myaxi_arlen, myaxi_arsize, myaxi_arburst, myaxi_arlock, myaxi_arcache, myaxi_arprot, myaxi_arqos, myaxi_aruser, myaxi_arvalid, myaxi_arready, myaxi_rdata, myaxi_rresp, myaxi_rlast, myaxi_rvalid, myaxi_rready, _axi_awaddr, _axi_awlen, _axi_awsize, _axi_awburst, _axi_awlock, _axi_awcache, _axi_awprot, _axi_awqos, _axi_awuser, _axi_awvalid, _axi_awready, _axi_wdata, _axi_wstrb, _axi_wlast, _axi_wvalid, _axi_wready, _axi_bresp, _axi_bvalid, _axi_bready, _axi_araddr, _axi_arlen, _axi_arsize, _axi_arburst, _axi_arlock, _axi_arcache, _axi_arprot, _axi_arqos, _axi_aruser, _axi_arvalid, _axi_arready, _axi_rdata, _axi_rresp, _axi_rlast, _axi_rvalid, _axi_rready, _tmp_0, _tmp_1, _tmp_2, _tmp_3, _tmp_4, _tmp_5, _tmp_6, _tmp_7, _tmp_8, _tmp_9, _tmp_10, _tmp_11, _tmp_12, _tmp_13, _tmp_14, _tmp_15, _tmp_16, _tmp_17, _tmp_18, _tmp_19, _tmp_20, _tmp_21, _tmp_22, _tmp_23, _tmp_24, _tmp_25, fsm, sum, _tmp_26, __axi_cond_0_1, _tmp_27, __axi_cond_1_1);
end
initial begin
CLK = 0;
forever begin
#5 CLK = !CLK;
end
end
initial begin
RST = 0;
_axi_awaddr = 0;
_axi_awlen = 0;
_axi_awvalid = 0;
_axi_wdata = 0;
_axi_wstrb = 0;
_axi_wlast = 0;
_axi_wvalid = 0;
_axi_araddr = 0;
_axi_arlen = 0;
_axi_arvalid = 0;
fsm = fsm_init;
sum = 0;
_tmp_26 = 0;
__axi_cond_0_1 = 0;
_tmp_27 = 0;
__axi_cond_1_1 = 0;
#100;
RST = 1;
#100;
RST = 0;
#100000;
$finish;
end
always @(posedge CLK) begin
if(RST) begin
_axi_awaddr <= 0;
_axi_awlen <= 0;
_axi_awvalid <= 0;
_axi_wdata <= 0;
_axi_wstrb <= 0;
_axi_wlast <= 0;
_axi_wvalid <= 0;
_axi_araddr <= 0;
_axi_arlen <= 0;
_axi_arvalid <= 0;
_tmp_26 <= 0;
__axi_cond_0_1 <= 0;
_tmp_27 <= 0;
__axi_cond_1_1 <= 0;
end else begin
if(__axi_cond_0_1) begin
_axi_arvalid <= 0;
end
if(__axi_cond_1_1) begin
_axi_arvalid <= 0;
end
_axi_awaddr <= 0;
_axi_awlen <= 0;
_axi_awvalid <= 0;
_axi_wdata <= 0;
_axi_wstrb <= 0;
_axi_wlast <= 0;
_axi_wvalid <= 0;
if((fsm == 0) && ((_axi_arready || !_axi_arvalid) && (_tmp_26 == 0))) begin
_axi_araddr <= 1024;
_axi_arlen <= 63;
_axi_arvalid <= 1;
_tmp_26 <= 64;
end
__axi_cond_0_1 <= 1;
if(_axi_arvalid && !_axi_arready) begin
_axi_arvalid <= _axi_arvalid;
end
if(_axi_rready && _axi_rvalid && (_tmp_26 > 0)) begin
_tmp_26 <= _tmp_26 - 1;
end
if((fsm == 2) && ((_axi_arready || !_axi_arvalid) && (_tmp_27 == 0))) begin
_axi_araddr <= 2048;
_axi_arlen <= 127;
_axi_arvalid <= 1;
_tmp_27 <= 128;
end
__axi_cond_1_1 <= 1;
if(_axi_arvalid && !_axi_arready) begin
_axi_arvalid <= _axi_arvalid;
end
if(_axi_rready && _axi_rvalid && (_tmp_27 > 0)) begin
_tmp_27 <= _tmp_27 - 1;
end
end
end
localparam fsm_1 = 1;
localparam fsm_2 = 2;
localparam fsm_3 = 3;
localparam fsm_4 = 4;
localparam fsm_5 = 5;
always @(posedge CLK) begin
if(RST) begin
fsm <= fsm_init;
sum <= 0;
end else begin
case(fsm)
fsm_init: begin
if(_axi_arready || !_axi_arvalid) begin
fsm <= fsm_1;
end
end
fsm_1: begin
if(_axi_rready && _axi_rvalid) begin
sum <= sum + _axi_rdata;
end
if(_axi_rready && _axi_rvalid && _axi_rlast) begin
fsm <= fsm_2;
end
end
fsm_2: begin
if(_axi_arready || !_axi_arvalid) begin
fsm <= fsm_3;
end
end
fsm_3: begin
if(_axi_rready && _axi_rvalid) begin
sum <= sum + _axi_rdata;
end
if(_axi_rready && _axi_rvalid && _axi_rlast) begin
fsm <= fsm_4;
end
end
fsm_4: begin
$display("sum=%d expected_sum=%d", sum, 92064);
fsm <= fsm_5;
end
endcase
end
end
endmodule
module main
(
input CLK,
input RST,
input [32-1:0] myaxi_awaddr,
input [8-1:0] myaxi_awlen,
input [3-1:0] myaxi_awsize,
input [2-1:0] myaxi_awburst,
input [1-1:0] myaxi_awlock,
input [4-1:0] myaxi_awcache,
input [3-1:0] myaxi_awprot,
input [4-1:0] myaxi_awqos,
input [2-1:0] myaxi_awuser,
input myaxi_awvalid,
output myaxi_awready,
input [32-1:0] myaxi_wdata,
input [4-1:0] myaxi_wstrb,
input myaxi_wlast,
input myaxi_wvalid,
output myaxi_wready,
output [2-1:0] myaxi_bresp,
output reg myaxi_bvalid,
input myaxi_bready,
input [32-1:0] myaxi_araddr,
input [8-1:0] myaxi_arlen,
input [3-1:0] myaxi_arsize,
input [2-1:0] myaxi_arburst,
input [1-1:0] myaxi_arlock,
input [4-1:0] myaxi_arcache,
input [3-1:0] myaxi_arprot,
input [4-1:0] myaxi_arqos,
input [2-1:0] myaxi_aruser,
input myaxi_arvalid,
output myaxi_arready,
output reg [32-1:0] myaxi_rdata,
output [2-1:0] myaxi_rresp,
output reg myaxi_rlast,
output reg myaxi_rvalid,
input myaxi_rready
);
assign myaxi_bresp = 0;
assign myaxi_rresp = 0;
assign myaxi_awready = 0;
assign myaxi_wready = 0;
reg [32-1:0] fsm;
localparam fsm_init = 0;
reg [9-1:0] _tmp_0;
reg [32-1:0] _tmp_1;
reg _tmp_2;
reg _tmp_3;
assign myaxi_arready = (fsm == 0) && !_tmp_2 && _tmp_3;
reg [32-1:0] rdata;
reg _tmp_4;
reg _myaxi_cond_0_1;
always @(posedge CLK) begin
if(RST) begin
myaxi_bvalid <= 0;
_tmp_3 <= 0;
_tmp_1 <= 0;
_tmp_0 <= 0;
_tmp_2 <= 0;
myaxi_rdata <= 0;
myaxi_rvalid <= 0;
myaxi_rlast <= 0;
_tmp_4 <= 0;
_myaxi_cond_0_1 <= 0;
end else begin
if(_myaxi_cond_0_1) begin
myaxi_rvalid <= 0;
myaxi_rlast <= 0;
_tmp_4 <= 0;
end
if(myaxi_bvalid && myaxi_bready) begin
myaxi_bvalid <= 0;
end
if(myaxi_wvalid && myaxi_wready && myaxi_wlast) begin
myaxi_bvalid <= 1;
end
_tmp_3 <= myaxi_arvalid;
if(myaxi_arready && myaxi_arvalid) begin
_tmp_1 <= myaxi_araddr;
_tmp_0 <= myaxi_arlen + 1;
end
_tmp_2 <= myaxi_arready && myaxi_arvalid;
if((fsm == 1) && ((_tmp_0 > 0) && (myaxi_rready || !myaxi_rvalid) && (_tmp_0 > 0))) begin
myaxi_rdata <= rdata;
myaxi_rvalid <= 1;
myaxi_rlast <= 0;
_tmp_0 <= _tmp_0 - 1;
end
if((fsm == 1) && ((_tmp_0 > 0) && (myaxi_rready || !myaxi_rvalid) && (_tmp_0 > 0)) && (_tmp_0 == 1)) begin
myaxi_rlast <= 1;
_tmp_4 <= 1;
end
_myaxi_cond_0_1 <= 1;
if(myaxi_rvalid && !myaxi_rready) begin
myaxi_rvalid <= myaxi_rvalid;
myaxi_rlast <= myaxi_rlast;
_tmp_4 <= _tmp_4;
end
end
end
localparam fsm_1 = 1;
localparam fsm_2 = 2;
always @(posedge CLK) begin
if(RST) begin
fsm <= fsm_init;
rdata <= 0;
end else begin
case(fsm)
fsm_init: begin
if(_tmp_2) begin
rdata <= _tmp_1 >> 2;
end
if(_tmp_2) begin
fsm <= fsm_1;
end
end
fsm_1: begin
if((_tmp_0 > 0) && (myaxi_rready || !myaxi_rvalid)) begin
rdata <= rdata + 1;
end
if(_tmp_4) begin
fsm <= fsm_2;
end
end
fsm_2: begin
fsm <= fsm_init;
end
endcase
end
end
endmodule
"""
def test():
veriloggen.reset()
test_module = types_axi_slave_read.mkTest()
code = test_module.to_verilog()
from pyverilog.vparser.parser import VerilogParser
from pyverilog.ast_code_generator.codegen import ASTCodeGenerator
parser = VerilogParser()
expected_ast = parser.parse(expected_verilog)
codegen = ASTCodeGenerator()
expected_code = codegen.visit(expected_ast)
assert(expected_code == code)
| 23.699225 | 1,255 | 0.630315 | 2,322 | 15,286 | 3.689922 | 0.059001 | 0.024043 | 0.037582 | 0.009804 | 0.476657 | 0.186391 | 0.152428 | 0.14204 | 0.14204 | 0.1243 | 0 | 0.066573 | 0.258079 | 15,286 | 644 | 1,256 | 23.736025 | 0.688916 | 0 | 0 | 0.33156 | 0 | 0.005319 | 0.961272 | 0.065027 | 0 | 0 | 0 | 0 | 0.001773 | 1 | 0.001773 | false | 0 | 0.010638 | 0 | 0.012411 | 0.001773 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
d4f53a6bcfe05e611929e885704509e9deec3431 | 13,825 | py | Python | desktop/core/ext-py/kerberos-1.3.0/pysrc/kerberos.py | yetsun/hue | 2e48f0cc70e233ee0e1b40733d4b2a18d8836c66 | [
"Apache-2.0"
] | 5,079 | 2015-01-01T03:39:46.000Z | 2022-03-31T07:38:22.000Z | pysrc/kerberos.py | ass-a2s/ccs-pykerberos | 24b0ca75a03f283c089f90befe88299158d3dc18 | [
"Apache-2.0"
] | 1,623 | 2015-01-01T08:06:24.000Z | 2022-03-30T19:48:52.000Z | pysrc/kerberos.py | ass-a2s/ccs-pykerberos | 24b0ca75a03f283c089f90befe88299158d3dc18 | [
"Apache-2.0"
] | 2,033 | 2015-01-04T07:18:02.000Z | 2022-03-28T19:55:47.000Z | ##
# Copyright (c) 2006-2018 Apple Inc. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
##
"""
PyKerberos Function Description.
"""
class KrbError(Exception):
pass
class BasicAuthError(KrbError):
pass
class GSSError(KrbError):
pass
def checkPassword(user, pswd, service, default_realm):
"""
This function provides a simple way to verify that a user name and password
match those normally used for Kerberos authentication.
It does this by checking that the supplied user name and password can be
used to get a ticket for the supplied service.
If the user name does not contain a realm, then the default realm supplied
is used.
For this to work properly the Kerberos must be configured properly on this
machine.
That will likely mean ensuring that the edu.mit.Kerberos preference file
has the correct realms and KDCs listed.
IMPORTANT: This method is vulnerable to KDC spoofing attacks and it should
only used for testing. Do not use this in any production system - your
security could be compromised if you do.
@param user: A string containing the Kerberos user name.
A realm may be included by appending an C{"@"} followed by the realm
string to the actual user id.
If no realm is supplied, then the realm set in the default_realm
argument will be used.
@param pswd: A string containing the password for the user.
@param service: A string containing the Kerberos service to check access
for.
This will be of the form C{"sss/xx.yy.zz"}, where C{"sss"} is the
service identifier (e.g., C{"http"}, C{"krbtgt"}), and C{"xx.yy.zz"} is
the hostname of the server.
@param default_realm: A string containing the default realm to use if one
is not supplied in the user argument.
Note that Kerberos realms are normally all uppercase (e.g.,
C{"EXAMPLE.COM"}).
@return: True if authentication succeeds, false otherwise.
"""
def changePassword(user, oldpswd, newpswd):
"""
This function allows to change the user password on the KDC.
@param user: A string containing the Kerberos user name.
A realm may be included by appending a C{"@"} followed by the realm
string to the actual user id.
If no realm is supplied, then the realm set in the default_realm
argument will be used.
@param oldpswd: A string containing the old (current) password for the
user.
@param newpswd: A string containing the new password for the user.
@return: True if password changing succeeds, false otherwise.
"""
def getServerPrincipalDetails(service, hostname):
"""
This function returns the service principal for the server given a service
type and hostname.
Details are looked up via the C{/etc/keytab} file.
@param service: A string containing the Kerberos service type for the
server.
@param hostname: A string containing the hostname of the server.
@return: A string containing the service principal.
"""
"""
GSSAPI Function Result Codes:
-1 : Error
0 : GSSAPI step continuation (only returned by 'Step' function)
1 : GSSAPI step complete, or function return OK
"""
# Some useful result codes
AUTH_GSS_CONTINUE = 0
AUTH_GSS_COMPLETE = 1
# Some useful gss flags
GSS_C_DELEG_FLAG = 1
GSS_C_MUTUAL_FLAG = 2
GSS_C_REPLAY_FLAG = 4
GSS_C_SEQUENCE_FLAG = 8
GSS_C_CONF_FLAG = 16
GSS_C_INTEG_FLAG = 32
GSS_C_ANON_FLAG = 64
GSS_C_PROT_READY_FLAG = 128
GSS_C_TRANS_FLAG = 256
def authGSSClientInit(service, **kwargs):
"""
Initializes a context for GSSAPI client-side authentication with the given
service principal.
L{authGSSClientClean} must be called after this function returns an OK
result to dispose of the context once all GSSAPI operations are complete.
@param service: A string containing the service principal in the form
C{"type@fqdn"}.
@param principal: Optional string containing the client principal in the
form C{"user@realm"}.
@param gssflags: Optional integer used to set GSS flags.
(e.g. C{GSS_C_DELEG_FLAG|GSS_C_MUTUAL_FLAG|GSS_C_SEQUENCE_FLAG} will
allow for forwarding credentials to the remote host)
@param delegated: Optional server context containing delegated credentials
@param mech_oid: Optional GGS mech OID
@return: A tuple of (result, context) where result is the result code (see
above) and context is an opaque value that will need to be passed to
subsequent functions.
"""
def authGSSClientClean(context):
"""
Destroys the context for GSSAPI client-side authentication. This function
is provided for compatibility with earlier versions of PyKerberos but does
nothing. The context object destroys itself when it is reclaimed.
@param context: The context object returned from L{authGSSClientInit}.
@return: A result code (see above).
"""
def authGSSClientInquireCred(context):
"""
Get the current user name, if any, without a client-side GSSAPI step.
If the principal has already been authenticated via completed client-side
GSSAPI steps then the user name of the authenticated principal is kept. The
user name will be available via authGSSClientUserName.
@param context: The context object returned from L{authGSSClientInit}.
@return: A result code (see above).
"""
"""
Address Types for Channel Bindings
https://docs.oracle.com/cd/E19455-01/806-3814/6jcugr7dp/index.html#reference-9
"""
GSS_C_AF_UNSPEC = 0
GSS_C_AF_LOCAL = 1
GSS_C_AF_INET = 2
GSS_C_AF_IMPLINK = 3
GSS_C_AF_PUP = 4
GSS_C_AF_CHAOS = 5
GSS_C_AF_NS = 6
GSS_C_AF_NBS = 7
GSS_C_AF_ECMA = 8
GSS_C_AF_DATAKIT = 9
GSS_C_AF_CCITT = 10
GSS_C_AF_SNA = 11
GSS_C_AF_DECnet = 12
GSS_C_AF_DLI = 13
GSS_C_AF_LAT = 14
GSS_C_AF_HYLINK = 15
GSS_C_AF_APPLETALK = 16
GSS_C_AF_BSC = 17
GSS_C_AF_DSS = 18
GSS_C_AF_OSI = 19
GSS_C_AF_X25 = 21
GSS_C_AF_NULLADDR = 255
def channelBindings(**kwargs):
"""
Builds a gss_channel_bindings_struct which can be used to pass onto
L{authGSSClientStep} to bind onto the auth. Details on Channel Bindings
can be foud at https://tools.ietf.org/html/rfc5929. More details on the
struct can be found at
https://docs.oracle.com/cd/E19455-01/806-3814/overview-52/index.html
@param initiator_addrtype: Optional integer used to set the
initiator_addrtype, defaults to GSS_C_AF_UNSPEC if not set
@param initiator_address: Optional byte string containing the
initiator_address
@param acceptor_addrtype: Optional integer used to set the
acceptor_addrtype, defaults to GSS_C_AF_UNSPEC if not set
@param acceptor_address: Optional byte string containing the
acceptor_address
@param application_data: Optional byte string containing the
application_data. An example would be 'tls-server-end-point:{cert-hash}'
where {cert-hash} is the hash of the server's certificate
@return: A tuple of (result, gss_channel_bindings_struct) where result is
the result code and gss_channel_bindings_struct is the channel bindings
structure that can be passed onto L{authGSSClientStep}
"""
def authGSSClientStep(context, challenge, **kwargs):
"""
Processes a single GSSAPI client-side step using the supplied server data.
@param context: The context object returned from L{authGSSClientInit}.
@param challenge: A string containing the base64-encoded server data (which
may be empty for the first step).
@param channel_bindings: Optional channel bindings to bind onto the auth
request. This struct can be built using :{channelBindings}
and if not specified it will pass along GSS_C_NO_CHANNEL_BINDINGS as
a default.
@return: A result code (see above).
"""
def authGSSClientResponse(context):
"""
Get the client response from the last successful GSSAPI client-side step.
@param context: The context object returned from L{authGSSClientInit}.
@return: A string containing the base64-encoded client data to be sent to
the server.
"""
def authGSSClientResponseConf(context):
"""
Determine whether confidentiality was enabled in the previously unwrapped
buffer.
@param context: The context object returned from L{authGSSClientInit}.
@return: C{1} if confidentiality was enabled in the previously unwrapped
buffer, C{0} otherwise.
"""
def authGSSClientUserName(context):
"""
Get the user name of the principal authenticated via the now complete
GSSAPI client-side operations, or the current user name obtained via
authGSSClientInquireCred. This method must only be called after
authGSSClientStep or authGSSClientInquireCred return a complete response
code.
@param context: The context object returned from L{authGSSClientInit}.
@return: A string containing the user name.
"""
def authGSSClientUnwrap(context, challenge):
"""
Perform the client side GSSAPI unwrap step.
@param challenge: A string containing the base64-encoded server data.
@return: A result code (see above)
"""
def authGSSClientWrap(context, data, user=None, protect=0):
"""
Perform the client side GSSAPI wrap step.
@param data: The result of the L{authGSSClientResponse} after the
L{authGSSClientUnwrap}.
@param user: The user to authorize.
@param protect: If C{0}, then just provide integrity protection.
If C{1}, then provide confidentiality as well.
@return: A result code (see above)
"""
def authGSSServerInit(service):
"""
Initializes a context for GSSAPI server-side authentication with the given
service principal.
authGSSServerClean must be called after this function returns an OK result
to dispose of the context once all GSSAPI operations are complete.
@param service: A string containing the service principal in the form
C{"type@fqdn"}. To initialize the context for the purpose of accepting
delegated credentials, pass the literal string C{"DELEGATE"}.
@return: A tuple of (result, context) where result is the result code (see
above) and context is an opaque value that will need to be passed to
subsequent functions.
"""
def authGSSServerClean(context):
"""
Destroys the context for GSSAPI server-side authentication. This function
is provided for compatibility with earlier versions of PyKerberos but does
nothing. The context object destroys itself when it is reclaimed.
@param context: The context object returned from L{authGSSClientInit}.
@return: A result code (see above).
"""
def authGSSServerStep(context, challenge):
"""
Processes a single GSSAPI server-side step using the supplied client data.
@param context: The context object returned from L{authGSSClientInit}.
@param challenge: A string containing the base64-encoded client data.
@return: A result code (see above).
"""
def authGSSServerResponse(context):
"""
Get the server response from the last successful GSSAPI server-side step.
@param context: The context object returned from L{authGSSClientInit}.
@return: A string containing the base64-encoded server data to be sent to
the client.
"""
def authGSSServerHasDelegated(context):
"""
Checks whether a server context has delegated credentials.
@param context: The context object returned from L{authGSSClientInit}.
@return: A bool saying whether delegated credentials are available.
"""
def authGSSServerUserName(context):
"""
Get the user name of the principal trying to authenticate to the server.
This method must only be called after L{authGSSServerStep} returns a
complete or continue response code.
@param context: The context object returned from L{authGSSClientInit}.
@return: A string containing the user name.
"""
def authGSSServerTargetName(context):
"""
Get the target name if the server did not supply its own credentials.
This method must only be called after L{authGSSServerStep} returns a
complete or continue response code.
@param context: The context object returned from L{authGSSClientInit}.
@return: A string containing the target name.
"""
def authGSSServerStoreDelegate(context):
"""
Save the ticket sent to the server in the file C{/tmp/krb5_pyserv_XXXXXX}.
This method must only be called after L{authGSSServerStep} returns a
complete or continue response code.
@param context: The context object returned from L{authGSSClientInit}.
@return: A result code (see above).
"""
def authGSSServerCacheName(context):
"""
Get the name of the credential cache created with
L{authGSSServerStoreDelegate}.
This method must only be called after L{authGSSServerStoreDelegate}.
@param context: The context object returned from L{authGSSClientInit}.
@return: A string containing the cache name.
"""
| 29.924242 | 80 | 0.715009 | 1,929 | 13,825 | 5.049767 | 0.231208 | 0.015194 | 0.048763 | 0.043117 | 0.450262 | 0.413407 | 0.376655 | 0.351093 | 0.310851 | 0.303665 | 0 | 0.011475 | 0.224665 | 13,825 | 461 | 81 | 29.989154 | 0.897285 | 0.761519 | 0 | 0.04918 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.360656 | false | 0.081967 | 0 | 0 | 0.409836 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
be012fdc10f24eba06be394052fbaac1f2cf481f | 8,414 | py | Python | tests/unit_tests/test_scaling.py | AppliedMechanics/AMfe | be209dffe4d170aca735f1e912fd5cb448502119 | [
"BSD-3-Clause"
] | 21 | 2017-06-01T15:55:33.000Z | 2022-03-13T08:43:31.000Z | tests/unit_tests/test_scaling.py | AppliedMechanics/AMfeti | be209dffe4d170aca735f1e912fd5cb448502119 | [
"BSD-3-Clause"
] | 1 | 2022-01-08T07:20:15.000Z | 2022-01-13T23:56:33.000Z | tests/unit_tests/test_scaling.py | AppliedMechanics/AMfeti | be209dffe4d170aca735f1e912fd5cb448502119 | [
"BSD-3-Clause"
] | 10 | 2018-01-11T23:48:55.000Z | 2022-01-12T15:58:54.000Z | """Test routines for scaling module"""
from unittest import TestCase
import numpy as np
from numpy.testing import assert_array_almost_equal
from scipy.sparse import csr_matrix
from amfeti.scaling import MultiplicityScaling
from ..tools import CustomDictAssertTest
class MultiplicityScalingTest(TestCase):
def setUp(self):
self.scaling_dict = {'subA': MultiplicityScaling(),
'subB': MultiplicityScaling(),
'subC': MultiplicityScaling(),
'subD': MultiplicityScaling(),
'subE': MultiplicityScaling()}
self.B_dict = {'subA': {'interfaceAD': csr_matrix(np.array([[0, 0, 0, 1, 0],
[0, 0, 0, 0, 1]])),
'interfaceAE': csr_matrix(np.array([[1, 0, 0, 0, 0],
[0, 0, 0, 1, 0]])),
'interfaceAB': csr_matrix(np.array([[0, 0, 0, 1, 0]]))},
'subB': {'interfaceBD': csr_matrix(np.array([[1, 0, 0, 0],
[0, 0, 0, 1]])),
'interfaceBC': csr_matrix(np.array([[0, 1, 0, 0],
[0, 0, 1, 0]])),
'interfaceAB': csr_matrix(np.array([[0, 0, 0, -1]])),
'interfaceBE': csr_matrix(np.array([[0, 0, 1, 0],
[0, 0, 0, 1]]))},
'subC': {'interfaceBC': csr_matrix(np.array([[0, 0, 0, -1],
[0, -1, 0, 0]])),
'interfaceCE': csr_matrix(np.array([[1, 0, 0, 0],
[0, 1, 0, 0]]))},
'subD': {'interfaceAD': csr_matrix(np.array([[-1, 0, 0, 0, 0],
[0, -1, 0, 0, 0]])),
'interfaceBD': csr_matrix(np.array([[0, 0, 0, 0, -1],
[-1, 0, 0, 0, 0]])),
'interfaceDE': csr_matrix(np.array([[1, 0, 0, 0, 0]]))},
'subE': {'interfaceAE': csr_matrix(np.array([[0, 0, 0, -1, 0, 0],
[0, 0, 0, 0, -1, 0]])),
'interfaceBE': csr_matrix(np.array([[0, -1, 0, 0, 0, 0],
[0, 0, 0, 0, -1, 0]])),
'interfaceCE': csr_matrix(np.array([[0, 0, 0, 0, 0, -1],
[0, -1, 0, 0, 0, 0]])),
'interfaceDE': csr_matrix(np.array([[0, 0, 0, 0, -1, 0]]))}}
self.custom_asserter = CustomDictAssertTest()
def tearDown(self):
pass
def test_update(self):
scaling_mat_dict_desired = {'subA': {'interfaceAD': csr_matrix(np.array([[1/4, 0],
[0, 1/2]])),
'interfaceAE': csr_matrix(np.array([[1/2, 0],
[0, 1/4]])),
'interfaceAB': csr_matrix(np.array([[1/4]]))},
'subB': {'interfaceBD': csr_matrix(np.array([[1/2, 0],
[0, 1/4]])),
'interfaceBC': csr_matrix(np.array([[1/2, 0],
[0, 1/3]])),
'interfaceAB': csr_matrix(np.array([[1/4]])),
'interfaceBE': csr_matrix(np.array([[1/3, 0],
[0, 1/4]]))},
'subC': {'interfaceBC': csr_matrix(np.array([[1/2, 0],
[0, 1/3]])),
'interfaceCE': csr_matrix(np.array([[1/2, 0],
[0, 1/3]]))},
'subD': {'interfaceAD': csr_matrix(np.array([[1/4, 0],
[0, 1/2]])),
'interfaceBD': csr_matrix(np.array([[1/2, 0],
[0, 1/4]])),
'interfaceDE': csr_matrix(np.array([[1/4]]))},
'subE': {'interfaceAE': csr_matrix(np.array([[1/2, 0],
[0, 1/4]])),
'interfaceBE': csr_matrix(np.array([[1/3, 0],
[0, 1/4]])),
'interfaceCE': csr_matrix(np.array([[1/2, 0],
[0, 1/3]])),
'interfaceDE': csr_matrix(np.array([[1/4]]))}}
for subs, scaling in self.scaling_dict.items():
scaling.update(self.B_dict[subs])
scaling_mat_desired = scaling_mat_dict_desired[subs]
for interface, int_scaling in scaling.scaling_dict.items():
assert_array_almost_equal(int_scaling.todense(), scaling_mat_desired[interface].todense())
def test_apply(self):
gap_dict = {'subA': {'interfaceAD': np.array([2, 2]),
'interfaceAE': np.array([2, 2]),
'interfaceAB': np.array([2])},
'subB': {'interfaceBD': np.array([2, 2]),
'interfaceBC': np.array([2, 2]),
'interfaceAB': np.array([2]),
'interfaceBE': np.array([2, 2])},
'subC': {'interfaceBC': np.array([2, 2]),
'interfaceCE': np.array([2, 2])},
'subD': {'interfaceAD': np.array([2, 2]),
'interfaceBD': np.array([2, 2]),
'interfaceDE': np.array([2])},
'subE': {'interfaceAE': np.array([2, 2]),
'interfaceBE': np.array([2, 2]),
'interfaceCE': np.array([2, 2]),
'interfaceDE': np.array([2])}}
gap_dict_desired = {'subA': {'interfaceAD': np.array([1/2, 1]),
'interfaceAE': np.array([1, 1/2]),
'interfaceAB': np.array([1/2])},
'subB': {'interfaceBD': np.array([1, 1/2]),
'interfaceBC': np.array([1, 2/3]),
'interfaceAB': np.array([1/2]),
'interfaceBE': np.array([2/3, 1/2])},
'subC': {'interfaceBC': np.array([1, 2/3]),
'interfaceCE': np.array([1, 2/3])},
'subD': {'interfaceAD': np.array([1/2, 1]),
'interfaceBD': np.array([1, 1/2]),
'interfaceDE': np.array([1/2])},
'subE': {'interfaceAE': np.array([1, 1/2]),
'interfaceBE': np.array([2/3, 1/2]),
'interfaceCE': np.array([1, 2/3]),
'interfaceDE': np.array([1/2])}}
for subs, scaling in self.scaling_dict.items():
scaling.update(self.B_dict[subs])
gap_dict_actual = scaling.apply(gap_dict[subs])
self.custom_asserter.assert_dict_almost_equal(gap_dict_actual, gap_dict_desired[subs])
| 64.229008 | 106 | 0.334561 | 728 | 8,414 | 3.763736 | 0.090659 | 0.062774 | 0.054745 | 0.186861 | 0.7 | 0.613869 | 0.475182 | 0.413869 | 0.365328 | 0.297445 | 0 | 0.071853 | 0.520323 | 8,414 | 130 | 107 | 64.723077 | 0.607037 | 0.003803 | 0 | 0.173913 | 0 | 0 | 0.095989 | 0 | 0 | 0 | 0 | 0 | 0.043478 | 1 | 0.034783 | false | 0.008696 | 0.052174 | 0 | 0.095652 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
be0415cd009890d1be04ff42711ff94802a4ac3c | 2,232 | py | Python | app/models2.py | migot01/Maintraq | 4ab547ec3c7cb1bfc3769f3e32a33a82cfa42d0a | [
"MIT"
] | null | null | null | app/models2.py | migot01/Maintraq | 4ab547ec3c7cb1bfc3769f3e32a33a82cfa42d0a | [
"MIT"
] | 10 | 2018-05-27T12:15:07.000Z | 2019-10-21T16:14:33.000Z | app/models2.py | migot01/Maintraq | 4ab547ec3c7cb1bfc3769f3e32a33a82cfa42d0a | [
"MIT"
] | null | null | null | import uuid
from werkzeug.security import generate_password_hash, check_password_hash
from app.helpers import create_user, check_email,get_title,create_request
from datetime import datetime, timedelta
import jwt
from app import app
class User(object):
""" """
def __init__(self, email=None, first_name=None, last_name=None, password=None):
self.email = email
self.password = password
self.first_name = first_name
self.last_name = last_name
def save(self):
"""
creates a new user and appends to the list
"""
if check_email(self.email) is False:
# return relevant message for email regestered
return ("Fail", "Email already exists")
else:
#insert user
self.password = self.password
res = create_user(self.email, self.first_name, self.last_name, self.password)
return ("Success", "User created!")
def generate_token(self, email, is_admin, UserID, role):
"""
Generates the Auth Token for the currently logging in user
:returns: string
"""
try:
payload = {
'exp': datetime.utcnow() + timedelta(minutes=60),
'iat': datetime.utcnow(),
'sub': email,
'user_id': UserID,
'role': role
}
return jwt.encode(
payload,
app.config["SECRET_KEY"],
algorithm='HS256'
)
except Exception as e:
return e
class Requests(object):
"""request model. stores all request data"""
def __init__(self, title=None, location=None, body=None,UserID=None):
self.title = title
self.location = location
self.body = body
self.UserID = UserID
def add_requests(self):
"""Adds a new request to the requests dictionary"""
if get_title(self.title):
return ("Fail", "Title already exists")
else:
res = create_request(self.title,self.location,self.body,self.UserID)
return ("Success", res)
| 32.347826 | 89 | 0.560036 | 245 | 2,232 | 4.967347 | 0.383673 | 0.036976 | 0.018077 | 0.027938 | 0.034511 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003434 | 0.34767 | 2,232 | 68 | 90 | 32.823529 | 0.832418 | 0.116487 | 0 | 0.042553 | 1 | 0 | 0.060274 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.106383 | false | 0.106383 | 0.12766 | 0 | 0.404255 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
be166920d577cb6b7a00f38baf9ee4eb65550488 | 274 | py | Python | basic/samples/pdf/__init__.py | fplust/python3-cookbook | 0eaca2e3631bb69deaf466c32023bbb2093513da | [
"Apache-2.0"
] | 1 | 2019-07-25T09:09:54.000Z | 2019-07-25T09:09:54.000Z | basic/samples/pdf/__init__.py | fplust/python3-cookbook | 0eaca2e3631bb69deaf466c32023bbb2093513da | [
"Apache-2.0"
] | null | null | null | basic/samples/pdf/__init__.py | fplust/python3-cookbook | 0eaca2e3631bb69deaf466c32023bbb2093513da | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- encoding: utf-8 -*-
"""
Topic: sample
Desc :
Python操作PDF:
1. 读取内容使用slate 0.4.1(基于PDFMiner),只能用于Python2
2. 各种PDF操作,包括读取内容、合并、分割、旋转、提取页面等等使用PyPDF2
目前最好的做法是使用PDFMiner,安装之后:
pdf2txt.py -o pc.txt /home/mango/work/perfect.pdf
然后自己去分享pc.txt文件即可
"""
| 16.117647 | 49 | 0.726277 | 39 | 274 | 5.102564 | 0.974359 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036735 | 0.105839 | 274 | 16 | 50 | 17.125 | 0.77551 | 0.952555 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
be1c73b73240a55d77edd0360fbbb7c0a71f072d | 1,077 | py | Python | intro-python/parsing-json/nested_data.py | Truxton-Marley/dne-dna-code | 129aefd27c3acca2b4a481448b06818444125b80 | [
"MIT"
] | null | null | null | intro-python/parsing-json/nested_data.py | Truxton-Marley/dne-dna-code | 129aefd27c3acca2b4a481448b06818444125b80 | [
"MIT"
] | null | null | null | intro-python/parsing-json/nested_data.py | Truxton-Marley/dne-dna-code | 129aefd27c3acca2b4a481448b06818444125b80 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
"""Working with nested data hands-on exercise / coding challenge."""
import json
import os
import pprint
# Get the absolute path for the directory where this file is located "here"
here = os.path.abspath(os.path.dirname(__file__))
with open(os.path.join(here, "interfaces.json")) as file:
# TODO: Parse the contents of the JSON file into a variable
json_text = file.read()
# print("JSON text is a", type(json_text))
# print(json_text)
json_data = json.loads(json_text)
# print("JSON data is a", type(json_data))
print("\nJSON file as a python dict:\n")
pprint.pprint(json_data)
# TODO: Loop through the interfaces in the JSON data and print out each
# interface's name, ip, and netmask.
print("The following IPv4 interfaces are configured:\n")
for interface in json_data["ietf-interfaces:interfaces"]["interface"]:
print(interface["name"] + ":")
print(" IP:", interface["ietf-ip:ipv4"]["address"][0]["ip"])
print(" SM:", interface["ietf-ip:ipv4"]["address"][0]["netmask"], "\n") | 31.676471 | 80 | 0.673166 | 160 | 1,077 | 4.45625 | 0.43125 | 0.067321 | 0.036466 | 0.030856 | 0.075736 | 0.075736 | 0 | 0 | 0 | 0 | 0 | 0.005663 | 0.18013 | 1,077 | 34 | 80 | 31.676471 | 0.801812 | 0.389044 | 0 | 0 | 0 | 0 | 0.297214 | 0.040248 | 0 | 0 | 0 | 0.029412 | 0 | 1 | 0 | false | 0 | 0.214286 | 0 | 0.214286 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
be2436cd4f2689e0d4337540f5b6b4d721151bcf | 370 | py | Python | sklearn_protocols/__init__.py | hmasdev/sklearn-protocols | 7906dabbd2a1f634fa08b5623cc41a2dc46e3c8d | [
"MIT"
] | null | null | null | sklearn_protocols/__init__.py | hmasdev/sklearn-protocols | 7906dabbd2a1f634fa08b5623cc41a2dc46e3c8d | [
"MIT"
] | null | null | null | sklearn_protocols/__init__.py | hmasdev/sklearn-protocols | 7906dabbd2a1f634fa08b5623cc41a2dc46e3c8d | [
"MIT"
] | null | null | null | from .model import (
RegressorProtocol,
ClassifierProtocol,
ClassifierWithPredictProbaProtocol,
TransformerProtocol,
ClusterProtocol,
)
__version__ = "0.0"
__all__ = [
RegressorProtocol.__name__,
ClassifierProtocol.__name__,
ClassifierWithPredictProbaProtocol.__name__,
TransformerProtocol.__name__,
ClusterProtocol.__name__,
]
| 20.555556 | 48 | 0.759459 | 22 | 370 | 11.5 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006536 | 0.172973 | 370 | 17 | 49 | 21.764706 | 0.820261 | 0 | 0 | 0 | 0 | 0 | 0.008108 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.066667 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
be25674c10d076e77faed4b734051c49178e2003 | 256 | py | Python | hsx_01/meiduo_mall/meiduo_mall/apps/goods/urls.py | hsx9527/test01 | 06cd5886af8707dd3c18da7b96cb0192b5c7e861 | [
"MIT"
] | null | null | null | hsx_01/meiduo_mall/meiduo_mall/apps/goods/urls.py | hsx9527/test01 | 06cd5886af8707dd3c18da7b96cb0192b5c7e861 | [
"MIT"
] | 7 | 2021-01-06T20:08:25.000Z | 2022-02-27T09:54:45.000Z | hsx_01/meiduo_mall/meiduo_mall/apps/goods/urls.py | hsx9527/test01 | 06cd5886af8707dd3c18da7b96cb0192b5c7e861 | [
"MIT"
] | null | null | null | from django.urls import re_path
from .views import *
urlpatterns = [
re_path(r'^list/(?P<category_id>\d+)/skus/$', ListView.as_view()),
re_path(r'^hot/(?P<category_id>\d+)/$', HotGoodsView.as_view()),
# re_path(r'^search/$', MySearchView()),
] | 32 | 70 | 0.648438 | 38 | 256 | 4.157895 | 0.578947 | 0.151899 | 0.132911 | 0.151899 | 0.164557 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121094 | 256 | 8 | 71 | 32 | 0.702222 | 0.148438 | 0 | 0 | 0 | 0 | 0.276498 | 0.276498 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
be2af7291be92c33f0577821e801d821ff916907 | 1,243 | py | Python | kubeflow_batch_predict/dataflow/_aggregators.py | yupbank/batch-predict | 0da05ed87e15ad0a89eeccbf42acae0272359324 | [
"Apache-2.0"
] | 17 | 2018-10-02T05:16:08.000Z | 2022-01-30T10:53:13.000Z | kubeflow_batch_predict/dataflow/_aggregators.py | ggaaooppeenngg/batch-predict | 0da05ed87e15ad0a89eeccbf42acae0272359324 | [
"Apache-2.0"
] | 20 | 2018-04-30T22:54:20.000Z | 2019-07-19T13:56:38.000Z | kubeflow_batch_predict/dataflow/_aggregators.py | ggaaooppeenngg/batch-predict | 0da05ed87e15ad0a89eeccbf42acae0272359324 | [
"Apache-2.0"
] | 10 | 2018-04-30T21:50:41.000Z | 2020-11-27T10:37:39.000Z | # Copyright 2018 Google Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Metric library for Kubeflow batch prediction dataflow transforms.
"""
# TODO(user): Get rid of this file and instead just use counters and other
# metrics directly in the body of the pertinent DoFns.
from apache_beam.metrics import Metrics
class AggregatorName(object):
"""Names of the metrics."""
ML_PREDICTIONS = "ml-predictions"
ML_FAILED_PREDICTIONS = "ml-failed-predictions"
# The aggregator config.
CONFIG_ = [AggregatorName.ML_PREDICTIONS, AggregatorName.ML_FAILED_PREDICTIONS]
def CreateAggregatorsDict(namespace="main"):
"""Creates metrics dict."""
return {name: Metrics.counter(namespace, name) for name in CONFIG_}
| 34.527778 | 79 | 0.766693 | 175 | 1,243 | 5.394286 | 0.611429 | 0.063559 | 0.060381 | 0.033898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007597 | 0.152856 | 1,243 | 35 | 80 | 35.514286 | 0.888889 | 0.666935 | 0 | 0 | 0 | 0 | 0.102094 | 0.054974 | 0 | 0 | 0 | 0.028571 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
077801de804bb3835e319add331a064fbf8e9491 | 282 | py | Python | contact/models.py | agiledesign2/django-base-ajax | 674101c8a8a49dbf85df88e43a855e1509bea90a | [
"MIT"
] | 7 | 2020-12-12T23:54:55.000Z | 2022-01-19T11:37:02.000Z | contact/models.py | agiledesign2/django-base-ajax | 674101c8a8a49dbf85df88e43a855e1509bea90a | [
"MIT"
] | 7 | 2020-05-16T09:44:23.000Z | 2022-02-10T10:38:18.000Z | contact/models.py | agiledesign2/django-base-ajax | 674101c8a8a49dbf85df88e43a855e1509bea90a | [
"MIT"
] | null | null | null | from django.db import models
# Create your models here.
class Contact(models.Model):
name = models.CharField(max_length = 100)
email = models.EmailField()
message = models.TextField()
timestamp = models.DateTimeField(auto_now_add = True)
def __str__(self):
return self.name | 25.636364 | 54 | 0.758865 | 38 | 282 | 5.447368 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012346 | 0.138298 | 282 | 11 | 55 | 25.636364 | 0.839506 | 0.085106 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0.125 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
0783186d0d640b8ef45a2ec50bbde8f4774e1fa6 | 1,695 | py | Python | chemdataextractor/scrape/__init__.py | edbeard/chemdataextractor-uvvis2018 | a5750d5313a250468e29d244cd4aeafdfc3250da | [
"MIT"
] | 6 | 2019-12-05T17:10:19.000Z | 2021-08-10T15:15:10.000Z | chemdataextractor/scrape/__init__.py | edbeard/chemdataextractor-uvvis2018 | a5750d5313a250468e29d244cd4aeafdfc3250da | [
"MIT"
] | null | null | null | chemdataextractor/scrape/__init__.py | edbeard/chemdataextractor-uvvis2018 | a5750d5313a250468e29d244cd4aeafdfc3250da | [
"MIT"
] | 2 | 2020-06-29T06:58:53.000Z | 2021-03-21T08:39:36.000Z | # -*- coding: utf-8 -*-
"""
chemdataextractor.scrape
~~~~~~~~~~~~~~~~~~~~~~~~
Declarative scraping framework for extracting structured data from HTML and XML documents.
:copyright: Copyright 2016 by Matt Swain.
:license: MIT, see LICENSE file for more details.
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
#: Block level HTML elements
BLOCK_ELEMENTS = {
'p', 'h1', 'h2', 'h3', 'h4', 'h5', 'h6', 'ul', 'ol', 'pre', 'dd', 'dl', 'div', 'noscript', 'blockquote', 'form',
'hr', 'table', 'fieldset', 'address', 'article', 'aside', 'audio', 'canvas', 'figcaption', 'figure', 'footer',
'header', 'hgroup', 'output', 'section', 'body', 'head', 'title', 'tr', 'td', 'th', 'thead', 'tfoot', 'dt', 'li',
'tbody',
}
#: Inline level HTML elements
INLINE_ELEMENTS = {
'b', 'big', 'i', 'small', 'tt', 'abbr', 'acronym', 'cite', 'code', 'dfn', 'em', 'kbd', 'strong', 'samp', 'var',
'a', 'bdo', 'br', 'img', 'map', 'object', 'q', 'script', 'span', 'sub', 'sup', 'button', 'input', 'label',
'select', 'textarea', 'blink', 'font', 'marquee', 'nobr', 's', 'strike', 'u', 'wbr',
}
from .clean import Cleaner, clean, clean_html, clean_markup
from .entity import Entity, EntityList, DocumentEntity
from .fields import StringField, IntField, FloatField, BoolField, DateTimeField, EntityField, UrlField
from .scraper import HtmlFormat, XmlFormat, GetRequester, PostRequester, UrlScraper, RssScraper, SearchScraper
from .selector import Selector, SelectorList
from .pub.nlm import NlmXmlDocument
from .pub.rsc import RscHtmlDocument
from .pub.springer import SpringerXmlDocument
| 41.341463 | 117 | 0.666667 | 198 | 1,695 | 5.590909 | 0.767677 | 0.036134 | 0.057814 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007534 | 0.138643 | 1,695 | 40 | 118 | 42.375 | 0.750685 | 0.184071 | 0 | 0 | 0 | 0 | 0.238165 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.521739 | 0 | 0.521739 | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
07909dd9c158084c4389296bb851051f10acc7f8 | 948 | py | Python | tests/parsing/test_contract_items.py | Morloth1274/EVE-Online-POCO-manager | f35e6ac49c4451a16011a8c15115c5229b3ce887 | [
"MIT"
] | 1 | 2017-05-06T22:58:13.000Z | 2017-05-06T22:58:13.000Z | tests/parsing/test_contract_items.py | Morloth1274/EVE-Online-POCO-manager | f35e6ac49c4451a16011a8c15115c5229b3ce887 | [
"MIT"
] | null | null | null | tests/parsing/test_contract_items.py | Morloth1274/EVE-Online-POCO-manager | f35e6ac49c4451a16011a8c15115c5229b3ce887 | [
"MIT"
] | null | null | null | from tests.compat import unittest
from tests.utils import make_api_result
from evelink.parsing import contract_items as evelink_c
class ContractItemsTestCase(unittest.TestCase):
def test_parse_contract_items(self):
api_result, _, _ = make_api_result("char/contract_items.xml")
result = evelink_c.parse_contract_items(api_result)
self.assertEqual(result, [
{'id': 779703190, 'quantity': 490, 'type_id': 17867, 'action': 'offered', 'singleton': False},
{'id': 779703191, 'quantity': 60, 'type_id': 17868, 'action': 'offered', 'singleton': False},
{'id': 779703192, 'quantity': 8360, 'type_id': 1228, 'action': 'offered', 'singleton': False},
{'id': 779703193, 'quantity': 16617, 'type_id': 1228, 'action': 'offered', 'singleton': False},
{'id': 779703194, 'quantity': 1, 'type_id': 973, 'action': 'offered', 'singleton': True, 'raw_quantity': -2},
])
| 52.666667 | 121 | 0.647679 | 109 | 948 | 5.422018 | 0.449541 | 0.050761 | 0.186125 | 0.182741 | 0.230118 | 0.13198 | 0.13198 | 0.13198 | 0 | 0 | 0 | 0.106771 | 0.189873 | 948 | 17 | 122 | 55.764706 | 0.66276 | 0 | 0 | 0 | 0 | 0 | 0.242872 | 0.024287 | 0 | 0 | 0 | 0 | 0.071429 | 1 | 0.071429 | false | 0 | 0.214286 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0798c1d8c8cbcc78a14fb67a7ec6b14e2d1cf763 | 2,419 | py | Python | newest/state.py | larsjsol/Newest | a4d49f4a492453d8042d981d097dcd1c4cd1798a | [
"MIT"
] | null | null | null | newest/state.py | larsjsol/Newest | a4d49f4a492453d8042d981d097dcd1c4cd1798a | [
"MIT"
] | null | null | null | newest/state.py | larsjsol/Newest | a4d49f4a492453d8042d981d097dcd1c4cd1798a | [
"MIT"
] | null | null | null | #!/usr/bin/env python
from datetime import datetime, timezone
from zmq.utils.jsonapi import dumps, loads
from dateutil.parser import parser
class State:
"""
The current known state of software versions.
"""
protocol_version = (0, 1)
def __init__(self):
self.software_versions = {}
def serialize(self):
"""
Return a representation that is suitable for
transmitting over the network.
"""
_dict = {}
_dict['protocol_version'] = State.protocol_version
_dict['software_versions'] = [v.to_dict() for v in self.software_versions.values()]
return dumps(_dict)
@staticmethod
def deserialize(json):
"""
Deserialize a json into a State object.
"""
state = State()
_dict = loads(json)
if tuple(_dict['protocol_version']) > State.protocol_version:
raise Exception('Protocol version {} not supported'.format(_dict['protocol_version']))
for ver in _dict['software_versions']:
state.update(ver['name'], ver['version'], parser().parse(ver['last_updated']))
return state
def update(self, software_name, version_id, last_updated=None):
"""
Update the state with a new software version.
"""
self.software_versions[software_name] = SoftwareVersion(software_name, version_id, last_updated)
def __str__(self):
return '\n'.join([str(k) for k in self.software_versions.values()])
class SoftwareVersion:
"""
The latest version for a single pice of a software packge.
name (string): name of the software.
version (string): a string (often a version number or a hash) identifying this version.
last_update (datetime): last time this entry was updated
"""
def __init__(self, name, version_id, last_updated=None):
self.name = name
self.version = version_id
if last_updated:
self.last_updated = last_updated
else:
self.last_updated = datetime.now(timezone.utc)
def __str__(self):
return "{} {} last updated: {}".format(self.name, self.version, self.last_updated.isoformat())
def to_dict(self):
"""
Return this object as a dict. Helper function for serialization
"""
return {'name': self.name, 'version': self.version, 'last_updated': self.last_updated.isoformat()}
| 32.689189 | 106 | 0.639107 | 291 | 2,419 | 5.116838 | 0.32646 | 0.08865 | 0.053727 | 0.034251 | 0.189389 | 0.116857 | 0 | 0 | 0 | 0 | 0 | 0.001106 | 0.25217 | 2,419 | 73 | 107 | 33.136986 | 0.822001 | 0.220752 | 0 | 0.054054 | 0 | 0 | 0.1062 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.216216 | false | 0 | 0.081081 | 0.054054 | 0.513514 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
0799064b62c2275fe1b55d9f4ed4af93f65bbe09 | 8,563 | py | Python | proxylist/base.py | lorien/proxylist | 7d078fcd5d82a8aa45d9849d7a75335f27b8f7bf | [
"MIT"
] | 15 | 2015-10-08T18:08:29.000Z | 2022-03-02T17:58:42.000Z | proxylist/base.py | lorien/proxylist | 7d078fcd5d82a8aa45d9849d7a75335f27b8f7bf | [
"MIT"
] | 1 | 2020-01-25T18:11:15.000Z | 2020-01-26T12:17:24.000Z | proxylist/base.py | lorien/proxylist | 7d078fcd5d82a8aa45d9849d7a75335f27b8f7bf | [
"MIT"
] | 11 | 2015-10-16T14:13:06.000Z | 2020-06-12T13:45:59.000Z | """
Module to load proxy list and select/rotate proxies from it.
"""
from __future__ import absolute_import
import re
import itertools
import logging
from random import randint
import six
from six.moves.urllib.request import urlopen
__all__ = ('Proxy', 'InvalidProxyLine', 'FileProxySource',
'WebProxySource', 'ListProxySource', 'ProxyList')
RE_SIMPLE_PROXY = re.compile(r'^([^:]+):([^:]+)$')
RE_AUTH_PROXY = re.compile(r'^([^:]+):([^:]+):([^:]+):([^:]+)$')
PROXY_STANDARD_ATTRS = ('host', 'port', 'username', 'password')
logger = logging.getLogger('proxylist')
class Proxy(object):
__slots__ = ('host', 'port', 'username', 'password',
'proxy_type', 'meta')
def __init__(self, host=None, port=None, username=None,
password=None, proxy_type=None, meta=None):
self.host = host
self.port = port
self.username = username
self.password = password
self.proxy_type = proxy_type
if meta is None:
self.meta = {}
else:
self.meta = meta
def __str__(self):
return '<Proxy %s:%s>' % (self.host, self.port)
def address(self):
return '%s:%s' % (self.host, self.port)
def get_address(self):
# TODO: deprecation warning
return self.address()
def userpwd(self):
if self.username:
return '%s:%s' % (self.username, self.password or '')
else:
return None
class ProxyListError(Exception):
pass
class InvalidProxyLine(ProxyListError):
pass
class NotEnoughData(ProxyListError):
pass
def parse_proxy_line(line):
"""
Parse proxy details from the raw text line.
The text line could be in one of the following formats:
* host:port
* host:port:username:password
"""
line = line.strip()
match = RE_SIMPLE_PROXY.search(line)
if match:
return match.group(1), match.group(2), None, None
match = RE_AUTH_PROXY.search(line)
if match:
host, port, user, pwd = match.groups()
return host, port, user, pwd
raise InvalidProxyLine('Invalid proxy line: %s' % line)
def parse_raw_list_data(data, proxy_type='http', proxy_userpwd=None,
item_format=None):
"Iterate over proxy servers found in the raw data"
if not isinstance(data, six.text_type):
data = data.decode('utf-8')
if item_format is None:
rex_format = None
else:
strict_item_format = '^%s$' % item_format.lstrip('^').rstrip('$')
rex_format = re.compile(strict_item_format, re.I)
for orig_line in data.splitlines():
line = orig_line.strip()
# If not format is defined then try few common
# format to extract proxy data from each item in the list
if rex_format is None:
line = line.replace(' ', '')
if line and not line.startswith('#'):
try:
host, port, username, password = parse_proxy_line(line)
except InvalidProxyLine as ex:
logger.error(ex)
continue
else:
proxy = Proxy(host, port, username, password, proxy_type)
else:
continue
else:
match = rex_format.match(line)
if match is None:
ex = InvalidProxyLine('Proxy line %s does not match format %s'
% (line, item_format))
logger.error(ex)
continue
data = match.groupdict()
meta = dict((x, y) for x, y in data.items()
if x not in PROXY_STANDARD_ATTRS)
proxy = Proxy(data['host'], data['port'],
data.get('username'), data.get('password'),
proxy_type=proxy_type, meta=meta)
for key in ('host', 'port', 'username', 'password', 'proxy_type'):
if key in data:
del data[key]
if proxy.username is None and proxy_userpwd is not None:
username, password = proxy_userpwd.split(':')
proxy.username = username
proxy.password = password
yield proxy
class BaseProxySource(object):
def __init__(self, proxy_type='http', proxy_userpwd=None,
item_format=None, filter_callback=None, **kwargs):
self.proxy_type = proxy_type
self.item_format = item_format
self.proxy_userpwd = proxy_userpwd
if filter_callback is not None:
self.filter_callback = filter_callback
else:
self.filter_callback = lambda x: True
def load_raw_data(self):
raise NotImplementedError
def load(self):
data = self.load_raw_data()
return list(filter(self.filter_callback, parse_raw_list_data(
data,
proxy_type=self.proxy_type,
proxy_userpwd=self.proxy_userpwd,
item_format=self.item_format,
)))
class FileProxySource(BaseProxySource):
"Proxy source that loads list from the file"
def __init__(self, path, **kwargs):
self.path = path
super(FileProxySource, self).__init__(**kwargs)
def load_raw_data(self):
with open(self.path) as inp:
return inp.read()
class WebProxySource(BaseProxySource):
"Proxy source that loads list from web resource"
def __init__(self, url, timeout=5, try_count=3, **kwargs):
self.url = url
self.timeout = timeout
self.try_count = try_count
super(WebProxySource, self).__init__(**kwargs)
def load_raw_data(self):
for count in range(self.try_count):
try:
data = urlopen(url=self.url, timeout=self.timeout).read()
except Exception as ex: # TODO: more specific exceptions
if count >= (self.try_count - 1):
raise
else:
return data.decode('utf-8')
class ListProxySource(BaseProxySource):
"""That proxy source that loads list from
python list of strings"""
def __init__(self, items, **kwargs):
self.items = items
super(ListProxySource, self).__init__(**kwargs)
def load_raw_data(self):
return '\n'.join(self.items)
class ProxyList(object):
"""
Class to work with proxy list.
"""
def __init__(self, source=None, from_file=None, from_url=None, meta=None):
"""
Creates ProxyList object and optionally initialize proxy source.
Only one arg of (source, from_file, from_url)
could be defined
"""
assert sum(1 for x in (source, from_file, from_url)
if x is not None) <= 1
if from_file:
source = FileProxySource(from_file)
elif from_url:
source = WebProxySource(from_url)
self.set_source(source)
self.meta = meta or {}
def set_source(self, source):
"Set the proxy source and use it to load proxy list"
self._source = source
if source:
self.load()
else:
self._list = []
self._list_iter = None
def load_file(self, path, **kwargs):
"Load proxy list from file"
self.set_source(FileProxySource(path, **kwargs))
def load_url(self, url, **kwargs):
"Load proxy list from web document"
self.set_source(WebProxySource(url, **kwargs))
def load_list(self, items, **kwargs):
"Load proxy list from python list"
self.set_source(ListProxySource(items, **kwargs))
def load(self):
"Load proxy list from configured proxy source"
self._list = self._source.load()
self._list_iter = itertools.cycle(self._list)
def random(self):
"Return random proxy"
idx = randint(0, len(self._list) - 1)
return self._list[idx]
def next(self):
"Return next proxy"
return next(self._list_iter)
def size(self):
"Return number of proxies in the list"
return len(self._list)
def __iter__(self):
return iter(self._list)
def __len__(self):
return len(self._list)
def __getitem__(self, key):
return self._list[key]
# Deprecated stuff
def get_next_proxy(self):
"Return next proxy"
# TODO: deprecation warning
return self.next()
def get_random_proxy(self):
"Return random proxy"
# TODO: deprecation warning
return self.random()
| 30.365248 | 78 | 0.588929 | 1,022 | 8,563 | 4.748532 | 0.188845 | 0.025963 | 0.016073 | 0.029672 | 0.174943 | 0.115393 | 0.073975 | 0.037503 | 0.017721 | 0 | 0 | 0.001852 | 0.306435 | 8,563 | 281 | 79 | 30.47331 | 0.815289 | 0.127759 | 0 | 0.173267 | 0 | 0 | 0.101589 | 0.004228 | 0 | 0 | 0 | 0.010676 | 0.004951 | 1 | 0.148515 | false | 0.069307 | 0.034653 | 0.034653 | 0.326733 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
07a03fb145604472b82a3c923653a7dffcd6b60c | 12,805 | py | Python | dagobah/daemon/api.py | usertesting/dagobah | a60f921900f68708d19c68227349081d31737cd6 | [
"WTFPL"
] | 574 | 2015-01-01T23:43:17.000Z | 2022-03-29T13:15:57.000Z | dagobah/daemon/api.py | yutiansut/dagobah | e624180c2291034960302c9e0b818b65b5a7ee11 | [
"WTFPL"
] | 72 | 2015-01-09T22:10:52.000Z | 2020-10-06T05:13:06.000Z | dagobah/daemon/api.py | yutiansut/dagobah | e624180c2291034960302c9e0b818b65b5a7ee11 | [
"WTFPL"
] | 157 | 2015-01-04T02:40:50.000Z | 2021-12-20T08:13:22.000Z | """ HTTP API methods for Dagobah daemon. """
import StringIO
import json
from flask import request, abort, send_file
from flask_login import login_required
from .daemon import app
from .util import validate_dict, api_call, allowed_file
dagobah = app.config['dagobah']
@app.route('/api/jobs', methods=['GET'])
@login_required
@api_call
def get_jobs():
return dagobah._serialize().get('jobs', {})
@app.route('/api/job', methods=['GET'])
@login_required
@api_call
def get_job():
args = dict(request.args)
if not validate_dict(args,
required=['job_name'],
job_name=str):
abort(400)
job = dagobah.get_job(args['job_name'])
if not job:
abort(400)
return job._serialize()
@app.route('/api/logs', methods=['GET'])
@login_required
@api_call
def get_run_log_history():
args = dict(request.args)
if not validate_dict(args,
required=['job_name', 'task_name'],
job_name=str,
task_name=str):
abort(400)
job = dagobah.get_job(args['job_name'])
task = job.tasks.get(args['task_name'], None)
if not task:
abort(400)
return task.get_run_log_history()
@app.route('/api/log', methods=['GET'])
@login_required
@api_call
def get_log():
args = dict(request.args)
if not validate_dict(args,
required=['job_name', 'task_name', 'log_id'],
job_name=str,
task_name=str,
log_id=str):
abort(400)
job = dagobah.get_job(args['job_name'])
task = job.tasks.get(args['task_name'], None)
if not task:
abort(400)
return task.get_run_log(args['log_id'])
@app.route('/api/head', methods=['GET'])
@login_required
@api_call
def head_task():
args = dict(request.args)
if not validate_dict(args,
required=['job_name', 'task_name'],
job_name=str,
task_name=str,
stream=str,
num_lines=int):
abort(400)
job = dagobah.get_job(args['job_name'])
task = job.tasks.get(args['task_name'], None)
if not task:
abort(400)
call_args = {}
for key in ['stream', 'num_lines']:
if key in args:
call_args[key] = args[key]
return task.head(**call_args)
@app.route('/api/tail', methods=['GET'])
@login_required
@api_call
def tail_task():
args = dict(request.args)
if not validate_dict(args,
required=['job_name', 'task_name'],
job_name=str,
task_name=str,
stream=str,
num_lines=int):
abort(400)
job = dagobah.get_job(args['job_name'])
task = job.tasks.get(args['task_name'], None)
if not task:
abort(400)
call_args = {}
for key in ['stream', 'num_lines']:
if key in args:
call_args[key] = args[key]
return task.tail(**call_args)
@app.route('/api/add_job', methods=['POST'])
@login_required
@api_call
def add_job():
args = dict(request.form)
if not validate_dict(args,
required=['job_name'],
job_name=str):
abort(400)
dagobah.add_job(args['job_name'])
@app.route('/api/delete_job', methods=['POST'])
@login_required
@api_call
def delete_job():
args = dict(request.form)
if not validate_dict(args,
required=['job_name'],
job_name=str):
abort(400)
dagobah.delete_job(args['job_name'])
@app.route('/api/start_job', methods=['POST'])
@login_required
@api_call
def start_job():
args = dict(request.form)
if not validate_dict(args,
required=['job_name'],
job_name=str):
abort(400)
job = dagobah.get_job(args['job_name'])
job.start()
@app.route('/api/retry_job', methods=['POST'])
@login_required
@api_call
def retry_job():
args = dict(request.form)
if not validate_dict(args,
required=['job_name'],
job_name=str):
abort(400)
job = dagobah.get_job(args['job_name'])
job.retry()
@app.route('/api/add_task_to_job', methods=['POST'])
@login_required
@api_call
def add_task_to_job():
args = dict(request.form)
if not validate_dict(args,
required=['job_name', 'task_command', 'task_name'],
job_name=str,
task_command=str,
task_name=str,
task_target=str):
abort(400)
dagobah.add_task_to_job(args['job_name'],
args['task_command'],
args['task_name'],
hostname=args.get("task_target", None))
@app.route('/api/delete_task', methods=['POST'])
@login_required
@api_call
def delete_task():
args = dict(request.form)
if not validate_dict(args,
required=['job_name', 'task_name'],
job_name=str,
task_name=str):
abort(400)
job = dagobah.get_job(args['job_name'])
job.delete_task(args['task_name'])
@app.route('/api/add_dependency', methods=['POST'])
@login_required
@api_call
def add_dependency():
args = dict(request.form)
if not validate_dict(args,
required=['job_name',
'from_task_name',
'to_task_name'],
job_name=str,
from_task_name=str,
to_task_name=str):
abort(400)
job = dagobah.get_job(args['job_name'])
job.add_dependency(args['from_task_name'], args['to_task_name'])
@app.route('/api/delete_dependency', methods=['POST'])
@login_required
@api_call
def delete_dependency():
args = dict(request.form)
if not validate_dict(args,
required=['job_name',
'from_task_name',
'to_task_name'],
job_name=str,
from_task_name=str,
to_task_name=str):
abort(400)
job = dagobah.get_job(args['job_name'])
job.delete_dependency(args['from_task_name'], args['to_task_name'])
@app.route('/api/schedule_job', methods=['POST'])
@login_required
@api_call
def schedule_job():
args = dict(request.form)
if not validate_dict(args,
required=['job_name', 'cron_schedule'],
job_name=str,
cron_schedule=str):
abort(400)
if args['cron_schedule'] == '':
args['cron_schedule'] = None
job = dagobah.get_job(args['job_name'])
job.schedule(args['cron_schedule'])
@app.route('/api/stop_scheduler', methods=['POST'])
@login_required
@api_call
def stop_scheduler():
dagobah.scheduler.stop()
@app.route('/api/restart_scheduler', methods=['POST'])
@login_required
@api_call
def restart_scheduler():
dagobah.scheduler.restart()
@app.route('/api/terminate_all_tasks', methods=['POST'])
@login_required
@api_call
def terminate_all_tasks():
args = dict(request.form)
if not validate_dict(args,
required=['job_name'],
job_name=str):
abort(400)
job = dagobah.get_job(args['job_name'])
job.terminate_all()
@app.route('/api/kill_all_tasks', methods=['POST'])
@login_required
@api_call
def kill_all_tasks():
args = dict(request.form)
if not validate_dict(args,
required=['job_name'],
job_name=str):
abort(400)
job = dagobah.get_job(args['job_name'])
job.kill_all()
@app.route('/api/terminate_task', methods=['POST'])
@login_required
@api_call
def terminate_task():
args = dict(request.form)
if not validate_dict(args,
required=['job_name', 'task_name'],
job_name=str,
task_name=str):
abort(400)
job = dagobah.get_job(args['job_name'])
task = job.tasks.get(args['task_name'], None)
if not task:
abort(400)
task.terminate()
@app.route('/api/kill_task', methods=['POST'])
@login_required
@api_call
def kill_task():
args = dict(request.form)
if not validate_dict(args,
required=['job_name', 'task_name'],
job_name=str,
task_name=str):
abort(400)
job = dagobah.get_job(args['job_name'])
task = job.tasks.get(args['task_name'], None)
if not task:
abort(400)
task.kill()
@app.route('/api/edit_job', methods=['POST'])
@login_required
@api_call
def edit_job():
args = dict(request.form)
if not validate_dict(args,
required=['job_name'],
job_name=str,
name=str):
abort(400)
job = dagobah.get_job(args['job_name'])
del args['job_name']
job.edit(**args)
@app.route('/api/update_job_notes', methods=['POST'])
@login_required
@api_call
def update_job_notes():
args = dict(request.form)
if not validate_dict(args,
required=['job_name', 'notes'],
job_name=str,
notes=str):
abort(400)
job = dagobah.get_job(args['job_name'])
job.update_job_notes(args['notes'])
@app.route('/api/edit_task', methods=['POST'])
@login_required
@api_call
def edit_task():
args = dict(request.form)
if not validate_dict(args,
required=['job_name', 'task_name'],
job_name=str,
task_name=str,
name=str,
command=str,
soft_timeout=int,
hard_timeout=int,
hostname=str):
abort(400)
job = dagobah.get_job(args['job_name'])
task = job.tasks.get(args['task_name'], None)
if not task:
abort(400)
# validate host
if 'hostname' in args and args.get('hostname') not in dagobah.get_hosts():
# Check for empty host, if so then task is no longer remote
if not args.get('hostname'):
args['hostname'] = None
else:
abort(400)
del args['job_name']
del args['task_name']
job.edit_task(task.name, **args)
@app.route('/api/set_soft_timeout', methods=['POST'])
@login_required
@api_call
def set_soft_timeout():
args = dict(request.form)
if not validate_dict(args,
required=['job_name', 'task_name', 'soft_timeout'],
job_name=str,
task_name=str,
soft_timeout=int):
abort(400)
job = dagobah.get_job(args['job_name'])
task = job.tasks.get(args['task_name'], None)
if not task:
abort(400)
task.set_soft_timeout(args['soft_timeout'])
@app.route('/api/set_hard_timeout', methods=['POST'])
@login_required
@api_call
def set_hard_timeout():
args = dict(request.form)
if not validate_dict(args,
required=['job_name', 'task_name', 'hard_timeout'],
job_name=str,
task_name=str,
hard_timeout=int):
abort(400)
job = dagobah.get_job(args['job_name'])
task = job.tasks.get(args['task_name'], None)
if not task:
abort(400)
task.set_hard_timeout(args['hard_timeout'])
@app.route('/api/export_job', methods=['GET'])
@login_required
def export_job():
args = dict(request.args)
if not validate_dict(args,
required=['job_name'],
job_name=str):
abort(400)
job = dagobah.get_job(args['job_name'])
to_send = StringIO.StringIO()
to_send.write(json.dumps(job._serialize(strict_json=True)))
to_send.write('\n')
to_send.seek(0)
return send_file(to_send,
attachment_filename='%s.json' % job.name,
as_attachment=True)
@app.route('/api/import_job', methods=['POST'])
@login_required
@api_call
def import_job():
file = request.files['file']
if (file and allowed_file(file.filename, ['json'])):
dagobah.add_job_from_json(file.read(), destructive=True)
@app.route('/api/hosts', methods=['GET'])
@login_required
@api_call
def get_hosts():
return dagobah.get_hosts()
| 26.456612 | 78 | 0.552284 | 1,567 | 12,805 | 4.271857 | 0.068283 | 0.078428 | 0.047655 | 0.083657 | 0.734389 | 0.709591 | 0.703167 | 0.672393 | 0.550941 | 0.513295 | 0 | 0.012066 | 0.31394 | 12,805 | 483 | 79 | 26.511387 | 0.749915 | 0.00859 | 0 | 0.658031 | 0 | 0 | 0.119335 | 0.010326 | 0 | 0 | 0 | 0 | 0 | 1 | 0.07513 | false | 0 | 0.020725 | 0.005181 | 0.11658 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
07a24619073ef83553511b2ec9964a1d2db5833d | 185 | py | Python | src/check.py | AnnFernandes/Handwritten-Line-Text-Recognition-using-Deep-Learning-with-Tensorflow | f04dc4a9c57eb50228dead30cf5bd414ecb8606d | [
"Apache-2.0"
] | 1 | 2019-11-29T07:10:05.000Z | 2019-11-29T07:10:05.000Z | src/check.py | AnnFernandes/Handwritten-Line-Text-Recognition-using-Deep-Learning-with-Tensorflow | f04dc4a9c57eb50228dead30cf5bd414ecb8606d | [
"Apache-2.0"
] | null | null | null | src/check.py | AnnFernandes/Handwritten-Line-Text-Recognition-using-Deep-Learning-with-Tensorflow | f04dc4a9c57eb50228dead30cf5bd414ecb8606d | [
"Apache-2.0"
] | null | null | null | import numpy as np
batch_size = 10
img_size = (800, 64)
random_matrix = np.random.random((batch_size, img_size[0], img_size[1]))
print(random_matrix)
print(random_matrix.shape) | 23.125 | 73 | 0.735135 | 31 | 185 | 4.129032 | 0.516129 | 0.164063 | 0.265625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.056604 | 0.140541 | 185 | 8 | 74 | 23.125 | 0.748428 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.333333 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
07bcdd6e0b82c5c3cdc6e0362b69c6492c13915f | 26,852 | py | Python | demo/apm.py | kingdavid72/APMonitor | e03760f196a25f77c7c73082d1db652fac0b3a01 | [
"BSD-2-Clause-FreeBSD"
] | 4 | 2020-05-19T06:23:39.000Z | 2022-02-19T17:42:17.000Z | apm.py | KGkiotsalitis/bus-dispatching-in-rolling-horizons | 22594644615519e6ef62cfb83ffca1e51e8238cb | [
"MIT"
] | 1 | 2019-12-19T01:28:28.000Z | 2019-12-19T19:13:13.000Z | apm.py | KGkiotsalitis/bus-dispatching-in-rolling-horizons | 22594644615519e6ef62cfb83ffca1e51e8238cb | [
"MIT"
] | 9 | 2018-05-19T06:15:19.000Z | 2022-02-19T17:42:35.000Z | # Import
import csv
import math
import os
import random
import string
import time
import webbrowser
from contextlib import closing
import sys
# Get Python version
ver = sys.version_info[0]
#print('Version: '+str(ver))
if ver==2: # Python 2
import urllib
else: # Python 3+
import urllib.request, urllib.parse, urllib.error
#import socket
if ver==2: # Python 2
def cmd(server, app, aline):
'''Send a request to the server \n \
server = address of server \n \
app = application name \n \
aline = line to send to server \n'''
try:
# Web-server URL address
url_base = string.strip(server) + '/online/apm_line.php'
app = app.lower()
app.replace(" ", "")
params = urllib.urlencode({'p': app, 'a': aline})
f = urllib.urlopen(url_base, params)
# Stream solution output
if(aline=='solve'):
line = ''
while True:
char = f.read(1)
if not char:
break
elif char == '\n':
print(line)
line = ''
else:
line += char
# Send request to web-server
response = f.read()
except:
response = 'Failed to connect to server'
return response
def load_model(server,app,filename):
'''Load APM model file \n \
server = address of server \n \
app = application name \n \
filename = APM file name'''
# Load APM File
f = open(filename,'r')
aline = f.read()
f.close()
app = app.lower()
app.replace(" ","")
response = cmd(server,app,' '+aline)
return
def load_data(server,app,filename):
'''Load CSV data file \n \
server = address of server \n \
app = application name \n \
filename = CSV file name'''
# Load CSV File
f = open(filename,'r')
aline = f.read()
f.close()
app = app.lower()
app.replace(" ","")
response = cmd(server,app,'csv '+aline)
return
def get_ip(server):
'''Get current IP address \n \
server = address of server'''
# get ip address for web-address lookup
url_base = string.strip(server) + '/ip.php'
f = urllib.urlopen(url_base)
ip = string.strip(f.read())
return ip
def apm_t0(server,app,mode):
'''Retrieve restart file \n \
server = address of server \n \
app = application name \n \
mode = {'ss','mpu','rto','sim','est','ctl'} '''
# Retrieve IP address
ip = get_ip(server)
# Web-server URL address
app = app.lower()
app.replace(" ","")
url = string.strip(server) + '/online/' + ip + '_' + app + '/' + string.strip(mode) + '.t0'
f = urllib.urlopen(url)
# Send request to web-server
solution = f.read()
return solution
def get_solution(server,app):
'''Retrieve solution results\n \
server = address of server \n \
app = application name '''
# Retrieve IP address
ip = get_ip(server)
# Web-server URL address
app = app.lower()
app.replace(" ","")
url = string.strip(server) + '/online/' + ip + '_' + app + '/results.csv'
f = urllib.urlopen(url)
# Send request to web-server
solution = f.read()
# Write the file
sol_file = 'solution_' + app + '.csv'
fh = open(sol_file,'w')
# possible problem here if file isn't able to open (see MATLAB equivalent)
fh.write(solution.replace('\r',''))
fh.close()
# Use array package
from array import array
# Import CSV file from web server
with closing(urllib.urlopen(url)) as f:
reader = csv.reader(f, delimiter=',')
y={}
for row in reader:
if len(row)==2:
y[row[0]] = float(row[1])
else:
y[row[0]] = array('f', [float(col) for col in row[1:]])
# Return solution
return y
def get_file(server,app,filename):
'''Retrieve any file from web-server\n \
server = address of server \n \
app = application name '''
# Retrieve IP address
ip = get_ip(server)
# Web-server URL address
app = app.lower()
app.replace(" ","")
url = string.strip(server) + '/online/' + ip + '_' + app + '/' + filename
f = urllib.urlopen(url)
# Send request to web-server
file = f.read()
# Write the file
fh = open(filename,'w')
fh.write(file.replace('\r',''))
fh.close()
return (file)
def set_option(server,app,name,value):
'''Load APM option \n \
server = address of server \n \
app = application name \n \
name = {FV,MV,SV,CV}.option \n \
value = numeric value of option '''
aline = 'option %s = %f' %(name,value)
app = app.lower()
app.replace(" ","")
response = cmd(server,app,aline)
return response
def web(server,app):
'''Open APM web viewer in local browser \n \
server = address of server \n \
app = application name '''
# Retrieve IP address
ip = get_ip(server)
# Web-server URL address
app = app.lower()
app.replace(" ","")
url = string.strip(server) + '/online/' + ip + '_' + app + '/' + ip + '_' + app + '_oper.htm'
webbrowser.get().open_new_tab(url)
return url
def web_var(server,app):
'''Open APM web viewer in local browser \n \
server = address of server \n \
app = application name '''
# Retrieve IP address
ip = get_ip(server)
# Web-server URL address
app = app.lower()
app.replace(" ","")
url = string.strip(server) + '/online/' + ip + '_' + app + '/' + ip + '_' + app + '_var.htm'
webbrowser.get().open_new_tab(url)
return url
def web_root(server,app):
'''Open APM root folder \n \
server = address of server \n \
app = application name '''
# Retrieve IP address
ip = get_ip(server)
# Web-server URL address
app = app.lower()
app.replace(" ","")
url = string.strip(server) + '/online/' + ip + '_' + app + '/'
webbrowser.get().open_new_tab(url)
return url
def classify(server,app,type,aline):
'''Classify parameter or variable as FV, MV, SV, or CV \n \
server = address of server \n \
app = application name \n \
type = {FV,MV,SV,CV} \n \
aline = parameter or variable name '''
x = 'info' + ' ' + type + ', ' + aline
app = app.lower()
app.replace(" ","")
response = cmd(server,app,x)
return response
def csv_data(filename):
'''Load CSV File into Python
A = csv_data(filename)
Function csv_data extracts data from a comma
separated value (csv) file and returns it
to the array A'''
try:
f = open(filename, 'rb')
reader = csv.reader(f)
headers = reader.next()
c = [float] * (len(headers))
A = {}
for h in headers:
A[h] = []
for row in reader:
for h, v, conv in zip(headers, row, c):
A[h].append(conv(v))
except ValueError:
A = {}
return A
def csv_lookup(name,replay):
'''Lookup Index of CSV Column \n \
name = parameter or variable name \n \
replay = csv replay data to search'''
header = replay[0]
try:
i = header.index(string.strip(name))
except ValueError:
i = -1 # no match
return i
def csv_element(name,row,replay):
'''Retrieve CSV Element \n \
name = parameter or variable name \n \
row = row of csv file \n \
replay = csv replay data to search'''
# get row number
if (row>len(replay)): row = len(replay)-1
# get column number
col = csv_lookup(name,replay)
if (col>=0): value = float(replay[row][col])
else: value = float('nan')
return value
def get_attribute(server,app,name):
'''Retrieve options for FV, MV, SV, or CV \n \
server = address of server \n \
app = application name \n \
name = {FV,MV,SV,CV}.{MEAS,MODEL,NEWVAL} \n \n \
Valid name combinations \n \
{FV,MV,CV}.MEAS \n \
{SV,CV}.MODEL \n \
{FV,MV}.NEWVAL '''
# Web-server URL address
url_base = string.strip(server) + '/online/get_tag.php'
app = app.lower()
app.replace(" ","")
params = urllib.urlencode({'p':app,'n':name})
f = urllib.urlopen(url_base,params)
# Send request to web-server
value = eval(f.read())
return value
def load_meas(server,app,name,value):
'''Transfer measurement to server for FV, MV, or CV \n \
server = address of server \n \
app = application name \n \
name = name of {FV,MV,CV} '''
# Web-server URL address
url_base = string.strip(server) + '/online/meas.php'
app = app.lower()
app.replace(" ","")
params = urllib.urlencode({'p':app,'n':name+'.MEAS','v':value})
f = urllib.urlopen(url_base,params)
# Send request to web-server
response = f.read()
return response
else: # Python 3+
def cmd(server,app,aline):
'''Send a request to the server \n \
server = address of server \n \
app = application name \n \
aline = line to send to server \n'''
try:
# Web-server URL address
url_base = server.strip() + '/online/apm_line.php'
app = app.lower()
app.replace(" ","")
params = urllib.parse.urlencode({'p':app,'a':aline})
en_params = params.encode()
f = urllib.request.urlopen(url_base,en_params)
# Stream solution output
if(aline=='solve'):
line = ''
while True:
en_char = f.read(1)
char = en_char.decode()
if not char:
break
elif char == '\n':
print(line)
line = ''
else:
line += char
# Send request to web-server
en_response = f.read()
response = en_response.decode()
except:
response = 'Failed to connect to server'
return response
def load_model(server,app,filename):
'''Load APM model file \n \
server = address of server \n \
app = application name \n \
filename = APM file name'''
# Load APM File
f = open(filename,'r')
aline = f.read()
f.close()
app = app.lower()
app.replace(" ","")
response = cmd(server,app,' '+aline)
return
def load_data(server,app,filename):
'''Load CSV data file \n \
server = address of server \n \
app = application name \n \
filename = CSV file name'''
# Load CSV File
f = open(filename,'r')
aline = f.read()
f.close()
app = app.lower()
app.replace(" ","")
response = cmd(server,app,'csv '+aline)
return
def get_ip(server):
'''Get current IP address \n \
server = address of server'''
# get ip address for web-address lookup
url_base = server.strip() + '/ip.php'
f = urllib.request.urlopen(url_base)
fip = f.read()
ip = fip.decode().strip()
return ip
def apm_t0(server,app,mode):
'''Retrieve restart file \n \
server = address of server \n \
app = application name \n \
mode = {'ss','mpu','rto','sim','est','ctl'} '''
# Retrieve IP address
ip = get_ip(server)
# Web-server URL address
app = app.lower()
app.replace(" ","")
url = server.strip() + '/online/' + ip + '_' + app + '/' + mode.strip() + '.t0'
f = urllib.request.urlopen(url)
# Send request to web-server
solution = f.read()
return solution
def get_solution(server,app):
'''Retrieve solution results\n \
server = address of server \n \
app = application name '''
# Retrieve IP address
ip = get_ip(server)
# Web-server URL address
app = app.lower()
app.replace(" ","")
url = server.strip() + '/online/' + ip + '_' + app + '/results.csv'
f = urllib.request.urlopen(url)
# Send request to web-server
solution = f.read()
# Write the file
sol_file = 'solution_' + app + '.csv'
fh = open(sol_file,'w')
# possible problem here if file isn't able to open (see MATLAB equivalent)
en_solution = solution.decode().replace('\r','')
fh.write(en_solution)
fh.close()
# Use array package
from array import array
# Import CSV file from web server
with closing(urllib.request.urlopen(url)) as f:
fr = f.read()
de_f = fr.decode()
reader = csv.reader(de_f.splitlines(), delimiter=',')
y={}
for row in reader:
if len(row)==2:
y[row[0]] = float(row[1])
else:
y[row[0]] = array('f', [float(col) for col in row[1:]])
# Return solution
return y
def get_file(server,app,filename):
'''Retrieve any file from web-server\n \
server = address of server \n \
app = application name '''
# Retrieve IP address
ip = get_ip(server)
# Web-server URL address
app = app.lower()
app.replace(" ","")
url = server.strip() + '/online/' + ip + '_' + app + '/' + filename
f = urllib.request.urlopen(url)
# Send request to web-server
file = f.read()
# Write the file
fh = open(filename,'w')
en_file = file.decode().replace('\r','')
fh.write(en_file)
fh.close()
return (file)
def set_option(server,app,name,value):
'''Load APM option \n \
server = address of server \n \
app = application name \n \
name = {FV,MV,SV,CV}.option \n \
value = numeric value of option '''
aline = 'option %s = %f' %(name,value)
app = app.lower()
app.replace(" ","")
response = cmd(server,app,aline)
return response
def web(server,app):
'''Open APM web viewer in local browser \n \
server = address of server \n \
app = application name '''
# Retrieve IP address
ip = get_ip(server)
# Web-server URL address
app = app.lower()
app.replace(" ","")
url = server.strip() + '/online/' + ip + '_' + app + '/' + ip + '_' + app + '_oper.htm'
webbrowser.get().open_new_tab(url)
return url
def web_var(server,app):
'''Open APM web viewer in local browser \n \
server = address of server \n \
app = application name '''
# Retrieve IP address
ip = get_ip(server)
# Web-server URL address
app = app.lower()
app.replace(" ","")
url = server.strip() + '/online/' + ip + '_' + app + '/' + ip + '_' + app + '_var.htm'
webbrowser.get().open_new_tab(url)
return url
def web_root(server,app):
'''Open APM root folder \n \
server = address of server \n \
app = application name '''
# Retrieve IP address
ip = get_ip(server)
# Web-server URL address
app = app.lower()
app.replace(" ","")
url = server.strip() + '/online/' + ip + '_' + app + '/'
webbrowser.get().open_new_tab(url)
return url
def classify(server,app,type,aline):
'''Classify parameter or variable as FV, MV, SV, or CV \n \
server = address of server \n \
app = application name \n \
type = {FV,MV,SV,CV} \n \
aline = parameter or variable name '''
x = 'info' + ' ' + type + ', ' + aline
app = app.lower()
app.replace(" ","")
response = cmd(server,app,x)
return response
def csv_data(filename):
'''Load CSV File into Python
A = csv_data(filename)
Function csv_data extracts data from a comma
separated value (csv) file and returns it
to the array A'''
try:
f = open(filename, 'rb')
reader = csv.reader(f)
headers = next(reader)
c = [float] * (len(headers))
A = {}
for h in headers:
A[h] = []
for row in reader:
for h, v, conv in zip(headers, row, c):
A[h].append(conv(v))
except ValueError:
A = {}
return A
def csv_lookup(name,replay):
'''Lookup Index of CSV Column \n \
name = parameter or variable name \n \
replay = csv replay data to search'''
header = replay[0]
try:
i = header.index(name.strip())
except ValueError:
i = -1 # no match
return i
def csv_element(name,row,replay):
'''Retrieve CSV Element \n \
name = parameter or variable name \n \
row = row of csv file \n \
replay = csv replay data to search'''
# get row number
if (row>len(replay)): row = len(replay)-1
# get column number
col = csv_lookup(name,replay)
if (col>=0): value = float(replay[row][col])
else: value = float('nan')
return value
def get_attribute(server,app,name):
'''Retrieve options for FV, MV, SV, or CV \n \
server = address of server \n \
app = application name \n \
name = {FV,MV,SV,CV}.{MEAS,MODEL,NEWVAL} \n \n \
Valid name combinations \n \
{FV,MV,CV}.MEAS \n \
{SV,CV}.MODEL \n \
{FV,MV}.NEWVAL '''
# Web-server URL address
url_base = server.strip() + '/online/get_tag.php'
app = app.lower()
app.replace(" ","")
params = urllib.parse.urlencode({'p':app,'n':name})
params_en = params.encode()
f = urllib.request.urlopen(url_base,params_en)
# Send request to web-server
value = eval(f.read())
return value
def load_meas(server,app,name,value):
'''Transfer measurement to server for FV, MV, or CV \n \
server = address of server \n \
app = application name \n \
name = name of {FV,MV,CV} '''
# Web-server URL address
url_base = server.strip() + '/online/meas.php'
app = app.lower()
app.replace(" ","")
params = urllib.parse.urlencode({'p':app,'n':name+'.MEAS','v':value})
params_en = params.encode()
f = urllib.request.urlopen(url_base,params_en)
# Send request to web-server
response = f.read()
return response
def solve(app,imode):
'''
APM Solver for simulation, estimation, and optimization with both
static (steady-state) and dynamic models. The dynamic modes can solve
index 2+ DAEs without numerical differentiation.
y = solve(app,imode)
Function solve uploads the model file (apm) and optionally
a data file (csv) with the same name to the web-server and performs
a forward-time stepping integration of ODE or DAE equations
with the following arguments:
Input: app = model (apm) and data file (csv) name
imode = simulation mode {1..7}
steady-state dynamic sequential
simulate 1 4 7
estimate 2 5 8 (under dev)
optimize 3 6 9 (under dev)
Output: y.names = names of all variables
y.values = tables of values corresponding to y.names
y.nvar = number of variables
y.x = combined variables and values but variable
names may be modified to make them valid
characters (e.g. replace '[' with '')
'''
# server and application file names
server = 'http://byu.apmonitor.com'
app = app.lower()
app.replace(" ","")
app_model = app + '.apm'
app_data = app + '.csv'
# randomize the application name
from random import randint
app = app + '_' + str(randint(1000,9999))
# clear previous application
cmd(server,app,'clear all')
try:
# load model file
load_model(server,app,app_model)
except:
msg = 'Model file ' + app + '.apm does not exist'
print(msg)
return []
# check if data file exists (optional)
try:
# load data file
load_data(server,app,app_data)
except:
# data file is optional
print('Optional data file ' + app + '.csv does not exist')
pass
# default options
# use or don't use web viewer
web = False
if web:
set_option(server,app,'nlc.web',2)
else:
set_option(server,app,'nlc.web',0)
# internal nodes in the collocation (between 2 and 6)
set_option(server,app,'nlc.nodes',3)
# sensitivity analysis (default: 0 - off)
set_option(server,app,'nlc.sensitivity',0)
# simulation mode (1=ss, 2=mpu, 3=rto)
# (4=sim, 5=est, 6=nlc, 7=sqs)
set_option(server,app,'nlc.imode',imode)
# attempt solution
solver_output = cmd(server,app,'solve')
# check for successful solution
status = get_attribute(server,app,'nlc.appstatus')
if status==1:
# open web viewer if selected
if web:
web(server,app)
# retrieve solution and solution.csv
z = get_solution(server,app)
return z
else:
print(solver_output)
print('Error: Did not converge to a solution')
return []
def plotter(y, subplots=1, save=False, filename='solution', format='png'):
'''
The plotter will go through each of the variables in the output y and
create plots for them. The number of vertical subplots can be
specified and the plots can be saved in the same folder.
This functionality is dependant on matplotlib, so this library must
be installed on the computer for the automatic plotter to work.
The input y should be the output from the apm solution. This can be
retrieved from the server using the following line of code:
y = get_solution(server, app)
'''
try:
import matplotlib.pyplot as plt
var_size = len(y)
colors = ['r-', 'g-', 'k-', 'b-']
color_pick = 0
if subplots > 9:
subplots = 9
j = 1
pltcount = 0
start = True
for i in range(var_size):
if list(y)[i] != 'time' and list(y)[i][:3] != 'slk':
if j == 1:
if start != True:
plt.xlabel('time')
start = False
if save:
if pltcount != 0:
plt.savefig(filename + str(pltcount) + '.' + format, format=format)
pltcount += 1
plt.figure()
else:
plt.gca().axes.get_xaxis().set_ticklabels([])
plt.subplot(100*subplots+10+j)
plt.plot(y['time'], y[list(y)[i]], colors[color_pick], linewidth=2.0)
if color_pick == 3:
color_pick = 0
else:
color_pick += 1
plt.ylabel(list(y)[i])
if subplots == 1:
plt.title(list(y)[i])
if j == subplots or i+2 == var_size:
j = 1
else:
j += 1
plt.xlabel('time')
if save:
plt.savefig('plots/' + filename + str(pltcount) + '.' + format, format=format)
if pltcount <= 20:
plt.show()
except ImportError:
print('Dependent Packages not imported.')
print('Please install matplotlib package to use plotting features.')
except:
print('Graphs not created. Double check that the')
print('simulation/optimization was succesfull')
# This code adds back compatibility with previous versions
apm = cmd
apm_load = load_model
csv_load = load_data
apm_ip = get_ip
apm_sol = get_solution
apm_get = get_file
apm_option = set_option
apm_web = web
apm_web_var = web_var
apm_web_root = web_root
apm_info = classify
apm_tag = get_attribute
apm_meas = load_meas
apm_solve = solve
| 34.918075 | 102 | 0.492217 | 3,094 | 26,852 | 4.215902 | 0.120233 | 0.032429 | 0.030052 | 0.034345 | 0.724164 | 0.711898 | 0.694036 | 0.692502 | 0.686906 | 0.683839 | 0 | 0.00565 | 0.393602 | 26,852 | 768 | 103 | 34.963542 | 0.795431 | 0.295769 | 0 | 0.682327 | 0 | 0 | 0.059454 | 0.001417 | 0.004474 | 0 | 0 | 0 | 0 | 1 | 0.080537 | false | 0.002237 | 0.038031 | 0 | 0.201342 | 0.022371 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
07d1558589e5d512ae52b3ee99c9ad6abb644a8f | 662 | py | Python | main.py | curi0sity-97/photo-organizer | f8429895aded24288e672935dfb8821ea830b755 | [
"MIT"
] | null | null | null | main.py | curi0sity-97/photo-organizer | f8429895aded24288e672935dfb8821ea830b755 | [
"MIT"
] | null | null | null | main.py | curi0sity-97/photo-organizer | f8429895aded24288e672935dfb8821ea830b755 | [
"MIT"
] | null | null | null | from library_organizer import read_google_photos, read_manifest, retrieve_metadata, check_manifest_result,\
update_google_photos_images
def perform_sync():
tcur = datetime.now()
manifest = read_manifest()
check_manifest_result(manifest)
update_google_photos_images(manifest)
# update_google_photos_images()
print(f"Total time passed {(datetime.now() - tcur).total_seconds()} s")
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('-s', '--sync', help='Sync local Picasa files to Google Photos', action="store_true")
args = parser.parse_args()
if args.sync:
perform_sync | 26.48 | 109 | 0.725076 | 82 | 662 | 5.45122 | 0.54878 | 0.134228 | 0.120805 | 0.161074 | 0.143177 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.167674 | 662 | 25 | 110 | 26.48 | 0.811252 | 0.043807 | 0 | 0 | 0 | 0 | 0.200949 | 0.03481 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0.071429 | 0.071429 | 0 | 0.142857 | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
07d1e77455ad008f2c005cb8255a63809b3d7143 | 491 | py | Python | stepper/sample.py | kotaproj/esp32mpy_recipe | 104cb1eae31da37c9b80f2afaedc4964c1ed80a1 | [
"MIT"
] | null | null | null | stepper/sample.py | kotaproj/esp32mpy_recipe | 104cb1eae31da37c9b80f2afaedc4964c1ed80a1 | [
"MIT"
] | null | null | null | stepper/sample.py | kotaproj/esp32mpy_recipe | 104cb1eae31da37c9b80f2afaedc4964c1ed80a1 | [
"MIT"
] | null | null | null | from step import Stepper
MOTOR_STEPS = (2048)
PIN_MOTOR_1 = (27)
PIN_MOTOR_2 = (26)
PIN_MOTOR_3 = (25)
PIN_MOTOR_4 = (23)
my_motor = Stepper(MOTOR_STEPS, PIN_MOTOR_1,
PIN_MOTOR_2, PIN_MOTOR_3, PIN_MOTOR_4)
for i in range(50, 150, 1):
my_motor.set_speed(i/10)
my_motor.step(16)
for i in range(150, 50, -1):
my_motor.set_speed(i/10)
my_motor.step(-16)
# # my_motor.step(512)
# for _ in range(5):
# my_motor.step(2048*3)
# my_motor.step(-2048*3)
| 20.458333 | 57 | 0.653768 | 92 | 491 | 3.173913 | 0.315217 | 0.219178 | 0.188356 | 0.075342 | 0.328767 | 0.219178 | 0.219178 | 0.219178 | 0.219178 | 0.219178 | 0 | 0.138817 | 0.207739 | 491 | 23 | 58 | 21.347826 | 0.611825 | 0.183299 | 0 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.071429 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
07d2be1fb5a87545db15dc58708c9e47001aeaa6 | 2,398 | py | Python | pyutils/utils.py | eltrompetero/innovation | b59617f5f1486d7f4caf620192d5e8d95cf30f7f | [
"MIT"
] | null | null | null | pyutils/utils.py | eltrompetero/innovation | b59617f5f1486d7f4caf620192d5e8d95cf30f7f | [
"MIT"
] | null | null | null | pyutils/utils.py | eltrompetero/innovation | b59617f5f1486d7f4caf620192d5e8d95cf30f7f | [
"MIT"
] | null | null | null | # ====================================================================================== #
# Useful functions for analyzing corp data.
# Author: Eddie Lee, edlee@santafe.edu
# ====================================================================================== #
import numpy as np
import pandas as pd
from fastparquet import ParquetFile
import snappy
import os
import datetime as dt
from warnings import warn
from multiprocess import Pool, cpu_count
from threadpoolctl import threadpool_limits
import dill as pickle
import duckdb as db
from itertools import combinations
DATADR = os.path.expanduser('~')+'/Dropbox/Research/corporations/starter_packet'
def db_conn():
return db.connect(database=':memory:', read_only=False)
def snappy_decompress(data, uncompressed_size):
return snappy.decompress(data)
def topic_added_dates():
"""Dates on which new topics were added according to the "topic taxonomy.csv".
Returns
-------
ndarray
"""
df = pd.read_csv('%s/topic taxonomy.csv'%DATADR)
udates, count = np.unique(df['Active Date'], return_counts=True)
udates = np.array([dt.datetime.strptime(d,'%m/%d/%Y').date() for d in udates])
return udates
def bin_laplace(y, nbins, center=1):
"""Bin statistics from a laplace distribution by using log bins spaced around the center.
Parameters
----------
y : ndarray
nbins : int
center : float, 1.
Returns
-------
ndarray
Counts in bins.
ndarray
Bin edges.
ndarray
Bin centers.
"""
logy = np.log(y)
bins = np.linspace(0, np.abs(logy).max()+1e-6, nbins//2)
bins = np.concatenate((-bins[1:][::-1], bins)) + np.log(center)
n = np.histogram(logy, bins)[0]
return n, np.exp(bins), np.exp(bins[:-1] + (bins[1] - bins[0])/2)
def log_hist(y, nbins=20):
"""Log histogram on discrete domain. Assuming min value is 1.
Parameters
----------
y : ndarray
nbins : int, 20
Returns
-------
ndarray
Normalized frequency.
ndarray
Bin midpoints.
ndarray
Bin edges.
"""
bins = np.unique(np.around(np.logspace(0, np.log10(y.max()+1), nbins)).astype(int))
p = np.histogram(y, bins)[0]
p = p / p.sum() / np.floor(np.diff(bins))
xmid = np.exp((np.log(bins[:-1]) + np.log(bins[1:]))/2)
return p, xmid, bins
| 25.510638 | 93 | 0.580067 | 309 | 2,398 | 4.459547 | 0.449838 | 0.021771 | 0.029028 | 0.033382 | 0.037736 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013845 | 0.216847 | 2,398 | 93 | 94 | 25.784946 | 0.719915 | 0.338616 | 0 | 0 | 0 | 0 | 0.066197 | 0.03169 | 0 | 0 | 0 | 0 | 0 | 1 | 0.147059 | false | 0 | 0.352941 | 0.058824 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
07e1f6d0e70a1b0820307cb8de189d553654f9df | 3,322 | py | Python | shoppingcart/shoppingcart_pb2_grpc.py | sleipnir/python-support | f70ad028ba509956290b2af5f8f19716de85109c | [
"Apache-2.0"
] | 1 | 2021-08-21T20:40:32.000Z | 2021-08-21T20:40:32.000Z | shoppingcart/shoppingcart_pb2_grpc.py | eigr/functions-python-sdk | d057b67235edb0feb505db2bde60022d958d1209 | [
"Apache-2.0"
] | null | null | null | shoppingcart/shoppingcart_pb2_grpc.py | eigr/functions-python-sdk | d057b67235edb0feb505db2bde60022d958d1209 | [
"Apache-2.0"
] | null | null | null | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
import grpc
from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2
from shoppingcart import shoppingcart_pb2 as shoppingcart_dot_shoppingcart__pb2
class ShoppingCartStub(object):
# missing associated documentation comment in .proto file
pass
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.AddItem = channel.unary_unary(
'/com.example.shoppingcart.ShoppingCart/AddItem',
request_serializer=shoppingcart_dot_shoppingcart__pb2.AddLineItem.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.RemoveItem = channel.unary_unary(
'/com.example.shoppingcart.ShoppingCart/RemoveItem',
request_serializer=shoppingcart_dot_shoppingcart__pb2.RemoveLineItem.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.GetCart = channel.unary_unary(
'/com.example.shoppingcart.ShoppingCart/GetCart',
request_serializer=shoppingcart_dot_shoppingcart__pb2.GetShoppingCart.SerializeToString,
response_deserializer=shoppingcart_dot_shoppingcart__pb2.Cart.FromString,
)
class ShoppingCartServicer(object):
# missing associated documentation comment in .proto file
pass
def AddItem(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def RemoveItem(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetCart(self, request, context):
# missing associated documentation comment in .proto file
pass
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_ShoppingCartServicer_to_server(servicer, server):
rpc_method_handlers = {
'AddItem': grpc.unary_unary_rpc_method_handler(
servicer.AddItem,
request_deserializer=shoppingcart_dot_shoppingcart__pb2.AddLineItem.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'RemoveItem': grpc.unary_unary_rpc_method_handler(
servicer.RemoveItem,
request_deserializer=shoppingcart_dot_shoppingcart__pb2.RemoveLineItem.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'GetCart': grpc.unary_unary_rpc_method_handler(
servicer.GetCart,
request_deserializer=shoppingcart_dot_shoppingcart__pb2.GetShoppingCart.FromString,
response_serializer=shoppingcart_dot_shoppingcart__pb2.Cart.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'com.example.shoppingcart.ShoppingCart', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
| 40.512195 | 96 | 0.770921 | 350 | 3,322 | 6.974286 | 0.208571 | 0.06145 | 0.099549 | 0.11061 | 0.730848 | 0.661204 | 0.531749 | 0.422368 | 0.422368 | 0.422368 | 0 | 0.005722 | 0.158338 | 3,322 | 81 | 97 | 41.012346 | 0.86731 | 0.118904 | 0 | 0.355932 | 1 | 0 | 0.11712 | 0.061316 | 0 | 0 | 0 | 0 | 0 | 1 | 0.084746 | false | 0.084746 | 0.050847 | 0 | 0.169492 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
07e5cd3c32c3092a11151e5ae35e09273c3d0465 | 593 | py | Python | naiads/utils/Utils.py | royarzun/clara-std-services | 4130c59105d3d2171839c67b83952099df8e84c9 | [
"MIT"
] | null | null | null | naiads/utils/Utils.py | royarzun/clara-std-services | 4130c59105d3d2171839c67b83952099df8e84c9 | [
"MIT"
] | 2 | 2016-08-10T03:35:48.000Z | 2016-08-18T22:26:33.000Z | naiads/utils/Utils.py | royarzun/clara-std-services | 4130c59105d3d2171839c67b83952099df8e84c9 | [
"MIT"
] | null | null | null | # coding=utf-8
import os
from array import array
from time import strftime
def convert_to_root_array(list_array):
return array('d', list_array)
def create_filename(path, filename):
timestamp_str = strftime("%Y%m%d%H%M%S")
return path + filename + "_" + timestamp_str + ".root"
def get_limits(list_array):
return [float(list_array[0]), float(list_array[-1])]
def set_output_folder(path):
if os.path.isdir(path) and os.access(path, os.W_OK):
return path if path.endswith("/") else path + "/"
else:
raise IOError("Output directory not writable.")
| 22.807692 | 58 | 0.684654 | 90 | 593 | 4.333333 | 0.511111 | 0.115385 | 0.076923 | 0.123077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006186 | 0.182125 | 593 | 25 | 59 | 23.72 | 0.797938 | 0.020236 | 0 | 0 | 0 | 0 | 0.088083 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.266667 | false | 0 | 0.2 | 0.133333 | 0.733333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
07f7499762c3d28f2130a0d2cd3c7f668a6eac70 | 498 | py | Python | examples/distilling/evaluation/project_verification/build_indexing.py | liuxianming/DistillingCaffe | 2dec7f33c1873c4edc1068602d282982df1f18f4 | [
"BSD-2-Clause"
] | null | null | null | examples/distilling/evaluation/project_verification/build_indexing.py | liuxianming/DistillingCaffe | 2dec7f33c1873c4edc1068602d282982df1f18f4 | [
"BSD-2-Clause"
] | null | null | null | examples/distilling/evaluation/project_verification/build_indexing.py | liuxianming/DistillingCaffe | 2dec7f33c1873c4edc1068602d282982df1f18f4 | [
"BSD-2-Clause"
] | null | null | null | """Build indexing using extracted features
"""
import numpy as np
import os, os.path
import pickle
import scipy.spatial
from scipy.spatial.distance import pdist, squareform
"""Build index
Given feature matrix feature_mat,
k is the number of nearest neighbors to choose
"""
def build_index(feature_mat, k):
sample_count_ = feature_mat.shap(0)
index = np.zeros((sample_count_, k), dtype=numpy.int)
for idx in range(sample_count_):
feature = feature_mat[idx, ...]
dist =
| 24.9 | 57 | 0.7249 | 73 | 498 | 4.794521 | 0.616438 | 0.114286 | 0.062857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002445 | 0.178715 | 498 | 19 | 58 | 26.210526 | 0.853301 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.454545 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
07fff66e2a8f927700edc89e64e3a79dc4a1617d | 2,138 | py | Python | Chapter09/state_2.py | shoshan/Clean-Code-in-Python | 8f436223912f8bc80029f2393440665e56dbba07 | [
"MIT"
] | 402 | 2018-08-19T03:09:40.000Z | 2022-03-30T08:10:26.000Z | Chapter09/state_2.py | shoshan/Clean-Code-in-Python | 8f436223912f8bc80029f2393440665e56dbba07 | [
"MIT"
] | 3 | 2019-01-29T20:36:28.000Z | 2022-03-02T02:16:23.000Z | Chapter09/state_2.py | shoshan/Clean-Code-in-Python | 8f436223912f8bc80029f2393440665e56dbba07 | [
"MIT"
] | 140 | 2018-09-16T05:47:46.000Z | 2022-03-31T03:20:30.000Z | """Clean Code in Python - Chapter 9: Common Design Patterns
> State
"""
import abc
from log import logger
from state_1 import InvalidTransitionError
class MergeRequestState(abc.ABC):
def __init__(self, merge_request):
self._merge_request = merge_request
@abc.abstractmethod
def open(self):
...
@abc.abstractmethod
def close(self):
...
@abc.abstractmethod
def merge(self):
...
def __str__(self):
return self.__class__.__name__
class Open(MergeRequestState):
def open(self):
self._merge_request.approvals = 0
def close(self):
self._merge_request.approvals = 0
self._merge_request.state = Closed
def merge(self):
logger.info("merging %s", self._merge_request)
logger.info("deleting branch %s", self._merge_request.source_branch)
self._merge_request.state = Merged
class Closed(MergeRequestState):
def open(self):
logger.info("reopening closed merge request %s", self._merge_request)
self._merge_request.state = Open
def close(self):
"""Current state."""
def merge(self):
raise InvalidTransitionError("can't merge a closed request")
class Merged(MergeRequestState):
def open(self):
raise InvalidTransitionError("already merged request")
def close(self):
raise InvalidTransitionError("already merged request")
def merge(self):
"""Current state."""
class MergeRequest:
def __init__(self, source_branch: str, target_branch: str) -> None:
self.source_branch = source_branch
self.target_branch = target_branch
self._state: MergeRequestState
self.approvals = 0
self.state = Open
@property
def state(self):
return self._state
@state.setter
def state(self, new_state_cls):
self._state = new_state_cls(self)
@property
def status(self):
return str(self.state)
def __getattr__(self, method):
return getattr(self.state, method)
def __str__(self):
return f"{self.target_branch}:{self.source_branch}"
| 22.989247 | 77 | 0.655753 | 247 | 2,138 | 5.412955 | 0.234818 | 0.107704 | 0.119671 | 0.062827 | 0.173523 | 0.173523 | 0.080778 | 0 | 0 | 0 | 0 | 0.003096 | 0.244621 | 2,138 | 92 | 78 | 23.23913 | 0.824768 | 0.044434 | 0 | 0.433333 | 0 | 0 | 0.085884 | 0.020237 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.05 | 0.083333 | 0.55 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
5800e208f1e9ec00c4c099cbdaf7aa9c1fa7907c | 466 | py | Python | krit/invitations/receivers.py | huroncg/krit-teams | ce96a49de44496c2f86e37dd917c51952fbbdeed | [
"BSD-3-Clause"
] | null | null | null | krit/invitations/receivers.py | huroncg/krit-teams | ce96a49de44496c2f86e37dd917c51952fbbdeed | [
"BSD-3-Clause"
] | null | null | null | krit/invitations/receivers.py | huroncg/krit-teams | ce96a49de44496c2f86e37dd917c51952fbbdeed | [
"BSD-3-Clause"
] | 1 | 2021-02-26T01:38:35.000Z | 2021-02-26T01:38:35.000Z | from django.dispatch import receiver
from krit.registration.models import SignupCodeResult
from krit.registration.signals import signup_code_used
from .models import Invitation
@receiver(signup_code_used, sender=SignupCodeResult)
def handle_signup_code_used(sender, **kwargs):
result = kwargs.get("signup_code_result")
try:
invite = result.signup_code.invitation
invite.accept(result.user)
except Invitation.DoesNotExist:
pass
| 29.125 | 54 | 0.776824 | 56 | 466 | 6.285714 | 0.482143 | 0.142045 | 0.119318 | 0.113636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152361 | 466 | 15 | 55 | 31.066667 | 0.891139 | 0 | 0 | 0 | 0 | 0 | 0.038627 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0.083333 | 0.333333 | 0 | 0.416667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
5802cb215113b30bc3106aadab0eeadc6de0e616 | 21,297 | py | Python | tests/type/test_predicate.py | KingDarBoja/graphql-core | 22970e94f1016e813848fc0ab5d1e7ab9ad612e4 | [
"MIT"
] | null | null | null | tests/type/test_predicate.py | KingDarBoja/graphql-core | 22970e94f1016e813848fc0ab5d1e7ab9ad612e4 | [
"MIT"
] | null | null | null | tests/type/test_predicate.py | KingDarBoja/graphql-core | 22970e94f1016e813848fc0ab5d1e7ab9ad612e4 | [
"MIT"
] | null | null | null | from pytest import raises # type: ignore
from graphql.language import DirectiveLocation
from graphql.type import (
GraphQLArgument,
GraphQLDeprecatedDirective,
GraphQLBoolean,
GraphQLDirective,
GraphQLEnumType,
GraphQLFloat,
GraphQLID,
GraphQLIncludeDirective,
GraphQLInputField,
GraphQLInputObjectType,
GraphQLInt,
GraphQLInterfaceType,
GraphQLList,
GraphQLNonNull,
GraphQLObjectType,
GraphQLScalarType,
GraphQLSkipDirective,
GraphQLString,
GraphQLUnionType,
assert_abstract_type,
assert_composite_type,
assert_directive,
assert_enum_type,
assert_input_object_type,
assert_input_type,
assert_interface_type,
assert_leaf_type,
assert_list_type,
assert_named_type,
assert_non_null_type,
assert_nullable_type,
assert_object_type,
assert_output_type,
assert_scalar_type,
assert_type,
assert_union_type,
assert_wrapping_type,
get_named_type,
get_nullable_type,
is_abstract_type,
is_composite_type,
is_directive,
is_enum_type,
is_input_object_type,
is_input_type,
is_interface_type,
is_leaf_type,
is_list_type,
is_named_type,
is_required_argument,
is_required_input_field,
is_non_null_type,
is_nullable_type,
is_object_type,
is_output_type,
is_scalar_type,
is_specified_directive,
is_specified_scalar_type,
is_type,
is_union_type,
is_wrapping_type,
)
ObjectType = GraphQLObjectType("Object", {})
InterfaceType = GraphQLInterfaceType("Interface", {})
UnionType = GraphQLUnionType("Union", types=[ObjectType])
EnumType = GraphQLEnumType("Enum", values={"foo": {}})
InputObjectType = GraphQLInputObjectType("InputObject", {})
ScalarType = GraphQLScalarType("Scalar")
Directive = GraphQLDirective("Directive", [DirectiveLocation.QUERY])
def describe_type_predicates():
def describe_is_type():
def returns_true_for_unwrapped_types():
assert is_type(GraphQLString) is True
assert_type(GraphQLString)
assert is_type(ObjectType) is True
assert_type(ObjectType)
def returns_true_for_wrapped_types():
assert is_type(GraphQLNonNull(GraphQLString)) is True
assert_type(GraphQLNonNull(GraphQLString))
def returns_false_for_type_classes_rather_than_instance():
assert is_type(GraphQLObjectType) is False
with raises(TypeError):
assert_type(GraphQLObjectType)
def returns_false_for_random_garbage():
assert is_type({"what": "is this"}) is False
with raises(TypeError):
assert_type({"what": "is this"})
def describe_is_scalar_type():
def returns_true_for_spec_defined_scalar():
assert is_scalar_type(GraphQLString) is True
assert_scalar_type(GraphQLString)
def returns_true_for_custom_scalar():
assert is_scalar_type(ScalarType) is True
assert_scalar_type(ScalarType)
def returns_false_for_scalar_class_rather_than_instance():
assert is_scalar_type(GraphQLScalarType) is False
with raises(TypeError):
assert_scalar_type(GraphQLScalarType)
def returns_false_for_wrapped_scalar():
assert is_scalar_type(GraphQLList(ScalarType)) is False
with raises(TypeError):
assert_scalar_type(GraphQLList(ScalarType))
def returns_false_for_non_scalar():
assert is_scalar_type(EnumType) is False
with raises(TypeError):
assert_scalar_type(EnumType)
assert is_scalar_type(Directive) is False
with raises(TypeError):
assert_scalar_type(Directive)
def returns_false_for_random_garbage():
assert is_scalar_type(None) is False
with raises(TypeError):
assert_scalar_type(None)
assert is_scalar_type({"what": "is this"}) is False
with raises(TypeError):
assert_scalar_type({"what": "is this"})
def describe_is_specified_scalar_type():
def returns_true_for_specified_scalars():
assert is_specified_scalar_type(GraphQLString) is True
assert is_specified_scalar_type(GraphQLInt) is True
assert is_specified_scalar_type(GraphQLFloat) is True
assert is_specified_scalar_type(GraphQLBoolean) is True
assert is_specified_scalar_type(GraphQLID) is True
def describe_is_object_type():
def returns_true_for_object_type():
assert is_object_type(ObjectType) is True
assert_object_type(ObjectType)
def returns_false_for_wrapped_object_type():
assert is_object_type(GraphQLList(ObjectType)) is False
with raises(TypeError):
assert_object_type(GraphQLList(ObjectType))
def returns_false_for_non_object_type():
assert is_scalar_type(InterfaceType) is False
with raises(TypeError):
assert_scalar_type(InterfaceType)
def describe_is_interface_type():
def returns_true_for_interface_type():
assert is_interface_type(InterfaceType) is True
assert_interface_type(InterfaceType)
def returns_false_for_wrapped_interface_type():
assert is_interface_type(GraphQLList(InterfaceType)) is False
with raises(TypeError):
assert_interface_type(GraphQLList(InterfaceType))
def returns_false_for_non_interface_type():
assert is_interface_type(ObjectType) is False
with raises(TypeError):
assert_interface_type(ObjectType)
def describe_is_union_type():
def returns_true_for_union_type():
assert is_union_type(UnionType) is True
assert_union_type(UnionType)
def returns_false_for_wrapped_union_type():
assert is_union_type(GraphQLList(UnionType)) is False
with raises(TypeError):
assert_union_type(GraphQLList(UnionType))
def returns_false_for_non_union_type():
assert is_union_type(ObjectType) is False
with raises(TypeError):
assert_union_type(ObjectType)
def describe_is_enum_type():
def returns_true_for_enum_type():
assert is_enum_type(EnumType) is True
assert_enum_type(EnumType)
def returns_false_for_wrapped_enum_type():
assert is_enum_type(GraphQLList(EnumType)) is False
with raises(TypeError):
assert_enum_type(GraphQLList(EnumType))
def returns_false_for_non_enum_type():
assert is_enum_type(ScalarType) is False
with raises(TypeError):
assert_enum_type(ScalarType)
def describe_is_input_object_type():
def returns_true_for_input_object_type():
assert is_input_object_type(InputObjectType) is True
assert_input_object_type(InputObjectType)
def returns_false_for_wrapped_input_object_type():
assert is_input_object_type(GraphQLList(InputObjectType)) is False
with raises(TypeError):
assert_input_object_type(GraphQLList(InputObjectType))
def returns_false_for_non_input_object_type():
assert is_input_object_type(ObjectType) is False
with raises(TypeError):
assert_input_object_type(ObjectType)
def describe_is_list_type():
def returns_true_for_a_list_wrapped_type():
assert is_list_type(GraphQLList(ObjectType)) is True
assert_list_type(GraphQLList(ObjectType))
def returns_false_for_a_unwrapped_type():
assert is_list_type(ObjectType) is False
with raises(TypeError):
assert_list_type(ObjectType)
def returns_false_for_a_non_list_wrapped_type():
assert is_list_type(GraphQLNonNull(GraphQLList(ObjectType))) is False
with raises(TypeError):
assert_list_type(GraphQLNonNull(GraphQLList(ObjectType)))
def describe_is_non_null_type():
def returns_true_for_a_non_null_wrapped_type():
assert is_non_null_type(GraphQLNonNull(ObjectType)) is True
assert_non_null_type(GraphQLNonNull(ObjectType))
def returns_false_for_an_unwrapped_type():
assert is_non_null_type(ObjectType) is False
with raises(TypeError):
assert_non_null_type(ObjectType)
def returns_false_for_a_not_non_null_wrapped_type():
assert is_non_null_type(GraphQLList(GraphQLNonNull(ObjectType))) is False
with raises(TypeError):
assert_non_null_type(GraphQLList(GraphQLNonNull(ObjectType)))
def describe_is_input_type():
def _assert_input_type(type_):
assert is_input_type(type_) is True
assert_input_type(type_)
def returns_true_for_an_input_type():
_assert_input_type(GraphQLString)
_assert_input_type(EnumType)
_assert_input_type(InputObjectType)
def returns_true_for_a_wrapped_input_type():
_assert_input_type(GraphQLList(GraphQLString))
_assert_input_type(GraphQLList(EnumType))
_assert_input_type(GraphQLList(InputObjectType))
_assert_input_type(GraphQLNonNull(GraphQLString))
_assert_input_type(GraphQLNonNull(EnumType))
_assert_input_type(GraphQLNonNull(InputObjectType))
def _assert_non_input_type(type_):
assert is_input_type(type_) is False
with raises(TypeError):
assert_input_type(type_)
def returns_false_for_an_output_type():
_assert_non_input_type(ObjectType)
_assert_non_input_type(InterfaceType)
_assert_non_input_type(UnionType)
def returns_false_for_a_wrapped_output_type():
_assert_non_input_type(GraphQLList(ObjectType))
_assert_non_input_type(GraphQLList(InterfaceType))
_assert_non_input_type(GraphQLList(UnionType))
_assert_non_input_type(GraphQLNonNull(ObjectType))
_assert_non_input_type(GraphQLNonNull(InterfaceType))
_assert_non_input_type(GraphQLNonNull(UnionType))
def describe_is_output_type():
def _assert_output_type(type_):
assert is_output_type(type_) is True
assert_output_type(type_)
def returns_true_for_an_output_type():
_assert_output_type(GraphQLString)
_assert_output_type(ObjectType)
_assert_output_type(InterfaceType)
_assert_output_type(UnionType)
_assert_output_type(EnumType)
def returns_true_for_a_wrapped_output_type():
_assert_output_type(GraphQLList(GraphQLString))
_assert_output_type(GraphQLList(ObjectType))
_assert_output_type(GraphQLList(InterfaceType))
_assert_output_type(GraphQLList(UnionType))
_assert_output_type(GraphQLList(EnumType))
_assert_output_type(GraphQLNonNull(GraphQLString))
_assert_output_type(GraphQLNonNull(ObjectType))
_assert_output_type(GraphQLNonNull(InterfaceType))
_assert_output_type(GraphQLNonNull(UnionType))
_assert_output_type(GraphQLNonNull(EnumType))
def _assert_non_output_type(type_):
assert is_output_type(type_) is False
with raises(TypeError):
assert_output_type(type_)
def returns_false_for_an_input_type():
_assert_non_output_type(InputObjectType)
def returns_false_for_a_wrapped_input_type():
_assert_non_output_type(GraphQLList(InputObjectType))
_assert_non_output_type(GraphQLNonNull(InputObjectType))
def describe_is_leaf_type():
def returns_true_for_scalar_and_enum_types():
assert is_leaf_type(ScalarType) is True
assert_leaf_type(ScalarType)
assert is_leaf_type(EnumType) is True
assert_leaf_type(EnumType)
def returns_false_for_wrapped_leaf_type():
assert is_leaf_type(GraphQLList(ScalarType)) is False
with raises(TypeError):
assert_leaf_type(GraphQLList(ScalarType))
def returns_false_for_non_leaf_type():
assert is_leaf_type(ObjectType) is False
with raises(TypeError):
assert_leaf_type(ObjectType)
def returns_false_for_wrapped_non_leaf_type():
assert is_leaf_type(GraphQLList(ObjectType)) is False
with raises(TypeError):
assert_leaf_type(GraphQLList(ObjectType))
def describe_is_composite_type():
def returns_true_for_object_interface_and_union_types():
assert is_composite_type(ObjectType) is True
assert_composite_type(ObjectType)
assert is_composite_type(InterfaceType) is True
assert_composite_type(InterfaceType)
assert is_composite_type(UnionType) is True
assert_composite_type(UnionType)
def returns_false_for_wrapped_composite_type():
assert is_composite_type(GraphQLList(ObjectType)) is False
with raises(TypeError):
assert_composite_type(GraphQLList(ObjectType))
def returns_false_for_non_composite_type():
assert is_composite_type(InputObjectType) is False
with raises(TypeError):
assert_composite_type(InputObjectType)
def returns_false_for_wrapped_non_composite_type():
assert is_composite_type(GraphQLList(InputObjectType)) is False
with raises(TypeError):
assert_composite_type(GraphQLList(InputObjectType))
def describe_is_abstract_type():
def returns_true_for_interface_and_union_types():
assert is_abstract_type(InterfaceType) is True
assert_abstract_type(InterfaceType)
assert is_abstract_type(UnionType) is True
assert_abstract_type(UnionType)
def returns_false_for_wrapped_abstract_type():
assert is_abstract_type(GraphQLList(InterfaceType)) is False
with raises(TypeError):
assert_abstract_type(GraphQLList(InterfaceType))
def returns_false_for_non_abstract_type():
assert is_abstract_type(ObjectType) is False
with raises(TypeError):
assert_abstract_type(ObjectType)
def returns_false_for_wrapped_non_abstract_type():
assert is_abstract_type(GraphQLList(ObjectType)) is False
with raises(TypeError):
assert_abstract_type(GraphQLList(ObjectType))
def describe_is_wrapping_type():
def returns_true_for_list_and_non_null_types():
assert is_wrapping_type(GraphQLList(ObjectType)) is True
assert_wrapping_type(GraphQLList(ObjectType))
assert is_wrapping_type(GraphQLNonNull(ObjectType)) is True
assert_wrapping_type(GraphQLNonNull(ObjectType))
def returns_false_for_unwrapped_types():
assert is_wrapping_type(ObjectType) is False
with raises(TypeError):
assert_wrapping_type(ObjectType)
def describe_is_nullable_type():
def returns_true_for_unwrapped_types():
assert is_nullable_type(ObjectType) is True
assert_nullable_type(ObjectType)
def returns_true_for_list_of_non_null_types():
assert is_nullable_type(GraphQLList(GraphQLNonNull(ObjectType))) is True
assert_nullable_type(GraphQLList(GraphQLNonNull(ObjectType)))
def returns_false_for_non_null_types():
assert is_nullable_type(GraphQLNonNull(ObjectType)) is False
with raises(TypeError):
assert_nullable_type(GraphQLNonNull(ObjectType))
def describe_get_nullable_type():
def returns_none_for_no_type():
assert get_nullable_type(None) is None
def returns_self_for_a_nullable_type():
assert get_nullable_type(ObjectType) is ObjectType
list_of_obj = GraphQLList(ObjectType)
assert get_nullable_type(list_of_obj) is list_of_obj
def unwraps_non_null_type():
assert get_nullable_type(GraphQLNonNull(ObjectType)) is ObjectType
def describe_is_named_type():
def returns_true_for_unwrapped_types():
assert is_named_type(ObjectType) is True
assert_named_type(ObjectType)
def returns_false_for_list_and_non_null_types():
assert is_named_type(GraphQLList(ObjectType)) is False
with raises(TypeError):
assert_named_type(GraphQLList(ObjectType))
assert is_named_type(GraphQLNonNull(ObjectType)) is False
with raises(TypeError):
assert_named_type(GraphQLNonNull(ObjectType))
def describe_get_named_type():
def returns_none_for_no_type():
assert get_named_type(None) is None
def returns_self_for_an_unwrapped_type():
assert get_named_type(ObjectType) is ObjectType
def unwraps_wrapper_types():
assert get_named_type(GraphQLNonNull(ObjectType)) is ObjectType
assert get_named_type(GraphQLList(ObjectType)) is ObjectType
def unwraps_deeply_wrapper_types():
assert (
get_named_type(GraphQLNonNull(GraphQLList(GraphQLNonNull(ObjectType))))
is ObjectType
)
def describe_is_required_argument():
def returns_true_for_required_arguments():
required_arg = GraphQLArgument(GraphQLNonNull(GraphQLString))
assert is_required_argument(required_arg) is True
def returns_false_for_optional_arguments():
opt_arg1 = GraphQLArgument(GraphQLString)
assert is_required_argument(opt_arg1) is False
opt_arg2 = GraphQLArgument(GraphQLString, default_value=None)
assert is_required_argument(opt_arg2) is False
opt_arg3 = GraphQLArgument(GraphQLList(GraphQLNonNull(GraphQLString)))
assert is_required_argument(opt_arg3) is False
opt_arg4 = GraphQLArgument(
GraphQLNonNull(GraphQLString), default_value="default"
)
assert is_required_argument(opt_arg4) is False
def describe_is_required_input_field():
def returns_true_for_required_input_field():
required_field = GraphQLInputField(GraphQLNonNull(GraphQLString))
assert is_required_input_field(required_field) is True
def returns_false_for_optional_input_field():
opt_field1 = GraphQLInputField(GraphQLString)
assert is_required_input_field(opt_field1) is False
opt_field2 = GraphQLInputField(GraphQLString, default_value=None)
assert is_required_input_field(opt_field2) is False
opt_field3 = GraphQLInputField(GraphQLList(GraphQLNonNull(GraphQLString)))
assert is_required_input_field(opt_field3) is False
opt_field4 = GraphQLInputField(
GraphQLNonNull(GraphQLString), default_value="default"
)
assert is_required_input_field(opt_field4) is False
def describe_directive_predicates():
def describe_is_directive():
def returns_true_for_spec_defined_directive():
assert is_directive(GraphQLSkipDirective) is True
assert_directive(GraphQLSkipDirective)
def returns_true_for_custom_directive():
assert is_directive(Directive) is True
assert_directive(Directive)
def returns_false_for_directive_class_rather_than_instance():
assert is_directive(GraphQLDirective) is False
with raises(TypeError):
assert_directive(GraphQLScalarType)
def returns_false_for_non_directive():
assert is_directive(EnumType) is False
with raises(TypeError):
assert_directive(EnumType)
assert is_directive(ScalarType) is False
with raises(TypeError):
assert_directive(ScalarType)
def returns_false_for_random_garbage():
assert is_directive(None) is False
with raises(TypeError):
assert_directive(None)
assert is_directive({"what": "is this"}) is False
with raises(TypeError):
assert_directive({"what": "is this"})
def describe_is_specified_directive():
def returns_true_for_specified_directives():
assert is_specified_directive(GraphQLIncludeDirective) is True
assert is_specified_directive(GraphQLSkipDirective) is True
assert is_specified_directive(GraphQLDeprecatedDirective) is True
def returns_false_for_custom_directive():
assert is_specified_directive(Directive) is False
| 39.659218 | 87 | 0.681927 | 2,301 | 21,297 | 5.8505 | 0.049544 | 0.05289 | 0.046798 | 0.056158 | 0.696108 | 0.531496 | 0.392587 | 0.246249 | 0.141287 | 0.014708 | 0 | 0.001014 | 0.259285 | 21,297 | 536 | 88 | 39.733209 | 0.852361 | 0.000563 | 0 | 0.126411 | 0 | 0 | 0.006249 | 0 | 0 | 0 | 0 | 0 | 0.510158 | 1 | 0.241535 | false | 0 | 0.006772 | 0 | 0.248307 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6af81b43f58a6e1bffafb616654f3b0c80ca1ba0 | 12,137 | py | Python | airbyte-integrations/connectors/source-google-analytics-v4/unit_tests/unit_test.py | OTRI-Unipd/OTRI-airbyte | 50eeeb773f75246e86c6e167b0cd7d2dda6efe0d | [
"MIT"
] | 2 | 2022-03-02T13:46:05.000Z | 2022-03-05T12:31:28.000Z | airbyte-integrations/connectors/source-google-analytics-v4/unit_tests/unit_test.py | OTRI-Unipd/OTRI-airbyte | 50eeeb773f75246e86c6e167b0cd7d2dda6efe0d | [
"MIT"
] | 5 | 2022-02-22T14:49:48.000Z | 2022-03-19T10:43:08.000Z | airbyte-integrations/connectors/source-google-analytics-v4/unit_tests/unit_test.py | OTRI-Unipd/OTRI-airbyte | 50eeeb773f75246e86c6e167b0cd7d2dda6efe0d | [
"MIT"
] | 1 | 2022-03-11T06:21:24.000Z | 2022-03-11T06:21:24.000Z | #
# Copyright (c) 2021 Airbyte, Inc., all rights reserved.
#
import json
import logging
from pathlib import Path
from unittest.mock import MagicMock, patch
from urllib.parse import unquote
import pendulum
import pytest
from airbyte_cdk.models import ConfiguredAirbyteCatalog
from airbyte_cdk.sources.streams.http.auth import NoAuth
from freezegun import freeze_time
from source_google_analytics_v4.source import (
DATA_IS_NOT_GOLDEN_MSG,
RESULT_IS_SAMPLED_MSG,
GoogleAnalyticsV4IncrementalObjectsBase,
GoogleAnalyticsV4Stream,
GoogleAnalyticsV4TypesList,
SourceGoogleAnalyticsV4,
)
def read_file(file_name):
parent_location = Path(__file__).absolute().parent
file = open(parent_location / file_name).read()
return file
expected_metrics_dimensions_type_map = (
{"ga:users": "INTEGER", "ga:newUsers": "INTEGER"},
{"ga:date": "STRING", "ga:country": "STRING"},
)
@pytest.fixture
def mock_metrics_dimensions_type_list_link(requests_mock):
requests_mock.get(
"https://www.googleapis.com/analytics/v3/metadata/ga/columns",
json=json.loads(read_file("metrics_dimensions_type_list.json")),
)
@pytest.fixture
def mock_auth_call(requests_mock):
yield requests_mock.post(
"https://oauth2.googleapis.com/token",
json={"access_token": "", "expires_in": 0},
)
@pytest.fixture
def mock_auth_check_connection(requests_mock):
yield requests_mock.post(
"https://analyticsreporting.googleapis.com/v4/reports:batchGet",
json={"data": {"test": "value"}},
)
@pytest.fixture
def mock_unknown_metrics_or_dimensions_error(requests_mock):
yield requests_mock.post(
"https://analyticsreporting.googleapis.com/v4/reports:batchGet",
status_code=400,
json={"error": {"message": "Unknown metrics or dimensions"}},
)
@pytest.fixture
def mock_api_returns_no_records(requests_mock):
"""API returns empty data for given date based slice"""
yield requests_mock.post(
"https://analyticsreporting.googleapis.com/v4/reports:batchGet",
json=json.loads(read_file("empty_response.json")),
)
@pytest.fixture
def mock_api_returns_valid_records(requests_mock):
"""API returns valid data for given date based slice"""
yield requests_mock.post(
"https://analyticsreporting.googleapis.com/v4/reports:batchGet",
json=json.loads(read_file("response_with_records.json")),
)
@pytest.fixture
def mock_api_returns_sampled_results(requests_mock):
"""API returns valid data for given date based slice"""
yield requests_mock.post(
"https://analyticsreporting.googleapis.com/v4/reports:batchGet",
json=json.loads(read_file("response_with_sampling.json")),
)
@pytest.fixture
def mock_api_returns_is_data_golden_false(requests_mock):
"""API returns valid data for given date based slice"""
yield requests_mock.post(
"https://analyticsreporting.googleapis.com/v4/reports:batchGet",
json=json.loads(read_file("response_is_data_golden_false.json")),
)
@pytest.fixture()
def test_config():
test_config = json.loads(read_file("../integration_tests/sample_config.json"))
test_config["authenticator"] = NoAuth()
test_config["metrics"] = []
test_config["dimensions"] = []
test_config["credentials"] = {
"type": "Service",
}
return test_config
def test_metrics_dimensions_type_list(mock_metrics_dimensions_type_list_link):
test_metrics, test_dimensions = GoogleAnalyticsV4TypesList().read_records(sync_mode=None)
assert test_metrics, test_dimensions == expected_metrics_dimensions_type_map
def get_metrics_dimensions_mapping():
test_metrics_dimensions_map = {
"metric": [("ga:users", "integer"), ("ga:newUsers", "integer")],
"dimension": [("ga:dimension", "string")],
}
for field_type, attribute_expected_pairs in test_metrics_dimensions_map.items():
for attribute_expected_pair in attribute_expected_pairs:
attribute, expected = attribute_expected_pair
yield field_type, attribute, expected
@pytest.mark.parametrize("metrics_dimensions_mapping", get_metrics_dimensions_mapping())
def test_lookup_metrics_dimensions_data_type(test_config, metrics_dimensions_mapping, mock_metrics_dimensions_type_list_link):
field_type, attribute, expected = metrics_dimensions_mapping
g = GoogleAnalyticsV4Stream(config=test_config)
test = g.lookup_data_type(field_type, attribute)
assert test == expected
def test_data_is_not_golden_is_logged_as_warning(
mock_api_returns_is_data_golden_false,
test_config,
mock_metrics_dimensions_type_list_link,
mock_auth_call,
caplog,
):
source = SourceGoogleAnalyticsV4()
del test_config["custom_reports"]
catalog = ConfiguredAirbyteCatalog.parse_obj(json.loads(read_file("./configured_catalog.json")))
list(source.read(logging.getLogger(), test_config, catalog))
assert DATA_IS_NOT_GOLDEN_MSG in caplog.text
def test_sampled_result_is_logged_as_warning(
mock_api_returns_sampled_results,
test_config,
mock_metrics_dimensions_type_list_link,
mock_auth_call,
caplog,
):
source = SourceGoogleAnalyticsV4()
del test_config["custom_reports"]
catalog = ConfiguredAirbyteCatalog.parse_obj(json.loads(read_file("./configured_catalog.json")))
list(source.read(logging.getLogger(), test_config, catalog))
assert RESULT_IS_SAMPLED_MSG in caplog.text
def test_no_regressions_for_result_is_sampled_and_data_is_golden_warnings(
mock_api_returns_valid_records,
test_config,
mock_metrics_dimensions_type_list_link,
mock_auth_call,
caplog,
):
source = SourceGoogleAnalyticsV4()
del test_config["custom_reports"]
catalog = ConfiguredAirbyteCatalog.parse_obj(json.loads(read_file("./configured_catalog.json")))
list(source.read(logging.getLogger(), test_config, catalog))
assert RESULT_IS_SAMPLED_MSG not in caplog.text
assert DATA_IS_NOT_GOLDEN_MSG not in caplog.text
@patch("source_google_analytics_v4.source.jwt")
def test_check_connection_fails_jwt(
jwt_encode_mock,
mocker,
mock_metrics_dimensions_type_list_link,
mock_auth_call,
mock_api_returns_no_records,
):
"""
check_connection fails because of the API returns no records,
then we assume than user doesn't have permission to read requested `view`
"""
test_config = json.loads(read_file("../integration_tests/sample_config.json"))
del test_config["custom_reports"]
test_config["credentials"] = {
"auth_type": "Service",
"credentials_json": '{"client_email": "", "private_key": "", "private_key_id": ""}',
}
source = SourceGoogleAnalyticsV4()
is_success, msg = source.check_connection(MagicMock(), test_config)
assert is_success is False
assert (
msg == f"Please check the permissions for the requested view_id: {test_config['view_id']}. Cannot retrieve data from that view ID."
)
jwt_encode_mock.encode.assert_called()
assert mock_auth_call.called
assert mock_api_returns_no_records.called
@patch("source_google_analytics_v4.source.jwt")
def test_check_connection_success_jwt(
jwt_encode_mock,
mocker,
mock_metrics_dimensions_type_list_link,
mock_auth_call,
mock_api_returns_valid_records,
):
"""
check_connection succeeds because of the API returns valid records for the latest date based slice,
then we assume than user has permission to read requested `view`
"""
test_config = json.loads(read_file("../integration_tests/sample_config.json"))
del test_config["custom_reports"]
test_config["credentials"] = {
"auth_type": "Service",
"credentials_json": '{"client_email": "", "private_key": "", "private_key_id": ""}',
}
source = SourceGoogleAnalyticsV4()
is_success, msg = source.check_connection(MagicMock(), test_config)
assert is_success is True
assert msg is None
jwt_encode_mock.encode.assert_called()
assert mock_auth_call.called
assert mock_api_returns_valid_records.called
@patch("source_google_analytics_v4.source.jwt")
def test_check_connection_fails_oauth(
jwt_encode_mock,
mocker,
mock_metrics_dimensions_type_list_link,
mock_auth_call,
mock_api_returns_no_records,
):
"""
check_connection fails because of the API returns no records,
then we assume than user doesn't have permission to read requested `view`
"""
test_config = json.loads(read_file("../integration_tests/sample_config.json"))
del test_config["custom_reports"]
test_config["credentials"] = {
"auth_type": "Client",
"client_id": "client_id_val",
"client_secret": "client_secret_val",
"refresh_token": "refresh_token_val",
}
source = SourceGoogleAnalyticsV4()
is_success, msg = source.check_connection(MagicMock(), test_config)
assert is_success is False
assert (
msg == f"Please check the permissions for the requested view_id: {test_config['view_id']}. Cannot retrieve data from that view ID."
)
jwt_encode_mock.encode.assert_not_called()
assert "https://www.googleapis.com/auth/analytics.readonly" in unquote(mock_auth_call.last_request.body)
assert "client_id_val" in unquote(mock_auth_call.last_request.body)
assert "client_secret_val" in unquote(mock_auth_call.last_request.body)
assert "refresh_token_val" in unquote(mock_auth_call.last_request.body)
assert mock_auth_call.called
assert mock_api_returns_no_records.called
@patch("source_google_analytics_v4.source.jwt")
def test_check_connection_success_oauth(
jwt_encode_mock,
mocker,
mock_metrics_dimensions_type_list_link,
mock_auth_call,
mock_api_returns_valid_records,
):
"""
check_connection succeeds because of the API returns valid records for the latest date based slice,
then we assume than user has permission to read requested `view`
"""
test_config = json.loads(read_file("../integration_tests/sample_config.json"))
del test_config["custom_reports"]
test_config["credentials"] = {
"auth_type": "Client",
"client_id": "client_id_val",
"client_secret": "client_secret_val",
"refresh_token": "refresh_token_val",
}
source = SourceGoogleAnalyticsV4()
is_success, msg = source.check_connection(MagicMock(), test_config)
assert is_success is True
assert msg is None
jwt_encode_mock.encode.assert_not_called()
assert "https://www.googleapis.com/auth/analytics.readonly" in unquote(mock_auth_call.last_request.body)
assert "client_id_val" in unquote(mock_auth_call.last_request.body)
assert "client_secret_val" in unquote(mock_auth_call.last_request.body)
assert "refresh_token_val" in unquote(mock_auth_call.last_request.body)
assert mock_auth_call.called
assert mock_api_returns_valid_records.called
def test_unknown_metrics_or_dimensions_error_validation(mock_metrics_dimensions_type_list_link, mock_unknown_metrics_or_dimensions_error):
records = GoogleAnalyticsV4Stream(MagicMock()).read_records(sync_mode=None)
assert records
@freeze_time("2021-11-30")
def test_stream_slices_limited_by_current_date(test_config):
g = GoogleAnalyticsV4IncrementalObjectsBase(config=test_config)
stream_state = {"ga_date": "2050-05-01"}
slices = g.stream_slices(stream_state=stream_state)
current_date = pendulum.now().date().strftime("%Y-%m-%d")
assert len(slices) == 1
assert slices[0]["startDate"] == slices[0]["endDate"]
assert slices[0]["endDate"] == current_date
@freeze_time("2021-11-30")
def test_stream_slices_start_from_current_date_if_abnornal_state_is_passed(test_config):
g = GoogleAnalyticsV4IncrementalObjectsBase(config=test_config)
stream_state = {"ga_date": "2050-05-01"}
slices = g.stream_slices(stream_state=stream_state)
current_date = pendulum.now().date().strftime("%Y-%m-%d")
assert len(slices) == 1
assert slices[0]["startDate"] == slices[0]["endDate"]
assert slices[0]["startDate"] == current_date
| 35.908284 | 139 | 0.744006 | 1,557 | 12,137 | 5.441233 | 0.145793 | 0.047214 | 0.028329 | 0.038362 | 0.770066 | 0.729698 | 0.683074 | 0.65085 | 0.65085 | 0.642115 | 0 | 0.007497 | 0.153745 | 12,137 | 337 | 140 | 36.014837 | 0.81735 | 0.070611 | 0 | 0.59387 | 0 | 0.007663 | 0.212414 | 0.054915 | 0 | 0 | 0 | 0 | 0.157088 | 1 | 0.088123 | false | 0.003831 | 0.042146 | 0 | 0.137931 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
6afde18a959ce6ca0da861f708621da744f7a487 | 828 | py | Python | mandala/all.py | amakelov/mandala | a9ec051ef730ada4eed216c62a07b033126e78d5 | [
"Apache-2.0"
] | 9 | 2022-02-22T19:24:01.000Z | 2022-03-23T04:46:41.000Z | mandala/all.py | amakelov/mandala | a9ec051ef730ada4eed216c62a07b033126e78d5 | [
"Apache-2.0"
] | null | null | null | mandala/all.py | amakelov/mandala | a9ec051ef730ada4eed216c62a07b033126e78d5 | [
"Apache-2.0"
] | null | null | null | import typing
from .core.utils import CompatArg, AsType, AsTransient, AsDelayedStorage, Mut, Skip, Mark
from .core.bases import unwrap, detached
from .core.config import CoreConfig
from .core.wrap import wrap_detached
from .storages.kv_impl.sqlite_impl import SQLiteStorage
from .storages.kv_impl.joblib_impl import JoblibStorage
from .ui.storage import Storage
from .ui.execution import wrap
from .ui.context import (
context, run, query, transient, delete, define, noop, retrace, capture
)
from .ui.funcs import op, superop
from .ui.vars import Var, Query, BuiltinVars
from .util.logging_ut import set_logging_level
from .queries.rel_weaver import ValQuery, ListQuery, DictQuery
from .queries.rel_weaver import MakeList
from .queries.qfunc import qfunc
IndexQuery = BuiltinVars.IndexQuery
KeyQuery = BuiltinVars.KeyQuery
| 37.636364 | 89 | 0.815217 | 115 | 828 | 5.782609 | 0.504348 | 0.045113 | 0.042105 | 0.054135 | 0.078195 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114734 | 828 | 21 | 90 | 39.428571 | 0.907231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.8 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
ed056decfe224db19bed102a648393e902392ba5 | 391 | py | Python | halo_bian/bian/domain/event.py | halo-framework/halo-bian | 38fe577bf3e5e0fe57f5e3eb5809d609e3f880ac | [
"MIT"
] | null | null | null | halo_bian/bian/domain/event.py | halo-framework/halo-bian | 38fe577bf3e5e0fe57f5e3eb5809d609e3f880ac | [
"MIT"
] | null | null | null | halo_bian/bian/domain/event.py | halo-framework/halo-bian | 38fe577bf3e5e0fe57f5e3eb5809d609e3f880ac | [
"MIT"
] | null | null | null | import uuid
from halo_app.domain.event import AbsHaloEvent
from halo_bian.bian.app.context import BianContext
from halo_bian.bian.bian import ActionTerms
class AbsBianEvent(AbsHaloEvent):
action_term = None
def __init__(self, context:BianContext,name:str,action_term:ActionTerms):
super(AbsBianEvent,self).__init__(context,name)
self.action_term = action_term
| 24.4375 | 77 | 0.780051 | 51 | 391 | 5.686275 | 0.45098 | 0.137931 | 0.082759 | 0.110345 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.14578 | 391 | 15 | 78 | 26.066667 | 0.868263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.444444 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
ed0b5866a5e35e7e83e13af79731c62bee8aaaed | 249 | py | Python | unnecessary_math.py | MarioRguezz/ISOExposition | 8300268355a41afb41ce1d7aa527203008c37cc8 | [
"Apache-2.0"
] | null | null | null | unnecessary_math.py | MarioRguezz/ISOExposition | 8300268355a41afb41ce1d7aa527203008c37cc8 | [
"Apache-2.0"
] | null | null | null | unnecessary_math.py | MarioRguezz/ISOExposition | 8300268355a41afb41ce1d7aa527203008c37cc8 | [
"Apache-2.0"
] | null | null | null | '''
La función muestra como el código se ejecuta, qué espera y cuál es su salida. En caso de que
no coincida, la prueba fallará
'''
def multiply(a, b):
"""
>>> multiply(4, 3)
12
>>> multiply('a', 3)
'aaa'
"""
return a * b | 20.75 | 92 | 0.574297 | 39 | 249 | 3.666667 | 0.820513 | 0.125874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027933 | 0.281125 | 249 | 12 | 93 | 20.75 | 0.77095 | 0.690763 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ed0f26876e0fd5437d37b574e27eba0cead983b1 | 1,373 | py | Python | pyllars/sparse_vector.py | bmmalone/pyllars | e37c2bd6fba91a587e54f0bc6a456cee050a6d00 | [
"MIT"
] | 3 | 2020-04-19T15:22:43.000Z | 2022-01-05T16:06:50.000Z | pyllars/sparse_vector.py | bmmalone/pyllars | e37c2bd6fba91a587e54f0bc6a456cee050a6d00 | [
"MIT"
] | 19 | 2018-12-29T13:59:39.000Z | 2021-07-22T11:53:23.000Z | pyllars/sparse_vector.py | bmmalone/pymisc-utils | a1000824623d7db4873836864c14b830cc81aeed | [
"MIT"
] | null | null | null | """
This class is a thin wrapper around scipy.sparse.lil_matrix to reduce
the notational burden when dealing with sparse vectors. Internally, they
are simply stored as sparse matrices.
By default, the sparse vectors are created as integer row matrices. The
scipy.sparse.lil_matrix representation is used.
THIS CLASS HAS NOT BEEN TESTED EXTENSIVELY.
"""
import scipy.sparse
class lil_sparse_vector(scipy.sparse.lil_matrix):
def __init__(self, size, dtype=int):
super(lil_sparse_vector, self).__init__((size,1), dtype=dtype)
def __getitem__(self, index):
""" This getter grabs the value stored in the vector at index.
Args:
index (int): The index
Returns:
self.dtype: The value stored at index
"""
return super().__getitem__((index, 0))
def __setitem__(self, index, value):
""" This setter puts value into the stored vector at index.
Args:
index (int): The index
value (self.dtype): The value to set. Type checking IS NOT performed
Returns:
None
"""
super().__setitem__((index, 0), value)
def __len__(self):
""" This property returns the length of the first dimension of the
stored matrix.
"""
return super().shape[0]
| 29.212766 | 84 | 0.622724 | 174 | 1,373 | 4.712644 | 0.436782 | 0.053659 | 0.05122 | 0.073171 | 0.080488 | 0.080488 | 0.080488 | 0.080488 | 0 | 0 | 0 | 0.004137 | 0.295703 | 1,373 | 46 | 85 | 29.847826 | 0.843847 | 0.547706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.1 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ed14bae8e800f94eaee259b21f08df06cf533069 | 782 | py | Python | Back-end/iot_smart_home/accounts/migrations/0002_alter_contractor_resume_and_more.py | sepahanit/IOT-SMARTHOME-IUT | 9a86ea8fff870ea147c6efd60745e19a289d0d4e | [
"MIT"
] | 1 | 2022-01-29T22:57:26.000Z | 2022-01-29T22:57:26.000Z | Back-end/iot_smart_home/accounts/migrations/0002_alter_contractor_resume_and_more.py | erfanbahrami/iot-smart-home-api | 068c5ee077e9b3a1cb0969009e80405a26d2021b | [
"MIT"
] | null | null | null | Back-end/iot_smart_home/accounts/migrations/0002_alter_contractor_resume_and_more.py | erfanbahrami/iot-smart-home-api | 068c5ee077e9b3a1cb0969009e80405a26d2021b | [
"MIT"
] | 2 | 2022-01-27T09:15:19.000Z | 2022-01-30T20:57:43.000Z | # Generated by Django 4.0.1 on 2022-01-13 17:16
from django.db import migrations
import django_bleach.models
class Migration(migrations.Migration):
dependencies = [
('accounts', '0001_initial'),
]
operations = [
migrations.AlterField(
model_name='contractor',
name='resume',
field=django_bleach.models.BleachField(blank=True, null=True),
),
migrations.AlterField(
model_name='contractor',
name='under_service_area',
field=django_bleach.models.BleachField(blank=True, null=True),
),
migrations.AlterField(
model_name='user',
name='address',
field=django_bleach.models.BleachField(blank=True),
),
]
| 26.066667 | 74 | 0.597187 | 78 | 782 | 5.858974 | 0.512821 | 0.105033 | 0.157549 | 0.190372 | 0.568928 | 0.568928 | 0.444201 | 0.350109 | 0.350109 | 0.350109 | 0 | 0.034296 | 0.29156 | 782 | 29 | 75 | 26.965517 | 0.790614 | 0.057545 | 0 | 0.434783 | 1 | 0 | 0.102041 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.086957 | 0 | 0.217391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed1c39f973097676df05e0114f0a0bf76361532f | 3,271 | py | Python | PhdTesterExample/phdTesterExample/models.py | Koldar/phdTester | b9eddc33409869fa327eef20dd0c05b336a71d97 | [
"MIT"
] | 1 | 2019-07-09T17:30:49.000Z | 2019-07-09T17:30:49.000Z | PhdTesterExample/phdTesterExample/models.py | Koldar/phdTester | b9eddc33409869fa327eef20dd0c05b336a71d97 | [
"MIT"
] | 145 | 2019-05-22T17:22:50.000Z | 2021-02-10T02:25:05.000Z | PhdTesterExample/phdTesterExample/models.py | Koldar/phdTester | b9eddc33409869fa327eef20dd0c05b336a71d97 | [
"MIT"
] | null | null | null | from typing import Dict
import string_utils
import phdTester as phd
class SortTestContext(phd.AbstractTestContext):
def __init__(self, ut: "SortAlgorithm" = None, te: "SortEnvironment" = None):
super().__init__(ut, te)
@property
def ut(self) -> "SortAlgorithm":
return self._ut
@property
def te(self) -> "SortEnvironment":
return self._te
class SortAlgorithm(phd.AbstractStuffUnderTest):
# @property
# def key_alias(self) -> Dict[str, str]:
# return {
# "algorithm": "a",
# }
# @property
# def value_alias(self) -> Dict[str, str]:
# # No alias
# return {}
def __init__(self):
phd.AbstractStuffUnderTest.__init__(self)
self.algorithm: str = None
self.shrinkFactor: float = None
# def get_label(self) -> str:
# return f"{self.algorithm}"
class SortEnvironment(phd.AbstractTestingEnvironment):
def get_order_key(self) -> str:
return "_".join(map(lambda o: f"{o}={str(self.get_option(o))}", self.options()))
# @property
# def key_alias(self) -> Dict[str, str]:
# return {
# "sequenceSize": "ss",
# "sequenceType": "st",
# "lowerBound": "lb",
# "upperBound": "ub",
# "run": "r",
# }
#
# @property
# def value_alias(self) -> Dict[str, str]:
# return {}
def __init__(self):
phd.AbstractTestingEnvironment.__init__(self)
self.sequenceSize: int = None
self.sequenceType: str = None
self.lowerBound: int = None
self.upperBound: int = None
self.run: int = None
def get_label(self) -> str:
return f"size={self.sequenceSize} type={self.sequenceType} lb={self.lowerBound} ub={self.upperBound} run={self.run}"
class SortSettings(phd.AbstractTestingGlobalSettings):
def __init__(self):
phd.AbstractTestingGlobalSettings.__init__(self)
self.buildDirectory = None
self.logLevel = None
class SortTestContextMask(phd.AbstractTestContextMask):
def __init__(self, ut: "SortAlgorithmMask", te: "SortEnvironmentMask"):
phd.AbstractTestContextMask.__init__(self, ut=ut, te=te)
@property
def ut(self) -> "SortAlgorithmMask":
return self._ut
@property
def te(self) -> "SortEnvironmentMask":
return self._te
class SortAlgorithmMask(phd.AbstractStuffUnderTestMask):
def __init__(self):
phd.AbstractStuffUnderTestMask.__init__(self)
self.algorithm: "phd.ITestContextMaskOption" = None
self.shrinkFactor: "phd.ITestContextMaskOption" = None
class SortEnvironmentMask(phd.AbstractTestEnvironmentMask):
def __init__(self):
phd.AbstractTestEnvironmentMask.__init__(self)
self.sequenceSize: phd.ITestContextMaskOption = None
self.sequenceType: phd.ITestContextMaskOption = None
self.lowerBound: phd.ITestContextMaskOption = None
self.upperBound: phd.ITestContextMaskOption = None
self.run: phd.ITestContextMaskOption = None
class PerformanceCsvRow(phd.AbstractCSVRow):
def __init__(self):
phd.AbstractCSVRow.__init__(self)
self.run: int = None
self.time: int = None
| 26.811475 | 124 | 0.638337 | 323 | 3,271 | 6.219814 | 0.219814 | 0.059731 | 0.043803 | 0.041812 | 0.170234 | 0.134395 | 0.131409 | 0.102539 | 0.038825 | 0 | 0 | 0 | 0.244879 | 3,271 | 121 | 125 | 27.033058 | 0.81336 | 0.152858 | 0 | 0.262295 | 0 | 0.016393 | 0.115118 | 0.046995 | 0 | 0 | 0 | 0 | 0 | 1 | 0.229508 | false | 0 | 0.04918 | 0.098361 | 0.508197 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ed1dd2c09eb6b1500a5c2ec26b7d25a4d8bde913 | 1,589 | py | Python | tests/pyxl_original/test_errors.py | adrienbrunet/mixt | d725ec752ce430d135e993bc988bfdf2b8457c4b | [
"MIT"
] | 27 | 2018-06-04T19:11:42.000Z | 2022-02-23T22:46:39.000Z | tests/pyxl_original/test_errors.py | adrienbrunet/mixt | d725ec752ce430d135e993bc988bfdf2b8457c4b | [
"MIT"
] | 7 | 2018-06-09T15:27:51.000Z | 2021-03-11T20:00:35.000Z | tests/pyxl_original/test_errors.py | adrienbrunet/mixt | d725ec752ce430d135e993bc988bfdf2b8457c4b | [
"MIT"
] | 3 | 2018-07-29T10:20:02.000Z | 2021-11-18T19:55:07.000Z | # coding: mixt
import pytest
from mixt.codec.register import pyxl_decode
from mixt.exceptions import ParserError
def test_malformed_if():
with pytest.raises(ParserError):
pyxl_decode(b"""
<Fragment>
<if cond="{true}">foo</if>
this is incorrect!
<else>bar</else>
</Fragment>""")
def test_invalid_if_prop():
with pytest.raises(ParserError):
pyxl_decode(b"""
<Fragment>
<if cond="{true}" foo="bar">foo</if>
</Fragment>""")
def test_missing_if_cond():
with pytest.raises(ParserError):
pyxl_decode(b"""
<Fragment>
<if>foo</if>
</Fragment>""")
with pytest.raises(ParserError):
pyxl_decode(b"""
<Fragment>
<if foo="bar">foo</if>
</Fragment>""")
def test_multiple_else():
with pytest.raises(ParserError):
pyxl_decode(b"""
<Fragment>
<if cond="{true}">foo</if>
<else>bar</else>
<else>baz</else>
</Fragment>""")
def test_nested_else():
with pytest.raises(ParserError):
pyxl_decode(b"""
<Fragment>
<if cond="{true}">foo</if>
<else><else>bar</else></else>
</Fragment>""")
def test_else_with_prop():
with pytest.raises(ParserError):
pyxl_decode(b"""
<Fragment>
<if cond="{true}">foo</if>
<else foo="bar">bar</else>
</Fragment>""")
| 25.629032 | 52 | 0.499056 | 163 | 1,589 | 4.723926 | 0.202454 | 0.103896 | 0.145455 | 0.245455 | 0.622078 | 0.622078 | 0.622078 | 0.562338 | 0.562338 | 0.562338 | 0 | 0 | 0.348647 | 1,589 | 61 | 53 | 26.04918 | 0.743961 | 0.007552 | 0 | 0.68 | 0 | 0 | 0.537143 | 0.074286 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12 | true | 0 | 0.06 | 0 | 0.18 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed25e613d23cb8d2d264542f6b5ae8f44bf1adf0 | 25,550 | py | Python | sdk/python/pulumi_azure_native/certificateregistration/v20200601/app_service_certificate_order.py | polivbr/pulumi-azure-native | 09571f3bf6bdc4f3621aabefd1ba6c0d4ecfb0e7 | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_native/certificateregistration/v20200601/app_service_certificate_order.py | polivbr/pulumi-azure-native | 09571f3bf6bdc4f3621aabefd1ba6c0d4ecfb0e7 | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_native/certificateregistration/v20200601/app_service_certificate_order.py | polivbr/pulumi-azure-native | 09571f3bf6bdc4f3621aabefd1ba6c0d4ecfb0e7 | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from ... import _utilities
from . import outputs
from ._enums import *
from ._inputs import *
__all__ = ['AppServiceCertificateOrderArgs', 'AppServiceCertificateOrder']
@pulumi.input_type
class AppServiceCertificateOrderArgs:
def __init__(__self__, *,
product_type: pulumi.Input['CertificateProductType'],
resource_group_name: pulumi.Input[str],
auto_renew: Optional[pulumi.Input[bool]] = None,
certificate_order_name: Optional[pulumi.Input[str]] = None,
certificates: Optional[pulumi.Input[Mapping[str, pulumi.Input['AppServiceCertificateArgs']]]] = None,
csr: Optional[pulumi.Input[str]] = None,
distinguished_name: Optional[pulumi.Input[str]] = None,
key_size: Optional[pulumi.Input[int]] = None,
kind: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
validity_in_years: Optional[pulumi.Input[int]] = None):
"""
The set of arguments for constructing a AppServiceCertificateOrder resource.
:param pulumi.Input['CertificateProductType'] product_type: Certificate product type.
:param pulumi.Input[str] resource_group_name: Name of the resource group to which the resource belongs.
:param pulumi.Input[bool] auto_renew: <code>true</code> if the certificate should be automatically renewed when it expires; otherwise, <code>false</code>.
:param pulumi.Input[str] certificate_order_name: Name of the certificate order.
:param pulumi.Input[Mapping[str, pulumi.Input['AppServiceCertificateArgs']]] certificates: State of the Key Vault secret.
:param pulumi.Input[str] csr: Last CSR that was created for this order.
:param pulumi.Input[str] distinguished_name: Certificate distinguished name.
:param pulumi.Input[int] key_size: Certificate key size.
:param pulumi.Input[str] kind: Kind of resource.
:param pulumi.Input[str] location: Resource Location.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: Resource tags.
:param pulumi.Input[int] validity_in_years: Duration in years (must be between 1 and 3).
"""
pulumi.set(__self__, "product_type", product_type)
pulumi.set(__self__, "resource_group_name", resource_group_name)
if auto_renew is None:
auto_renew = True
if auto_renew is not None:
pulumi.set(__self__, "auto_renew", auto_renew)
if certificate_order_name is not None:
pulumi.set(__self__, "certificate_order_name", certificate_order_name)
if certificates is not None:
pulumi.set(__self__, "certificates", certificates)
if csr is not None:
pulumi.set(__self__, "csr", csr)
if distinguished_name is not None:
pulumi.set(__self__, "distinguished_name", distinguished_name)
if key_size is None:
key_size = 2048
if key_size is not None:
pulumi.set(__self__, "key_size", key_size)
if kind is not None:
pulumi.set(__self__, "kind", kind)
if location is not None:
pulumi.set(__self__, "location", location)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if validity_in_years is None:
validity_in_years = 1
if validity_in_years is not None:
pulumi.set(__self__, "validity_in_years", validity_in_years)
@property
@pulumi.getter(name="productType")
def product_type(self) -> pulumi.Input['CertificateProductType']:
"""
Certificate product type.
"""
return pulumi.get(self, "product_type")
@product_type.setter
def product_type(self, value: pulumi.Input['CertificateProductType']):
pulumi.set(self, "product_type", value)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Input[str]:
"""
Name of the resource group to which the resource belongs.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter(name="autoRenew")
def auto_renew(self) -> Optional[pulumi.Input[bool]]:
"""
<code>true</code> if the certificate should be automatically renewed when it expires; otherwise, <code>false</code>.
"""
return pulumi.get(self, "auto_renew")
@auto_renew.setter
def auto_renew(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "auto_renew", value)
@property
@pulumi.getter(name="certificateOrderName")
def certificate_order_name(self) -> Optional[pulumi.Input[str]]:
"""
Name of the certificate order.
"""
return pulumi.get(self, "certificate_order_name")
@certificate_order_name.setter
def certificate_order_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "certificate_order_name", value)
@property
@pulumi.getter
def certificates(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input['AppServiceCertificateArgs']]]]:
"""
State of the Key Vault secret.
"""
return pulumi.get(self, "certificates")
@certificates.setter
def certificates(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input['AppServiceCertificateArgs']]]]):
pulumi.set(self, "certificates", value)
@property
@pulumi.getter
def csr(self) -> Optional[pulumi.Input[str]]:
"""
Last CSR that was created for this order.
"""
return pulumi.get(self, "csr")
@csr.setter
def csr(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "csr", value)
@property
@pulumi.getter(name="distinguishedName")
def distinguished_name(self) -> Optional[pulumi.Input[str]]:
"""
Certificate distinguished name.
"""
return pulumi.get(self, "distinguished_name")
@distinguished_name.setter
def distinguished_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "distinguished_name", value)
@property
@pulumi.getter(name="keySize")
def key_size(self) -> Optional[pulumi.Input[int]]:
"""
Certificate key size.
"""
return pulumi.get(self, "key_size")
@key_size.setter
def key_size(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "key_size", value)
@property
@pulumi.getter
def kind(self) -> Optional[pulumi.Input[str]]:
"""
Kind of resource.
"""
return pulumi.get(self, "kind")
@kind.setter
def kind(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "kind", value)
@property
@pulumi.getter
def location(self) -> Optional[pulumi.Input[str]]:
"""
Resource Location.
"""
return pulumi.get(self, "location")
@location.setter
def location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "location", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
Resource tags.
"""
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter(name="validityInYears")
def validity_in_years(self) -> Optional[pulumi.Input[int]]:
"""
Duration in years (must be between 1 and 3).
"""
return pulumi.get(self, "validity_in_years")
@validity_in_years.setter
def validity_in_years(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "validity_in_years", value)
class AppServiceCertificateOrder(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
auto_renew: Optional[pulumi.Input[bool]] = None,
certificate_order_name: Optional[pulumi.Input[str]] = None,
certificates: Optional[pulumi.Input[Mapping[str, pulumi.Input[pulumi.InputType['AppServiceCertificateArgs']]]]] = None,
csr: Optional[pulumi.Input[str]] = None,
distinguished_name: Optional[pulumi.Input[str]] = None,
key_size: Optional[pulumi.Input[int]] = None,
kind: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
product_type: Optional[pulumi.Input['CertificateProductType']] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
validity_in_years: Optional[pulumi.Input[int]] = None,
__props__=None):
"""
SSL certificate purchase order.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] auto_renew: <code>true</code> if the certificate should be automatically renewed when it expires; otherwise, <code>false</code>.
:param pulumi.Input[str] certificate_order_name: Name of the certificate order.
:param pulumi.Input[Mapping[str, pulumi.Input[pulumi.InputType['AppServiceCertificateArgs']]]] certificates: State of the Key Vault secret.
:param pulumi.Input[str] csr: Last CSR that was created for this order.
:param pulumi.Input[str] distinguished_name: Certificate distinguished name.
:param pulumi.Input[int] key_size: Certificate key size.
:param pulumi.Input[str] kind: Kind of resource.
:param pulumi.Input[str] location: Resource Location.
:param pulumi.Input['CertificateProductType'] product_type: Certificate product type.
:param pulumi.Input[str] resource_group_name: Name of the resource group to which the resource belongs.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] tags: Resource tags.
:param pulumi.Input[int] validity_in_years: Duration in years (must be between 1 and 3).
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: AppServiceCertificateOrderArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
SSL certificate purchase order.
:param str resource_name: The name of the resource.
:param AppServiceCertificateOrderArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(AppServiceCertificateOrderArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
auto_renew: Optional[pulumi.Input[bool]] = None,
certificate_order_name: Optional[pulumi.Input[str]] = None,
certificates: Optional[pulumi.Input[Mapping[str, pulumi.Input[pulumi.InputType['AppServiceCertificateArgs']]]]] = None,
csr: Optional[pulumi.Input[str]] = None,
distinguished_name: Optional[pulumi.Input[str]] = None,
key_size: Optional[pulumi.Input[int]] = None,
kind: Optional[pulumi.Input[str]] = None,
location: Optional[pulumi.Input[str]] = None,
product_type: Optional[pulumi.Input['CertificateProductType']] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
validity_in_years: Optional[pulumi.Input[int]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = AppServiceCertificateOrderArgs.__new__(AppServiceCertificateOrderArgs)
if auto_renew is None:
auto_renew = True
__props__.__dict__["auto_renew"] = auto_renew
__props__.__dict__["certificate_order_name"] = certificate_order_name
__props__.__dict__["certificates"] = certificates
__props__.__dict__["csr"] = csr
__props__.__dict__["distinguished_name"] = distinguished_name
if key_size is None:
key_size = 2048
__props__.__dict__["key_size"] = key_size
__props__.__dict__["kind"] = kind
__props__.__dict__["location"] = location
if product_type is None and not opts.urn:
raise TypeError("Missing required property 'product_type'")
__props__.__dict__["product_type"] = product_type
if resource_group_name is None and not opts.urn:
raise TypeError("Missing required property 'resource_group_name'")
__props__.__dict__["resource_group_name"] = resource_group_name
__props__.__dict__["tags"] = tags
if validity_in_years is None:
validity_in_years = 1
__props__.__dict__["validity_in_years"] = validity_in_years
__props__.__dict__["app_service_certificate_not_renewable_reasons"] = None
__props__.__dict__["domain_verification_token"] = None
__props__.__dict__["expiration_time"] = None
__props__.__dict__["intermediate"] = None
__props__.__dict__["is_private_key_external"] = None
__props__.__dict__["last_certificate_issuance_time"] = None
__props__.__dict__["name"] = None
__props__.__dict__["next_auto_renewal_time_stamp"] = None
__props__.__dict__["provisioning_state"] = None
__props__.__dict__["root"] = None
__props__.__dict__["serial_number"] = None
__props__.__dict__["signed_certificate"] = None
__props__.__dict__["status"] = None
__props__.__dict__["type"] = None
alias_opts = pulumi.ResourceOptions(aliases=[pulumi.Alias(type_="azure-nextgen:certificateregistration/v20200601:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-native:certificateregistration:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-nextgen:certificateregistration:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-native:certificateregistration/v20150801:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-nextgen:certificateregistration/v20150801:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-native:certificateregistration/v20180201:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-nextgen:certificateregistration/v20180201:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-native:certificateregistration/v20190801:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-nextgen:certificateregistration/v20190801:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-native:certificateregistration/v20200901:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-nextgen:certificateregistration/v20200901:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-native:certificateregistration/v20201001:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-nextgen:certificateregistration/v20201001:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-native:certificateregistration/v20201201:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-nextgen:certificateregistration/v20201201:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-native:certificateregistration/v20210101:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-nextgen:certificateregistration/v20210101:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-native:certificateregistration/v20210115:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-nextgen:certificateregistration/v20210115:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-native:certificateregistration/v20210201:AppServiceCertificateOrder"), pulumi.Alias(type_="azure-nextgen:certificateregistration/v20210201:AppServiceCertificateOrder")])
opts = pulumi.ResourceOptions.merge(opts, alias_opts)
super(AppServiceCertificateOrder, __self__).__init__(
'azure-native:certificateregistration/v20200601:AppServiceCertificateOrder',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None) -> 'AppServiceCertificateOrder':
"""
Get an existing AppServiceCertificateOrder resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = AppServiceCertificateOrderArgs.__new__(AppServiceCertificateOrderArgs)
__props__.__dict__["app_service_certificate_not_renewable_reasons"] = None
__props__.__dict__["auto_renew"] = None
__props__.__dict__["certificates"] = None
__props__.__dict__["csr"] = None
__props__.__dict__["distinguished_name"] = None
__props__.__dict__["domain_verification_token"] = None
__props__.__dict__["expiration_time"] = None
__props__.__dict__["intermediate"] = None
__props__.__dict__["is_private_key_external"] = None
__props__.__dict__["key_size"] = None
__props__.__dict__["kind"] = None
__props__.__dict__["last_certificate_issuance_time"] = None
__props__.__dict__["location"] = None
__props__.__dict__["name"] = None
__props__.__dict__["next_auto_renewal_time_stamp"] = None
__props__.__dict__["product_type"] = None
__props__.__dict__["provisioning_state"] = None
__props__.__dict__["root"] = None
__props__.__dict__["serial_number"] = None
__props__.__dict__["signed_certificate"] = None
__props__.__dict__["status"] = None
__props__.__dict__["tags"] = None
__props__.__dict__["type"] = None
__props__.__dict__["validity_in_years"] = None
return AppServiceCertificateOrder(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="appServiceCertificateNotRenewableReasons")
def app_service_certificate_not_renewable_reasons(self) -> pulumi.Output[Sequence[str]]:
"""
Reasons why App Service Certificate is not renewable at the current moment.
"""
return pulumi.get(self, "app_service_certificate_not_renewable_reasons")
@property
@pulumi.getter(name="autoRenew")
def auto_renew(self) -> pulumi.Output[Optional[bool]]:
"""
<code>true</code> if the certificate should be automatically renewed when it expires; otherwise, <code>false</code>.
"""
return pulumi.get(self, "auto_renew")
@property
@pulumi.getter
def certificates(self) -> pulumi.Output[Optional[Mapping[str, 'outputs.AppServiceCertificateResponse']]]:
"""
State of the Key Vault secret.
"""
return pulumi.get(self, "certificates")
@property
@pulumi.getter
def csr(self) -> pulumi.Output[Optional[str]]:
"""
Last CSR that was created for this order.
"""
return pulumi.get(self, "csr")
@property
@pulumi.getter(name="distinguishedName")
def distinguished_name(self) -> pulumi.Output[Optional[str]]:
"""
Certificate distinguished name.
"""
return pulumi.get(self, "distinguished_name")
@property
@pulumi.getter(name="domainVerificationToken")
def domain_verification_token(self) -> pulumi.Output[str]:
"""
Domain verification token.
"""
return pulumi.get(self, "domain_verification_token")
@property
@pulumi.getter(name="expirationTime")
def expiration_time(self) -> pulumi.Output[str]:
"""
Certificate expiration time.
"""
return pulumi.get(self, "expiration_time")
@property
@pulumi.getter
def intermediate(self) -> pulumi.Output['outputs.CertificateDetailsResponse']:
"""
Intermediate certificate.
"""
return pulumi.get(self, "intermediate")
@property
@pulumi.getter(name="isPrivateKeyExternal")
def is_private_key_external(self) -> pulumi.Output[bool]:
"""
<code>true</code> if private key is external; otherwise, <code>false</code>.
"""
return pulumi.get(self, "is_private_key_external")
@property
@pulumi.getter(name="keySize")
def key_size(self) -> pulumi.Output[Optional[int]]:
"""
Certificate key size.
"""
return pulumi.get(self, "key_size")
@property
@pulumi.getter
def kind(self) -> pulumi.Output[Optional[str]]:
"""
Kind of resource.
"""
return pulumi.get(self, "kind")
@property
@pulumi.getter(name="lastCertificateIssuanceTime")
def last_certificate_issuance_time(self) -> pulumi.Output[str]:
"""
Certificate last issuance time.
"""
return pulumi.get(self, "last_certificate_issuance_time")
@property
@pulumi.getter
def location(self) -> pulumi.Output[str]:
"""
Resource Location.
"""
return pulumi.get(self, "location")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Resource Name.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="nextAutoRenewalTimeStamp")
def next_auto_renewal_time_stamp(self) -> pulumi.Output[str]:
"""
Time stamp when the certificate would be auto renewed next
"""
return pulumi.get(self, "next_auto_renewal_time_stamp")
@property
@pulumi.getter(name="productType")
def product_type(self) -> pulumi.Output[str]:
"""
Certificate product type.
"""
return pulumi.get(self, "product_type")
@property
@pulumi.getter(name="provisioningState")
def provisioning_state(self) -> pulumi.Output[str]:
"""
Status of certificate order.
"""
return pulumi.get(self, "provisioning_state")
@property
@pulumi.getter
def root(self) -> pulumi.Output['outputs.CertificateDetailsResponse']:
"""
Root certificate.
"""
return pulumi.get(self, "root")
@property
@pulumi.getter(name="serialNumber")
def serial_number(self) -> pulumi.Output[str]:
"""
Current serial number of the certificate.
"""
return pulumi.get(self, "serial_number")
@property
@pulumi.getter(name="signedCertificate")
def signed_certificate(self) -> pulumi.Output['outputs.CertificateDetailsResponse']:
"""
Signed certificate.
"""
return pulumi.get(self, "signed_certificate")
@property
@pulumi.getter
def status(self) -> pulumi.Output[str]:
"""
Current order status.
"""
return pulumi.get(self, "status")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
Resource tags.
"""
return pulumi.get(self, "tags")
@property
@pulumi.getter
def type(self) -> pulumi.Output[str]:
"""
Resource type.
"""
return pulumi.get(self, "type")
@property
@pulumi.getter(name="validityInYears")
def validity_in_years(self) -> pulumi.Output[Optional[int]]:
"""
Duration in years (must be between 1 and 3).
"""
return pulumi.get(self, "validity_in_years")
| 43.900344 | 2,081 | 0.658395 | 2,696 | 25,550 | 5.914318 | 0.083086 | 0.069677 | 0.064346 | 0.042897 | 0.746943 | 0.647538 | 0.575353 | 0.450298 | 0.415303 | 0.362935 | 0 | 0.009114 | 0.231311 | 25,550 | 581 | 2,082 | 43.975904 | 0.802749 | 0.170568 | 0 | 0.46281 | 1 | 0 | 0.206009 | 0.136423 | 0 | 0 | 0 | 0 | 0 | 1 | 0.14876 | false | 0.002755 | 0.022039 | 0 | 0.278237 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed32f0ac7ceafd4f8aabdbb9f27efe91014c9553 | 216 | py | Python | discord_data/common.py | seanbreckenridge/discord_data | 92a71f4362567a39b7d41b81702c07868a85f182 | [
"Apache-2.0"
] | 6 | 2020-10-27T01:23:01.000Z | 2021-10-02T21:27:47.000Z | discord_data/common.py | seanbreckenridge/discord_data | 92a71f4362567a39b7d41b81702c07868a85f182 | [
"Apache-2.0"
] | 3 | 2021-03-20T04:32:06.000Z | 2022-02-10T04:28:22.000Z | discord_data/common.py | seanbreckenridge/discord_data | 92a71f4362567a39b7d41b81702c07868a85f182 | [
"Apache-2.0"
] | 1 | 2021-10-02T23:52:21.000Z | 2021-10-02T23:52:21.000Z | from typing import Union
from pathlib import Path
PathIsh = Union[str, Path]
def expand_path(path: PathIsh) -> Path:
if isinstance(path, str):
path = Path(path)
return path.expanduser().absolute()
| 19.636364 | 39 | 0.689815 | 29 | 216 | 5.103448 | 0.517241 | 0.162162 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.203704 | 216 | 10 | 40 | 21.6 | 0.860465 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ed456a3edcb26f25ba9644d33b232d93aaa7ed8e | 1,379 | py | Python | examples/chanlun/feature/03/ex_setup_commands.py | garywangiam02/vnpy | fbb168bf977d95ae874e92a3655c6c893db16a1f | [
"MIT"
] | null | null | null | examples/chanlun/feature/03/ex_setup_commands.py | garywangiam02/vnpy | fbb168bf977d95ae874e92a3655c6c893db16a1f | [
"MIT"
] | null | null | null | examples/chanlun/feature/03/ex_setup_commands.py | garywangiam02/vnpy | fbb168bf977d95ae874e92a3655c6c893db16a1f | [
"MIT"
] | null | null | null | # coding=utf-8
from pytdx.parser.base import BaseParser
from pytdx.helper import get_datetime, get_volume, get_price
from collections import OrderedDict
import struct
class ExSetupCmd1(BaseParser):
def setup(self):
# self.send_pkg = bytearray.fromhex("01 01 48 65 00 01 52 00 52 00 54 24 1f 32 c6 e5"
# "d5 3d fb 41 1f 32 c6 e5 d5 3d fb 41 1f 32 c6 e5"
# "d5 3d fb 41 1f 32 c6 e5 d5 3d fb 41 1f 32 c6 e5"
# "d5 3d fb 41 1f 32 c6 e5 d5 3d fb 41 1f 32 c6 e5"
# "d5 3d fb 41 1f 32 c6 e5 d5 3d fb 41 cc e1 6d ff"
# "d5 ba 3f b8 cb c5 7a 05 4f 77 48 ea")
self.send_pkg = bytearray.fromhex("01 01 48 65 00 01 52 00 52 00 54 24"
"FC F0 0E 92 F3 C8 37 83 1F 32 C6 E5 D5 3D FB 41 CD 9C"
"F2 07 FC D0 3C F6 F2 F7 A4 77 47 83 1D 59 9D CC 1F 91"
"D5 55 82 DC 09 07 EE 29 DD FE 4C 28 1F 32 C6 E5 D5 3D"
"FB 41 1F 32 C6 E5 D5 3D FB 41 F3 43 87 E6 68 A9 2A A3"
"70 11 E4 9C D2 6E B0 1A")
def parseResponse(self, body_buf):
pass
| 53.038462 | 97 | 0.460479 | 231 | 1,379 | 2.722944 | 0.454545 | 0.069952 | 0.104928 | 0.139905 | 0.441971 | 0.441971 | 0.441971 | 0.441971 | 0.416534 | 0.416534 | 0 | 0.333333 | 0.497462 | 1,379 | 25 | 98 | 55.16 | 0.574315 | 0.373459 | 0 | 0 | 0 | 0 | 0.315421 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0.071429 | 0.285714 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
ed4fbfac36b74f5d9d3d7d151dd4dfd5ee7c227f | 3,635 | py | Python | SMT2/Torres/torres.py | luispozas/PR | 34a36ec09d9a1bc9f5cc99882243da5f453c24e8 | [
"MIT"
] | 1 | 2021-06-06T18:28:48.000Z | 2021-06-06T18:28:48.000Z | SMT2/Torres/torres.py | luispozas/PR | 34a36ec09d9a1bc9f5cc99882243da5f453c24e8 | [
"MIT"
] | null | null | null | SMT2/Torres/torres.py | luispozas/PR | 34a36ec09d9a1bc9f5cc99882243da5f453c24e8 | [
"MIT"
] | null | null | null | #!/usr/bin/python3
import sys
# altura : altura de la torre
# disp : piezas disponibles
# Colores: Azul = 0, Rojo = 1 , Verde = 2;
altura = int(input())
disp1 = input().split()
disp = []
for i in range(len(disp1)):
disp.append(int(disp1[i]))
def torre (i):
return "torre_"+str(i)
def setlogic(l):
return "(set-logic "+ l +")"
def intvar(v):
return "(declare-fun "+v+" () Int)"
def bool2int(b):
return "(ite "+b+" 1 0 )"
def addand(a1,a2):
return "(and "+a1+" "+a2+" )"
def addor(a1,a2):
return "(or "+a1+" "+a2+" )"
def addnot(a):
return "(not "+a+" )"
def addexists(a):
if len(a) == 0:
return "false"
elif len(a) == 1:
return a[0]
else :
x = a.pop()
return "(or " + x + " " + addexists(a) + " )"
def addeq(a1,a2):
return "(= "+a1+" "+a2+" )"
def addle(a1,a2):
return "(<= "+a1+" "+a2+" )"
def addge(a1,a2):
return "(>= "+a1+" "+a2+" )"
def addlt(a1,a2):
return "(< "+a1+" "+a2+" )"
def addgt(a1,a2):
return "(> "+a1+" "+a2+" )"
def addplus(a1,a2):
return "(+ "+a1+" "+a2+" )"
def addassert(a):
return "(assert "+a+" )"
def addsum(a):
if len(a) == 0:
return "0"
elif len(a) == 1:
return a[0]
else :
x = a.pop()
return "(+ " + x + " " + addsum(a) + " )"
def checksat():
print("(check-sat)")
def getmodel():
print("(get-model)")
def getvalue(l):
print("(get-value " + l + " )")
################################
# generamos un fichero smtlib2
################################
print("(set-option :produce-models true)")
print(setlogic("QF_LIA"))
#declaración de variables de la solución
for i in range(altura):
print(intvar(torre(i)))
# fin declaración
#constraint forall (i in 0..altura-1) (0 <= torre_i);
#constraint forall (i in 0..altura-1) (torre_i <= 2);
for i in range(altura): # es equivalente a range(0,altura)
print(addassert(addle("0",torre(i))))
print(addassert(addle(torre(i),"2")))
#end constraint
#No dos verdes consecutivas
#constraint forall (i in 0..altura-2) (torre_i!=2 \/ torre_i+1!=2);
for i in range(altura-1):
c1 = addnot(addeq(torre(i),"2"))
c2 = addnot(addeq(torre(i+1),"2"))
print(addassert(addor(c1,c2)))
#fin constraint
#Piezas azules >= Piezas verdes en todo momento
#constraint forall (i in 0..altura-1) (( sum (j in 0..i ) ( bool2int(torre_j=0) )) >=
#( sum (j in 0..i ) ( bool2int(torre_j=2) )));
for i in range(altura):
suma = []
sumv = []
for j in range(i+1):
suma.append(bool2int(addeq(torre(j),"0")))
sumv.append(bool2int(addeq(torre(j),"2")))
print(addassert(addge(addsum(suma),addsum(sumv))))
#fin constraint
#No mas piezas de las disponibles
#constraint forall (c in 0..2) (sum (i in 0..altura-1 ) ( bool2int(torre_i=c) ) <= disp[c]);
for c in range(3):
sumc = []
for i in range(altura):
sumc.append(bool2int(addeq(torre(i),str(c))))
print(addassert(addle(addsum(sumc),str(disp[c]))))
#fin constraint
#Piezas rojas >= Piezas azules + Piezas verdes
#constraint ( sum (i in 0..altura-1 where (torre[i]=Rojo)) ( 1 )) >=
# ( sum (i in 0..altura-1 ) ( bool2int(torre[i]=Azul \/ torre[i]=Verde) ));
#Lo expresamos como
#sum (i in 0..altura-1 ) ( bool2int(torre[i]!=Rojo) ) <= altura div 2
sumc = []
for i in range(altura):
sumc.append(bool2int(addnot(addeq(torre(i),"1"))))
print(addassert(addle(addsum(sumc),str(altura//2))))
#fin constraint
#Empieza con rojo
#constraint torre[0] = Rojo;
print(addassert(addeq(torre(0),"1")))
checksat()
#getmodel()
for i in reversed(range(altura)):
getvalue("("+torre(i)+")")
exit(0)
| 25.068966 | 92 | 0.569739 | 542 | 3,635 | 3.804428 | 0.230627 | 0.052376 | 0.023278 | 0.038797 | 0.359845 | 0.309893 | 0.170223 | 0.130941 | 0.109602 | 0.031038 | 0 | 0.037651 | 0.203576 | 3,635 | 144 | 93 | 25.243056 | 0.674611 | 0.310316 | 0 | 0.193182 | 0 | 0 | 0.092416 | 0 | 0 | 0 | 0 | 0.006944 | 0.102273 | 1 | 0.215909 | false | 0 | 0.011364 | 0.159091 | 0.454545 | 0.147727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
ed56b039382be2171b9d6d37154bb3ffcaf1cb4c | 351 | py | Python | openproblems/api/load.py | dburkhardt/SingleCellOpenProblems | ce792172085e7d2cdf1ec6a40d2f5b55179b4062 | [
"MIT"
] | 134 | 2020-08-19T07:35:56.000Z | 2021-05-19T11:37:50.000Z | openproblems/api/load.py | dburkhardt/SingleCellOpenProblems | ce792172085e7d2cdf1ec6a40d2f5b55179b4062 | [
"MIT"
] | 175 | 2020-08-17T15:26:06.000Z | 2021-05-14T11:03:46.000Z | openproblems/api/load.py | LuckyMD/SingleCellOpenProblems | 0ae39db494557e1dd9f28e59dda765527191eee1 | [
"MIT"
] | 46 | 2020-10-08T21:11:37.000Z | 2021-04-25T07:05:28.000Z | from . import utils
def load_dataset(task_name, function_name, test):
"""Load a dataset for a task."""
fun = utils.get_function(task_name, "datasets", function_name)
return fun(test=test)
def main(args):
"""Run the ``load`` subcommand."""
adata = load_dataset(args.task, args.name, args.test)
adata.write_h5ad(args.output)
| 25.071429 | 66 | 0.683761 | 51 | 351 | 4.54902 | 0.490196 | 0.094828 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003448 | 0.173789 | 351 | 13 | 67 | 27 | 0.796552 | 0.156695 | 0 | 0 | 0 | 0 | 0.02807 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ed5a5fdba5687f084b8f6076c72c34c46a95181f | 214 | py | Python | prof_mod.py | VandyChris/PythonForDataAnalysis | 9bd39b16be746cf909953ac8ac21ce0b4e203a8f | [
"MIT"
] | null | null | null | prof_mod.py | VandyChris/PythonForDataAnalysis | 9bd39b16be746cf909953ac8ac21ce0b4e203a8f | [
"MIT"
] | null | null | null | prof_mod.py | VandyChris/PythonForDataAnalysis | 9bd39b16be746cf909953ac8ac21ce0b4e203a8f | [
"MIT"
] | null | null | null | from numpy.random import randn
def add_and_sum(x, y):
added = x + y
summed = added.sum(axis=1)
return summed
def call_function():
x = randn(1000, 1000)
y = randn(1000, 1000)
return add_and_sum(x, y) | 21.4 | 31 | 0.668224 | 38 | 214 | 3.631579 | 0.5 | 0.043478 | 0.130435 | 0.144928 | 0.15942 | 0 | 0 | 0 | 0 | 0 | 0 | 0.100592 | 0.21028 | 214 | 10 | 32 | 21.4 | 0.715976 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ed6172ae663ce1e852ba416427de6ef1d2fcaf2e | 1,980 | py | Python | pytorch_lightning/plugins/environments/cluster_environment.py | GabrielePicco/pytorch-lightning | 0d6dfd42d8965347a258e3d20e83bddd344e718f | [
"Apache-2.0"
] | 4 | 2021-07-27T14:39:02.000Z | 2022-03-07T10:57:13.000Z | pytorch_lightning/plugins/environments/cluster_environment.py | GabrielePicco/pytorch-lightning | 0d6dfd42d8965347a258e3d20e83bddd344e718f | [
"Apache-2.0"
] | 2 | 2021-07-03T07:07:32.000Z | 2022-03-10T16:07:20.000Z | pytorch_lightning/plugins/environments/cluster_environment.py | GabrielePicco/pytorch-lightning | 0d6dfd42d8965347a258e3d20e83bddd344e718f | [
"Apache-2.0"
] | 1 | 2022-01-08T14:06:27.000Z | 2022-01-08T14:06:27.000Z | # Copyright The PyTorch Lightning team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from abc import ABC, abstractmethod
class ClusterEnvironment(ABC):
""" Specification of a cluster environment. """
@abstractmethod
def creates_children(self) -> bool:
""" Whether the environment creates the subprocesses or not. """
@abstractmethod
def master_address(self) -> str:
""" The master address through which all processes connect and communicate. """
@abstractmethod
def master_port(self) -> int:
""" An open and configured port in the master node through which all processes communicate. """
@abstractmethod
def world_size(self) -> int:
""" The number of processes across all devices and nodes. """
@abstractmethod
def set_world_size(self, size: int) -> None:
pass
@abstractmethod
def global_rank(self) -> int:
""" The rank (index) of the currently running process across all nodes and devices. """
@abstractmethod
def set_global_rank(self, rank: int) -> None:
pass
@abstractmethod
def local_rank(self) -> int:
""" The rank (index) of the currently running process inside of the current node. """
@abstractmethod
def node_rank(self) -> int:
""" The rank (index) of the node on which the current process runs. """
def teardown(self) -> None:
""" Clean up any state set after execution finishes. """
pass
| 33.559322 | 103 | 0.683333 | 258 | 1,980 | 5.20155 | 0.46124 | 0.114009 | 0.029806 | 0.031297 | 0.138599 | 0.09687 | 0.09687 | 0.09687 | 0.076006 | 0.076006 | 0 | 0.00262 | 0.228788 | 1,980 | 58 | 104 | 34.137931 | 0.876228 | 0.581313 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.416667 | false | 0.125 | 0.041667 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
ed6b2ff9b3d95cfb8e1e5129c3d883adf4da5782 | 385 | py | Python | md_translate/utils.py | ilyachch/docs-trans-app | fd91ab6fdf6b718be61c0559836d708223ce9280 | [
"MIT"
] | 10 | 2020-01-01T13:40:32.000Z | 2021-08-21T07:56:35.000Z | md_translate/utils.py | ilyachch/docs-trans-app | fd91ab6fdf6b718be61c0559836d708223ce9280 | [
"MIT"
] | 13 | 2019-05-15T06:36:09.000Z | 2021-09-25T23:47:37.000Z | md_translate/utils.py | ilyachch/docs-trans-app | fd91ab6fdf6b718be61c0559836d708223ce9280 | [
"MIT"
] | 4 | 2019-10-08T06:27:46.000Z | 2021-06-03T08:51:05.000Z | from translators import apis # type: ignore
from md_translate import const
from md_translate.exceptions import UnknownServiceError
def get_translator_by_service_name(service_name: str) -> apis.Tse:
translator_class = const.TRANSLATOR_BY_SERVICE_NAME.get(service_name)
if translator_class is None:
raise UnknownServiceError(service_name)
return translator_class
| 32.083333 | 73 | 0.81039 | 50 | 385 | 5.94 | 0.5 | 0.185185 | 0.10101 | 0.154882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 385 | 11 | 74 | 35 | 0.9 | 0.031169 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
ed6f7ec4b4c2a6ec29f71b484b7e9c4b157de8f6 | 1,715 | py | Python | spark_auto_mapper_fhir/value_sets/precision_estimate_type.py | imranq2/SparkAutoMapper.FHIR | dd23b218fb0097d1edc2f3e688e8d6d4d7278bd2 | [
"Apache-2.0"
] | 1 | 2020-10-31T23:25:07.000Z | 2020-10-31T23:25:07.000Z | spark_auto_mapper_fhir/value_sets/precision_estimate_type.py | icanbwell/SparkAutoMapper.FHIR | 98f368e781b46523142c7cb513c670d659a93c9b | [
"Apache-2.0"
] | null | null | null | spark_auto_mapper_fhir/value_sets/precision_estimate_type.py | icanbwell/SparkAutoMapper.FHIR | 98f368e781b46523142c7cb513c670d659a93c9b | [
"Apache-2.0"
] | null | null | null | from __future__ import annotations
from spark_auto_mapper_fhir.fhir_types.uri import FhirUri
from spark_auto_mapper_fhir.value_sets.generic_type import GenericTypeCode
from spark_auto_mapper.type_definitions.defined_types import AutoMapperTextInputType
# This file is auto-generated by generate_classes so do not edit manually
# noinspection PyPep8Naming
class PrecisionEstimateTypeCode(GenericTypeCode):
"""
PrecisionEstimateType
From: http://terminology.hl7.org/CodeSystem/precision-estimate-type in valuesets.xml
Method of reporting variability of estimates, such as confidence intervals,
interquartile range or standard deviation.
"""
def __init__(self, value: AutoMapperTextInputType):
super().__init__(value=value)
"""
http://terminology.hl7.org/CodeSystem/precision-estimate-type
"""
codeset: FhirUri = "http://terminology.hl7.org/CodeSystem/precision-estimate-type"
class PrecisionEstimateTypeCodeValues:
"""
confidence interval.
From: http://terminology.hl7.org/CodeSystem/precision-estimate-type in valuesets.xml
"""
ConfidenceInterval = PrecisionEstimateTypeCode("CI")
"""
interquartile range.
From: http://terminology.hl7.org/CodeSystem/precision-estimate-type in valuesets.xml
"""
InterquartileRange = PrecisionEstimateTypeCode("IQR")
"""
standard deviation.
From: http://terminology.hl7.org/CodeSystem/precision-estimate-type in valuesets.xml
"""
StandardDeviation = PrecisionEstimateTypeCode("SD")
"""
standard error.
From: http://terminology.hl7.org/CodeSystem/precision-estimate-type in valuesets.xml
"""
StandardError = PrecisionEstimateTypeCode("SE")
| 34.3 | 88 | 0.753936 | 178 | 1,715 | 7.117978 | 0.426966 | 0.082873 | 0.099448 | 0.116022 | 0.394633 | 0.358327 | 0.358327 | 0.358327 | 0.276243 | 0.276243 | 0 | 0.005491 | 0.150437 | 1,715 | 49 | 89 | 35 | 0.864104 | 0.253061 | 0 | 0 | 0 | 0 | 0.088945 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.307692 | 0 | 0.923077 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
ed7820073c1921ab0b6afce373215376f5e0e213 | 15,907 | py | Python | Project4/views_templates.py | sockduct/Udacity-FSND | 565db445652b0da41f251047cf34233c5e6146b9 | [
"MIT"
] | null | null | null | Project4/views_templates.py | sockduct/Udacity-FSND | 565db445652b0da41f251047cf34233c5e6146b9 | [
"MIT"
] | null | null | null | Project4/views_templates.py | sockduct/Udacity-FSND | 565db445652b0da41f251047cf34233c5e6146b9 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: ascii -*-
###################################################################################################
#
# Python version(s) used/tested:
# * Python 2.7.12-32 on Ubuntu 16.04.2 LTS
# * Python 2.7.13-32 on Windows 7
# * Python 3.5.2-32 on Ubuntu 16.04.2 LTS
# * Python 3.6.1-32 on Windows 7
#
# Notes on Style:
# * PEP 8 followed with maximum line length of 99 characters (allowable
# per: https://www.python.org/dev/peps/pep-0008/#maximum-line-length)
# * Per above, comments and docstrings must be wrapped at 72 characters
# * Interpreting this as just the comment/docstring text and not the
# leading quotes or '# '
#
#
# Template version used: 0.1.2
#
# -------------------------------------------------------------------------------------------------
#
# Issues/PLanned Improvements:
# * TBD
#
'''<module/program description> - triple quotes should end on this line if
on liner...'''
# Future Imports - Must be first, provides Python 2/3 interoperability
from __future__ import print_function # print(<strings...>, file=sys.stdout, end='\n')
from __future__ import division # 3/2 == 1.5, 3//2 == 1
from __future__ import absolute_import # prevent implicit relative imports in v2.x
from __future__ import unicode_literals # all string literals treated as unicode strings
# Imports
from flask import Flask, jsonify, request, url_for, abort, g, render_template, make_response
from flask_httpauth import HTTPBasicAuth
from functools import update_wrapper
import json
from models import Base, Item, User
from oauth2client.client import flow_from_clientsecrets
from oauth2client.client import FlowExchangeError
import os
from pprint import pprint
import ratelimit
import requests
# SQLAlchemy extension to map classes to database tables
from sqlalchemy.ext.declarative import declarative_base
# SQLAlchemy database handle to interact with underlying database
from sqlalchemy.orm import sessionmaker
# x
from sqlalchemy.orm import relationship
# SQLAlchemy module to connect to underlying database
from sqlalchemy import create_engine
import time
# Globals
# Note: Consider using function/class/method default parameters instead
# of global constants where it makes sense
# SQLAlchemy setup - create an instance of a connection to the underlying
# database
# Default database - SQLite:
DB_PATH = os.path.join(os.path.dirname(__file__), 'catalog.db')
engine = create_engine('sqlite:///' + DB_PATH)
# Use PostgreSQL, with user catalog:
# engine = create_engine('postgresql+psycopg2://catalog:NEKpPllvkcVEP4W9QzyIgDbKH15NM1I96BclRWG5@/catalog')
# Not sure what this does or if it's needed
Base.metadata.bind = engine
# Create ORM handle to underlying database
DBSession = sessionmaker(bind=engine)
# Used to interact with underlying database
session = DBSession()
#
# Flask setup
app = Flask(__name__)
auth = HTTPBasicAuth()
#
# OAuth setup
OAUTH_CLIENT_FILE = 'client_secret_google.json'
OAUTH_CLIENT_FILE_PATH = os.path.join(os.path.dirname(__file__), OAUTH_CLIENT_FILE)
CLIENT_ID = json.loads(open(OAUTH_CLIENT_FILE_PATH).read())['web']['client_id']
# Metadata
__author__ = 'James R. Small'
__contact__ = 'james<dot>r<dot>small<at>att<dot>com'
__date__ = 'July 28, 2017'
__version__ = '0.0.1'
# Integrate these:
@auth.verify_password
def verify_password(username, password):
user = session.query(User).filter_by(username=username).first()
# Don't want to notify agent if username not found or password verification
# failed - this would constitute a security vulnerability
if not user or not user.verify_password(password):
return False
else:
g.user = user
return True
#
# Another version:
@auth.verify_password
def verify_password(username_or_token, password):
#Try to see if it's a token first
user_id = User.verify_auth_token(username_or_token)
if user_id:
user = session.query(User).filter_by(id = user_id).one()
else:
user = session.query(User).filter_by(username = username_or_token).first()
if not user or not user.verify_password(password):
return False
g.user = user
return True
#
# Don't like this approach where token is in username, what about using separate
# header like GitHub does?
@auth.verify_password
def verify_password(username_or_token, password):
# First check if its a token
# Debugging
print('Received: {}:{}'.format(username_or_token, password))
user_id = User.verify_auth_token(username_or_token)
if user_id:
# Debugging
print('Validated by token')
user = session.query(User).filter_by(id=user_id).one()
else:
# Debugging
print('Trying to validate by username/password...')
user = session.query(User).filter_by(username=username_or_token).first()
if not user or not user.verify_password(password):
# Debugging
print('Failed to validate auth credentials')
return False
# Debugging
print('Validated by username/password')
# Successful authentication
g.user = user
return True
@app.route('/clientOAuth')
def start():
return render_template('clientOAuth.html')
@app.route('/oauth/<provider>', methods = ['POST'])
def login(provider):
# print('Request: {}'.format(request))
# print('Request introspection:')
# pprint(request.__dict__)
# print('-=-' * 25)
#STEP 1 - Parse the auth code
# Use this way if running seafood_test.py script
## auth_code = request.json.get('auth_code')
# Use this way if coming from browser
auth_code = request.data
print "Step 1 - Complete, received auth code %s" % auth_code
# print('-=-' * 25)
if provider == 'google':
#STEP 2 - Exchange for a token
try:
# Upgrade the authorization code into a credentials object
oauth_flow = flow_from_clientsecrets(OAUTH_CLIENT_FILE, scope='',
redirect_uri='postmessage')
## oauth_flow = flow_from_clientsecrets(OAUTH_CLIENT_FILE, scope='')
## oauth_flow.redirect_uri = 'postmessage'
credentials = oauth_flow.step2_exchange(auth_code)
except FlowExchangeError:
response = make_response(json.dumps('Failed to upgrade the authorization code.'), 401)
response.headers['Content-Type'] = 'application/json'
return response
# Check that the access token is valid.
access_token = credentials.access_token
url = ('https://www.googleapis.com/oauth2/v1/tokeninfo?access_token=%s' % access_token)
h = httplib2.Http()
result = json.loads(h.request(url, 'GET')[1])
# If there was an error in the access token info, abort.
if result.get('error') is not None:
response = make_response(json.dumps(result.get('error')), 500)
response.headers['Content-Type'] = 'application/json'
# # Verify that the access token is used for the intended user.
# gplus_id = credentials.id_token['sub']
# if result['user_id'] != gplus_id:
# response = make_response(json.dumps("Token's user ID doesn't match given user ID."),
# 401)
# response.headers['Content-Type'] = 'application/json'
# return response
# # Verify that the access token is valid for this app.
# if result['issued_to'] != CLIENT_ID:
# response = make_response(json.dumps("Token's client ID does not match app's."), 401)
# response.headers['Content-Type'] = 'application/json'
# return response
# stored_credentials = login_session.get('credentials')
# stored_gplus_id = login_session.get('gplus_id')
# if stored_credentials is not None and gplus_id == stored_gplus_id:
# response = make_response(json.dumps('Current user is already connected.'), 200)
# response.headers['Content-Type'] = 'application/json'
# return response
print "Step 2 Complete! Access Token : %s " % credentials.access_token
#STEP 3 - Find User or make a new one
#Get user info
h = httplib2.Http()
userinfo_url = "https://www.googleapis.com/oauth2/v1/userinfo"
params = {'access_token': credentials.access_token, 'alt':'json'}
answer = requests.get(userinfo_url, params=params)
data = answer.json()
name = data['name']
picture = data['picture']
email = data['email']
print('Received:\nName: {}\nPicture: {}\nEmail: {}\n'.format(name, picture, email))
#see if user exists, if it doesn't make a new one
user = session.query(User).filter_by(email=email).first()
if not user:
print('Creating database entry for user...')
user = User(username = name, picture = picture, email = email)
session.add(user)
session.commit()
else:
print('User already in database')
#STEP 4 - Make token
token = user.generate_auth_token(600)
#STEP 5 - Send back token to the client
print('Generated auth token: {}'.format(token))
return jsonify({'token': token.decode('ascii')})
#return jsonify({'token': token.decode('ascii'), 'duration': 600})
else:
return 'Unrecoginized Provider'
# /token route to get a token for a user with login credentials
@app.route('/token')
@auth.login_required
def get_auth_token():
token = g.user.generate_auth_token()
# token.decode(<str>) converts the ASCII "byte-string" to Unicode
# Believe Python 2.x only but not sure
return jsonify({'token': token.decode('ascii')})
#ADD a /users route here
@app.route('/users', methods=['POST'])
def registerUser():
try:
username = request.json.get('username', '')
password = request.json.get('password', '')
except AttributeError as err:
username = password = None
if not username or not password:
print('Missing required parameters (username, password).')
abort(400)
user = session.query(User).filter_by(username=username).first()
if user:
print('User already exists.')
return (jsonify({'message': 'User already exists.'}), 200,
{'Location': url_for('get_user', id=user.id, _external=True)})
user = User(username=username)
user.hash_password(password)
session.add(user)
session.commit()
return (jsonify({'username': user.username}), 201, {'Location':
url_for('get_user', id=user.id, _external=True)})
@app.route('/api/users/<int:id>')
def get_user(id):
user = session.query(User).filter_by(id=id).one()
if not user:
abort(400)
else:
return jsonify({'username': user.username})
@app.route('/resource')
@auth.login_required
def get_resource():
return jsonify({ 'data': 'Hello, %s!' % g.user.username })
@app.route('/bagels', methods = ['GET','POST'])
#protect this route with a required login
@auth.login_required
def showAllBagels():
if request.method == 'GET':
print('Hello {}!'.format(g.user.username))
bagels = session.query(Bagel).all()
return jsonify(bagels=[bagel.serialize for bagel in bagels])
elif request.method == 'POST':
name = request.json.get('name')
description = request.json.get('description')
picture = request.json.get('picture')
price = request.json.get('price')
newBagel = Bagel(name=name, description=description, picture=picture, price=price)
session.add(newBagel)
session.commit()
return jsonify(newBagel.serialize)
@app.route('/rate-limited')
@ratelimit(limit=300, per=30 * 1) # Limit to 300 requests per 30 seconds
def index():
return jsonify({'response':'This is a rate limited response'})
@app.route('/products', methods=['GET', 'POST'])
@auth.login_required
def showAllProducts():
print('Request: {}'.format(request))
if request.method == 'GET':
products = session.query(Product).all()
return jsonify(products = [p.serialize for p in products])
if request.method == 'POST':
name = request.json.get('name')
category = request.json.get('category')
price = request.json.get('price')
newItem = Product(name=name, category=category, price=price)
session.add(newItem)
session.commit()
return jsonify(newItem.serialize)
@app.route('/products/<category>')
@auth.login_required
def showCategoriedProducts(category):
if category == 'fruit':
fruit_items = session.query(Product).filter_by(category = 'fruit').all()
return jsonify(fruit_products = [f.serialize for f in fruit_items])
if category == 'legume':
legume_items = session.query(Product).filter_by(category = 'legume').all()
return jsonify(legume_products = [l.serialize for l in legume_items])
if category == 'vegetable':
vegetable_items = session.query(Product).filter_by(category = 'vegetable').all()
return jsonify(produce_products = [p.serialize for p in produce_items])
@app.route('/catalog')
@ratelimit.ratelimit(limit=60, per=60 * 1) # Limit to 300 requests per 30 seconds
def getCatalog():
items = session.query(Item).all()
#Populate an empty database
if items == []:
item1 = Item(name="Pineapple", price="$2.50",
picture=("https://upload.wikimedia.org/wikipedia/commons/c/"
"cb/Pineapple_and_cross_section.jpg"),
description="Organically Grown in Hawai'i")
session.add(item1)
item2 = Item(name="Carrots", price = "$1.99",
picture=("http://media.mercola.com/assets/images/food-facts/"
"carrot-fb.jpg"), description="High in Vitamin A")
session.add(item2)
item3 = Item(name="Aluminum Foil", price="$3.50", picture=(
"http://images.wisegeek.com/aluminum-foil.jpg"), description=
"300 feet long")
session.add(item3)
item4 = Item(name="Eggs", price="$2.00", picture=(
"http://whatsyourdeal.com/grocery-coupons/wp-content/uploads/"
"2015/01/eggs.png"), description = "Farm Fresh Organic Eggs")
session.add(item4)
item5 = Item(name="Bananas", price="$2.15", picture=
"http://dreamatico.com/data_images/banana/banana-3.jpg",
description="Fresh, delicious, and full of potassium")
session.add(item5)
session.commit()
items = session.query(Item).all()
return jsonify(catalog=[i.serialize for i in items])
if __name__ == '__main__':
#app.config['SECRET_KEY'] = ''.join(random.choice(string.ascii_uppercase + string.digits)
# for x in xrange(32))
app.secret_key = os.urandom(40)
app.debug = True
app.run(host='0.0.0.0', port=5000)
# Watch out for multi-threaded interaction with your database!!!
## app.run(host='0.0.0.0', port=5000, threaded=True)
## app.run(host='0.0.0.0', port=5000, processes=3)
###################################################################################################
# Post coding
#
# Only test for Python 3 compatibility: pylint --py3k <script>.py
# pylint <script>.py
# Score should be >= 8.0
# Alternatives:
# pep8, flake8
#
# python warning options:
# * -Qwarnall - Believe check for old division usage
# * -t - issue warnings about inconsitent tab usage
# * -3 - warn about Python 3.x incompatibilities
#
# python3 warning options:
# * -b - issue warnings about mixing strings and bytes
#
# Future:
# * Testing - doctest/unittest/pytest/other
# * Logging
#
| 37.9642 | 107 | 0.642233 | 1,984 | 15,907 | 5.039315 | 0.263105 | 0.020804 | 0.014003 | 0.016003 | 0.264153 | 0.213443 | 0.179336 | 0.153331 | 0.121624 | 0.076215 | 0 | 0.017971 | 0.219903 | 15,907 | 418 | 108 | 38.055024 | 0.787735 | 0.304771 | 0 | 0.280172 | 0 | 0 | 0.169381 | 0.008945 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.073276 | 0.086207 | null | null | 0.073276 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
ed7da8a2aa6ce2187cce19929470687044c959b5 | 3,151 | py | Python | flavio/io/instanceio.py | Felicia56/flavio | ea735bd8febbb961d249eddf338a4960c1fbee69 | [
"MIT"
] | 61 | 2016-03-09T16:19:39.000Z | 2022-03-30T00:55:51.000Z | flavio/io/instanceio.py | Felicia56/flavio | ea735bd8febbb961d249eddf338a4960c1fbee69 | [
"MIT"
] | 167 | 2016-03-15T15:25:57.000Z | 2022-02-27T22:19:22.000Z | flavio/io/instanceio.py | Felicia56/flavio | ea735bd8febbb961d249eddf338a4960c1fbee69 | [
"MIT"
] | 57 | 2016-03-15T14:24:23.000Z | 2022-01-14T01:00:03.000Z | """Functions to load and dump class instances from and to YAML dictionaries
or streams."""
import flavio
import voluptuous as vol
import yaml
from collections import OrderedDict
class YAMLLoadable(object):
"""Base class for objects that can be loaded and dumped from and to
a dict or YAML stream."""
# these class attributes should be overwritten by child classes
_input_schema_dict = {}
_output_schema_dict = {}
@classmethod
def input_schema(cls):
return vol.Schema(cls._input_schema_dict, extra=vol.ALLOW_EXTRA)
@classmethod
def output_schema(cls):
return vol.Schema(cls._output_schema_dict, extra=vol.REMOVE_EXTRA)
@classmethod
def load_dict(cls, d, **kwargs):
"""Instantiate an object from a YAML dictionary."""
schema = cls.input_schema()
return cls(**schema(d), **kwargs)
@classmethod
def load(cls, f, **kwargs):
"""Instantiate an object from a YAML string or stream."""
d = flavio.io.yaml.load_include(f)
return cls.load_dict(d, **kwargs)
def get_yaml_dict(self):
"""Dump the object to a YAML dictionary."""
d = self.__dict__.copy()
schema = self.output_schema()
d = schema(d)
# remove NoneTypes and empty lists
d = {k: v for k, v in d.items() if v is not None and v != []}
return d
def dump(self, stream=None, **kwargs):
"""Dump the objectto a YAML string or stream."""
d = self.get_yaml_dict(**kwargs)
return yaml.dump(d, stream=stream, **kwargs)
def coerce_observable_tuple(obs):
"""Force an arbitrary observable representation into the tuple representation."""
return flavio.Observable.argument_format(obs, format='tuple')
def coerce_observable_dict(obs):
"""Force an arbitrary observable representation into the dict representation."""
return flavio.Observable.argument_format(obs, format='dict')
def coerce_par_obj(par_obj_dict):
"""Coerce a dictionary of parameter constraints into a `ParameterConstraints`
instance taking `flavio.default_parameters` as starting point"""
par_obj = flavio.default_parameters.copy()
return flavio.ParameterConstraints.from_yaml_dict(par_obj_dict,
instance=par_obj)
def ensurelist(v):
"""Coerce NoneType to empty list, wrap non-list in list."""
if isinstance(v, list):
return v
elif v is None:
return []
else:
raise ValueError("Unexpected form of list: {}".format(v))
def get_par_diff(par_obj):
"""Return a dictionary representation of a ParameterConstraints instance
that only contains constraints that are not identical to ones in
`default_parameters`."""
dict_default = flavio.default_parameters.get_yaml_dict()
dict_par = par_obj.get_yaml_dict()
return [c for c in dict_par if c not in dict_default]
def list_deduplicate(lst):
"""Remove duplicate elements from a list but keep the order
(keep the first occuring element of duplicates). List elements must be
hashable."""
return list(OrderedDict.fromkeys(lst))
| 33.168421 | 85 | 0.679784 | 423 | 3,151 | 4.917258 | 0.304965 | 0.020192 | 0.021154 | 0.017308 | 0.180288 | 0.180288 | 0.1375 | 0.104808 | 0 | 0 | 0 | 0 | 0.22596 | 3,151 | 94 | 86 | 33.521277 | 0.852809 | 0.344018 | 0 | 0.078431 | 0 | 0 | 0.018173 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.078431 | 0.039216 | 0.627451 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
ed815fef63b31e736c9a35cdd5d9fc88357ced53 | 43,395 | py | Python | pysnmp/Juniper-System-Clock-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/Juniper-System-Clock-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/Juniper-System-Clock-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module Juniper-System-Clock-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/Juniper-System-Clock-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 19:53:45 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, ObjectIdentifier, Integer = mibBuilder.importSymbols("ASN1", "OctetString", "ObjectIdentifier", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueRangeConstraint, ConstraintsUnion, ValueSizeConstraint, ConstraintsIntersection, SingleValueConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueRangeConstraint", "ConstraintsUnion", "ValueSizeConstraint", "ConstraintsIntersection", "SingleValueConstraint")
juniMibs, = mibBuilder.importSymbols("Juniper-MIBs", "juniMibs")
JuniEnable, = mibBuilder.importSymbols("Juniper-TC", "JuniEnable")
NotificationGroup, ObjectGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ObjectGroup", "ModuleCompliance")
Counter64, TimeTicks, ObjectIdentity, Unsigned32, Counter32, Integer32, IpAddress, NotificationType, Gauge32, MibScalar, MibTable, MibTableRow, MibTableColumn, Bits, MibIdentifier, iso, ModuleIdentity = mibBuilder.importSymbols("SNMPv2-SMI", "Counter64", "TimeTicks", "ObjectIdentity", "Unsigned32", "Counter32", "Integer32", "IpAddress", "NotificationType", "Gauge32", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Bits", "MibIdentifier", "iso", "ModuleIdentity")
TextualConvention, TruthValue, RowStatus, DateAndTime, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "TruthValue", "RowStatus", "DateAndTime", "DisplayString")
juniSysClockMIB = ModuleIdentity((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56))
juniSysClockMIB.setRevisions(('2007-03-22 14:00', '2005-12-14 14:01', '2003-09-15 14:01', '2003-09-12 13:37', '2002-04-04 14:56',))
if mibBuilder.loadTexts: juniSysClockMIB.setLastUpdated('200512141401Z')
if mibBuilder.loadTexts: juniSysClockMIB.setOrganization('Juniper Networks, Inc.')
class JuniSysClockMonth(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12))
namedValues = NamedValues(("january", 1), ("february", 2), ("march", 3), ("april", 4), ("may", 5), ("june", 6), ("july", 7), ("august", 8), ("september", 9), ("october", 10), ("november", 11), ("december", 12))
class JuniSysClockWeekOfTheMonth(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6))
namedValues = NamedValues(("weekFirst", 0), ("weekOne", 1), ("weekTwo", 2), ("weekThree", 3), ("weekFour", 4), ("weekFive", 5), ("weekLast", 6))
class JuniSysClockDayOfTheWeek(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6))
namedValues = NamedValues(("sunday", 0), ("monday", 1), ("tuesday", 2), ("wednesday", 3), ("thursday", 4), ("friday", 5), ("saturday", 6))
class JuniSysClockHour(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ValueRangeConstraint(0, 23)
class JuniSysClockMinute(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ValueRangeConstraint(0, 59)
class JuniNtpTimeStamp(TextualConvention, OctetString):
reference = "D.L. Mills, 'Network Time Protocol (Version 3)', RFC-1305, March 1992. J. Postel & J. Reynolds, 'NVT ASCII character set', RFC-854, May 1983."
status = 'current'
subtypeSpec = OctetString.subtypeSpec + ValueSizeConstraint(0, 21)
class JuniNtpClockSignedTime(TextualConvention, OctetString):
reference = "D.L. Mills, 'Network Time Protocol (Version 3)', RFC-1305, March 1992. J. Postel & J. Reynolds, 'NVT ASCII character set', RFC-854, May 1983."
status = 'current'
subtypeSpec = OctetString.subtypeSpec + ValueSizeConstraint(0, 11)
class JuniNtpClockUnsignedTime(TextualConvention, OctetString):
reference = "D.L. Mills, 'Network Time Protocol (Version 3)', RFC-1305, March 1992. J. Postel & J. Reynolds, 'NVT ASCII character set', RFC-854, May 1983"
status = 'current'
subtypeSpec = OctetString.subtypeSpec + ValueSizeConstraint(0, 11)
juniSysClockObjects = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1))
juniNtpObjects = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2))
juniSysClockTime = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 1))
juniSysClockDst = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2))
juniSysClockDateAndTime = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 1, 1), DateAndTime()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDateAndTime.setStatus('current')
juniSysClockTimeZoneName = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 1, 2), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 63))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockTimeZoneName.setStatus('current')
juniSysClockDstName = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 1), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 63))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstName.setStatus('current')
juniSysClockDstOffset = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 1440)).clone(60)).setUnits('minutes').setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstOffset.setStatus('current')
juniSysClockDstStatus = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3))).clone(namedValues=NamedValues(("off", 0), ("recurrent", 1), ("absolute", 2), ("recognizedUS", 3)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstStatus.setStatus('current')
juniSysClockDstAbsoluteStartTime = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 4), DateAndTime()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstAbsoluteStartTime.setStatus('current')
juniSysClockDstAbsoluteStopTime = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 5), DateAndTime()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstAbsoluteStopTime.setStatus('current')
juniSysClockDstRecurStartMonth = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 6), JuniSysClockMonth().clone('march')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstRecurStartMonth.setStatus('current')
juniSysClockDstRecurStartWeek = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 7), JuniSysClockWeekOfTheMonth().clone('weekTwo')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstRecurStartWeek.setStatus('current')
juniSysClockDstRecurStartDay = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 8), JuniSysClockDayOfTheWeek().clone('sunday')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstRecurStartDay.setStatus('current')
juniSysClockDstRecurStartHour = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 9), JuniSysClockHour().clone(1)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstRecurStartHour.setStatus('current')
juniSysClockDstRecurStartMinute = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 10), JuniSysClockMinute()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstRecurStartMinute.setStatus('current')
juniSysClockDstRecurStopMonth = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 11), JuniSysClockMonth().clone('november')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstRecurStopMonth.setStatus('current')
juniSysClockDstRecurStopWeek = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 12), JuniSysClockWeekOfTheMonth().clone('weekFirst')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstRecurStopWeek.setStatus('current')
juniSysClockDstRecurStopDay = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 13), JuniSysClockDayOfTheWeek().clone('sunday')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstRecurStopDay.setStatus('current')
juniSysClockDstRecurStopHour = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 14), JuniSysClockHour().clone(2)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstRecurStopHour.setStatus('current')
juniSysClockDstRecurStopMinute = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 1, 2, 15), JuniSysClockMinute()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniSysClockDstRecurStopMinute.setStatus('current')
juniNtpSysClock = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 1))
juniNtpClient = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2))
juniNtpServer = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 3))
juniNtpPeers = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4))
juniNtpAccessGroup = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 5))
juniNtpSysClockState = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 1, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5))).clone(namedValues=NamedValues(("neverFrequencyCalibrated", 0), ("frequencyCalibrated", 1), ("setToServerTime", 2), ("frequencyCalibrationIsGoingOn", 3), ("synchronized", 4), ("spikeDetected", 5)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpSysClockState.setStatus('current')
juniNtpSysClockOffsetError = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 1, 2), JuniNtpClockSignedTime()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpSysClockOffsetError.setStatus('deprecated')
juniNtpSysClockFrequencyError = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 1, 3), Integer32()).setUnits('ppm').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpSysClockFrequencyError.setStatus('deprecated')
juniNtpSysClockRootDelay = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 1, 4), JuniNtpClockSignedTime()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpSysClockRootDelay.setStatus('current')
juniNtpSysClockRootDispersion = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 1, 5), JuniNtpClockUnsignedTime()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpSysClockRootDispersion.setStatus('current')
juniNtpSysClockStratumNumber = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-1, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpSysClockStratumNumber.setStatus('current')
juniNtpSysClockLastUpdateTime = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 1, 7), JuniNtpTimeStamp()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpSysClockLastUpdateTime.setStatus('current')
juniNtpSysClockLastUpdateServer = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 1, 8), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpSysClockLastUpdateServer.setStatus('current')
juniNtpSysClockOffsetErrorNew = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 1, 9), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 25))).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpSysClockOffsetErrorNew.setStatus('current')
juniNtpSysClockFrequencyErrorNew = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 1, 10), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 25))).setUnits('ppm').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpSysClockFrequencyErrorNew.setStatus('current')
juniNtpClientAdminStatus = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2, 1), JuniEnable().clone('disable')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniNtpClientAdminStatus.setStatus('current')
juniNtpClientSystemRouterIndex = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2, 2), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpClientSystemRouterIndex.setStatus('current')
juniNtpClientPacketSourceIfIndex = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniNtpClientPacketSourceIfIndex.setStatus('current')
juniNtpClientBroadcastDelay = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 999999)).clone(3000)).setUnits('microseconds').setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniNtpClientBroadcastDelay.setStatus('current')
juniNtpClientIfTable = MibTable((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2, 5), )
if mibBuilder.loadTexts: juniNtpClientIfTable.setStatus('current')
juniNtpClientIfEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2, 5, 1), ).setIndexNames((0, "Juniper-System-Clock-MIB", "juniNtpClientIfRouterIndex"), (0, "Juniper-System-Clock-MIB", "juniNtpClientIfIfIndex"))
if mibBuilder.loadTexts: juniNtpClientIfEntry.setStatus('current')
juniNtpClientIfRouterIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2, 5, 1, 1), Unsigned32())
if mibBuilder.loadTexts: juniNtpClientIfRouterIndex.setStatus('current')
juniNtpClientIfIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2, 5, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647)))
if mibBuilder.loadTexts: juniNtpClientIfIfIndex.setStatus('current')
juniNtpClientIfDisable = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2, 5, 1, 3), TruthValue()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniNtpClientIfDisable.setStatus('current')
juniNtpClientIfIsBroadcastClient = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2, 5, 1, 4), TruthValue()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniNtpClientIfIsBroadcastClient.setStatus('current')
juniNtpClientIfIsBroadcastServer = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2, 5, 1, 5), TruthValue()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniNtpClientIfIsBroadcastServer.setStatus('current')
juniNtpClientIfIsBroadcastServerVersion = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2, 5, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 4)).clone(3)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniNtpClientIfIsBroadcastServerVersion.setStatus('current')
juniNtpClientIfIsBroadcastServerDelay = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 2, 5, 1, 7), Integer32().subtype(subtypeSpec=ValueRangeConstraint(4, 17)).clone(6)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniNtpClientIfIsBroadcastServerDelay.setStatus('current')
juniNtpServerStratumNumber = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 3, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 255)).clone(8)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniNtpServerStratumNumber.setStatus('current')
juniNtpServerAdminStatus = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 3, 2), JuniEnable()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniNtpServerAdminStatus.setStatus('current')
juniNtpPeerCfgTable = MibTable((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 1), )
if mibBuilder.loadTexts: juniNtpPeerCfgTable.setStatus('current')
juniNtpPeerCfgEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 1, 1), ).setIndexNames((0, "Juniper-System-Clock-MIB", "juniNtpClientIfRouterIndex"), (0, "Juniper-System-Clock-MIB", "juniNtpPeerCfgIpAddress"))
if mibBuilder.loadTexts: juniNtpPeerCfgEntry.setStatus('current')
juniNtpPeerCfgIpAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 1, 1, 1), IpAddress())
if mibBuilder.loadTexts: juniNtpPeerCfgIpAddress.setStatus('current')
juniNtpPeerCfgNtpVersion = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 1, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 4))).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniNtpPeerCfgNtpVersion.setStatus('current')
juniNtpPeerCfgPacketSourceIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 1, 1, 3), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647))).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniNtpPeerCfgPacketSourceIfIndex.setStatus('current')
juniNtpPeerCfgIsPreferred = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 1, 1, 4), TruthValue()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniNtpPeerCfgIsPreferred.setStatus('current')
juniNtpPeerCfgRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 1, 1, 5), RowStatus()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniNtpPeerCfgRowStatus.setStatus('current')
juniNtpPeerTable = MibTable((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2), )
if mibBuilder.loadTexts: juniNtpPeerTable.setStatus('current')
juniNtpPeerEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1), ).setIndexNames((0, "Juniper-System-Clock-MIB", "juniNtpClientIfRouterIndex"), (0, "Juniper-System-Clock-MIB", "juniNtpPeerCfgIpAddress"))
if mibBuilder.loadTexts: juniNtpPeerEntry.setStatus('current')
juniNtpPeerState = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 1), OctetString().subtype(subtypeSpec=ValueSizeConstraint(1, 1)).setFixedLength(1)).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerState.setStatus('current')
juniNtpPeerStratumNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerStratumNumber.setStatus('current')
juniNtpPeerAssociationMode = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("broacastServer", 0), ("multicastServer", 1), ("unicastServer", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerAssociationMode.setStatus('current')
juniNtpPeerBroadcastInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 4), Integer32()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerBroadcastInterval.setStatus('current')
juniNtpPeerPolledInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 5), Integer32()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerPolledInterval.setStatus('current')
juniNtpPeerPollingInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 6), Integer32()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerPollingInterval.setStatus('current')
juniNtpPeerDelay = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 7), JuniNtpClockSignedTime()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerDelay.setStatus('current')
juniNtpPeerDispersion = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 8), JuniNtpClockUnsignedTime()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerDispersion.setStatus('current')
juniNtpPeerOffsetError = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 9), JuniNtpClockSignedTime()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerOffsetError.setStatus('current')
juniNtpPeerReachability = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 10), OctetString().subtype(subtypeSpec=ValueSizeConstraint(1, 1)).setFixedLength(1)).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerReachability.setStatus('current')
juniNtpPeerRootDelay = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 11), JuniNtpClockSignedTime()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerRootDelay.setStatus('current')
juniNtpPeerRootDispersion = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 12), JuniNtpClockUnsignedTime()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerRootDispersion.setStatus('current')
juniNtpPeerRootSyncDistance = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 13), JuniNtpClockSignedTime()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerRootSyncDistance.setStatus('current')
juniNtpPeerRootTime = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 14), JuniNtpTimeStamp()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerRootTime.setStatus('current')
juniNtpPeerRootTimeUpdateServer = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 15), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerRootTimeUpdateServer.setStatus('current')
juniNtpPeerReceiveTime = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 16), JuniNtpTimeStamp()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerReceiveTime.setStatus('current')
juniNtpPeerTransmitTime = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 17), JuniNtpTimeStamp()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerTransmitTime.setStatus('current')
juniNtpPeerRequestTime = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 18), JuniNtpTimeStamp()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerRequestTime.setStatus('current')
juniNtpPeerPrecision = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 19), JuniNtpClockSignedTime()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerPrecision.setStatus('current')
juniNtpPeerLastUpdateTime = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 2, 1, 20), Unsigned32()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerLastUpdateTime.setStatus('current')
juniNtpPeerFilterRegisterTable = MibTable((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 3), )
if mibBuilder.loadTexts: juniNtpPeerFilterRegisterTable.setStatus('current')
juniNtpPeerFilterRegisterEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 3, 1), ).setIndexNames((0, "Juniper-System-Clock-MIB", "juniNtpPeerCfgIpAddress"), (0, "Juniper-System-Clock-MIB", "juniNtpPeerFilterIndex"))
if mibBuilder.loadTexts: juniNtpPeerFilterRegisterEntry.setStatus('current')
juniNtpPeerFilterIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 3, 1, 1), Unsigned32())
if mibBuilder.loadTexts: juniNtpPeerFilterIndex.setStatus('current')
juniNtpPeerFilterOffset = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 3, 1, 2), JuniNtpClockSignedTime()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerFilterOffset.setStatus('current')
juniNtpPeerFilterDelay = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 3, 1, 3), JuniNtpClockSignedTime()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerFilterDelay.setStatus('current')
juniNtpPeerFilterDispersion = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 4, 3, 1, 4), JuniNtpClockUnsignedTime()).setUnits('seconds').setMaxAccess("readonly")
if mibBuilder.loadTexts: juniNtpPeerFilterDispersion.setStatus('current')
juniNtpRouterAccessGroupPeer = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 5, 1), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniNtpRouterAccessGroupPeer.setStatus('current')
juniNtpRouterAccessGroupServe = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 5, 2), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniNtpRouterAccessGroupServe.setStatus('current')
juniNtpRouterAccessGroupServeOnly = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 5, 3), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniNtpRouterAccessGroupServeOnly.setStatus('current')
juniNtpRouterAccessGroupQueryOnly = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 5, 4), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniNtpRouterAccessGroupQueryOnly.setStatus('current')
juniNtpTraps = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 0))
juniNtpFrequencyCalibrationStart = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 0, 1)).setObjects(("Juniper-System-Clock-MIB", "juniNtpSysClockFrequencyError"))
if mibBuilder.loadTexts: juniNtpFrequencyCalibrationStart.setStatus('current')
juniNtpFrequencyCalibrationEnd = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 0, 2)).setObjects(("Juniper-System-Clock-MIB", "juniNtpSysClockFrequencyError"))
if mibBuilder.loadTexts: juniNtpFrequencyCalibrationEnd.setStatus('current')
juniNtpTimeSynUp = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 0, 3))
if mibBuilder.loadTexts: juniNtpTimeSynUp.setStatus('current')
juniNtpTimeSynDown = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 0, 4))
if mibBuilder.loadTexts: juniNtpTimeSynDown.setStatus('current')
juniNtpTimeServerSynUp = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 0, 5)).setObjects(("Juniper-System-Clock-MIB", "juniNtpPeerCfgIsPreferred"))
if mibBuilder.loadTexts: juniNtpTimeServerSynUp.setStatus('current')
juniNtpTimeServerSynDown = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 0, 6)).setObjects(("Juniper-System-Clock-MIB", "juniNtpPeerCfgIsPreferred"))
if mibBuilder.loadTexts: juniNtpTimeServerSynDown.setStatus('current')
juniNtpFirstSystemClockSet = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 0, 7)).setObjects(("Juniper-System-Clock-MIB", "juniNtpSysClockOffsetError"), ("Juniper-System-Clock-MIB", "juniNtpSysClockState"))
if mibBuilder.loadTexts: juniNtpFirstSystemClockSet.setStatus('current')
juniNtpClockOffSetLimitCrossed = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 2, 0, 8)).setObjects(("Juniper-System-Clock-MIB", "juniNtpSysClockOffsetError"), ("Juniper-System-Clock-MIB", "juniNtpSysClockState"))
if mibBuilder.loadTexts: juniNtpClockOffSetLimitCrossed.setStatus('current')
juniSysClockConformance = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3))
juniSysClockCompliances = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 1))
juniSysClockGroups = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 2))
juniSysClockCompliance = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 1, 1)).setObjects(("Juniper-System-Clock-MIB", "juniSysClockTimeGroup"), ("Juniper-System-Clock-MIB", "juniSysClockDstGroup"), ("Juniper-System-Clock-MIB", "juniNtpSysClockGroup"), ("Juniper-System-Clock-MIB", "juniNtpClientGroup"), ("Juniper-System-Clock-MIB", "juniNtpServerGroup"), ("Juniper-System-Clock-MIB", "juniNtpPeersGroup"), ("Juniper-System-Clock-MIB", "juniNtpAccessGroupGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniSysClockCompliance = juniSysClockCompliance.setStatus('obsolete')
juniSysClockCompliance2 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 1, 2)).setObjects(("Juniper-System-Clock-MIB", "juniSysClockTimeGroup"), ("Juniper-System-Clock-MIB", "juniSysClockDstGroup"), ("Juniper-System-Clock-MIB", "juniNtpSysClockGroup"), ("Juniper-System-Clock-MIB", "juniNtpClientGroup"), ("Juniper-System-Clock-MIB", "juniNtpServerGroup"), ("Juniper-System-Clock-MIB", "juniNtpPeersGroup"), ("Juniper-System-Clock-MIB", "juniNtpAccessGroupGroup"), ("Juniper-System-Clock-MIB", "juniNtpNotificationGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniSysClockCompliance2 = juniSysClockCompliance2.setStatus('obsolete')
juniSysClockCompliance3 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 1, 3)).setObjects(("Juniper-System-Clock-MIB", "juniSysClockTimeGroup"), ("Juniper-System-Clock-MIB", "juniSysClockDstGroup"), ("Juniper-System-Clock-MIB", "juniNtpSysClockGroup2"), ("Juniper-System-Clock-MIB", "juniNtpClientGroup"), ("Juniper-System-Clock-MIB", "juniNtpServerGroup"), ("Juniper-System-Clock-MIB", "juniNtpPeersGroup"), ("Juniper-System-Clock-MIB", "juniNtpAccessGroupGroup"), ("Juniper-System-Clock-MIB", "juniNtpNotificationGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniSysClockCompliance3 = juniSysClockCompliance3.setStatus('current')
juniSysClockTimeGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 2, 1)).setObjects(("Juniper-System-Clock-MIB", "juniSysClockDateAndTime"), ("Juniper-System-Clock-MIB", "juniSysClockTimeZoneName"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniSysClockTimeGroup = juniSysClockTimeGroup.setStatus('current')
juniSysClockDstGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 2, 2)).setObjects(("Juniper-System-Clock-MIB", "juniSysClockDstName"), ("Juniper-System-Clock-MIB", "juniSysClockDstOffset"), ("Juniper-System-Clock-MIB", "juniSysClockDstStatus"), ("Juniper-System-Clock-MIB", "juniSysClockDstAbsoluteStartTime"), ("Juniper-System-Clock-MIB", "juniSysClockDstAbsoluteStopTime"), ("Juniper-System-Clock-MIB", "juniSysClockDstRecurStartMonth"), ("Juniper-System-Clock-MIB", "juniSysClockDstRecurStartWeek"), ("Juniper-System-Clock-MIB", "juniSysClockDstRecurStartDay"), ("Juniper-System-Clock-MIB", "juniSysClockDstRecurStartHour"), ("Juniper-System-Clock-MIB", "juniSysClockDstRecurStartMinute"), ("Juniper-System-Clock-MIB", "juniSysClockDstRecurStopMonth"), ("Juniper-System-Clock-MIB", "juniSysClockDstRecurStopWeek"), ("Juniper-System-Clock-MIB", "juniSysClockDstRecurStopDay"), ("Juniper-System-Clock-MIB", "juniSysClockDstRecurStopHour"), ("Juniper-System-Clock-MIB", "juniSysClockDstRecurStopMinute"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniSysClockDstGroup = juniSysClockDstGroup.setStatus('current')
juniNtpSysClockGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 2, 3)).setObjects(("Juniper-System-Clock-MIB", "juniNtpSysClockState"), ("Juniper-System-Clock-MIB", "juniNtpSysClockOffsetError"), ("Juniper-System-Clock-MIB", "juniNtpSysClockFrequencyError"), ("Juniper-System-Clock-MIB", "juniNtpSysClockRootDelay"), ("Juniper-System-Clock-MIB", "juniNtpSysClockRootDispersion"), ("Juniper-System-Clock-MIB", "juniNtpSysClockStratumNumber"), ("Juniper-System-Clock-MIB", "juniNtpSysClockLastUpdateTime"), ("Juniper-System-Clock-MIB", "juniNtpSysClockLastUpdateServer"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniNtpSysClockGroup = juniNtpSysClockGroup.setStatus('obsolete')
juniNtpClientGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 2, 4)).setObjects(("Juniper-System-Clock-MIB", "juniNtpClientAdminStatus"), ("Juniper-System-Clock-MIB", "juniNtpClientSystemRouterIndex"), ("Juniper-System-Clock-MIB", "juniNtpClientPacketSourceIfIndex"), ("Juniper-System-Clock-MIB", "juniNtpClientBroadcastDelay"), ("Juniper-System-Clock-MIB", "juniNtpClientIfDisable"), ("Juniper-System-Clock-MIB", "juniNtpClientIfIsBroadcastClient"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniNtpClientGroup = juniNtpClientGroup.setStatus('current')
juniNtpServerGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 2, 5)).setObjects(("Juniper-System-Clock-MIB", "juniNtpServerAdminStatus"), ("Juniper-System-Clock-MIB", "juniNtpServerStratumNumber"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniNtpServerGroup = juniNtpServerGroup.setStatus('current')
juniNtpPeersGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 2, 6)).setObjects(("Juniper-System-Clock-MIB", "juniNtpPeerState"), ("Juniper-System-Clock-MIB", "juniNtpPeerStratumNumber"), ("Juniper-System-Clock-MIB", "juniNtpPeerAssociationMode"), ("Juniper-System-Clock-MIB", "juniNtpPeerBroadcastInterval"), ("Juniper-System-Clock-MIB", "juniNtpPeerPolledInterval"), ("Juniper-System-Clock-MIB", "juniNtpPeerPollingInterval"), ("Juniper-System-Clock-MIB", "juniNtpPeerDelay"), ("Juniper-System-Clock-MIB", "juniNtpPeerDispersion"), ("Juniper-System-Clock-MIB", "juniNtpPeerOffsetError"), ("Juniper-System-Clock-MIB", "juniNtpPeerReachability"), ("Juniper-System-Clock-MIB", "juniNtpPeerPrecision"), ("Juniper-System-Clock-MIB", "juniNtpPeerRootDelay"), ("Juniper-System-Clock-MIB", "juniNtpPeerRootDispersion"), ("Juniper-System-Clock-MIB", "juniNtpPeerRootSyncDistance"), ("Juniper-System-Clock-MIB", "juniNtpPeerRootTime"), ("Juniper-System-Clock-MIB", "juniNtpPeerRootTimeUpdateServer"), ("Juniper-System-Clock-MIB", "juniNtpPeerReceiveTime"), ("Juniper-System-Clock-MIB", "juniNtpPeerTransmitTime"), ("Juniper-System-Clock-MIB", "juniNtpPeerRequestTime"), ("Juniper-System-Clock-MIB", "juniNtpPeerFilterOffset"), ("Juniper-System-Clock-MIB", "juniNtpPeerFilterDelay"), ("Juniper-System-Clock-MIB", "juniNtpPeerFilterDispersion"), ("Juniper-System-Clock-MIB", "juniNtpPeerCfgNtpVersion"), ("Juniper-System-Clock-MIB", "juniNtpPeerCfgPacketSourceIfIndex"), ("Juniper-System-Clock-MIB", "juniNtpPeerCfgIsPreferred"), ("Juniper-System-Clock-MIB", "juniNtpPeerCfgRowStatus"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniNtpPeersGroup = juniNtpPeersGroup.setStatus('obsolete')
juniNtpAccessGroupGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 2, 7)).setObjects(("Juniper-System-Clock-MIB", "juniNtpRouterAccessGroupPeer"), ("Juniper-System-Clock-MIB", "juniNtpRouterAccessGroupServe"), ("Juniper-System-Clock-MIB", "juniNtpRouterAccessGroupServeOnly"), ("Juniper-System-Clock-MIB", "juniNtpRouterAccessGroupQueryOnly"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniNtpAccessGroupGroup = juniNtpAccessGroupGroup.setStatus('current')
juniNtpNotificationGroup = NotificationGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 2, 8)).setObjects(("Juniper-System-Clock-MIB", "juniNtpFrequencyCalibrationStart"), ("Juniper-System-Clock-MIB", "juniNtpFrequencyCalibrationEnd"), ("Juniper-System-Clock-MIB", "juniNtpTimeSynUp"), ("Juniper-System-Clock-MIB", "juniNtpTimeSynDown"), ("Juniper-System-Clock-MIB", "juniNtpTimeServerSynUp"), ("Juniper-System-Clock-MIB", "juniNtpTimeServerSynDown"), ("Juniper-System-Clock-MIB", "juniNtpFirstSystemClockSet"), ("Juniper-System-Clock-MIB", "juniNtpClockOffSetLimitCrossed"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniNtpNotificationGroup = juniNtpNotificationGroup.setStatus('current')
juniNtpSysClockGroup2 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 2, 9)).setObjects(("Juniper-System-Clock-MIB", "juniNtpSysClockState"), ("Juniper-System-Clock-MIB", "juniNtpSysClockRootDelay"), ("Juniper-System-Clock-MIB", "juniNtpSysClockRootDispersion"), ("Juniper-System-Clock-MIB", "juniNtpSysClockStratumNumber"), ("Juniper-System-Clock-MIB", "juniNtpSysClockLastUpdateTime"), ("Juniper-System-Clock-MIB", "juniNtpSysClockLastUpdateServer"), ("Juniper-System-Clock-MIB", "juniNtpSysClockOffsetErrorNew"), ("Juniper-System-Clock-MIB", "juniNtpSysClockFrequencyErrorNew"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniNtpSysClockGroup2 = juniNtpSysClockGroup2.setStatus('current')
juniNtpSysClockDeprecatedGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 2, 10)).setObjects(("Juniper-System-Clock-MIB", "juniNtpSysClockOffsetError"), ("Juniper-System-Clock-MIB", "juniNtpSysClockFrequencyError"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniNtpSysClockDeprecatedGroup = juniNtpSysClockDeprecatedGroup.setStatus('deprecated')
juniNtpPeersGroup1 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 56, 3, 2, 11)).setObjects(("Juniper-System-Clock-MIB", "juniNtpPeerState"), ("Juniper-System-Clock-MIB", "juniNtpPeerStratumNumber"), ("Juniper-System-Clock-MIB", "juniNtpPeerAssociationMode"), ("Juniper-System-Clock-MIB", "juniNtpPeerBroadcastInterval"), ("Juniper-System-Clock-MIB", "juniNtpPeerPolledInterval"), ("Juniper-System-Clock-MIB", "juniNtpPeerPollingInterval"), ("Juniper-System-Clock-MIB", "juniNtpPeerDelay"), ("Juniper-System-Clock-MIB", "juniNtpPeerDispersion"), ("Juniper-System-Clock-MIB", "juniNtpPeerOffsetError"), ("Juniper-System-Clock-MIB", "juniNtpPeerReachability"), ("Juniper-System-Clock-MIB", "juniNtpPeerPrecision"), ("Juniper-System-Clock-MIB", "juniNtpPeerRootDelay"), ("Juniper-System-Clock-MIB", "juniNtpPeerRootDispersion"), ("Juniper-System-Clock-MIB", "juniNtpPeerRootSyncDistance"), ("Juniper-System-Clock-MIB", "juniNtpPeerRootTime"), ("Juniper-System-Clock-MIB", "juniNtpPeerRootTimeUpdateServer"), ("Juniper-System-Clock-MIB", "juniNtpPeerReceiveTime"), ("Juniper-System-Clock-MIB", "juniNtpPeerTransmitTime"), ("Juniper-System-Clock-MIB", "juniNtpPeerRequestTime"), ("Juniper-System-Clock-MIB", "juniNtpPeerFilterOffset"), ("Juniper-System-Clock-MIB", "juniNtpPeerFilterDelay"), ("Juniper-System-Clock-MIB", "juniNtpPeerFilterDispersion"), ("Juniper-System-Clock-MIB", "juniNtpPeerCfgNtpVersion"), ("Juniper-System-Clock-MIB", "juniNtpPeerCfgPacketSourceIfIndex"), ("Juniper-System-Clock-MIB", "juniNtpPeerCfgIsPreferred"), ("Juniper-System-Clock-MIB", "juniNtpPeerCfgRowStatus"), ("Juniper-System-Clock-MIB", "juniNtpPeerLastUpdateTime"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniNtpPeersGroup1 = juniNtpPeersGroup1.setStatus('current')
mibBuilder.exportSymbols("Juniper-System-Clock-MIB", juniSysClockDstAbsoluteStartTime=juniSysClockDstAbsoluteStartTime, juniNtpPeerPrecision=juniNtpPeerPrecision, juniNtpClientIfEntry=juniNtpClientIfEntry, juniNtpPeerFilterIndex=juniNtpPeerFilterIndex, juniNtpClientGroup=juniNtpClientGroup, juniNtpPeerBroadcastInterval=juniNtpPeerBroadcastInterval, juniNtpClientIfDisable=juniNtpClientIfDisable, juniSysClockObjects=juniSysClockObjects, juniSysClockDstGroup=juniSysClockDstGroup, juniNtpSysClockGroup2=juniNtpSysClockGroup2, juniNtpPeerRootTimeUpdateServer=juniNtpPeerRootTimeUpdateServer, juniNtpTimeServerSynUp=juniNtpTimeServerSynUp, juniNtpSysClock=juniNtpSysClock, juniNtpClientAdminStatus=juniNtpClientAdminStatus, juniNtpTimeSynDown=juniNtpTimeSynDown, juniSysClockDstRecurStartHour=juniSysClockDstRecurStartHour, juniNtpPeerRootDelay=juniNtpPeerRootDelay, juniNtpSysClockState=juniNtpSysClockState, juniNtpServerStratumNumber=juniNtpServerStratumNumber, juniSysClockCompliance3=juniSysClockCompliance3, juniNtpPeerFilterDelay=juniNtpPeerFilterDelay, juniNtpServer=juniNtpServer, juniNtpPeerCfgIsPreferred=juniNtpPeerCfgIsPreferred, JuniSysClockMonth=JuniSysClockMonth, juniNtpPeerReachability=juniNtpPeerReachability, juniNtpPeersGroup1=juniNtpPeersGroup1, juniNtpPeerRootDispersion=juniNtpPeerRootDispersion, juniNtpClockOffSetLimitCrossed=juniNtpClockOffSetLimitCrossed, JuniSysClockWeekOfTheMonth=JuniSysClockWeekOfTheMonth, juniSysClockDstRecurStopMinute=juniSysClockDstRecurStopMinute, juniNtpPeerFilterDispersion=juniNtpPeerFilterDispersion, juniSysClockDstAbsoluteStopTime=juniSysClockDstAbsoluteStopTime, JuniNtpClockSignedTime=JuniNtpClockSignedTime, juniNtpClientIfIsBroadcastServerDelay=juniNtpClientIfIsBroadcastServerDelay, JuniNtpClockUnsignedTime=JuniNtpClockUnsignedTime, JuniSysClockHour=JuniSysClockHour, juniNtpPeerDispersion=juniNtpPeerDispersion, juniNtpPeerDelay=juniNtpPeerDelay, juniNtpPeerTransmitTime=juniNtpPeerTransmitTime, juniSysClockTimeGroup=juniSysClockTimeGroup, juniNtpPeerCfgRowStatus=juniNtpPeerCfgRowStatus, juniSysClockGroups=juniSysClockGroups, PYSNMP_MODULE_ID=juniSysClockMIB, juniNtpSysClockDeprecatedGroup=juniNtpSysClockDeprecatedGroup, juniNtpPeerFilterOffset=juniNtpPeerFilterOffset, juniNtpSysClockOffsetErrorNew=juniNtpSysClockOffsetErrorNew, juniNtpPeerCfgPacketSourceIfIndex=juniNtpPeerCfgPacketSourceIfIndex, juniSysClockCompliance2=juniSysClockCompliance2, juniSysClockDstName=juniSysClockDstName, juniNtpPeerCfgIpAddress=juniNtpPeerCfgIpAddress, juniSysClockDstRecurStopDay=juniSysClockDstRecurStopDay, juniNtpClientPacketSourceIfIndex=juniNtpClientPacketSourceIfIndex, juniSysClockDstOffset=juniSysClockDstOffset, juniSysClockDstRecurStartDay=juniSysClockDstRecurStartDay, juniNtpAccessGroup=juniNtpAccessGroup, juniNtpAccessGroupGroup=juniNtpAccessGroupGroup, juniNtpPeerPolledInterval=juniNtpPeerPolledInterval, juniNtpSysClockRootDelay=juniNtpSysClockRootDelay, juniNtpTimeSynUp=juniNtpTimeSynUp, juniSysClockMIB=juniSysClockMIB, juniSysClockDstRecurStartMinute=juniSysClockDstRecurStartMinute, juniNtpFrequencyCalibrationEnd=juniNtpFrequencyCalibrationEnd, juniNtpPeerRootSyncDistance=juniNtpPeerRootSyncDistance, juniNtpTraps=juniNtpTraps, juniNtpFrequencyCalibrationStart=juniNtpFrequencyCalibrationStart, juniSysClockDstRecurStartWeek=juniSysClockDstRecurStartWeek, juniNtpRouterAccessGroupQueryOnly=juniNtpRouterAccessGroupQueryOnly, juniNtpPeerAssociationMode=juniNtpPeerAssociationMode, juniNtpSysClockRootDispersion=juniNtpSysClockRootDispersion, juniNtpClientIfIfIndex=juniNtpClientIfIfIndex, juniNtpPeerReceiveTime=juniNtpPeerReceiveTime, juniSysClockDst=juniSysClockDst, juniNtpSysClockOffsetError=juniNtpSysClockOffsetError, juniSysClockDstRecurStopHour=juniSysClockDstRecurStopHour, juniNtpPeerRequestTime=juniNtpPeerRequestTime, juniNtpPeerStratumNumber=juniNtpPeerStratumNumber, juniNtpFirstSystemClockSet=juniNtpFirstSystemClockSet, juniNtpPeerTable=juniNtpPeerTable, juniNtpRouterAccessGroupServe=juniNtpRouterAccessGroupServe, juniNtpSysClockFrequencyError=juniNtpSysClockFrequencyError, juniSysClockDstRecurStopWeek=juniSysClockDstRecurStopWeek, juniNtpSysClockFrequencyErrorNew=juniNtpSysClockFrequencyErrorNew, juniSysClockDstRecurStartMonth=juniSysClockDstRecurStartMonth, juniNtpClientIfIsBroadcastServer=juniNtpClientIfIsBroadcastServer, juniNtpSysClockGroup=juniNtpSysClockGroup, JuniSysClockDayOfTheWeek=JuniSysClockDayOfTheWeek, juniNtpPeerCfgNtpVersion=juniNtpPeerCfgNtpVersion, juniNtpServerAdminStatus=juniNtpServerAdminStatus, juniSysClockTime=juniSysClockTime, juniNtpPeerCfgEntry=juniNtpPeerCfgEntry, juniNtpPeerFilterRegisterTable=juniNtpPeerFilterRegisterTable, juniNtpSysClockLastUpdateServer=juniNtpSysClockLastUpdateServer, juniSysClockCompliances=juniSysClockCompliances, JuniSysClockMinute=JuniSysClockMinute, juniSysClockTimeZoneName=juniSysClockTimeZoneName, juniNtpTimeServerSynDown=juniNtpTimeServerSynDown, juniNtpPeerFilterRegisterEntry=juniNtpPeerFilterRegisterEntry, juniNtpClientIfIsBroadcastServerVersion=juniNtpClientIfIsBroadcastServerVersion, juniNtpClientIfTable=juniNtpClientIfTable, juniNtpPeerRootTime=juniNtpPeerRootTime, juniNtpClientIfRouterIndex=juniNtpClientIfRouterIndex, juniNtpClientSystemRouterIndex=juniNtpClientSystemRouterIndex, juniSysClockDstStatus=juniSysClockDstStatus, juniNtpPeerOffsetError=juniNtpPeerOffsetError, juniNtpClientBroadcastDelay=juniNtpClientBroadcastDelay, juniNtpClient=juniNtpClient, juniNtpPeers=juniNtpPeers, juniNtpRouterAccessGroupPeer=juniNtpRouterAccessGroupPeer, juniNtpSysClockLastUpdateTime=juniNtpSysClockLastUpdateTime, juniNtpServerGroup=juniNtpServerGroup, juniNtpPeersGroup=juniNtpPeersGroup, juniNtpClientIfIsBroadcastClient=juniNtpClientIfIsBroadcastClient, juniNtpPeerCfgTable=juniNtpPeerCfgTable, juniNtpPeerLastUpdateTime=juniNtpPeerLastUpdateTime, juniSysClockDstRecurStopMonth=juniSysClockDstRecurStopMonth, juniNtpObjects=juniNtpObjects, juniNtpPeerState=juniNtpPeerState, JuniNtpTimeStamp=JuniNtpTimeStamp, juniNtpSysClockStratumNumber=juniNtpSysClockStratumNumber, juniNtpPeerPollingInterval=juniNtpPeerPollingInterval, juniSysClockConformance=juniSysClockConformance, juniNtpNotificationGroup=juniNtpNotificationGroup, juniNtpPeerEntry=juniNtpPeerEntry, juniSysClockCompliance=juniSysClockCompliance, juniNtpRouterAccessGroupServeOnly=juniNtpRouterAccessGroupServeOnly, juniSysClockDateAndTime=juniSysClockDateAndTime)
| 147.101695 | 6,403 | 0.771494 | 4,468 | 43,395 | 7.492614 | 0.077216 | 0.058249 | 0.080652 | 0.094094 | 0.476626 | 0.437823 | 0.396392 | 0.38274 | 0.334678 | 0.310422 | 0 | 0.065523 | 0.077152 | 43,395 | 294 | 6,404 | 147.602041 | 0.77042 | 0.007881 | 0 | 0.101449 | 0 | 0.01087 | 0.238454 | 0.154911 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.028986 | 0 | 0.137681 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
ed8717d8100b40bd671ecfd19df8577f56d71f23 | 9,634 | py | Python | Providers/nxOMSAutomationWorker/automationworker/3.x/worker/configuration3.py | romit-kumar/PowerShell-DSC-for-Linux | de82b0f1c2191fbc695bb22523f0f7780c1a3114 | [
"MIT"
] | null | null | null | Providers/nxOMSAutomationWorker/automationworker/3.x/worker/configuration3.py | romit-kumar/PowerShell-DSC-for-Linux | de82b0f1c2191fbc695bb22523f0f7780c1a3114 | [
"MIT"
] | null | null | null | Providers/nxOMSAutomationWorker/automationworker/3.x/worker/configuration3.py | romit-kumar/PowerShell-DSC-for-Linux | de82b0f1c2191fbc695bb22523f0f7780c1a3114 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# ====================================
# Copyright (c) Microsoft Corporation. All rights reserved.
# ====================================
import importHelper
importHelper.install_aliases()
import configparser
import os
import sys
import serializerfactory
json = serializerfactory.get_serializer(sys.version_info)
CONFIG_ENV_KEY = "WORKERCONF"
WORKER_REQUIRED_CONFIG_SECTION = "worker-required"
WORKER_OPTIONAL_CONFIG_SECTION = "worker-optional"
METADATA_CONFIG_SECTION = "metadata"
# manually set configuration values
SOURCE_DIRECTORY_PATH = "source_directory_path"
COMPONENT = "component"
# required configuration keys
CERT_PATH = "jrds_cert_path"
KEY_PATH = "jrds_key_path"
BASE_URI = "jrds_base_uri"
ACCOUNT_ID = "account_id"
MACHINE_ID = "machine_id"
HYBRID_WORKER_GROUP_NAME = "hybrid_worker_group_name"
WORKER_VERSION = "worker_version"
WORKING_DIRECTORY_PATH = "working_directory_path"
# optional configuration keys
DEBUG_TRACES = "debug_traces"
BYPASS_CERTIFICATE_VERIFICATION = "bypass_certificate_verification"
ENFORCE_RUNBOOK_SIGNATURE_VALIDATION = "enforce_runbook_signature_validation"
GPG_PUBLIC_KEYRING_PATH = "gpg_public_keyring_path"
STATE_DIRECTORY_PATH = "state_directory_path"
JRDS_POLLING_FREQUENCY = "jrds_polling_frequency"
PROXY_CONFIGURATION_PATH = "proxy_configuration_path"
# optional metadata configuration keys
VM_ID = "vm_id"
IS_AZURE_VM = "is_azure_vm"
WORKER_TYPE = "worker_type"
# optional configuration default values
DEFAULT_EMPTY = ""
DEFAULT_DEBUG_TRACES = "false"
DEFAUTL_BYPASS_CERTIFICATE_VERIFICATION = "false"
DEFAULT_ENFORCE_RUNBOOK_SIGNATURE_VALIDATION = "true"
DEFAULT_GPG_PUBLIC_KEYRING_PATH = DEFAULT_EMPTY
DEFAULT_STATE_DIRECTORY_PATH = DEFAULT_EMPTY
DEFAULT_PROXY_CONFIGURATION_PATH = DEFAULT_EMPTY
DEFAULT_UNKNOWN = "Unknown"
DEFAULT_VM_ID = DEFAULT_UNKNOWN
DEFAULT_WORKER_TYPE = DEFAULT_UNKNOWN
DEFAULT_COMPONENT = DEFAULT_UNKNOWN
DEFAULT_WORKER_VERSION = "1.7.6.0"
DEFAULT_JRDS_POLLING_FREQUENCY = "15"
# state configuration keys
STATE_PID = "pid"
STATE_RESOURCE_VERSION = "resource_version"
STATE_WORKSPACE_ID = "workspace_id"
STATE_WORKER_VERSION = "worker_version"
# other configuration keys (optional and most likely not used by the worker)
AGENT_ID = "agent_id"
WORKSPACE_ID = "workspace_id"
REGISTRATION_ENDPOINT = "registration_endpoint"
CERTIFICATE_THUMBPRINT = "jrds_cert_thumbprint"
worker_configuration_file_path = DEFAULT_EMPTY
def read_and_set_configuration(configuration_file_path):
"""Reads the worker configuration from the file at config_path and sets the read configuration to
the env variable.
The configuration is read of the path, put into a dictionary which will be serialized and set in the env
variable.
Notes:
The WORKER_VERSION has to be set manually for now.
The COMPONENT has to be set manually at the entry point of each component (worker/sandbox).
Args:
configuration_file_path: string, the configuration file path.
"""
global worker_configuration_file_path
worker_configuration_file_path = configuration_file_path
clear_config()
# init and set default values for optional configuration keys
config = configparser.SafeConfigParser({DEBUG_TRACES: DEFAULT_DEBUG_TRACES,
BYPASS_CERTIFICATE_VERIFICATION: DEFAUTL_BYPASS_CERTIFICATE_VERIFICATION,
ENFORCE_RUNBOOK_SIGNATURE_VALIDATION: DEFAULT_ENFORCE_RUNBOOK_SIGNATURE_VALIDATION,
GPG_PUBLIC_KEYRING_PATH: DEFAULT_GPG_PUBLIC_KEYRING_PATH,
STATE_DIRECTORY_PATH: DEFAULT_STATE_DIRECTORY_PATH,
JRDS_POLLING_FREQUENCY: DEFAULT_JRDS_POLLING_FREQUENCY,
PROXY_CONFIGURATION_PATH: DEFAULT_PROXY_CONFIGURATION_PATH,
WORKER_TYPE: DEFAULT_WORKER_TYPE,
VM_ID: DEFAULT_VM_ID,
IS_AZURE_VM: "False"})
# load the worker configuration file
config.read(configuration_file_path)
# create the configuration dictionary
# read required configuration values
configuration = {CERT_PATH: os.path.abspath(config.get(WORKER_REQUIRED_CONFIG_SECTION, CERT_PATH)),
KEY_PATH: os.path.abspath(config.get(WORKER_REQUIRED_CONFIG_SECTION, KEY_PATH)),
BASE_URI: config.get(WORKER_REQUIRED_CONFIG_SECTION, BASE_URI),
ACCOUNT_ID: config.get(WORKER_REQUIRED_CONFIG_SECTION, ACCOUNT_ID),
MACHINE_ID: config.get(WORKER_REQUIRED_CONFIG_SECTION, MACHINE_ID),
HYBRID_WORKER_GROUP_NAME: config.get(WORKER_REQUIRED_CONFIG_SECTION, HYBRID_WORKER_GROUP_NAME),
WORKING_DIRECTORY_PATH: os.path.abspath(
config.get(WORKER_REQUIRED_CONFIG_SECTION, WORKING_DIRECTORY_PATH)),
SOURCE_DIRECTORY_PATH: os.path.dirname(os.path.realpath(__file__)),
WORKER_VERSION: DEFAULT_WORKER_VERSION,
COMPONENT: DEFAULT_COMPONENT}
# read optional configuration section
configuration.update({DEBUG_TRACES: config.getboolean(WORKER_OPTIONAL_CONFIG_SECTION, DEBUG_TRACES),
BYPASS_CERTIFICATE_VERIFICATION: config.getboolean(WORKER_OPTIONAL_CONFIG_SECTION,
BYPASS_CERTIFICATE_VERIFICATION),
ENFORCE_RUNBOOK_SIGNATURE_VALIDATION: config.getboolean(WORKER_OPTIONAL_CONFIG_SECTION,
ENFORCE_RUNBOOK_SIGNATURE_VALIDATION),
GPG_PUBLIC_KEYRING_PATH: config.get(WORKER_OPTIONAL_CONFIG_SECTION, GPG_PUBLIC_KEYRING_PATH),
STATE_DIRECTORY_PATH: config.get(WORKER_OPTIONAL_CONFIG_SECTION, STATE_DIRECTORY_PATH),
JRDS_POLLING_FREQUENCY: config.getint(WORKER_OPTIONAL_CONFIG_SECTION, JRDS_POLLING_FREQUENCY),
PROXY_CONFIGURATION_PATH: config.get(WORKER_OPTIONAL_CONFIG_SECTION,
PROXY_CONFIGURATION_PATH),
WORKER_TYPE: config.get(METADATA_CONFIG_SECTION, WORKER_TYPE),
VM_ID: config.get(METADATA_CONFIG_SECTION, VM_ID),
IS_AZURE_VM: config.getboolean(METADATA_CONFIG_SECTION, IS_AZURE_VM)})
# set the worker conf to env var
set_config(configuration)
def set_config(configuration):
"""Sets the worker configuration to the env variable.
This method will merge the provided dictionary to any existent value in the environment variable.
Args:
configuration: dictionary(string), the configuration key value pairs.
"""
try:
env_config = os.environ[CONFIG_ENV_KEY]
config = json.loads(env_config)
config.update(configuration)
configuration = config
except KeyError:
pass
os.environ[CONFIG_ENV_KEY] = json.dumps(configuration)
def clear_config():
try:
del os.environ[CONFIG_ENV_KEY]
except Exception:
pass
def get_value(key):
"""Gets a specific value from the configuration value in the environment variable.
This method will merge the provided dictionary to any existent value in the environment variable.
Args:
key: string, the configuration key value.
Returns:
The configuration value.
"""
try:
return json.loads(os.environ[CONFIG_ENV_KEY])[key]
except KeyError:
raise KeyError("Configuration environment variable not found. [key=" + key + "].")
def get_jrds_get_sandbox_actions_polling_freq():
return get_value(JRDS_POLLING_FREQUENCY)
def get_jrds_get_job_actions_polling_freq():
return get_value(JRDS_POLLING_FREQUENCY)
def get_component():
return get_value(COMPONENT)
def get_jrds_cert_path():
return get_value(CERT_PATH)
def get_jrds_key_path():
return get_value(KEY_PATH)
def get_jrds_base_uri():
return get_value(BASE_URI)
def get_account_id():
return get_value(ACCOUNT_ID)
def get_machine_id():
return get_value(MACHINE_ID)
def get_hybrid_worker_name():
return get_value(HYBRID_WORKER_GROUP_NAME)
def get_worker_version():
return get_value(WORKER_VERSION)
def get_worker_configuration_file_path():
return worker_configuration_file_path
def get_working_directory_path():
return get_value(WORKING_DIRECTORY_PATH)
def get_debug_traces():
return get_value(DEBUG_TRACES)
def get_verify_certificates():
return get_value(BYPASS_CERTIFICATE_VERIFICATION)
def get_source_directory_path():
return get_value(SOURCE_DIRECTORY_PATH)
def get_enforce_runbook_signature_validation():
return get_value(ENFORCE_RUNBOOK_SIGNATURE_VALIDATION)
def get_gpg_public_keyrings_path():
"""Return a list of string representing keyring path."""
keyring_list = str(get_value(GPG_PUBLIC_KEYRING_PATH)).split(",")
sanitized_list = list(map(str.strip, keyring_list))
return sanitized_list
def get_state_directory_path():
return get_value(STATE_DIRECTORY_PATH)
def get_temporary_request_directory_path():
return os.path.join(get_working_directory_path(), "req_tmp")
def get_proxy_configuration_path():
return get_value(PROXY_CONFIGURATION_PATH)
def get_worker_type():
return get_value(WORKER_TYPE)
| 35.160584 | 127 | 0.711542 | 1,117 | 9,634 | 5.724261 | 0.163832 | 0.044729 | 0.039412 | 0.04645 | 0.371286 | 0.251329 | 0.1736 | 0.097279 | 0.072412 | 0.072412 | 0 | 0.000933 | 0.221403 | 9,634 | 273 | 128 | 35.289377 | 0.851486 | 0.172929 | 0 | 0.057692 | 0 | 0 | 0.075777 | 0.028528 | 0 | 0 | 0 | 0 | 0 | 1 | 0.160256 | false | 0.051282 | 0.038462 | 0.128205 | 0.339744 | 0.00641 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 2 |
71ec21ae09ea080c972190abbeb176bbfa00fb1d | 2,702 | py | Python | registry/domain/models/organization_member.py | vinthedark/snet-marketplace-service | 66ed9d093b00f09d3e28ef4d86c4e4c125037d06 | [
"MIT"
] | 14 | 2019-02-12T09:14:52.000Z | 2021-03-11T18:42:22.000Z | registry/domain/models/organization_member.py | vinthedark/snet-marketplace-service | 66ed9d093b00f09d3e28ef4d86c4e4c125037d06 | [
"MIT"
] | 1,079 | 2019-01-10T04:31:24.000Z | 2022-03-29T06:16:42.000Z | registry/domain/models/organization_member.py | vinthedark/snet-marketplace-service | 66ed9d093b00f09d3e28ef4d86c4e4c125037d06 | [
"MIT"
] | 20 | 2018-12-18T13:06:41.000Z | 2021-09-17T11:13:01.000Z | from uuid import uuid4
from common.utils import datetime_to_string
class OrganizationMember(object):
def __init__(self, org_uuid, username, status, role, address=None,
invite_code=None, transaction_hash=None, invited_on=None, updated_on=None):
self.__role = role
self.__org_uuid = org_uuid
self.__username = username
self.__status = status
self.__address = address
self.__invite_code = invite_code
self.__transaction_hash = transaction_hash
self.__invited_on = invited_on
self.__updated_on = updated_on
@property
def org_uuid(self):
return self.__org_uuid
@property
def username(self):
return self.__username
@property
def status(self):
return self.__status
@property
def address(self):
return self.__address
@property
def role(self):
return self.__role
@property
def invite_code(self):
return self.__invite_code
@property
def transaction_hash(self):
return self.__transaction_hash
@property
def invited_on(self):
return self.__invited_on
@property
def updated_on(self):
return self.__updated_on
def set_transaction_hash(self, transaction_hash):
self.__transaction_hash = transaction_hash
def __repr__(self):
return "Item(%s, %s,%s)" % (self.address, self.username, self.role)
def __eq__(self, other):
if isinstance(other, OrganizationMember):
return self.address == other.address and self.username == other.username and self.role == other.role
else:
return False
def __ne__(self, other):
return not self.__eq__(other)
def __hash__(self):
return hash(self.__repr__())
def to_response(self):
member_dict = {
"username": self.username,
"address": self.address,
"status": self.status,
"role": self.role,
"invited_on": "",
"updated_on": ""
}
if self.invited_on is not None:
member_dict["invited_on"] = datetime_to_string(self.invited_on)
if self.updated_on is not None:
member_dict["updated_on"] = datetime_to_string(self.updated_on)
return member_dict
def generate_invite_code(self):
self.__invite_code = uuid4().hex
def set_status(self, status):
self.__status = status
def set_invited_on(self, invited_on):
self.__invited_on = invited_on
def set_updated_on(self, updated_on):
self.__updated_on = updated_on
def set_role(self, role):
self.__role = role
| 26.490196 | 112 | 0.633235 | 323 | 2,702 | 4.863777 | 0.151703 | 0.074475 | 0.080204 | 0.028644 | 0.203692 | 0.057288 | 0 | 0 | 0 | 0 | 0 | 0.001028 | 0.279793 | 2,702 | 101 | 113 | 26.752475 | 0.806269 | 0 | 0 | 0.24359 | 0 | 0 | 0.029608 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.269231 | false | 0 | 0.025641 | 0.153846 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
71fb812cd1919ad2115f85e01a4e729bce16b4a0 | 16,780 | py | Python | pysnmp/CTRON-SFPS-CALL-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/CTRON-SFPS-CALL-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/CTRON-SFPS-CALL-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module CTRON-SFPS-CALL-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/CTRON-SFPS-CALL-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 18:15:09 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, ObjectIdentifier, Integer = mibBuilder.importSymbols("ASN1", "OctetString", "ObjectIdentifier", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ConstraintsUnion, SingleValueConstraint, ConstraintsIntersection, ValueSizeConstraint, ValueRangeConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ConstraintsUnion", "SingleValueConstraint", "ConstraintsIntersection", "ValueSizeConstraint", "ValueRangeConstraint")
sfpsSapAPI, sfpsCallTableStats, sfpsSap, sfpsCallByTuple = mibBuilder.importSymbols("CTRON-SFPS-INCLUDE-MIB", "sfpsSapAPI", "sfpsCallTableStats", "sfpsSap", "sfpsCallByTuple")
ModuleCompliance, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup")
Counter32, MibIdentifier, Gauge32, ModuleIdentity, IpAddress, TimeTicks, Integer32, MibScalar, MibTable, MibTableRow, MibTableColumn, NotificationType, ObjectIdentity, iso, Unsigned32, Counter64, Bits = mibBuilder.importSymbols("SNMPv2-SMI", "Counter32", "MibIdentifier", "Gauge32", "ModuleIdentity", "IpAddress", "TimeTicks", "Integer32", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "NotificationType", "ObjectIdentity", "iso", "Unsigned32", "Counter64", "Bits")
DisplayString, TextualConvention = mibBuilder.importSymbols("SNMPv2-TC", "DisplayString", "TextualConvention")
class HexInteger(Integer32):
pass
sfpsSapTable = MibTable((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1), )
if mibBuilder.loadTexts: sfpsSapTable.setStatus('mandatory')
sfpsSapTableEntry = MibTableRow((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1), ).setIndexNames((0, "CTRON-SFPS-CALL-MIB", "sfpsSapTableTag"), (0, "CTRON-SFPS-CALL-MIB", "sfpsSapTableHash"), (0, "CTRON-SFPS-CALL-MIB", "sfpsSapTableHashIndex"))
if mibBuilder.loadTexts: sfpsSapTableEntry.setStatus('mandatory')
sfpsSapTableTag = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapTableTag.setStatus('mandatory')
sfpsSapTableHash = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapTableHash.setStatus('mandatory')
sfpsSapTableHashIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapTableHashIndex.setStatus('mandatory')
sfpsSapTableSourceCP = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 4), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapTableSourceCP.setStatus('mandatory')
sfpsSapTableDestCP = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 5), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapTableDestCP.setStatus('mandatory')
sfpsSapTableSAP = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 6), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapTableSAP.setStatus('mandatory')
sfpsSapTableOperStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("other", 1), ("disable", 2), ("enable", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapTableOperStatus.setStatus('mandatory')
sfpsSapTableAdminStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("other", 1), ("disable", 2), ("enable", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapTableAdminStatus.setStatus('mandatory')
sfpsSapTableStateTime = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 9), TimeTicks()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapTableStateTime.setStatus('mandatory')
sfpsSapTableDescription = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 10), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapTableDescription.setStatus('mandatory')
sfpsSapTableNumAccepted = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 11), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapTableNumAccepted.setStatus('mandatory')
sfpsSapTableNumDropped = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 12), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapTableNumDropped.setStatus('mandatory')
sfpsSapTableUnicastSap = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 13), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: sfpsSapTableUnicastSap.setStatus('mandatory')
sfpsSapTableNVStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 1, 1, 14), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("other", 1), ("disable", 2), ("enable", 3), ("unset", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapTableNVStatus.setStatus('mandatory')
sfpsSapAPIVerb = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 2, 1), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11))).clone(namedValues=NamedValues(("getStatus", 1), ("next", 2), ("first", 3), ("disable", 4), ("disableInNvram", 5), ("enable", 6), ("enableInNvram", 7), ("clearFromNvram", 8), ("clearAllNvram", 9), ("resetStats", 10), ("resetAllStats", 11)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: sfpsSapAPIVerb.setStatus('mandatory')
sfpsSapAPISourceCP = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 2, 2), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: sfpsSapAPISourceCP.setStatus('mandatory')
sfpsSapAPIDestCP = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 2, 3), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: sfpsSapAPIDestCP.setStatus('mandatory')
sfpsSapAPISAP = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 2, 4), DisplayString()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: sfpsSapAPISAP.setStatus('mandatory')
sfpsSapAPINVStatus = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 2, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("other", 1), ("disable", 2), ("enable", 3), ("unset", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapAPINVStatus.setStatus('mandatory')
sfpsSapAPIAdminStatus = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 2, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("other", 1), ("disable", 2), ("enable", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapAPIAdminStatus.setStatus('mandatory')
sfpsSapAPIOperStatus = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 2, 7), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("other", 1), ("disable", 2), ("enable", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapAPIOperStatus.setStatus('mandatory')
sfpsSapAPINvSet = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 2, 8), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapAPINvSet.setStatus('mandatory')
sfpsSapAPINVTotal = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 2, 9), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: sfpsSapAPINVTotal.setStatus('mandatory')
sfpsSapAPINumAccept = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 2, 10), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapAPINumAccept.setStatus('mandatory')
sfpsSapAPINvDiscard = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 2, 11), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapAPINvDiscard.setStatus('mandatory')
sfpsSapAPIDefaultStatus = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 2, 2, 2, 12), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("other", 1), ("disable", 2), ("enable", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsSapAPIDefaultStatus.setStatus('mandatory')
sfpsCallByTupleTable = MibTable((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1), )
if mibBuilder.loadTexts: sfpsCallByTupleTable.setStatus('mandatory')
sfpsCallByTupleEntry = MibTableRow((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1), ).setIndexNames((0, "CTRON-SFPS-CALL-MIB", "sfpsCallByTupleInPort"), (0, "CTRON-SFPS-CALL-MIB", "sfpsCallByTupleSrcHash"), (0, "CTRON-SFPS-CALL-MIB", "sfpsCallByTupleDstHash"), (0, "CTRON-SFPS-CALL-MIB", "sfpsCallByTupleHashIndex"))
if mibBuilder.loadTexts: sfpsCallByTupleEntry.setStatus('mandatory')
sfpsCallByTupleInPort = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleInPort.setStatus('mandatory')
sfpsCallByTupleSrcHash = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleSrcHash.setStatus('mandatory')
sfpsCallByTupleDstHash = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleDstHash.setStatus('mandatory')
sfpsCallByTupleHashIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleHashIndex.setStatus('mandatory')
sfpsCallByTupleBotSrcType = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 5), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleBotSrcType.setStatus('mandatory')
sfpsCallByTupleBotSrcAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 6), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleBotSrcAddress.setStatus('mandatory')
sfpsCallByTupleBotDstType = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 7), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleBotDstType.setStatus('mandatory')
sfpsCallByTupleBotDstAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 8), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleBotDstAddress.setStatus('mandatory')
sfpsCallByTupleTopSrcType = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 9), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleTopSrcType.setStatus('mandatory')
sfpsCallByTupleTopSrcAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 10), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleTopSrcAddress.setStatus('mandatory')
sfpsCallByTupleTopDstType = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 11), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleTopDstType.setStatus('mandatory')
sfpsCallByTupleTopDstAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 12), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleTopDstAddress.setStatus('mandatory')
sfpsCallByTupleCallProcName = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 13), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleCallProcName.setStatus('mandatory')
sfpsCallByTupleCallTag = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 14), HexInteger()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleCallTag.setStatus('mandatory')
sfpsCallByTupleCallState = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 15), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleCallState.setStatus('mandatory')
sfpsCallByTupleTimeRemaining = MibTableColumn((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 5, 1, 1, 16), TimeTicks()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallByTupleTimeRemaining.setStatus('mandatory')
sfpsCallTableStatsRam = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 6, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallTableStatsRam.setStatus('mandatory')
sfpsCallTableStatsSize = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 6, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallTableStatsSize.setStatus('mandatory')
sfpsCallTableStatsInUse = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 6, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallTableStatsInUse.setStatus('mandatory')
sfpsCallTableStatsMax = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 6, 4), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallTableStatsMax.setStatus('mandatory')
sfpsCallTableStatsTotMisses = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 6, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallTableStatsTotMisses.setStatus('mandatory')
sfpsCallTableStatsMissStart = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 6, 7), TimeTicks()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallTableStatsMissStart.setStatus('mandatory')
sfpsCallTableStatsMissStop = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 6, 8), TimeTicks()).setMaxAccess("readonly")
if mibBuilder.loadTexts: sfpsCallTableStatsMissStop.setStatus('mandatory')
sfpsCallTableStatsLastMiss = MibScalar((1, 3, 6, 1, 4, 1, 52, 4, 2, 4, 2, 2, 5, 1, 6, 9), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: sfpsCallTableStatsLastMiss.setStatus('mandatory')
mibBuilder.exportSymbols("CTRON-SFPS-CALL-MIB", sfpsSapTableDestCP=sfpsSapTableDestCP, sfpsCallTableStatsTotMisses=sfpsCallTableStatsTotMisses, sfpsSapTableEntry=sfpsSapTableEntry, sfpsCallByTupleInPort=sfpsCallByTupleInPort, sfpsCallByTupleTable=sfpsCallByTupleTable, sfpsCallByTupleBotSrcType=sfpsCallByTupleBotSrcType, sfpsCallByTupleBotSrcAddress=sfpsCallByTupleBotSrcAddress, sfpsSapTableAdminStatus=sfpsSapTableAdminStatus, sfpsSapTableUnicastSap=sfpsSapTableUnicastSap, sfpsSapAPIVerb=sfpsSapAPIVerb, sfpsSapAPISAP=sfpsSapAPISAP, sfpsSapTableSourceCP=sfpsSapTableSourceCP, sfpsSapAPINvSet=sfpsSapAPINvSet, sfpsSapAPINumAccept=sfpsSapAPINumAccept, sfpsCallByTupleTopSrcType=sfpsCallByTupleTopSrcType, sfpsSapTableStateTime=sfpsSapTableStateTime, sfpsCallByTupleBotDstAddress=sfpsCallByTupleBotDstAddress, sfpsCallByTupleTopSrcAddress=sfpsCallByTupleTopSrcAddress, sfpsCallByTupleTopDstType=sfpsCallByTupleTopDstType, sfpsSapAPIAdminStatus=sfpsSapAPIAdminStatus, sfpsCallByTupleEntry=sfpsCallByTupleEntry, sfpsCallByTupleTopDstAddress=sfpsCallByTupleTopDstAddress, sfpsSapAPINVTotal=sfpsSapAPINVTotal, sfpsCallTableStatsMax=sfpsCallTableStatsMax, sfpsCallTableStatsRam=sfpsCallTableStatsRam, sfpsCallTableStatsSize=sfpsCallTableStatsSize, sfpsSapTableHash=sfpsSapTableHash, sfpsSapTableNVStatus=sfpsSapTableNVStatus, sfpsCallByTupleCallTag=sfpsCallByTupleCallTag, sfpsSapAPINvDiscard=sfpsSapAPINvDiscard, sfpsCallByTupleTimeRemaining=sfpsCallByTupleTimeRemaining, sfpsSapTableDescription=sfpsSapTableDescription, HexInteger=HexInteger, sfpsSapTableNumDropped=sfpsSapTableNumDropped, sfpsSapAPIDefaultStatus=sfpsSapAPIDefaultStatus, sfpsSapTable=sfpsSapTable, sfpsCallByTupleSrcHash=sfpsCallByTupleSrcHash, sfpsSapAPIDestCP=sfpsSapAPIDestCP, sfpsSapTableTag=sfpsSapTableTag, sfpsCallByTupleDstHash=sfpsCallByTupleDstHash, sfpsSapAPIOperStatus=sfpsSapAPIOperStatus, sfpsSapAPINVStatus=sfpsSapAPINVStatus, sfpsCallTableStatsMissStart=sfpsCallTableStatsMissStart, sfpsCallByTupleBotDstType=sfpsCallByTupleBotDstType, sfpsCallByTupleCallState=sfpsCallByTupleCallState, sfpsCallByTupleCallProcName=sfpsCallByTupleCallProcName, sfpsSapTableHashIndex=sfpsSapTableHashIndex, sfpsSapTableSAP=sfpsSapTableSAP, sfpsSapTableNumAccepted=sfpsSapTableNumAccepted, sfpsCallByTupleHashIndex=sfpsCallByTupleHashIndex, sfpsSapTableOperStatus=sfpsSapTableOperStatus, sfpsCallTableStatsMissStop=sfpsCallTableStatsMissStop, sfpsCallTableStatsLastMiss=sfpsCallTableStatsLastMiss, sfpsCallTableStatsInUse=sfpsCallTableStatsInUse, sfpsSapAPISourceCP=sfpsSapAPISourceCP)
| 132.125984 | 2,548 | 0.749166 | 1,978 | 16,780 | 6.35541 | 0.093023 | 0.019569 | 0.016466 | 0.017182 | 0.448731 | 0.419457 | 0.336091 | 0.29926 | 0.294487 | 0.238326 | 0 | 0.0777 | 0.09112 | 16,780 | 126 | 2,549 | 133.174603 | 0.746574 | 0.019785 | 0 | 0 | 0 | 0 | 0.118134 | 0.010706 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.008475 | 0.059322 | 0 | 0.067797 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c087c64cccb416373ebd5b43832430d6b2dd92c | 536 | py | Python | backend/treeckle/users/migrations/0010_auto_20210708_1620.py | CAPTxTreeckle/Treeckle-3.0 | 0c8a1a1db5685a22968644deabdd79b525ff0140 | [
"MIT"
] | null | null | null | backend/treeckle/users/migrations/0010_auto_20210708_1620.py | CAPTxTreeckle/Treeckle-3.0 | 0c8a1a1db5685a22968644deabdd79b525ff0140 | [
"MIT"
] | 1 | 2021-11-28T06:18:51.000Z | 2021-11-28T06:18:51.000Z | backend/treeckle/users/migrations/0010_auto_20210708_1620.py | CAPTxTreeckle/Treeckle-3.0 | 0c8a1a1db5685a22968644deabdd79b525ff0140 | [
"MIT"
] | 1 | 2021-08-02T09:20:10.000Z | 2021-08-02T09:20:10.000Z | # Generated by Django 3.2.3 on 2021-07-08 16:20
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('users', '0009_gmailauthentication'),
]
operations = [
migrations.RemoveField(
model_name='user',
name='third_party_authenticator',
),
migrations.RemoveField(
model_name='user',
name='third_party_id',
),
migrations.DeleteModel(
name='GmailAuthentication',
),
]
| 21.44 | 47 | 0.572761 | 48 | 536 | 6.25 | 0.645833 | 0.14 | 0.173333 | 0.2 | 0.32 | 0.32 | 0.32 | 0.32 | 0 | 0 | 0 | 0.052198 | 0.320896 | 536 | 24 | 48 | 22.333333 | 0.771978 | 0.083955 | 0 | 0.388889 | 1 | 0 | 0.194274 | 0.100205 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c1270a5bbbe1cd0bca120127700d0922f79d41b | 2,933 | py | Python | scripts/m2.py | byu-iot-security/cheeziot_webserver | bf3b173ff0ef0165329212f2ee65485191f75abc | [
"MIT"
] | 2 | 2017-02-13T11:55:56.000Z | 2017-02-15T08:27:49.000Z | scripts/m2.py | byu-iot-security/cheeziot_webserver | bf3b173ff0ef0165329212f2ee65485191f75abc | [
"MIT"
] | null | null | null | scripts/m2.py | byu-iot-security/cheeziot_webserver | bf3b173ff0ef0165329212f2ee65485191f75abc | [
"MIT"
] | null | null | null | import pymongo
import os
import datetime
from base64 import decodestring
from bson.objectid import ObjectId
from pymongo import MongoClient
config_file = open("config", 'r')
collection = ''
database = ''
#Parse the configuration (config) file
for line in config_file:
field,val = line.split("=")
if(field == "COLLECTION"):
collection = val.rstrip()
elif(field == "DATABASE"):
database = val.rstrip()
print collection
print database
client = MongoClient('localhost', 27017)
# TODO: Only retrieve records with image data
# Assume that all records have an image, for now
# Get a hardcoded entry
# entry = client[database][collection].find_one({"_id": ObjectId("58a61687870a765994850d5a")})
# Sort from newest to oldest based on the kaa timestamp, and return the newest record
# For sorting nested fields, see http://stackoverflow.com/questions/12031507/mongodb-sorting-by-nested-object-value
entry = client[database][collection].find().sort("header.timestamp", pymongo.DESCENDING).limit(1)[0]
# Get the most recent image record according to _id
# The _id field will contain an implicit timestamp in it
# See http://stackoverflow.com/questions/4421207/mongodb-how-to-get-the-last-n-records
# entry = client[database][collection].find().sort("_id", pymongo.DESCENDING).limit(1)[0]
# NOTE: find_one() and find().limit(1) aren't perfectly interchangeable
# See http://dba.stackexchange.com/questions/7573/difference-between-mongodbs-find-and-findone-calls
# # Other tests
# cursor = client[database][collection].find().sort("_id", pymongo.DESCENDING)
# cursor = client[database][collection].find()
# print cursor[0].get("_id")
print "-----------------------"
print entry.get("_id")
person_name = entry.get("event").get("person_name")
if person_name:
name = person_name.rstrip()
else:
name = "?"
if os.path.isfile("public/faces.html"):
os.remove("public/faces.html")
#construct the faces.html page to be served to a client.
last_seen = open("public/faces.html", "w")
last_seen.write("<!doctype html>\n")
last_seen.write(" <head>\n")
last_seen.write(" <title>Faces of the Clyde</title>\n")
last_seen.write(" </head>\n")
last_seen.write(" <body>\n")
last_seen.write(" <img src=\"images/faces.png\">\n")
last_seen.write(" <div>\n")
last_seen.write(" <img src=\"images/test_out.bmp\" width=\"200\" height=\"200\">\n")
name_string = " <font size = \"6\" face=\"Courier New\">" + name + "</b>\n"
last_seen.write(" </div>\n")
last_seen.write(" <div>\n")
last_seen.write(name_string)
last_seen.write(" </div>\n")
last_seen.write(" </body>\n")
last_seen.write("</html>\n")
last_seen.close()
raw_image_data = entry.get("event").get("image_data")
#if test_out.bmp already exists, delete it
if os.path.isfile("public/images/test_out.bmp"):
os.remove("public/images/test_out.bmp")
f = file("public/images/test_out.bmp", "wb")
for i in raw_image_data:
f.write(decodestring(i))
| 30.237113 | 115 | 0.707467 | 430 | 2,933 | 4.725581 | 0.369767 | 0.062992 | 0.089567 | 0.082677 | 0.317421 | 0.185532 | 0.164862 | 0.153051 | 0.09498 | 0 | 0 | 0.022842 | 0.119332 | 2,933 | 96 | 116 | 30.552083 | 0.763841 | 0.390726 | 0 | 0.078431 | 0 | 0 | 0.272367 | 0.057191 | 0 | 0 | 0 | 0.010417 | 0 | 0 | null | null | 0 | 0.117647 | null | null | 0.078431 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c151b9c94727553de4af55b007ddf35fbbe0ca0 | 1,215 | py | Python | Level1/Lessons12977/wowo0709.py | StudyForCoding/ProgrammersLevel | dc957b1c02cc4383a93b8cbf3d739e6c4d88aa25 | [
"MIT"
] | null | null | null | Level1/Lessons12977/wowo0709.py | StudyForCoding/ProgrammersLevel | dc957b1c02cc4383a93b8cbf3d739e6c4d88aa25 | [
"MIT"
] | null | null | null | Level1/Lessons12977/wowo0709.py | StudyForCoding/ProgrammersLevel | dc957b1c02cc4383a93b8cbf3d739e6c4d88aa25 | [
"MIT"
] | 1 | 2021-04-05T07:35:59.000Z | 2021-04-05T07:35:59.000Z | # 소수 만들기
from math import sqrt
def solution(nums):
answer = 0
for i in range(len(nums)):
for j in range(i+1,len(nums)):
for k in range(i+1,j):
flag = 1
num = nums[i] + nums[k] + nums[j]
for n in range(2,int(sqrt(num))+1):
if num % n == 0:
flag = 0
break
if flag: answer += 1
return answer
'''
테스트 1 〉 통과 (3.45ms, 10.2MB)
테스트 2 〉 통과 (4.57ms, 10.2MB)
테스트 3 〉 통과 (0.88ms, 10.2MB)
테스트 4 〉 통과 (0.74ms, 10.2MB)
테스트 5 〉 통과 (3.00ms, 10.3MB)
테스트 6 〉 통과 (4.09ms, 10.3MB)
테스트 7 〉 통과 (0.26ms, 10.1MB)
테스트 8 〉 통과 (10.36ms, 10.2MB)
테스트 9 〉 통과 (1.19ms, 10.3MB)
테스트 10 〉 통과 (9.91ms, 10.3MB)
테스트 11 〉 통과 (0.12ms, 10.1MB)
테스트 12 〉 통과 (0.07ms, 10.2MB)
테스트 13 〉 통과 (0.16ms, 10.2MB)
테스트 14 〉 통과 (0.06ms, 10.3MB)
테스트 15 〉 통과 (0.06ms, 10.1MB)
테스트 16 〉 통과 (11.10ms, 10.2MB)
테스트 17 〉 통과 (9.69ms, 10.3MB)
테스트 18 〉 통과 (0.14ms, 10.2MB)
테스트 19 〉 통과 (0.01ms, 10.2MB)
테스트 20 〉 통과 (15.05ms, 10.3MB)
테스트 21 〉 통과 (14.49ms, 10.2MB)
테스트 22 〉 통과 (2.29ms, 10.2MB)
테스트 23 〉 통과 (0.01ms, 10.3MB)
테스트 24 〉 통과 (11.77ms, 10.2MB)
테스트 25 〉 통과 (11.82ms, 10.2MB)
테스트 26 〉 통과 (0.01ms, 10.2MB)
''' | 25.851064 | 51 | 0.51358 | 271 | 1,215 | 2.398524 | 0.295203 | 0.12 | 0.172308 | 0.036923 | 0.086154 | 0.04 | 0 | 0 | 0 | 0 | 0 | 0.256287 | 0.312757 | 1,215 | 47 | 52 | 25.851064 | 0.491018 | 0.004938 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.071429 | 0 | 0.214286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c1725af21530638134951a47cd55927d486bddb | 123,343 | py | Python | java/java2py/java2python-0.5.1/java2python/lang/JavaLexer.py | DanielMabadeje/Artificial-Intelligence-Deep-Learning-Machine-Learning-Tutorials | 7adab3877fc1d3f1d5f57e6c1743dae8f76f72c5 | [
"Apache-2.0"
] | 3,266 | 2017-08-06T16:51:46.000Z | 2022-03-30T07:34:24.000Z | java/java2py/java2python-0.5.1/java2python/lang/JavaLexer.py | hashDanChibueze/Artificial-Intelligence-Deep-Learning-Machine-Learning-Tutorials | bef2c415d154a052c00e99a05f0870af7a5819ac | [
"Apache-2.0"
] | 150 | 2017-08-28T14:59:36.000Z | 2022-03-11T23:21:35.000Z | java/java2py/java2python-0.5.1/java2python/lang/JavaLexer.py | hashDanChibueze/Artificial-Intelligence-Deep-Learning-Machine-Learning-Tutorials | bef2c415d154a052c00e99a05f0870af7a5819ac | [
"Apache-2.0"
] | 1,449 | 2017-08-06T17:40:59.000Z | 2022-03-31T12:03:24.000Z | # $ANTLR 3.1.3 Mar 18, 2009 10:09:25 Java.g 2012-01-29 13:54:05
import sys
from antlr3 import *
from antlr3.compat import set, frozenset
# for convenience in actions
HIDDEN = BaseRecognizer.HIDDEN
# token types
PACKAGE=84
EXPONENT=173
STAR=49
WHILE=103
MOD=32
MOD_ASSIGN=33
CASE=58
CHAR=60
NEW=82
DO=64
GENERIC_TYPE_PARAM_LIST=138
CLASS_INSTANCE_INITIALIZER=121
ARRAY_ELEMENT_ACCESS=115
FOR_CONDITION=129
NOT=34
VAR_DECLARATION=160
ANNOTATION_METHOD_DECL=109
EOF=-1
DIV_ASSIGN=14
BREAK=56
LOGICAL_AND=26
BIT_SHIFT_RIGHT_ASSIGN=9
UNARY_PLUS=159
TYPE=157
FINAL=70
INC=21
RPAREN=43
IMPORT=78
STRING_LITERAL=170
FOR_UPDATE=132
FLOATING_POINT_LITERAL=168
CAST_EXPR=118
NOT_EQUAL=35
VOID_METHOD_DECL=163
RETURN=88
THIS=95
DOUBLE=65
VOID=101
ENUM_TOP_LEVEL_SCOPE=125
SUPER=92
COMMENT=181
ANNOTATION_INIT_KEY_LIST=107
JAVA_ID_START=178
FLOAT_TYPE_SUFFIX=174
PRE_DEC=149
RBRACK=41
IMPLEMENTS_CLAUSE=140
SWITCH_BLOCK_LABEL_LIST=154
LINE_COMMENT=182
PRIVATE=85
STATIC=90
BLOCK_SCOPE=117
ANNOTATION_INIT_DEFAULT_KEY=106
SWITCH=93
NULL=83
VAR_DECLARATOR=161
MINUS_ASSIGN=31
ELSE=66
STRICTFP=91
CHARACTER_LITERAL=169
PRE_INC=150
ANNOTATION_LIST=108
ELLIPSIS=17
NATIVE=81
OCTAL_ESCAPE=177
UNARY_MINUS=158
THROWS=97
LCURLY=23
INT=79
FORMAL_PARAM_VARARG_DECL=135
METHOD_CALL=144
ASSERT=54
TRY=100
INTERFACE_TOP_LEVEL_SCOPE=139
SHIFT_LEFT=45
WS=180
SHIFT_RIGHT=47
FORMAL_PARAM_STD_DECL=134
LOCAL_MODIFIER_LIST=142
OR=36
LESS_THAN=25
SHIFT_RIGHT_ASSIGN=48
EXTENDS_BOUND_LIST=127
JAVA_SOURCE=143
CATCH=59
FALSE=69
INTEGER_TYPE_SUFFIX=172
DECIMAL_LITERAL=167
THROW=96
FOR_INIT=131
PROTECTED=86
DEC=12
CLASS=61
LBRACK=22
BIT_SHIFT_RIGHT=8
THROWS_CLAUSE=156
GREATER_OR_EQUAL=19
FOR=73
LOGICAL_NOT=27
THIS_CONSTRUCTOR_CALL=155
FLOAT=72
JAVADOC_COMMENT=183
ABSTRACT=53
AND=4
POST_DEC=147
AND_ASSIGN=5
ANNOTATION_SCOPE=110
MODIFIER_LIST=145
STATIC_ARRAY_CREATOR=152
LPAREN=29
IF=74
AT=7
CONSTRUCTOR_DECL=124
ESCAPE_SEQUENCE=175
LABELED_STATEMENT=141
UNICODE_ESCAPE=176
BOOLEAN=55
SYNCHRONIZED=94
EXPR=126
CLASS_TOP_LEVEL_SCOPE=123
IMPLEMENTS=75
CONTINUE=62
COMMA=11
TRANSIENT=98
XOR_ASSIGN=52
EQUAL=18
LOGICAL_OR=28
ARGUMENT_LIST=112
QUALIFIED_TYPE_IDENT=151
IDENT=164
PLUS=38
ANNOTATION_INIT_BLOCK=105
HEX_LITERAL=165
DOT=15
SHIFT_LEFT_ASSIGN=46
FORMAL_PARAM_LIST=133
GENERIC_TYPE_ARG_LIST=137
DOTSTAR=16
ANNOTATION_TOP_LEVEL_SCOPE=111
BYTE=57
XOR=51
JAVA_ID_PART=179
GREATER_THAN=20
VOLATILE=102
PARENTESIZED_EXPR=146
LESS_OR_EQUAL=24
ARRAY_DECLARATOR_LIST=114
CLASS_STATIC_INITIALIZER=122
DEFAULT=63
OCTAL_LITERAL=166
HEX_DIGIT=171
SHORT=89
INSTANCEOF=76
MINUS=30
SEMI=44
TRUE=99
EXTENDS_CLAUSE=128
STAR_ASSIGN=50
VAR_DECLARATOR_LIST=162
COLON=10
ARRAY_DECLARATOR=113
OR_ASSIGN=37
ENUM=67
QUESTION=40
FINALLY=71
RCURLY=42
ASSIGN=6
PLUS_ASSIGN=39
ANNOTATION_INIT_ARRAY_ELEMENT=104
FUNCTION_METHOD_DECL=136
INTERFACE=77
DIV=13
POST_INC=148
LONG=80
CLASS_CONSTRUCTOR_CALL=120
PUBLIC=87
EXTENDS=68
FOR_EACH=130
ARRAY_INITIALIZER=116
CATCH_CLAUSE_LIST=119
SUPER_CONSTRUCTOR_CALL=153
class JavaLexer(Lexer):
grammarFileName = "Java.g"
antlr_version = version_str_to_tuple("3.1.3 Mar 18, 2009 10:09:25")
antlr_version_str = "3.1.3 Mar 18, 2009 10:09:25"
def __init__(self, input=None, state=None):
if state is None:
state = RecognizerSharedState()
super(JavaLexer, self).__init__(input, state)
self.dfa29 = self.DFA29(
self, 29,
eot = self.DFA29_eot,
eof = self.DFA29_eof,
min = self.DFA29_min,
max = self.DFA29_max,
accept = self.DFA29_accept,
special = self.DFA29_special,
transition = self.DFA29_transition
)
# $ANTLR start "AND"
def mAND(self, ):
try:
_type = AND
_channel = DEFAULT_CHANNEL
# Java.g:7:5: ( '&' )
# Java.g:7:7: '&'
pass
self.match(38)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "AND"
# $ANTLR start "AND_ASSIGN"
def mAND_ASSIGN(self, ):
try:
_type = AND_ASSIGN
_channel = DEFAULT_CHANNEL
# Java.g:8:12: ( '&=' )
# Java.g:8:14: '&='
pass
self.match("&=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "AND_ASSIGN"
# $ANTLR start "ASSIGN"
def mASSIGN(self, ):
try:
_type = ASSIGN
_channel = DEFAULT_CHANNEL
# Java.g:9:8: ( '=' )
# Java.g:9:10: '='
pass
self.match(61)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "ASSIGN"
# $ANTLR start "AT"
def mAT(self, ):
try:
_type = AT
_channel = DEFAULT_CHANNEL
# Java.g:10:4: ( '@' )
# Java.g:10:6: '@'
pass
self.match(64)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "AT"
# $ANTLR start "BIT_SHIFT_RIGHT"
def mBIT_SHIFT_RIGHT(self, ):
try:
_type = BIT_SHIFT_RIGHT
_channel = DEFAULT_CHANNEL
# Java.g:11:17: ( '>>>' )
# Java.g:11:19: '>>>'
pass
self.match(">>>")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "BIT_SHIFT_RIGHT"
# $ANTLR start "BIT_SHIFT_RIGHT_ASSIGN"
def mBIT_SHIFT_RIGHT_ASSIGN(self, ):
try:
_type = BIT_SHIFT_RIGHT_ASSIGN
_channel = DEFAULT_CHANNEL
# Java.g:12:24: ( '>>>=' )
# Java.g:12:26: '>>>='
pass
self.match(">>>=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "BIT_SHIFT_RIGHT_ASSIGN"
# $ANTLR start "COLON"
def mCOLON(self, ):
try:
_type = COLON
_channel = DEFAULT_CHANNEL
# Java.g:13:7: ( ':' )
# Java.g:13:9: ':'
pass
self.match(58)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "COLON"
# $ANTLR start "COMMA"
def mCOMMA(self, ):
try:
_type = COMMA
_channel = DEFAULT_CHANNEL
# Java.g:14:7: ( ',' )
# Java.g:14:9: ','
pass
self.match(44)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "COMMA"
# $ANTLR start "DEC"
def mDEC(self, ):
try:
_type = DEC
_channel = DEFAULT_CHANNEL
# Java.g:15:5: ( '--' )
# Java.g:15:7: '--'
pass
self.match("--")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "DEC"
# $ANTLR start "DIV"
def mDIV(self, ):
try:
_type = DIV
_channel = DEFAULT_CHANNEL
# Java.g:16:5: ( '/' )
# Java.g:16:7: '/'
pass
self.match(47)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "DIV"
# $ANTLR start "DIV_ASSIGN"
def mDIV_ASSIGN(self, ):
try:
_type = DIV_ASSIGN
_channel = DEFAULT_CHANNEL
# Java.g:17:12: ( '/=' )
# Java.g:17:14: '/='
pass
self.match("/=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "DIV_ASSIGN"
# $ANTLR start "DOT"
def mDOT(self, ):
try:
_type = DOT
_channel = DEFAULT_CHANNEL
# Java.g:18:5: ( '.' )
# Java.g:18:7: '.'
pass
self.match(46)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "DOT"
# $ANTLR start "DOTSTAR"
def mDOTSTAR(self, ):
try:
_type = DOTSTAR
_channel = DEFAULT_CHANNEL
# Java.g:19:9: ( '.*' )
# Java.g:19:11: '.*'
pass
self.match(".*")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "DOTSTAR"
# $ANTLR start "ELLIPSIS"
def mELLIPSIS(self, ):
try:
_type = ELLIPSIS
_channel = DEFAULT_CHANNEL
# Java.g:20:10: ( '...' )
# Java.g:20:12: '...'
pass
self.match("...")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "ELLIPSIS"
# $ANTLR start "EQUAL"
def mEQUAL(self, ):
try:
_type = EQUAL
_channel = DEFAULT_CHANNEL
# Java.g:21:7: ( '==' )
# Java.g:21:9: '=='
pass
self.match("==")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "EQUAL"
# $ANTLR start "GREATER_OR_EQUAL"
def mGREATER_OR_EQUAL(self, ):
try:
_type = GREATER_OR_EQUAL
_channel = DEFAULT_CHANNEL
# Java.g:22:18: ( '>=' )
# Java.g:22:20: '>='
pass
self.match(">=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "GREATER_OR_EQUAL"
# $ANTLR start "GREATER_THAN"
def mGREATER_THAN(self, ):
try:
_type = GREATER_THAN
_channel = DEFAULT_CHANNEL
# Java.g:23:14: ( '>' )
# Java.g:23:16: '>'
pass
self.match(62)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "GREATER_THAN"
# $ANTLR start "INC"
def mINC(self, ):
try:
_type = INC
_channel = DEFAULT_CHANNEL
# Java.g:24:5: ( '++' )
# Java.g:24:7: '++'
pass
self.match("++")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "INC"
# $ANTLR start "LBRACK"
def mLBRACK(self, ):
try:
_type = LBRACK
_channel = DEFAULT_CHANNEL
# Java.g:25:8: ( '[' )
# Java.g:25:10: '['
pass
self.match(91)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "LBRACK"
# $ANTLR start "LCURLY"
def mLCURLY(self, ):
try:
_type = LCURLY
_channel = DEFAULT_CHANNEL
# Java.g:26:8: ( '{' )
# Java.g:26:10: '{'
pass
self.match(123)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "LCURLY"
# $ANTLR start "LESS_OR_EQUAL"
def mLESS_OR_EQUAL(self, ):
try:
_type = LESS_OR_EQUAL
_channel = DEFAULT_CHANNEL
# Java.g:27:15: ( '<=' )
# Java.g:27:17: '<='
pass
self.match("<=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "LESS_OR_EQUAL"
# $ANTLR start "LESS_THAN"
def mLESS_THAN(self, ):
try:
_type = LESS_THAN
_channel = DEFAULT_CHANNEL
# Java.g:28:11: ( '<' )
# Java.g:28:13: '<'
pass
self.match(60)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "LESS_THAN"
# $ANTLR start "LOGICAL_AND"
def mLOGICAL_AND(self, ):
try:
_type = LOGICAL_AND
_channel = DEFAULT_CHANNEL
# Java.g:29:13: ( '&&' )
# Java.g:29:15: '&&'
pass
self.match("&&")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "LOGICAL_AND"
# $ANTLR start "LOGICAL_NOT"
def mLOGICAL_NOT(self, ):
try:
_type = LOGICAL_NOT
_channel = DEFAULT_CHANNEL
# Java.g:30:13: ( '!' )
# Java.g:30:15: '!'
pass
self.match(33)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "LOGICAL_NOT"
# $ANTLR start "LOGICAL_OR"
def mLOGICAL_OR(self, ):
try:
_type = LOGICAL_OR
_channel = DEFAULT_CHANNEL
# Java.g:31:12: ( '||' )
# Java.g:31:14: '||'
pass
self.match("||")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "LOGICAL_OR"
# $ANTLR start "LPAREN"
def mLPAREN(self, ):
try:
_type = LPAREN
_channel = DEFAULT_CHANNEL
# Java.g:32:8: ( '(' )
# Java.g:32:10: '('
pass
self.match(40)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "LPAREN"
# $ANTLR start "MINUS"
def mMINUS(self, ):
try:
_type = MINUS
_channel = DEFAULT_CHANNEL
# Java.g:33:7: ( '-' )
# Java.g:33:9: '-'
pass
self.match(45)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "MINUS"
# $ANTLR start "MINUS_ASSIGN"
def mMINUS_ASSIGN(self, ):
try:
_type = MINUS_ASSIGN
_channel = DEFAULT_CHANNEL
# Java.g:34:14: ( '-=' )
# Java.g:34:16: '-='
pass
self.match("-=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "MINUS_ASSIGN"
# $ANTLR start "MOD"
def mMOD(self, ):
try:
_type = MOD
_channel = DEFAULT_CHANNEL
# Java.g:35:5: ( '%' )
# Java.g:35:7: '%'
pass
self.match(37)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "MOD"
# $ANTLR start "MOD_ASSIGN"
def mMOD_ASSIGN(self, ):
try:
_type = MOD_ASSIGN
_channel = DEFAULT_CHANNEL
# Java.g:36:12: ( '%=' )
# Java.g:36:14: '%='
pass
self.match("%=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "MOD_ASSIGN"
# $ANTLR start "NOT"
def mNOT(self, ):
try:
_type = NOT
_channel = DEFAULT_CHANNEL
# Java.g:37:5: ( '~' )
# Java.g:37:7: '~'
pass
self.match(126)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "NOT"
# $ANTLR start "NOT_EQUAL"
def mNOT_EQUAL(self, ):
try:
_type = NOT_EQUAL
_channel = DEFAULT_CHANNEL
# Java.g:38:11: ( '!=' )
# Java.g:38:13: '!='
pass
self.match("!=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "NOT_EQUAL"
# $ANTLR start "OR"
def mOR(self, ):
try:
_type = OR
_channel = DEFAULT_CHANNEL
# Java.g:39:4: ( '|' )
# Java.g:39:6: '|'
pass
self.match(124)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "OR"
# $ANTLR start "OR_ASSIGN"
def mOR_ASSIGN(self, ):
try:
_type = OR_ASSIGN
_channel = DEFAULT_CHANNEL
# Java.g:40:11: ( '|=' )
# Java.g:40:13: '|='
pass
self.match("|=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "OR_ASSIGN"
# $ANTLR start "PLUS"
def mPLUS(self, ):
try:
_type = PLUS
_channel = DEFAULT_CHANNEL
# Java.g:41:6: ( '+' )
# Java.g:41:8: '+'
pass
self.match(43)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "PLUS"
# $ANTLR start "PLUS_ASSIGN"
def mPLUS_ASSIGN(self, ):
try:
_type = PLUS_ASSIGN
_channel = DEFAULT_CHANNEL
# Java.g:42:13: ( '+=' )
# Java.g:42:15: '+='
pass
self.match("+=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "PLUS_ASSIGN"
# $ANTLR start "QUESTION"
def mQUESTION(self, ):
try:
_type = QUESTION
_channel = DEFAULT_CHANNEL
# Java.g:43:10: ( '?' )
# Java.g:43:12: '?'
pass
self.match(63)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "QUESTION"
# $ANTLR start "RBRACK"
def mRBRACK(self, ):
try:
_type = RBRACK
_channel = DEFAULT_CHANNEL
# Java.g:44:8: ( ']' )
# Java.g:44:10: ']'
pass
self.match(93)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "RBRACK"
# $ANTLR start "RCURLY"
def mRCURLY(self, ):
try:
_type = RCURLY
_channel = DEFAULT_CHANNEL
# Java.g:45:8: ( '}' )
# Java.g:45:10: '}'
pass
self.match(125)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "RCURLY"
# $ANTLR start "RPAREN"
def mRPAREN(self, ):
try:
_type = RPAREN
_channel = DEFAULT_CHANNEL
# Java.g:46:8: ( ')' )
# Java.g:46:10: ')'
pass
self.match(41)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "RPAREN"
# $ANTLR start "SEMI"
def mSEMI(self, ):
try:
_type = SEMI
_channel = DEFAULT_CHANNEL
# Java.g:47:6: ( ';' )
# Java.g:47:8: ';'
pass
self.match(59)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "SEMI"
# $ANTLR start "SHIFT_LEFT"
def mSHIFT_LEFT(self, ):
try:
_type = SHIFT_LEFT
_channel = DEFAULT_CHANNEL
# Java.g:48:12: ( '<<' )
# Java.g:48:14: '<<'
pass
self.match("<<")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "SHIFT_LEFT"
# $ANTLR start "SHIFT_LEFT_ASSIGN"
def mSHIFT_LEFT_ASSIGN(self, ):
try:
_type = SHIFT_LEFT_ASSIGN
_channel = DEFAULT_CHANNEL
# Java.g:49:19: ( '<<=' )
# Java.g:49:21: '<<='
pass
self.match("<<=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "SHIFT_LEFT_ASSIGN"
# $ANTLR start "SHIFT_RIGHT"
def mSHIFT_RIGHT(self, ):
try:
_type = SHIFT_RIGHT
_channel = DEFAULT_CHANNEL
# Java.g:50:13: ( '>>' )
# Java.g:50:15: '>>'
pass
self.match(">>")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "SHIFT_RIGHT"
# $ANTLR start "SHIFT_RIGHT_ASSIGN"
def mSHIFT_RIGHT_ASSIGN(self, ):
try:
_type = SHIFT_RIGHT_ASSIGN
_channel = DEFAULT_CHANNEL
# Java.g:51:20: ( '>>=' )
# Java.g:51:22: '>>='
pass
self.match(">>=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "SHIFT_RIGHT_ASSIGN"
# $ANTLR start "STAR"
def mSTAR(self, ):
try:
_type = STAR
_channel = DEFAULT_CHANNEL
# Java.g:52:6: ( '*' )
# Java.g:52:8: '*'
pass
self.match(42)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "STAR"
# $ANTLR start "STAR_ASSIGN"
def mSTAR_ASSIGN(self, ):
try:
_type = STAR_ASSIGN
_channel = DEFAULT_CHANNEL
# Java.g:53:13: ( '*=' )
# Java.g:53:15: '*='
pass
self.match("*=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "STAR_ASSIGN"
# $ANTLR start "XOR"
def mXOR(self, ):
try:
_type = XOR
_channel = DEFAULT_CHANNEL
# Java.g:54:5: ( '^' )
# Java.g:54:7: '^'
pass
self.match(94)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "XOR"
# $ANTLR start "XOR_ASSIGN"
def mXOR_ASSIGN(self, ):
try:
_type = XOR_ASSIGN
_channel = DEFAULT_CHANNEL
# Java.g:55:12: ( '^=' )
# Java.g:55:14: '^='
pass
self.match("^=")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "XOR_ASSIGN"
# $ANTLR start "ABSTRACT"
def mABSTRACT(self, ):
try:
_type = ABSTRACT
_channel = DEFAULT_CHANNEL
# Java.g:56:10: ( 'abstract' )
# Java.g:56:12: 'abstract'
pass
self.match("abstract")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "ABSTRACT"
# $ANTLR start "ASSERT"
def mASSERT(self, ):
try:
_type = ASSERT
_channel = DEFAULT_CHANNEL
# Java.g:57:8: ( 'assert' )
# Java.g:57:10: 'assert'
pass
self.match("assert")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "ASSERT"
# $ANTLR start "BOOLEAN"
def mBOOLEAN(self, ):
try:
_type = BOOLEAN
_channel = DEFAULT_CHANNEL
# Java.g:58:9: ( 'boolean' )
# Java.g:58:11: 'boolean'
pass
self.match("boolean")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "BOOLEAN"
# $ANTLR start "BREAK"
def mBREAK(self, ):
try:
_type = BREAK
_channel = DEFAULT_CHANNEL
# Java.g:59:7: ( 'break' )
# Java.g:59:9: 'break'
pass
self.match("break")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "BREAK"
# $ANTLR start "BYTE"
def mBYTE(self, ):
try:
_type = BYTE
_channel = DEFAULT_CHANNEL
# Java.g:60:6: ( 'byte' )
# Java.g:60:8: 'byte'
pass
self.match("byte")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "BYTE"
# $ANTLR start "CASE"
def mCASE(self, ):
try:
_type = CASE
_channel = DEFAULT_CHANNEL
# Java.g:61:6: ( 'case' )
# Java.g:61:8: 'case'
pass
self.match("case")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "CASE"
# $ANTLR start "CATCH"
def mCATCH(self, ):
try:
_type = CATCH
_channel = DEFAULT_CHANNEL
# Java.g:62:7: ( 'catch' )
# Java.g:62:9: 'catch'
pass
self.match("catch")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "CATCH"
# $ANTLR start "CHAR"
def mCHAR(self, ):
try:
_type = CHAR
_channel = DEFAULT_CHANNEL
# Java.g:63:6: ( 'char' )
# Java.g:63:8: 'char'
pass
self.match("char")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "CHAR"
# $ANTLR start "CLASS"
def mCLASS(self, ):
try:
_type = CLASS
_channel = DEFAULT_CHANNEL
# Java.g:64:7: ( 'class' )
# Java.g:64:9: 'class'
pass
self.match("class")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "CLASS"
# $ANTLR start "CONTINUE"
def mCONTINUE(self, ):
try:
_type = CONTINUE
_channel = DEFAULT_CHANNEL
# Java.g:65:10: ( 'continue' )
# Java.g:65:12: 'continue'
pass
self.match("continue")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "CONTINUE"
# $ANTLR start "DEFAULT"
def mDEFAULT(self, ):
try:
_type = DEFAULT
_channel = DEFAULT_CHANNEL
# Java.g:66:9: ( 'default' )
# Java.g:66:11: 'default'
pass
self.match("default")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "DEFAULT"
# $ANTLR start "DO"
def mDO(self, ):
try:
_type = DO
_channel = DEFAULT_CHANNEL
# Java.g:67:4: ( 'do' )
# Java.g:67:6: 'do'
pass
self.match("do")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "DO"
# $ANTLR start "DOUBLE"
def mDOUBLE(self, ):
try:
_type = DOUBLE
_channel = DEFAULT_CHANNEL
# Java.g:68:8: ( 'double' )
# Java.g:68:10: 'double'
pass
self.match("double")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "DOUBLE"
# $ANTLR start "ELSE"
def mELSE(self, ):
try:
_type = ELSE
_channel = DEFAULT_CHANNEL
# Java.g:69:6: ( 'else' )
# Java.g:69:8: 'else'
pass
self.match("else")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "ELSE"
# $ANTLR start "ENUM"
def mENUM(self, ):
try:
_type = ENUM
_channel = DEFAULT_CHANNEL
# Java.g:70:6: ( 'enum' )
# Java.g:70:8: 'enum'
pass
self.match("enum")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "ENUM"
# $ANTLR start "EXTENDS"
def mEXTENDS(self, ):
try:
_type = EXTENDS
_channel = DEFAULT_CHANNEL
# Java.g:71:9: ( 'extends' )
# Java.g:71:11: 'extends'
pass
self.match("extends")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "EXTENDS"
# $ANTLR start "FALSE"
def mFALSE(self, ):
try:
_type = FALSE
_channel = DEFAULT_CHANNEL
# Java.g:72:7: ( 'false' )
# Java.g:72:9: 'false'
pass
self.match("false")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "FALSE"
# $ANTLR start "FINAL"
def mFINAL(self, ):
try:
_type = FINAL
_channel = DEFAULT_CHANNEL
# Java.g:73:7: ( 'final' )
# Java.g:73:9: 'final'
pass
self.match("final")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "FINAL"
# $ANTLR start "FINALLY"
def mFINALLY(self, ):
try:
_type = FINALLY
_channel = DEFAULT_CHANNEL
# Java.g:74:9: ( 'finally' )
# Java.g:74:11: 'finally'
pass
self.match("finally")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "FINALLY"
# $ANTLR start "FLOAT"
def mFLOAT(self, ):
try:
_type = FLOAT
_channel = DEFAULT_CHANNEL
# Java.g:75:7: ( 'float' )
# Java.g:75:9: 'float'
pass
self.match("float")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "FLOAT"
# $ANTLR start "FOR"
def mFOR(self, ):
try:
_type = FOR
_channel = DEFAULT_CHANNEL
# Java.g:76:5: ( 'for' )
# Java.g:76:7: 'for'
pass
self.match("for")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "FOR"
# $ANTLR start "IF"
def mIF(self, ):
try:
_type = IF
_channel = DEFAULT_CHANNEL
# Java.g:77:4: ( 'if' )
# Java.g:77:6: 'if'
pass
self.match("if")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "IF"
# $ANTLR start "IMPLEMENTS"
def mIMPLEMENTS(self, ):
try:
_type = IMPLEMENTS
_channel = DEFAULT_CHANNEL
# Java.g:78:12: ( 'implements' )
# Java.g:78:14: 'implements'
pass
self.match("implements")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "IMPLEMENTS"
# $ANTLR start "INSTANCEOF"
def mINSTANCEOF(self, ):
try:
_type = INSTANCEOF
_channel = DEFAULT_CHANNEL
# Java.g:79:12: ( 'instanceof' )
# Java.g:79:14: 'instanceof'
pass
self.match("instanceof")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "INSTANCEOF"
# $ANTLR start "INTERFACE"
def mINTERFACE(self, ):
try:
_type = INTERFACE
_channel = DEFAULT_CHANNEL
# Java.g:80:11: ( 'interface' )
# Java.g:80:13: 'interface'
pass
self.match("interface")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "INTERFACE"
# $ANTLR start "IMPORT"
def mIMPORT(self, ):
try:
_type = IMPORT
_channel = DEFAULT_CHANNEL
# Java.g:81:8: ( 'import' )
# Java.g:81:10: 'import'
pass
self.match("import")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "IMPORT"
# $ANTLR start "INT"
def mINT(self, ):
try:
_type = INT
_channel = DEFAULT_CHANNEL
# Java.g:82:5: ( 'int' )
# Java.g:82:7: 'int'
pass
self.match("int")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "INT"
# $ANTLR start "LONG"
def mLONG(self, ):
try:
_type = LONG
_channel = DEFAULT_CHANNEL
# Java.g:83:6: ( 'long' )
# Java.g:83:8: 'long'
pass
self.match("long")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "LONG"
# $ANTLR start "NATIVE"
def mNATIVE(self, ):
try:
_type = NATIVE
_channel = DEFAULT_CHANNEL
# Java.g:84:8: ( 'native' )
# Java.g:84:10: 'native'
pass
self.match("native")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "NATIVE"
# $ANTLR start "NEW"
def mNEW(self, ):
try:
_type = NEW
_channel = DEFAULT_CHANNEL
# Java.g:85:5: ( 'new' )
# Java.g:85:7: 'new'
pass
self.match("new")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "NEW"
# $ANTLR start "NULL"
def mNULL(self, ):
try:
_type = NULL
_channel = DEFAULT_CHANNEL
# Java.g:86:6: ( 'null' )
# Java.g:86:8: 'null'
pass
self.match("null")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "NULL"
# $ANTLR start "PACKAGE"
def mPACKAGE(self, ):
try:
_type = PACKAGE
_channel = DEFAULT_CHANNEL
# Java.g:87:9: ( 'package' )
# Java.g:87:11: 'package'
pass
self.match("package")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "PACKAGE"
# $ANTLR start "PRIVATE"
def mPRIVATE(self, ):
try:
_type = PRIVATE
_channel = DEFAULT_CHANNEL
# Java.g:88:9: ( 'private' )
# Java.g:88:11: 'private'
pass
self.match("private")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "PRIVATE"
# $ANTLR start "PROTECTED"
def mPROTECTED(self, ):
try:
_type = PROTECTED
_channel = DEFAULT_CHANNEL
# Java.g:89:11: ( 'protected' )
# Java.g:89:13: 'protected'
pass
self.match("protected")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "PROTECTED"
# $ANTLR start "PUBLIC"
def mPUBLIC(self, ):
try:
_type = PUBLIC
_channel = DEFAULT_CHANNEL
# Java.g:90:8: ( 'public' )
# Java.g:90:10: 'public'
pass
self.match("public")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "PUBLIC"
# $ANTLR start "RETURN"
def mRETURN(self, ):
try:
_type = RETURN
_channel = DEFAULT_CHANNEL
# Java.g:91:8: ( 'return' )
# Java.g:91:10: 'return'
pass
self.match("return")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "RETURN"
# $ANTLR start "SHORT"
def mSHORT(self, ):
try:
_type = SHORT
_channel = DEFAULT_CHANNEL
# Java.g:92:7: ( 'short' )
# Java.g:92:9: 'short'
pass
self.match("short")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "SHORT"
# $ANTLR start "STATIC"
def mSTATIC(self, ):
try:
_type = STATIC
_channel = DEFAULT_CHANNEL
# Java.g:93:8: ( 'static' )
# Java.g:93:10: 'static'
pass
self.match("static")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "STATIC"
# $ANTLR start "STRICTFP"
def mSTRICTFP(self, ):
try:
_type = STRICTFP
_channel = DEFAULT_CHANNEL
# Java.g:94:10: ( 'strictfp' )
# Java.g:94:12: 'strictfp'
pass
self.match("strictfp")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "STRICTFP"
# $ANTLR start "SUPER"
def mSUPER(self, ):
try:
_type = SUPER
_channel = DEFAULT_CHANNEL
# Java.g:95:7: ( 'super' )
# Java.g:95:9: 'super'
pass
self.match("super")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "SUPER"
# $ANTLR start "SWITCH"
def mSWITCH(self, ):
try:
_type = SWITCH
_channel = DEFAULT_CHANNEL
# Java.g:96:8: ( 'switch' )
# Java.g:96:10: 'switch'
pass
self.match("switch")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "SWITCH"
# $ANTLR start "SYNCHRONIZED"
def mSYNCHRONIZED(self, ):
try:
_type = SYNCHRONIZED
_channel = DEFAULT_CHANNEL
# Java.g:97:14: ( 'synchronized' )
# Java.g:97:16: 'synchronized'
pass
self.match("synchronized")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "SYNCHRONIZED"
# $ANTLR start "THIS"
def mTHIS(self, ):
try:
_type = THIS
_channel = DEFAULT_CHANNEL
# Java.g:98:6: ( 'this' )
# Java.g:98:8: 'this'
pass
self.match("this")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "THIS"
# $ANTLR start "THROW"
def mTHROW(self, ):
try:
_type = THROW
_channel = DEFAULT_CHANNEL
# Java.g:99:7: ( 'throw' )
# Java.g:99:9: 'throw'
pass
self.match("throw")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "THROW"
# $ANTLR start "THROWS"
def mTHROWS(self, ):
try:
_type = THROWS
_channel = DEFAULT_CHANNEL
# Java.g:100:8: ( 'throws' )
# Java.g:100:10: 'throws'
pass
self.match("throws")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "THROWS"
# $ANTLR start "TRANSIENT"
def mTRANSIENT(self, ):
try:
_type = TRANSIENT
_channel = DEFAULT_CHANNEL
# Java.g:101:11: ( 'transient' )
# Java.g:101:13: 'transient'
pass
self.match("transient")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "TRANSIENT"
# $ANTLR start "TRUE"
def mTRUE(self, ):
try:
_type = TRUE
_channel = DEFAULT_CHANNEL
# Java.g:102:6: ( 'true' )
# Java.g:102:8: 'true'
pass
self.match("true")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "TRUE"
# $ANTLR start "TRY"
def mTRY(self, ):
try:
_type = TRY
_channel = DEFAULT_CHANNEL
# Java.g:103:5: ( 'try' )
# Java.g:103:7: 'try'
pass
self.match("try")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "TRY"
# $ANTLR start "VOID"
def mVOID(self, ):
try:
_type = VOID
_channel = DEFAULT_CHANNEL
# Java.g:104:6: ( 'void' )
# Java.g:104:8: 'void'
pass
self.match("void")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "VOID"
# $ANTLR start "VOLATILE"
def mVOLATILE(self, ):
try:
_type = VOLATILE
_channel = DEFAULT_CHANNEL
# Java.g:105:10: ( 'volatile' )
# Java.g:105:12: 'volatile'
pass
self.match("volatile")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "VOLATILE"
# $ANTLR start "WHILE"
def mWHILE(self, ):
try:
_type = WHILE
_channel = DEFAULT_CHANNEL
# Java.g:106:7: ( 'while' )
# Java.g:106:9: 'while'
pass
self.match("while")
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "WHILE"
# $ANTLR start "HEX_LITERAL"
def mHEX_LITERAL(self, ):
try:
_type = HEX_LITERAL
_channel = DEFAULT_CHANNEL
# Java.g:985:13: ( '0' ( 'x' | 'X' ) ( HEX_DIGIT )+ ( INTEGER_TYPE_SUFFIX )? )
# Java.g:985:15: '0' ( 'x' | 'X' ) ( HEX_DIGIT )+ ( INTEGER_TYPE_SUFFIX )?
pass
self.match(48)
if self.input.LA(1) == 88 or self.input.LA(1) == 120:
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
# Java.g:985:29: ( HEX_DIGIT )+
cnt1 = 0
while True: #loop1
alt1 = 2
LA1_0 = self.input.LA(1)
if ((48 <= LA1_0 <= 57) or (65 <= LA1_0 <= 70) or (97 <= LA1_0 <= 102)) :
alt1 = 1
if alt1 == 1:
# Java.g:985:29: HEX_DIGIT
pass
self.mHEX_DIGIT()
else:
if cnt1 >= 1:
break #loop1
eee = EarlyExitException(1, self.input)
raise eee
cnt1 += 1
# Java.g:985:40: ( INTEGER_TYPE_SUFFIX )?
alt2 = 2
LA2_0 = self.input.LA(1)
if (LA2_0 == 76 or LA2_0 == 108) :
alt2 = 1
if alt2 == 1:
# Java.g:985:40: INTEGER_TYPE_SUFFIX
pass
self.mINTEGER_TYPE_SUFFIX()
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "HEX_LITERAL"
# $ANTLR start "DECIMAL_LITERAL"
def mDECIMAL_LITERAL(self, ):
try:
_type = DECIMAL_LITERAL
_channel = DEFAULT_CHANNEL
# Java.g:987:17: ( ( '0' | '1' .. '9' ( '0' .. '9' )* ) ( INTEGER_TYPE_SUFFIX )? )
# Java.g:987:19: ( '0' | '1' .. '9' ( '0' .. '9' )* ) ( INTEGER_TYPE_SUFFIX )?
pass
# Java.g:987:19: ( '0' | '1' .. '9' ( '0' .. '9' )* )
alt4 = 2
LA4_0 = self.input.LA(1)
if (LA4_0 == 48) :
alt4 = 1
elif ((49 <= LA4_0 <= 57)) :
alt4 = 2
else:
nvae = NoViableAltException("", 4, 0, self.input)
raise nvae
if alt4 == 1:
# Java.g:987:20: '0'
pass
self.match(48)
elif alt4 == 2:
# Java.g:987:26: '1' .. '9' ( '0' .. '9' )*
pass
self.matchRange(49, 57)
# Java.g:987:35: ( '0' .. '9' )*
while True: #loop3
alt3 = 2
LA3_0 = self.input.LA(1)
if ((48 <= LA3_0 <= 57)) :
alt3 = 1
if alt3 == 1:
# Java.g:987:35: '0' .. '9'
pass
self.matchRange(48, 57)
else:
break #loop3
# Java.g:987:46: ( INTEGER_TYPE_SUFFIX )?
alt5 = 2
LA5_0 = self.input.LA(1)
if (LA5_0 == 76 or LA5_0 == 108) :
alt5 = 1
if alt5 == 1:
# Java.g:987:46: INTEGER_TYPE_SUFFIX
pass
self.mINTEGER_TYPE_SUFFIX()
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "DECIMAL_LITERAL"
# $ANTLR start "OCTAL_LITERAL"
def mOCTAL_LITERAL(self, ):
try:
_type = OCTAL_LITERAL
_channel = DEFAULT_CHANNEL
# Java.g:989:15: ( '0' ( '0' .. '7' )+ ( INTEGER_TYPE_SUFFIX )? )
# Java.g:989:17: '0' ( '0' .. '7' )+ ( INTEGER_TYPE_SUFFIX )?
pass
self.match(48)
# Java.g:989:21: ( '0' .. '7' )+
cnt6 = 0
while True: #loop6
alt6 = 2
LA6_0 = self.input.LA(1)
if ((48 <= LA6_0 <= 55)) :
alt6 = 1
if alt6 == 1:
# Java.g:989:22: '0' .. '7'
pass
self.matchRange(48, 55)
else:
if cnt6 >= 1:
break #loop6
eee = EarlyExitException(6, self.input)
raise eee
cnt6 += 1
# Java.g:989:33: ( INTEGER_TYPE_SUFFIX )?
alt7 = 2
LA7_0 = self.input.LA(1)
if (LA7_0 == 76 or LA7_0 == 108) :
alt7 = 1
if alt7 == 1:
# Java.g:989:33: INTEGER_TYPE_SUFFIX
pass
self.mINTEGER_TYPE_SUFFIX()
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "OCTAL_LITERAL"
# $ANTLR start "HEX_DIGIT"
def mHEX_DIGIT(self, ):
try:
# Java.g:992:11: ( ( '0' .. '9' | 'a' .. 'f' | 'A' .. 'F' ) )
# Java.g:992:13: ( '0' .. '9' | 'a' .. 'f' | 'A' .. 'F' )
pass
if (48 <= self.input.LA(1) <= 57) or (65 <= self.input.LA(1) <= 70) or (97 <= self.input.LA(1) <= 102):
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
finally:
pass
# $ANTLR end "HEX_DIGIT"
# $ANTLR start "INTEGER_TYPE_SUFFIX"
def mINTEGER_TYPE_SUFFIX(self, ):
try:
# Java.g:995:21: ( ( 'l' | 'L' ) )
# Java.g:995:23: ( 'l' | 'L' )
pass
if self.input.LA(1) == 76 or self.input.LA(1) == 108:
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
finally:
pass
# $ANTLR end "INTEGER_TYPE_SUFFIX"
# $ANTLR start "FLOATING_POINT_LITERAL"
def mFLOATING_POINT_LITERAL(self, ):
try:
_type = FLOATING_POINT_LITERAL
_channel = DEFAULT_CHANNEL
# Java.g:998:5: ( ( '0' .. '9' )+ ( DOT ( '0' .. '9' )* ( EXPONENT )? ( FLOAT_TYPE_SUFFIX )? | EXPONENT ( FLOAT_TYPE_SUFFIX )? | FLOAT_TYPE_SUFFIX ) | DOT ( '0' .. '9' )+ ( EXPONENT )? ( FLOAT_TYPE_SUFFIX )? )
alt17 = 2
LA17_0 = self.input.LA(1)
if ((48 <= LA17_0 <= 57)) :
alt17 = 1
elif (LA17_0 == 46) :
alt17 = 2
else:
nvae = NoViableAltException("", 17, 0, self.input)
raise nvae
if alt17 == 1:
# Java.g:998:9: ( '0' .. '9' )+ ( DOT ( '0' .. '9' )* ( EXPONENT )? ( FLOAT_TYPE_SUFFIX )? | EXPONENT ( FLOAT_TYPE_SUFFIX )? | FLOAT_TYPE_SUFFIX )
pass
# Java.g:998:9: ( '0' .. '9' )+
cnt8 = 0
while True: #loop8
alt8 = 2
LA8_0 = self.input.LA(1)
if ((48 <= LA8_0 <= 57)) :
alt8 = 1
if alt8 == 1:
# Java.g:998:10: '0' .. '9'
pass
self.matchRange(48, 57)
else:
if cnt8 >= 1:
break #loop8
eee = EarlyExitException(8, self.input)
raise eee
cnt8 += 1
# Java.g:999:9: ( DOT ( '0' .. '9' )* ( EXPONENT )? ( FLOAT_TYPE_SUFFIX )? | EXPONENT ( FLOAT_TYPE_SUFFIX )? | FLOAT_TYPE_SUFFIX )
alt13 = 3
LA13 = self.input.LA(1)
if LA13 == 46:
alt13 = 1
elif LA13 == 69 or LA13 == 101:
alt13 = 2
elif LA13 == 68 or LA13 == 70 or LA13 == 100 or LA13 == 102:
alt13 = 3
else:
nvae = NoViableAltException("", 13, 0, self.input)
raise nvae
if alt13 == 1:
# Java.g:1000:13: DOT ( '0' .. '9' )* ( EXPONENT )? ( FLOAT_TYPE_SUFFIX )?
pass
self.mDOT()
# Java.g:1000:17: ( '0' .. '9' )*
while True: #loop9
alt9 = 2
LA9_0 = self.input.LA(1)
if ((48 <= LA9_0 <= 57)) :
alt9 = 1
if alt9 == 1:
# Java.g:1000:18: '0' .. '9'
pass
self.matchRange(48, 57)
else:
break #loop9
# Java.g:1000:29: ( EXPONENT )?
alt10 = 2
LA10_0 = self.input.LA(1)
if (LA10_0 == 69 or LA10_0 == 101) :
alt10 = 1
if alt10 == 1:
# Java.g:1000:29: EXPONENT
pass
self.mEXPONENT()
# Java.g:1000:39: ( FLOAT_TYPE_SUFFIX )?
alt11 = 2
LA11_0 = self.input.LA(1)
if (LA11_0 == 68 or LA11_0 == 70 or LA11_0 == 100 or LA11_0 == 102) :
alt11 = 1
if alt11 == 1:
# Java.g:1000:39: FLOAT_TYPE_SUFFIX
pass
self.mFLOAT_TYPE_SUFFIX()
elif alt13 == 2:
# Java.g:1001:13: EXPONENT ( FLOAT_TYPE_SUFFIX )?
pass
self.mEXPONENT()
# Java.g:1001:22: ( FLOAT_TYPE_SUFFIX )?
alt12 = 2
LA12_0 = self.input.LA(1)
if (LA12_0 == 68 or LA12_0 == 70 or LA12_0 == 100 or LA12_0 == 102) :
alt12 = 1
if alt12 == 1:
# Java.g:1001:22: FLOAT_TYPE_SUFFIX
pass
self.mFLOAT_TYPE_SUFFIX()
elif alt13 == 3:
# Java.g:1002:13: FLOAT_TYPE_SUFFIX
pass
self.mFLOAT_TYPE_SUFFIX()
elif alt17 == 2:
# Java.g:1004:9: DOT ( '0' .. '9' )+ ( EXPONENT )? ( FLOAT_TYPE_SUFFIX )?
pass
self.mDOT()
# Java.g:1004:13: ( '0' .. '9' )+
cnt14 = 0
while True: #loop14
alt14 = 2
LA14_0 = self.input.LA(1)
if ((48 <= LA14_0 <= 57)) :
alt14 = 1
if alt14 == 1:
# Java.g:1004:14: '0' .. '9'
pass
self.matchRange(48, 57)
else:
if cnt14 >= 1:
break #loop14
eee = EarlyExitException(14, self.input)
raise eee
cnt14 += 1
# Java.g:1004:25: ( EXPONENT )?
alt15 = 2
LA15_0 = self.input.LA(1)
if (LA15_0 == 69 or LA15_0 == 101) :
alt15 = 1
if alt15 == 1:
# Java.g:1004:25: EXPONENT
pass
self.mEXPONENT()
# Java.g:1004:35: ( FLOAT_TYPE_SUFFIX )?
alt16 = 2
LA16_0 = self.input.LA(1)
if (LA16_0 == 68 or LA16_0 == 70 or LA16_0 == 100 or LA16_0 == 102) :
alt16 = 1
if alt16 == 1:
# Java.g:1004:35: FLOAT_TYPE_SUFFIX
pass
self.mFLOAT_TYPE_SUFFIX()
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "FLOATING_POINT_LITERAL"
# $ANTLR start "EXPONENT"
def mEXPONENT(self, ):
try:
# Java.g:1008:10: ( ( 'e' | 'E' ) ( '+' | '-' )? ( '0' .. '9' )+ )
# Java.g:1008:12: ( 'e' | 'E' ) ( '+' | '-' )? ( '0' .. '9' )+
pass
if self.input.LA(1) == 69 or self.input.LA(1) == 101:
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
# Java.g:1008:22: ( '+' | '-' )?
alt18 = 2
LA18_0 = self.input.LA(1)
if (LA18_0 == 43 or LA18_0 == 45) :
alt18 = 1
if alt18 == 1:
# Java.g:
pass
if self.input.LA(1) == 43 or self.input.LA(1) == 45:
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
# Java.g:1008:33: ( '0' .. '9' )+
cnt19 = 0
while True: #loop19
alt19 = 2
LA19_0 = self.input.LA(1)
if ((48 <= LA19_0 <= 57)) :
alt19 = 1
if alt19 == 1:
# Java.g:1008:34: '0' .. '9'
pass
self.matchRange(48, 57)
else:
if cnt19 >= 1:
break #loop19
eee = EarlyExitException(19, self.input)
raise eee
cnt19 += 1
finally:
pass
# $ANTLR end "EXPONENT"
# $ANTLR start "FLOAT_TYPE_SUFFIX"
def mFLOAT_TYPE_SUFFIX(self, ):
try:
# Java.g:1011:19: ( ( 'f' | 'F' | 'd' | 'D' ) )
# Java.g:1011:21: ( 'f' | 'F' | 'd' | 'D' )
pass
if self.input.LA(1) == 68 or self.input.LA(1) == 70 or self.input.LA(1) == 100 or self.input.LA(1) == 102:
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
finally:
pass
# $ANTLR end "FLOAT_TYPE_SUFFIX"
# $ANTLR start "CHARACTER_LITERAL"
def mCHARACTER_LITERAL(self, ):
try:
_type = CHARACTER_LITERAL
_channel = DEFAULT_CHANNEL
# Java.g:1014:5: ( '\\'' ( ESCAPE_SEQUENCE | ~ ( '\\'' | '\\\\' ) ) '\\'' )
# Java.g:1014:9: '\\'' ( ESCAPE_SEQUENCE | ~ ( '\\'' | '\\\\' ) ) '\\''
pass
self.match(39)
# Java.g:1014:14: ( ESCAPE_SEQUENCE | ~ ( '\\'' | '\\\\' ) )
alt20 = 2
LA20_0 = self.input.LA(1)
if (LA20_0 == 92) :
alt20 = 1
elif ((0 <= LA20_0 <= 38) or (40 <= LA20_0 <= 91) or (93 <= LA20_0 <= 65535)) :
alt20 = 2
else:
nvae = NoViableAltException("", 20, 0, self.input)
raise nvae
if alt20 == 1:
# Java.g:1014:16: ESCAPE_SEQUENCE
pass
self.mESCAPE_SEQUENCE()
elif alt20 == 2:
# Java.g:1014:34: ~ ( '\\'' | '\\\\' )
pass
if (0 <= self.input.LA(1) <= 38) or (40 <= self.input.LA(1) <= 91) or (93 <= self.input.LA(1) <= 65535):
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
self.match(39)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "CHARACTER_LITERAL"
# $ANTLR start "STRING_LITERAL"
def mSTRING_LITERAL(self, ):
try:
_type = STRING_LITERAL
_channel = DEFAULT_CHANNEL
# Java.g:1018:5: ( '\"' ( ESCAPE_SEQUENCE | ~ ( '\\\\' | '\"' ) )* '\"' )
# Java.g:1018:8: '\"' ( ESCAPE_SEQUENCE | ~ ( '\\\\' | '\"' ) )* '\"'
pass
self.match(34)
# Java.g:1018:12: ( ESCAPE_SEQUENCE | ~ ( '\\\\' | '\"' ) )*
while True: #loop21
alt21 = 3
LA21_0 = self.input.LA(1)
if (LA21_0 == 92) :
alt21 = 1
elif ((0 <= LA21_0 <= 33) or (35 <= LA21_0 <= 91) or (93 <= LA21_0 <= 65535)) :
alt21 = 2
if alt21 == 1:
# Java.g:1018:14: ESCAPE_SEQUENCE
pass
self.mESCAPE_SEQUENCE()
elif alt21 == 2:
# Java.g:1018:32: ~ ( '\\\\' | '\"' )
pass
if (0 <= self.input.LA(1) <= 33) or (35 <= self.input.LA(1) <= 91) or (93 <= self.input.LA(1) <= 65535):
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
else:
break #loop21
self.match(34)
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "STRING_LITERAL"
# $ANTLR start "ESCAPE_SEQUENCE"
def mESCAPE_SEQUENCE(self, ):
try:
# Java.g:1023:5: ( '\\\\' ( 'b' | 't' | 'n' | 'f' | 'r' | '\\\"' | '\\'' | '\\\\' ) | UNICODE_ESCAPE | OCTAL_ESCAPE )
alt22 = 3
LA22_0 = self.input.LA(1)
if (LA22_0 == 92) :
LA22 = self.input.LA(2)
if LA22 == 34 or LA22 == 39 or LA22 == 92 or LA22 == 98 or LA22 == 102 or LA22 == 110 or LA22 == 114 or LA22 == 116:
alt22 = 1
elif LA22 == 117:
alt22 = 2
elif LA22 == 48 or LA22 == 49 or LA22 == 50 or LA22 == 51 or LA22 == 52 or LA22 == 53 or LA22 == 54 or LA22 == 55:
alt22 = 3
else:
nvae = NoViableAltException("", 22, 1, self.input)
raise nvae
else:
nvae = NoViableAltException("", 22, 0, self.input)
raise nvae
if alt22 == 1:
# Java.g:1023:9: '\\\\' ( 'b' | 't' | 'n' | 'f' | 'r' | '\\\"' | '\\'' | '\\\\' )
pass
self.match(92)
if self.input.LA(1) == 34 or self.input.LA(1) == 39 or self.input.LA(1) == 92 or self.input.LA(1) == 98 or self.input.LA(1) == 102 or self.input.LA(1) == 110 or self.input.LA(1) == 114 or self.input.LA(1) == 116:
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
elif alt22 == 2:
# Java.g:1024:9: UNICODE_ESCAPE
pass
self.mUNICODE_ESCAPE()
elif alt22 == 3:
# Java.g:1025:9: OCTAL_ESCAPE
pass
self.mOCTAL_ESCAPE()
finally:
pass
# $ANTLR end "ESCAPE_SEQUENCE"
# $ANTLR start "OCTAL_ESCAPE"
def mOCTAL_ESCAPE(self, ):
try:
# Java.g:1030:5: ( '\\\\' ( '0' .. '3' ) ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) )
alt23 = 3
LA23_0 = self.input.LA(1)
if (LA23_0 == 92) :
LA23_1 = self.input.LA(2)
if ((48 <= LA23_1 <= 51)) :
LA23_2 = self.input.LA(3)
if ((48 <= LA23_2 <= 55)) :
LA23_4 = self.input.LA(4)
if ((48 <= LA23_4 <= 55)) :
alt23 = 1
else:
alt23 = 2
else:
alt23 = 3
elif ((52 <= LA23_1 <= 55)) :
LA23_3 = self.input.LA(3)
if ((48 <= LA23_3 <= 55)) :
alt23 = 2
else:
alt23 = 3
else:
nvae = NoViableAltException("", 23, 1, self.input)
raise nvae
else:
nvae = NoViableAltException("", 23, 0, self.input)
raise nvae
if alt23 == 1:
# Java.g:1030:9: '\\\\' ( '0' .. '3' ) ( '0' .. '7' ) ( '0' .. '7' )
pass
self.match(92)
# Java.g:1030:14: ( '0' .. '3' )
# Java.g:1030:15: '0' .. '3'
pass
self.matchRange(48, 51)
# Java.g:1030:25: ( '0' .. '7' )
# Java.g:1030:26: '0' .. '7'
pass
self.matchRange(48, 55)
# Java.g:1030:36: ( '0' .. '7' )
# Java.g:1030:37: '0' .. '7'
pass
self.matchRange(48, 55)
elif alt23 == 2:
# Java.g:1031:9: '\\\\' ( '0' .. '7' ) ( '0' .. '7' )
pass
self.match(92)
# Java.g:1031:14: ( '0' .. '7' )
# Java.g:1031:15: '0' .. '7'
pass
self.matchRange(48, 55)
# Java.g:1031:25: ( '0' .. '7' )
# Java.g:1031:26: '0' .. '7'
pass
self.matchRange(48, 55)
elif alt23 == 3:
# Java.g:1032:9: '\\\\' ( '0' .. '7' )
pass
self.match(92)
# Java.g:1032:14: ( '0' .. '7' )
# Java.g:1032:15: '0' .. '7'
pass
self.matchRange(48, 55)
finally:
pass
# $ANTLR end "OCTAL_ESCAPE"
# $ANTLR start "UNICODE_ESCAPE"
def mUNICODE_ESCAPE(self, ):
try:
# Java.g:1037:5: ( '\\\\' 'u' HEX_DIGIT HEX_DIGIT HEX_DIGIT HEX_DIGIT )
# Java.g:1037:9: '\\\\' 'u' HEX_DIGIT HEX_DIGIT HEX_DIGIT HEX_DIGIT
pass
self.match(92)
self.match(117)
self.mHEX_DIGIT()
self.mHEX_DIGIT()
self.mHEX_DIGIT()
self.mHEX_DIGIT()
finally:
pass
# $ANTLR end "UNICODE_ESCAPE"
# $ANTLR start "IDENT"
def mIDENT(self, ):
try:
_type = IDENT
_channel = DEFAULT_CHANNEL
# Java.g:1041:5: ( JAVA_ID_START ( JAVA_ID_PART )* )
# Java.g:1041:9: JAVA_ID_START ( JAVA_ID_PART )*
pass
self.mJAVA_ID_START()
# Java.g:1041:23: ( JAVA_ID_PART )*
while True: #loop24
alt24 = 2
LA24_0 = self.input.LA(1)
if (LA24_0 == 36 or (48 <= LA24_0 <= 57) or (65 <= LA24_0 <= 90) or LA24_0 == 95 or (97 <= LA24_0 <= 122) or (192 <= LA24_0 <= 214) or (216 <= LA24_0 <= 246) or (248 <= LA24_0 <= 8191) or (12352 <= LA24_0 <= 12687) or (13056 <= LA24_0 <= 13183) or (13312 <= LA24_0 <= 15661) or (19968 <= LA24_0 <= 40959) or (63744 <= LA24_0 <= 64255)) :
alt24 = 1
if alt24 == 1:
# Java.g:1041:24: JAVA_ID_PART
pass
self.mJAVA_ID_PART()
else:
break #loop24
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "IDENT"
# $ANTLR start "JAVA_ID_START"
def mJAVA_ID_START(self, ):
try:
# Java.g:1046:5: ( '\\u0024' | '\\u0041' .. '\\u005a' | '\\u005f' | '\\u0061' .. '\\u007a' | '\\u00c0' .. '\\u00d6' | '\\u00d8' .. '\\u00f6' | '\\u00f8' .. '\\u00ff' | '\\u0100' .. '\\u1fff' | '\\u3040' .. '\\u318f' | '\\u3300' .. '\\u337f' | '\\u3400' .. '\\u3d2d' | '\\u4e00' .. '\\u9fff' | '\\uf900' .. '\\ufaff' )
# Java.g:
pass
if self.input.LA(1) == 36 or (65 <= self.input.LA(1) <= 90) or self.input.LA(1) == 95 or (97 <= self.input.LA(1) <= 122) or (192 <= self.input.LA(1) <= 214) or (216 <= self.input.LA(1) <= 246) or (248 <= self.input.LA(1) <= 8191) or (12352 <= self.input.LA(1) <= 12687) or (13056 <= self.input.LA(1) <= 13183) or (13312 <= self.input.LA(1) <= 15661) or (19968 <= self.input.LA(1) <= 40959) or (63744 <= self.input.LA(1) <= 64255):
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
finally:
pass
# $ANTLR end "JAVA_ID_START"
# $ANTLR start "JAVA_ID_PART"
def mJAVA_ID_PART(self, ):
try:
# Java.g:1063:5: ( JAVA_ID_START | '\\u0030' .. '\\u0039' )
# Java.g:
pass
if self.input.LA(1) == 36 or (48 <= self.input.LA(1) <= 57) or (65 <= self.input.LA(1) <= 90) or self.input.LA(1) == 95 or (97 <= self.input.LA(1) <= 122) or (192 <= self.input.LA(1) <= 214) or (216 <= self.input.LA(1) <= 246) or (248 <= self.input.LA(1) <= 8191) or (12352 <= self.input.LA(1) <= 12687) or (13056 <= self.input.LA(1) <= 13183) or (13312 <= self.input.LA(1) <= 15661) or (19968 <= self.input.LA(1) <= 40959) or (63744 <= self.input.LA(1) <= 64255):
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
finally:
pass
# $ANTLR end "JAVA_ID_PART"
# $ANTLR start "WS"
def mWS(self, ):
try:
_type = WS
_channel = DEFAULT_CHANNEL
# Java.g:1067:5: ( ( ' ' | '\\r' | '\\t' | '\\u000C' | '\\n' ) )
# Java.g:1067:8: ( ' ' | '\\r' | '\\t' | '\\u000C' | '\\n' )
pass
if (9 <= self.input.LA(1) <= 10) or (12 <= self.input.LA(1) <= 13) or self.input.LA(1) == 32:
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
#action start
_channel = HIDDEN
#action end
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "WS"
# $ANTLR start "COMMENT"
def mCOMMENT(self, ):
try:
_type = COMMENT
_channel = DEFAULT_CHANNEL
# Java.g:1074:5: ( '/*' ~ ( '*' ) ( options {greedy=false; } : . )* '*/' )
# Java.g:1074:9: '/*' ~ ( '*' ) ( options {greedy=false; } : . )* '*/'
pass
self.match("/*")
if (0 <= self.input.LA(1) <= 41) or (43 <= self.input.LA(1) <= 65535):
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
# Java.g:1074:21: ( options {greedy=false; } : . )*
while True: #loop25
alt25 = 2
LA25_0 = self.input.LA(1)
if (LA25_0 == 42) :
LA25_1 = self.input.LA(2)
if (LA25_1 == 47) :
alt25 = 2
elif ((0 <= LA25_1 <= 46) or (48 <= LA25_1 <= 65535)) :
alt25 = 1
elif ((0 <= LA25_0 <= 41) or (43 <= LA25_0 <= 65535)) :
alt25 = 1
if alt25 == 1:
# Java.g:1074:49: .
pass
self.matchAny()
else:
break #loop25
self.match("*/")
#action start
_channel = HIDDEN
#action end
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "COMMENT"
# $ANTLR start "LINE_COMMENT"
def mLINE_COMMENT(self, ):
try:
_type = LINE_COMMENT
_channel = DEFAULT_CHANNEL
# Java.g:1081:5: ( '//' (~ ( '\\n' | '\\r' ) )* ( '\\r' )? '\\n' )
# Java.g:1081:7: '//' (~ ( '\\n' | '\\r' ) )* ( '\\r' )? '\\n'
pass
self.match("//")
# Java.g:1081:12: (~ ( '\\n' | '\\r' ) )*
while True: #loop26
alt26 = 2
LA26_0 = self.input.LA(1)
if ((0 <= LA26_0 <= 9) or (11 <= LA26_0 <= 12) or (14 <= LA26_0 <= 65535)) :
alt26 = 1
if alt26 == 1:
# Java.g:1081:12: ~ ( '\\n' | '\\r' )
pass
if (0 <= self.input.LA(1) <= 9) or (11 <= self.input.LA(1) <= 12) or (14 <= self.input.LA(1) <= 65535):
self.input.consume()
else:
mse = MismatchedSetException(None, self.input)
self.recover(mse)
raise mse
else:
break #loop26
# Java.g:1081:26: ( '\\r' )?
alt27 = 2
LA27_0 = self.input.LA(1)
if (LA27_0 == 13) :
alt27 = 1
if alt27 == 1:
# Java.g:1081:26: '\\r'
pass
self.match(13)
self.match(10)
#action start
_channel = HIDDEN
#action end
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "LINE_COMMENT"
# $ANTLR start "JAVADOC_COMMENT"
def mJAVADOC_COMMENT(self, ):
try:
_type = JAVADOC_COMMENT
_channel = DEFAULT_CHANNEL
# Java.g:1088:5: ( '/**' ( options {greedy=false; } : . )* '*/' )
# Java.g:1088:9: '/**' ( options {greedy=false; } : . )* '*/'
pass
self.match("/**")
# Java.g:1088:15: ( options {greedy=false; } : . )*
while True: #loop28
alt28 = 2
LA28_0 = self.input.LA(1)
if (LA28_0 == 42) :
LA28_1 = self.input.LA(2)
if (LA28_1 == 47) :
alt28 = 2
elif ((0 <= LA28_1 <= 46) or (48 <= LA28_1 <= 65535)) :
alt28 = 1
elif ((0 <= LA28_0 <= 41) or (43 <= LA28_0 <= 65535)) :
alt28 = 1
if alt28 == 1:
# Java.g:1088:43: .
pass
self.matchAny()
else:
break #loop28
self.match("*/")
#action start
_channel = HIDDEN
#action end
self._state.type = _type
self._state.channel = _channel
finally:
pass
# $ANTLR end "JAVADOC_COMMENT"
def mTokens(self):
# Java.g:1:8: ( AND | AND_ASSIGN | ASSIGN | AT | BIT_SHIFT_RIGHT | BIT_SHIFT_RIGHT_ASSIGN | COLON | COMMA | DEC | DIV | DIV_ASSIGN | DOT | DOTSTAR | ELLIPSIS | EQUAL | GREATER_OR_EQUAL | GREATER_THAN | INC | LBRACK | LCURLY | LESS_OR_EQUAL | LESS_THAN | LOGICAL_AND | LOGICAL_NOT | LOGICAL_OR | LPAREN | MINUS | MINUS_ASSIGN | MOD | MOD_ASSIGN | NOT | NOT_EQUAL | OR | OR_ASSIGN | PLUS | PLUS_ASSIGN | QUESTION | RBRACK | RCURLY | RPAREN | SEMI | SHIFT_LEFT | SHIFT_LEFT_ASSIGN | SHIFT_RIGHT | SHIFT_RIGHT_ASSIGN | STAR | STAR_ASSIGN | XOR | XOR_ASSIGN | ABSTRACT | ASSERT | BOOLEAN | BREAK | BYTE | CASE | CATCH | CHAR | CLASS | CONTINUE | DEFAULT | DO | DOUBLE | ELSE | ENUM | EXTENDS | FALSE | FINAL | FINALLY | FLOAT | FOR | IF | IMPLEMENTS | INSTANCEOF | INTERFACE | IMPORT | INT | LONG | NATIVE | NEW | NULL | PACKAGE | PRIVATE | PROTECTED | PUBLIC | RETURN | SHORT | STATIC | STRICTFP | SUPER | SWITCH | SYNCHRONIZED | THIS | THROW | THROWS | TRANSIENT | TRUE | TRY | VOID | VOLATILE | WHILE | HEX_LITERAL | DECIMAL_LITERAL | OCTAL_LITERAL | FLOATING_POINT_LITERAL | CHARACTER_LITERAL | STRING_LITERAL | IDENT | WS | COMMENT | LINE_COMMENT | JAVADOC_COMMENT )
alt29 = 111
alt29 = self.dfa29.predict(self.input)
if alt29 == 1:
# Java.g:1:10: AND
pass
self.mAND()
elif alt29 == 2:
# Java.g:1:14: AND_ASSIGN
pass
self.mAND_ASSIGN()
elif alt29 == 3:
# Java.g:1:25: ASSIGN
pass
self.mASSIGN()
elif alt29 == 4:
# Java.g:1:32: AT
pass
self.mAT()
elif alt29 == 5:
# Java.g:1:35: BIT_SHIFT_RIGHT
pass
self.mBIT_SHIFT_RIGHT()
elif alt29 == 6:
# Java.g:1:51: BIT_SHIFT_RIGHT_ASSIGN
pass
self.mBIT_SHIFT_RIGHT_ASSIGN()
elif alt29 == 7:
# Java.g:1:74: COLON
pass
self.mCOLON()
elif alt29 == 8:
# Java.g:1:80: COMMA
pass
self.mCOMMA()
elif alt29 == 9:
# Java.g:1:86: DEC
pass
self.mDEC()
elif alt29 == 10:
# Java.g:1:90: DIV
pass
self.mDIV()
elif alt29 == 11:
# Java.g:1:94: DIV_ASSIGN
pass
self.mDIV_ASSIGN()
elif alt29 == 12:
# Java.g:1:105: DOT
pass
self.mDOT()
elif alt29 == 13:
# Java.g:1:109: DOTSTAR
pass
self.mDOTSTAR()
elif alt29 == 14:
# Java.g:1:117: ELLIPSIS
pass
self.mELLIPSIS()
elif alt29 == 15:
# Java.g:1:126: EQUAL
pass
self.mEQUAL()
elif alt29 == 16:
# Java.g:1:132: GREATER_OR_EQUAL
pass
self.mGREATER_OR_EQUAL()
elif alt29 == 17:
# Java.g:1:149: GREATER_THAN
pass
self.mGREATER_THAN()
elif alt29 == 18:
# Java.g:1:162: INC
pass
self.mINC()
elif alt29 == 19:
# Java.g:1:166: LBRACK
pass
self.mLBRACK()
elif alt29 == 20:
# Java.g:1:173: LCURLY
pass
self.mLCURLY()
elif alt29 == 21:
# Java.g:1:180: LESS_OR_EQUAL
pass
self.mLESS_OR_EQUAL()
elif alt29 == 22:
# Java.g:1:194: LESS_THAN
pass
self.mLESS_THAN()
elif alt29 == 23:
# Java.g:1:204: LOGICAL_AND
pass
self.mLOGICAL_AND()
elif alt29 == 24:
# Java.g:1:216: LOGICAL_NOT
pass
self.mLOGICAL_NOT()
elif alt29 == 25:
# Java.g:1:228: LOGICAL_OR
pass
self.mLOGICAL_OR()
elif alt29 == 26:
# Java.g:1:239: LPAREN
pass
self.mLPAREN()
elif alt29 == 27:
# Java.g:1:246: MINUS
pass
self.mMINUS()
elif alt29 == 28:
# Java.g:1:252: MINUS_ASSIGN
pass
self.mMINUS_ASSIGN()
elif alt29 == 29:
# Java.g:1:265: MOD
pass
self.mMOD()
elif alt29 == 30:
# Java.g:1:269: MOD_ASSIGN
pass
self.mMOD_ASSIGN()
elif alt29 == 31:
# Java.g:1:280: NOT
pass
self.mNOT()
elif alt29 == 32:
# Java.g:1:284: NOT_EQUAL
pass
self.mNOT_EQUAL()
elif alt29 == 33:
# Java.g:1:294: OR
pass
self.mOR()
elif alt29 == 34:
# Java.g:1:297: OR_ASSIGN
pass
self.mOR_ASSIGN()
elif alt29 == 35:
# Java.g:1:307: PLUS
pass
self.mPLUS()
elif alt29 == 36:
# Java.g:1:312: PLUS_ASSIGN
pass
self.mPLUS_ASSIGN()
elif alt29 == 37:
# Java.g:1:324: QUESTION
pass
self.mQUESTION()
elif alt29 == 38:
# Java.g:1:333: RBRACK
pass
self.mRBRACK()
elif alt29 == 39:
# Java.g:1:340: RCURLY
pass
self.mRCURLY()
elif alt29 == 40:
# Java.g:1:347: RPAREN
pass
self.mRPAREN()
elif alt29 == 41:
# Java.g:1:354: SEMI
pass
self.mSEMI()
elif alt29 == 42:
# Java.g:1:359: SHIFT_LEFT
pass
self.mSHIFT_LEFT()
elif alt29 == 43:
# Java.g:1:370: SHIFT_LEFT_ASSIGN
pass
self.mSHIFT_LEFT_ASSIGN()
elif alt29 == 44:
# Java.g:1:388: SHIFT_RIGHT
pass
self.mSHIFT_RIGHT()
elif alt29 == 45:
# Java.g:1:400: SHIFT_RIGHT_ASSIGN
pass
self.mSHIFT_RIGHT_ASSIGN()
elif alt29 == 46:
# Java.g:1:419: STAR
pass
self.mSTAR()
elif alt29 == 47:
# Java.g:1:424: STAR_ASSIGN
pass
self.mSTAR_ASSIGN()
elif alt29 == 48:
# Java.g:1:436: XOR
pass
self.mXOR()
elif alt29 == 49:
# Java.g:1:440: XOR_ASSIGN
pass
self.mXOR_ASSIGN()
elif alt29 == 50:
# Java.g:1:451: ABSTRACT
pass
self.mABSTRACT()
elif alt29 == 51:
# Java.g:1:460: ASSERT
pass
self.mASSERT()
elif alt29 == 52:
# Java.g:1:467: BOOLEAN
pass
self.mBOOLEAN()
elif alt29 == 53:
# Java.g:1:475: BREAK
pass
self.mBREAK()
elif alt29 == 54:
# Java.g:1:481: BYTE
pass
self.mBYTE()
elif alt29 == 55:
# Java.g:1:486: CASE
pass
self.mCASE()
elif alt29 == 56:
# Java.g:1:491: CATCH
pass
self.mCATCH()
elif alt29 == 57:
# Java.g:1:497: CHAR
pass
self.mCHAR()
elif alt29 == 58:
# Java.g:1:502: CLASS
pass
self.mCLASS()
elif alt29 == 59:
# Java.g:1:508: CONTINUE
pass
self.mCONTINUE()
elif alt29 == 60:
# Java.g:1:517: DEFAULT
pass
self.mDEFAULT()
elif alt29 == 61:
# Java.g:1:525: DO
pass
self.mDO()
elif alt29 == 62:
# Java.g:1:528: DOUBLE
pass
self.mDOUBLE()
elif alt29 == 63:
# Java.g:1:535: ELSE
pass
self.mELSE()
elif alt29 == 64:
# Java.g:1:540: ENUM
pass
self.mENUM()
elif alt29 == 65:
# Java.g:1:545: EXTENDS
pass
self.mEXTENDS()
elif alt29 == 66:
# Java.g:1:553: FALSE
pass
self.mFALSE()
elif alt29 == 67:
# Java.g:1:559: FINAL
pass
self.mFINAL()
elif alt29 == 68:
# Java.g:1:565: FINALLY
pass
self.mFINALLY()
elif alt29 == 69:
# Java.g:1:573: FLOAT
pass
self.mFLOAT()
elif alt29 == 70:
# Java.g:1:579: FOR
pass
self.mFOR()
elif alt29 == 71:
# Java.g:1:583: IF
pass
self.mIF()
elif alt29 == 72:
# Java.g:1:586: IMPLEMENTS
pass
self.mIMPLEMENTS()
elif alt29 == 73:
# Java.g:1:597: INSTANCEOF
pass
self.mINSTANCEOF()
elif alt29 == 74:
# Java.g:1:608: INTERFACE
pass
self.mINTERFACE()
elif alt29 == 75:
# Java.g:1:618: IMPORT
pass
self.mIMPORT()
elif alt29 == 76:
# Java.g:1:625: INT
pass
self.mINT()
elif alt29 == 77:
# Java.g:1:629: LONG
pass
self.mLONG()
elif alt29 == 78:
# Java.g:1:634: NATIVE
pass
self.mNATIVE()
elif alt29 == 79:
# Java.g:1:641: NEW
pass
self.mNEW()
elif alt29 == 80:
# Java.g:1:645: NULL
pass
self.mNULL()
elif alt29 == 81:
# Java.g:1:650: PACKAGE
pass
self.mPACKAGE()
elif alt29 == 82:
# Java.g:1:658: PRIVATE
pass
self.mPRIVATE()
elif alt29 == 83:
# Java.g:1:666: PROTECTED
pass
self.mPROTECTED()
elif alt29 == 84:
# Java.g:1:676: PUBLIC
pass
self.mPUBLIC()
elif alt29 == 85:
# Java.g:1:683: RETURN
pass
self.mRETURN()
elif alt29 == 86:
# Java.g:1:690: SHORT
pass
self.mSHORT()
elif alt29 == 87:
# Java.g:1:696: STATIC
pass
self.mSTATIC()
elif alt29 == 88:
# Java.g:1:703: STRICTFP
pass
self.mSTRICTFP()
elif alt29 == 89:
# Java.g:1:712: SUPER
pass
self.mSUPER()
elif alt29 == 90:
# Java.g:1:718: SWITCH
pass
self.mSWITCH()
elif alt29 == 91:
# Java.g:1:725: SYNCHRONIZED
pass
self.mSYNCHRONIZED()
elif alt29 == 92:
# Java.g:1:738: THIS
pass
self.mTHIS()
elif alt29 == 93:
# Java.g:1:743: THROW
pass
self.mTHROW()
elif alt29 == 94:
# Java.g:1:749: THROWS
pass
self.mTHROWS()
elif alt29 == 95:
# Java.g:1:756: TRANSIENT
pass
self.mTRANSIENT()
elif alt29 == 96:
# Java.g:1:766: TRUE
pass
self.mTRUE()
elif alt29 == 97:
# Java.g:1:771: TRY
pass
self.mTRY()
elif alt29 == 98:
# Java.g:1:775: VOID
pass
self.mVOID()
elif alt29 == 99:
# Java.g:1:780: VOLATILE
pass
self.mVOLATILE()
elif alt29 == 100:
# Java.g:1:789: WHILE
pass
self.mWHILE()
elif alt29 == 101:
# Java.g:1:795: HEX_LITERAL
pass
self.mHEX_LITERAL()
elif alt29 == 102:
# Java.g:1:807: DECIMAL_LITERAL
pass
self.mDECIMAL_LITERAL()
elif alt29 == 103:
# Java.g:1:823: OCTAL_LITERAL
pass
self.mOCTAL_LITERAL()
elif alt29 == 104:
# Java.g:1:837: FLOATING_POINT_LITERAL
pass
self.mFLOATING_POINT_LITERAL()
elif alt29 == 105:
# Java.g:1:860: CHARACTER_LITERAL
pass
self.mCHARACTER_LITERAL()
elif alt29 == 106:
# Java.g:1:878: STRING_LITERAL
pass
self.mSTRING_LITERAL()
elif alt29 == 107:
# Java.g:1:893: IDENT
pass
self.mIDENT()
elif alt29 == 108:
# Java.g:1:899: WS
pass
self.mWS()
elif alt29 == 109:
# Java.g:1:902: COMMENT
pass
self.mCOMMENT()
elif alt29 == 110:
# Java.g:1:910: LINE_COMMENT
pass
self.mLINE_COMMENT()
elif alt29 == 111:
# Java.g:1:923: JAVADOC_COMMENT
pass
self.mJAVADOC_COMMENT()
# lookup tables for DFA #29
DFA29_eot = DFA.unpack(
u"\1\uffff\1\61\1\63\1\uffff\1\66\2\uffff\1\71\1\75\1\100\1\104\2"
u"\uffff\1\107\1\111\1\114\1\uffff\1\116\6\uffff\1\120\1\122\17\55"
u"\2\173\11\uffff\1\177\21\uffff\1\u0083\14\uffff\12\55\1\u0090\7"
u"\55\1\u0098\23\55\1\uffff\1\u00b3\1\uffff\1\173\1\u00b5\6\uffff"
u"\14\55\1\uffff\6\55\1\u00c8\1\uffff\2\55\1\u00cd\2\55\1\u00d0\20"
u"\55\1\u00e1\3\55\3\uffff\4\55\1\u00e9\1\u00ea\1\55\1\u00ec\4\55"
u"\1\u00f1\1\u00f2\4\55\1\uffff\4\55\1\uffff\1\u00fb\1\55\1\uffff"
u"\1\u00fd\13\55\1\u0109\2\55\1\u010c\1\uffff\1\u010d\5\55\1\u0113"
u"\2\uffff\1\u0114\1\uffff\1\u0115\3\55\2\uffff\1\55\1\u011a\1\u011c"
u"\1\u011d\4\55\1\uffff\1\55\1\uffff\5\55\1\u0128\2\55\1\u012b\2"
u"\55\1\uffff\1\u012f\1\55\2\uffff\1\55\1\u0132\1\55\1\u0134\1\55"
u"\3\uffff\2\55\1\u0138\1\55\1\uffff\1\55\2\uffff\1\55\1\u013c\2"
u"\55\1\u013f\3\55\1\u0143\1\u0144\1\uffff\1\u0145\1\55\1\uffff\1"
u"\u0147\1\55\1\u0149\1\uffff\2\55\1\uffff\1\55\1\uffff\1\u014d\1"
u"\55\1\u014f\1\uffff\1\u0150\1\u0151\1\55\1\uffff\2\55\1\uffff\1"
u"\u0155\1\u0156\1\55\3\uffff\1\55\1\uffff\1\55\1\uffff\2\55\1\u015c"
u"\1\uffff\1\u015d\3\uffff\3\55\2\uffff\1\55\1\u0162\2\55\1\u0165"
u"\2\uffff\2\55\1\u0168\1\u0169\1\uffff\1\55\1\u016b\1\uffff\1\u016c"
u"\1\u016d\2\uffff\1\55\3\uffff\1\55\1\u0170\1\uffff"
)
DFA29_eof = DFA.unpack(
u"\u0171\uffff"
)
DFA29_min = DFA.unpack(
u"\1\11\1\46\1\75\1\uffff\1\75\2\uffff\1\55\2\52\1\53\2\uffff\1\74"
u"\2\75\1\uffff\1\75\6\uffff\2\75\1\142\1\157\1\141\1\145\1\154\1"
u"\141\1\146\1\157\2\141\1\145\2\150\1\157\1\150\2\56\11\uffff\1"
u"\75\6\uffff\1\0\12\uffff\1\75\14\uffff\2\163\1\157\1\145\1\164"
u"\1\163\2\141\1\156\1\146\1\44\1\163\1\165\1\164\1\154\1\156\1\157"
u"\1\162\1\44\1\160\1\163\1\156\1\164\1\167\1\154\1\143\1\151\1\142"
u"\1\164\1\157\1\141\1\160\1\151\1\156\1\151\1\141\2\151\1\uffff"
u"\1\56\1\uffff\1\56\1\75\6\uffff\1\164\1\145\1\154\1\141\2\145\1"
u"\143\1\162\1\163\1\164\1\141\1\142\1\uffff\1\145\1\155\1\145\1"
u"\163\2\141\1\44\1\uffff\1\154\1\164\1\44\1\147\1\151\1\44\1\154"
u"\1\153\1\166\1\164\1\154\1\165\1\162\1\164\1\151\1\145\1\164\1"
u"\143\1\163\1\157\1\156\1\145\1\44\1\144\1\141\1\154\3\uffff\2\162"
u"\1\145\1\153\2\44\1\150\1\44\1\163\1\151\1\165\1\154\2\44\1\156"
u"\1\145\1\154\1\164\1\uffff\1\145\1\162\1\141\1\162\1\uffff\1\44"
u"\1\166\1\uffff\1\44\2\141\1\145\1\151\1\162\1\164\1\151\1\143\1"
u"\162\1\143\1\150\1\44\1\167\1\163\1\44\1\uffff\1\44\1\164\1\145"
u"\1\141\1\164\1\141\1\44\2\uffff\1\44\1\uffff\1\44\1\156\1\154\1"
u"\145\2\uffff\1\144\3\44\1\155\1\164\1\156\1\146\1\uffff\1\145\1"
u"\uffff\1\147\1\164\2\143\1\156\1\44\1\143\1\164\1\44\1\150\1\162"
u"\1\uffff\1\44\1\151\2\uffff\1\151\1\44\1\143\1\44\1\156\3\uffff"
u"\1\165\1\164\1\44\1\163\1\uffff\1\171\2\uffff\1\145\1\44\1\143"
u"\1\141\1\44\2\145\1\164\2\44\1\uffff\1\44\1\146\1\uffff\1\44\1"
u"\157\1\44\1\uffff\1\145\1\154\1\uffff\1\164\1\uffff\1\44\1\145"
u"\1\44\1\uffff\2\44\1\156\1\uffff\1\145\1\143\1\uffff\2\44\1\145"
u"\3\uffff\1\160\1\uffff\1\156\1\uffff\1\156\1\145\1\44\1\uffff\1"
u"\44\3\uffff\1\164\1\157\1\145\2\uffff\1\144\1\44\1\151\1\164\1"
u"\44\2\uffff\1\163\1\146\2\44\1\uffff\1\172\1\44\1\uffff\2\44\2"
u"\uffff\1\145\3\uffff\1\144\1\44\1\uffff"
)
DFA29_max = DFA.unpack(
u"\1\ufaff\2\75\1\uffff\1\76\2\uffff\2\75\1\71\1\75\2\uffff\2\75"
u"\1\174\1\uffff\1\75\6\uffff\2\75\1\163\1\171\2\157\1\170\1\157"
u"\1\156\1\157\2\165\1\145\1\171\1\162\1\157\1\150\1\170\1\146\11"
u"\uffff\1\76\6\uffff\1\uffff\12\uffff\1\75\14\uffff\2\163\1\157"
u"\1\145\2\164\2\141\1\156\1\146\1\ufaff\1\163\1\165\1\164\1\154"
u"\1\156\1\157\1\162\1\ufaff\1\160\1\164\1\156\1\164\1\167\1\154"
u"\1\143\1\157\1\142\1\164\1\157\1\162\1\160\1\151\1\156\1\162\1"
u"\171\1\154\1\151\1\uffff\1\146\1\uffff\1\146\1\75\6\uffff\1\164"
u"\1\145\1\154\1\141\2\145\1\143\1\162\1\163\1\164\1\141\1\142\1"
u"\uffff\1\145\1\155\1\145\1\163\2\141\1\ufaff\1\uffff\1\157\1\164"
u"\1\ufaff\1\147\1\151\1\ufaff\1\154\1\153\1\166\1\164\1\154\1\165"
u"\1\162\1\164\1\151\1\145\1\164\1\143\1\163\1\157\1\156\1\145\1"
u"\ufaff\1\144\1\141\1\154\3\uffff\2\162\1\145\1\153\2\ufaff\1\150"
u"\1\ufaff\1\163\1\151\1\165\1\154\2\ufaff\1\156\1\145\1\154\1\164"
u"\1\uffff\1\145\1\162\1\141\1\162\1\uffff\1\ufaff\1\166\1\uffff"
u"\1\ufaff\2\141\1\145\1\151\1\162\1\164\1\151\1\143\1\162\1\143"
u"\1\150\1\ufaff\1\167\1\163\1\ufaff\1\uffff\1\ufaff\1\164\1\145"
u"\1\141\1\164\1\141\1\ufaff\2\uffff\1\ufaff\1\uffff\1\ufaff\1\156"
u"\1\154\1\145\2\uffff\1\144\3\ufaff\1\155\1\164\1\156\1\146\1\uffff"
u"\1\145\1\uffff\1\147\1\164\2\143\1\156\1\ufaff\1\143\1\164\1\ufaff"
u"\1\150\1\162\1\uffff\1\ufaff\1\151\2\uffff\1\151\1\ufaff\1\143"
u"\1\ufaff\1\156\3\uffff\1\165\1\164\1\ufaff\1\163\1\uffff\1\171"
u"\2\uffff\1\145\1\ufaff\1\143\1\141\1\ufaff\2\145\1\164\2\ufaff"
u"\1\uffff\1\ufaff\1\146\1\uffff\1\ufaff\1\157\1\ufaff\1\uffff\1"
u"\145\1\154\1\uffff\1\164\1\uffff\1\ufaff\1\145\1\ufaff\1\uffff"
u"\2\ufaff\1\156\1\uffff\1\145\1\143\1\uffff\2\ufaff\1\145\3\uffff"
u"\1\160\1\uffff\1\156\1\uffff\1\156\1\145\1\ufaff\1\uffff\1\ufaff"
u"\3\uffff\1\164\1\157\1\145\2\uffff\1\144\1\ufaff\1\151\1\164\1"
u"\ufaff\2\uffff\1\163\1\146\2\ufaff\1\uffff\1\172\1\ufaff\1\uffff"
u"\2\ufaff\2\uffff\1\145\3\uffff\1\144\1\ufaff\1\uffff"
)
DFA29_accept = DFA.unpack(
u"\3\uffff\1\4\1\uffff\1\7\1\10\4\uffff\1\23\1\24\3\uffff\1\32\1"
u"\uffff\1\37\1\45\1\46\1\47\1\50\1\51\23\uffff\1\151\1\152\1\153"
u"\1\154\1\2\1\27\1\1\1\17\1\3\1\uffff\1\20\1\21\1\11\1\34\1\33\1"
u"\13\1\uffff\1\156\1\12\1\15\1\16\1\14\1\150\1\22\1\44\1\43\1\25"
u"\1\uffff\1\26\1\40\1\30\1\31\1\42\1\41\1\36\1\35\1\57\1\56\1\61"
u"\1\60\46\uffff\1\145\1\uffff\1\146\2\uffff\1\55\1\54\1\155\1\157"
u"\1\53\1\52\14\uffff\1\75\7\uffff\1\107\32\uffff\1\147\1\6\1\5\22"
u"\uffff\1\106\4\uffff\1\114\2\uffff\1\117\20\uffff\1\141\7\uffff"
u"\1\66\1\67\1\uffff\1\71\4\uffff\1\77\1\100\10\uffff\1\115\1\uffff"
u"\1\120\13\uffff\1\134\2\uffff\1\140\1\142\5\uffff\1\65\1\70\1\72"
u"\4\uffff\1\102\1\uffff\1\103\1\105\12\uffff\1\126\2\uffff\1\131"
u"\3\uffff\1\135\2\uffff\1\144\1\uffff\1\63\3\uffff\1\76\3\uffff"
u"\1\113\2\uffff\1\116\3\uffff\1\124\1\125\1\127\1\uffff\1\132\1"
u"\uffff\1\136\3\uffff\1\64\1\uffff\1\74\1\101\1\104\3\uffff\1\121"
u"\1\122\5\uffff\1\62\1\73\4\uffff\1\130\2\uffff\1\143\2\uffff\1"
u"\112\1\123\1\uffff\1\137\1\110\1\111\2\uffff\1\133"
)
DFA29_special = DFA.unpack(
u"\73\uffff\1\0\u0135\uffff"
)
DFA29_transition = [
DFA.unpack(u"\2\56\1\uffff\2\56\22\uffff\1\56\1\16\1\54\1\uffff\1"
u"\55\1\21\1\1\1\53\1\20\1\26\1\30\1\12\1\6\1\7\1\11\1\10\1\51\11"
u"\52\1\5\1\27\1\15\1\2\1\4\1\23\1\3\32\55\1\13\1\uffff\1\24\1\31"
u"\1\55\1\uffff\1\32\1\33\1\34\1\35\1\36\1\37\2\55\1\40\2\55\1\41"
u"\1\55\1\42\1\55\1\43\1\55\1\44\1\45\1\46\1\55\1\47\1\50\3\55\1"
u"\14\1\17\1\25\1\22\101\uffff\27\55\1\uffff\37\55\1\uffff\u1f08"
u"\55\u1040\uffff\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e"
u"\55\u10d2\uffff\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\60\26\uffff\1\57"),
DFA.unpack(u"\1\62"),
DFA.unpack(u""),
DFA.unpack(u"\1\65\1\64"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\67\17\uffff\1\70"),
DFA.unpack(u"\1\73\4\uffff\1\74\15\uffff\1\72"),
DFA.unpack(u"\1\76\3\uffff\1\77\1\uffff\12\101"),
DFA.unpack(u"\1\102\21\uffff\1\103"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\106\1\105"),
DFA.unpack(u"\1\110"),
DFA.unpack(u"\1\113\76\uffff\1\112"),
DFA.unpack(u""),
DFA.unpack(u"\1\115"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\117"),
DFA.unpack(u"\1\121"),
DFA.unpack(u"\1\123\20\uffff\1\124"),
DFA.unpack(u"\1\125\2\uffff\1\126\6\uffff\1\127"),
DFA.unpack(u"\1\130\6\uffff\1\131\3\uffff\1\132\2\uffff\1\133"),
DFA.unpack(u"\1\134\11\uffff\1\135"),
DFA.unpack(u"\1\136\1\uffff\1\137\11\uffff\1\140"),
DFA.unpack(u"\1\141\7\uffff\1\142\2\uffff\1\143\2\uffff\1\144"),
DFA.unpack(u"\1\145\6\uffff\1\146\1\147"),
DFA.unpack(u"\1\150"),
DFA.unpack(u"\1\151\3\uffff\1\152\17\uffff\1\153"),
DFA.unpack(u"\1\154\20\uffff\1\155\2\uffff\1\156"),
DFA.unpack(u"\1\157"),
DFA.unpack(u"\1\160\13\uffff\1\161\1\162\1\uffff\1\163\1\uffff\1"
u"\164"),
DFA.unpack(u"\1\165\11\uffff\1\166"),
DFA.unpack(u"\1\167"),
DFA.unpack(u"\1\170"),
DFA.unpack(u"\1\101\1\uffff\10\172\2\101\12\uffff\3\101\21\uffff"
u"\1\171\13\uffff\3\101\21\uffff\1\171"),
DFA.unpack(u"\1\101\1\uffff\12\174\12\uffff\3\101\35\uffff\3\101"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\176\1\175"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\52\u0080\1\u0081\uffd5\u0080"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u0082"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u0084"),
DFA.unpack(u"\1\u0085"),
DFA.unpack(u"\1\u0086"),
DFA.unpack(u"\1\u0087"),
DFA.unpack(u"\1\u0088"),
DFA.unpack(u"\1\u0089\1\u008a"),
DFA.unpack(u"\1\u008b"),
DFA.unpack(u"\1\u008c"),
DFA.unpack(u"\1\u008d"),
DFA.unpack(u"\1\u008e"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\24\55\1\u008f\5\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08"
u"\55\u1040\uffff\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e"
u"\55\u10d2\uffff\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0091"),
DFA.unpack(u"\1\u0092"),
DFA.unpack(u"\1\u0093"),
DFA.unpack(u"\1\u0094"),
DFA.unpack(u"\1\u0095"),
DFA.unpack(u"\1\u0096"),
DFA.unpack(u"\1\u0097"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0099"),
DFA.unpack(u"\1\u009a\1\u009b"),
DFA.unpack(u"\1\u009c"),
DFA.unpack(u"\1\u009d"),
DFA.unpack(u"\1\u009e"),
DFA.unpack(u"\1\u009f"),
DFA.unpack(u"\1\u00a0"),
DFA.unpack(u"\1\u00a1\5\uffff\1\u00a2"),
DFA.unpack(u"\1\u00a3"),
DFA.unpack(u"\1\u00a4"),
DFA.unpack(u"\1\u00a5"),
DFA.unpack(u"\1\u00a6\20\uffff\1\u00a7"),
DFA.unpack(u"\1\u00a8"),
DFA.unpack(u"\1\u00a9"),
DFA.unpack(u"\1\u00aa"),
DFA.unpack(u"\1\u00ab\10\uffff\1\u00ac"),
DFA.unpack(u"\1\u00ad\23\uffff\1\u00ae\3\uffff\1\u00af"),
DFA.unpack(u"\1\u00b0\2\uffff\1\u00b1"),
DFA.unpack(u"\1\u00b2"),
DFA.unpack(u""),
DFA.unpack(u"\1\101\1\uffff\10\172\2\101\12\uffff\3\101\35\uffff"
u"\3\101"),
DFA.unpack(u""),
DFA.unpack(u"\1\101\1\uffff\12\174\12\uffff\3\101\35\uffff\3\101"),
DFA.unpack(u"\1\u00b4"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u00b6"),
DFA.unpack(u"\1\u00b7"),
DFA.unpack(u"\1\u00b8"),
DFA.unpack(u"\1\u00b9"),
DFA.unpack(u"\1\u00ba"),
DFA.unpack(u"\1\u00bb"),
DFA.unpack(u"\1\u00bc"),
DFA.unpack(u"\1\u00bd"),
DFA.unpack(u"\1\u00be"),
DFA.unpack(u"\1\u00bf"),
DFA.unpack(u"\1\u00c0"),
DFA.unpack(u"\1\u00c1"),
DFA.unpack(u""),
DFA.unpack(u"\1\u00c2"),
DFA.unpack(u"\1\u00c3"),
DFA.unpack(u"\1\u00c4"),
DFA.unpack(u"\1\u00c5"),
DFA.unpack(u"\1\u00c6"),
DFA.unpack(u"\1\u00c7"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u""),
DFA.unpack(u"\1\u00c9\2\uffff\1\u00ca"),
DFA.unpack(u"\1\u00cb"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\4\55\1\u00cc\25\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08"
u"\55\u1040\uffff\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e"
u"\55\u10d2\uffff\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u00ce"),
DFA.unpack(u"\1\u00cf"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u00d1"),
DFA.unpack(u"\1\u00d2"),
DFA.unpack(u"\1\u00d3"),
DFA.unpack(u"\1\u00d4"),
DFA.unpack(u"\1\u00d5"),
DFA.unpack(u"\1\u00d6"),
DFA.unpack(u"\1\u00d7"),
DFA.unpack(u"\1\u00d8"),
DFA.unpack(u"\1\u00d9"),
DFA.unpack(u"\1\u00da"),
DFA.unpack(u"\1\u00db"),
DFA.unpack(u"\1\u00dc"),
DFA.unpack(u"\1\u00dd"),
DFA.unpack(u"\1\u00de"),
DFA.unpack(u"\1\u00df"),
DFA.unpack(u"\1\u00e0"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u00e2"),
DFA.unpack(u"\1\u00e3"),
DFA.unpack(u"\1\u00e4"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u00e5"),
DFA.unpack(u"\1\u00e6"),
DFA.unpack(u"\1\u00e7"),
DFA.unpack(u"\1\u00e8"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u00eb"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u00ed"),
DFA.unpack(u"\1\u00ee"),
DFA.unpack(u"\1\u00ef"),
DFA.unpack(u"\1\u00f0"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u00f3"),
DFA.unpack(u"\1\u00f4"),
DFA.unpack(u"\1\u00f5"),
DFA.unpack(u"\1\u00f6"),
DFA.unpack(u""),
DFA.unpack(u"\1\u00f7"),
DFA.unpack(u"\1\u00f8"),
DFA.unpack(u"\1\u00f9"),
DFA.unpack(u"\1\u00fa"),
DFA.unpack(u""),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u00fc"),
DFA.unpack(u""),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u00fe"),
DFA.unpack(u"\1\u00ff"),
DFA.unpack(u"\1\u0100"),
DFA.unpack(u"\1\u0101"),
DFA.unpack(u"\1\u0102"),
DFA.unpack(u"\1\u0103"),
DFA.unpack(u"\1\u0104"),
DFA.unpack(u"\1\u0105"),
DFA.unpack(u"\1\u0106"),
DFA.unpack(u"\1\u0107"),
DFA.unpack(u"\1\u0108"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u010a"),
DFA.unpack(u"\1\u010b"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u""),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u010e"),
DFA.unpack(u"\1\u010f"),
DFA.unpack(u"\1\u0110"),
DFA.unpack(u"\1\u0111"),
DFA.unpack(u"\1\u0112"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u""),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0116"),
DFA.unpack(u"\1\u0117"),
DFA.unpack(u"\1\u0118"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u0119"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\13\55\1\u011b\16\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08"
u"\55\u1040\uffff\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e"
u"\55\u10d2\uffff\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u011e"),
DFA.unpack(u"\1\u011f"),
DFA.unpack(u"\1\u0120"),
DFA.unpack(u"\1\u0121"),
DFA.unpack(u""),
DFA.unpack(u"\1\u0122"),
DFA.unpack(u""),
DFA.unpack(u"\1\u0123"),
DFA.unpack(u"\1\u0124"),
DFA.unpack(u"\1\u0125"),
DFA.unpack(u"\1\u0126"),
DFA.unpack(u"\1\u0127"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0129"),
DFA.unpack(u"\1\u012a"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u012c"),
DFA.unpack(u"\1\u012d"),
DFA.unpack(u""),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\22\55\1\u012e\7\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08"
u"\55\u1040\uffff\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e"
u"\55\u10d2\uffff\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0130"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u0131"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0133"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0135"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u0136"),
DFA.unpack(u"\1\u0137"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0139"),
DFA.unpack(u""),
DFA.unpack(u"\1\u013a"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u013b"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u013d"),
DFA.unpack(u"\1\u013e"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0140"),
DFA.unpack(u"\1\u0141"),
DFA.unpack(u"\1\u0142"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u""),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0146"),
DFA.unpack(u""),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0148"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u""),
DFA.unpack(u"\1\u014a"),
DFA.unpack(u"\1\u014b"),
DFA.unpack(u""),
DFA.unpack(u"\1\u014c"),
DFA.unpack(u""),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u014e"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u""),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0152"),
DFA.unpack(u""),
DFA.unpack(u"\1\u0153"),
DFA.unpack(u"\1\u0154"),
DFA.unpack(u""),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0157"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u0158"),
DFA.unpack(u""),
DFA.unpack(u"\1\u0159"),
DFA.unpack(u""),
DFA.unpack(u"\1\u015a"),
DFA.unpack(u"\1\u015b"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u""),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u015e"),
DFA.unpack(u"\1\u015f"),
DFA.unpack(u"\1\u0160"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u0161"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\u0163"),
DFA.unpack(u"\1\u0164"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u0166"),
DFA.unpack(u"\1\u0167"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u""),
DFA.unpack(u"\1\u016a"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u""),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u016e"),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u""),
DFA.unpack(u"\1\u016f"),
DFA.unpack(u"\1\55\13\uffff\12\55\7\uffff\32\55\4\uffff\1\55\1\uffff"
u"\32\55\105\uffff\27\55\1\uffff\37\55\1\uffff\u1f08\55\u1040\uffff"
u"\u0150\55\u0170\uffff\u0080\55\u0080\uffff\u092e\55\u10d2\uffff"
u"\u5200\55\u5900\uffff\u0200\55"),
DFA.unpack(u"")
]
# class definition for DFA #29
class DFA29(DFA):
pass
def specialStateTransition(self_, s, input):
# convince pylint that my self_ magic is ok ;)
# pylint: disable-msg=E0213
# pretend we are a member of the recognizer
# thus semantic predicates can be evaluated
self = self_.recognizer
_s = s
if s == 0:
LA29_59 = input.LA(1)
s = -1
if ((0 <= LA29_59 <= 41) or (43 <= LA29_59 <= 65535)):
s = 128
elif (LA29_59 == 42):
s = 129
if s >= 0:
return s
nvae = NoViableAltException(self_.getDescription(), 29, _s, input)
self_.error(nvae)
raise nvae
def main(argv, stdin=sys.stdin, stdout=sys.stdout, stderr=sys.stderr):
from antlr3.main import LexerMain
main = LexerMain(JavaLexer)
main.stdin = stdin
main.stdout = stdout
main.stderr = stderr
main.execute(argv)
if __name__ == '__main__':
main(sys.argv)
| 23.333901 | 1,174 | 0.469244 | 15,986 | 123,343 | 3.525272 | 0.064807 | 0.037796 | 0.066542 | 0.050555 | 0.529483 | 0.463632 | 0.426386 | 0.400231 | 0.389371 | 0.378174 | 0 | 0.174857 | 0.380824 | 123,343 | 5,285 | 1,175 | 23.338316 | 0.563054 | 0.159052 | 0 | 0.526969 | 0 | 0.092264 | 0.194628 | 0.176019 | 0 | 0 | 0 | 0 | 0.001774 | 1 | 0.044003 | false | 0.139461 | 0.003194 | 0 | 0.05181 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
9c2b66a37641559dc7bb07a463ae8972203ec073 | 1,069 | py | Python | hw/hw09/tests/multiples_3.py | AnthonyNg404/61A | 6b8fc656ef5438dc45e58d49b025bc653dda8655 | [
"Unlicense"
] | null | null | null | hw/hw09/tests/multiples_3.py | AnthonyNg404/61A | 6b8fc656ef5438dc45e58d49b025bc653dda8655 | [
"Unlicense"
] | null | null | null | hw/hw09/tests/multiples_3.py | AnthonyNg404/61A | 6b8fc656ef5438dc45e58d49b025bc653dda8655 | [
"Unlicense"
] | null | null | null | test = {
'name': 'multiples_3',
'points': 1,
'suites': [
{
'cases': [
{
'code': r"""
scm> (car multiples-of-three)
3
scm> (list? (cdr multiples-of-three)) ; Check to make sure variable contains a stream
#f
scm> (list? (cdr (cdr-stream multiples-of-three))) ; Check to make sure rest of stream is a stream
#f
scm> (equal? (first-k multiples-of-three 5) '(3 6 9 12 15))
#t
scm> (equal? (first-k multiples-of-three 10) '(3 6 9 12 15 18 21 24 27 30))
#t
scm> (length (first-k multiples-of-three 100))
100
""",
'hidden': False,
'locked': False
}
],
'scored': True,
'setup': r"""
scm> (load-all ".")
scm> (define (first-k s k) (if (or (null? s) (= k 0)) nil (cons (car s) (first-k (cdr-stream s) (- k 1)))))
scm> (define (length lst) (if (null? lst) 0 (+ 1 (length (cdr lst)))))
""",
'teardown': '',
'type': 'scheme'
}
]
}
| 28.891892 | 113 | 0.460243 | 141 | 1,069 | 3.48227 | 0.453901 | 0.13442 | 0.195519 | 0.10387 | 0.321792 | 0.248473 | 0.248473 | 0 | 0 | 0 | 0 | 0.059084 | 0.366698 | 1,069 | 36 | 114 | 29.694444 | 0.666174 | 0 | 0 | 0.166667 | 0 | 0.166667 | 0.793265 | 0.019645 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c2c8307b85f151581670e8c50e1033d0c99ee2e | 61,881 | py | Python | pysnmp-with-texts/NBS-OTNPM-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/NBS-OTNPM-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/NBS-OTNPM-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module NBS-OTNPM-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/NBS-OTNPM-MIB
# Produced by pysmi-0.3.4 at Wed May 1 14:17:31 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, ObjectIdentifier, OctetString = mibBuilder.importSymbols("ASN1", "Integer", "ObjectIdentifier", "OctetString")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ConstraintsUnion, ValueSizeConstraint, SingleValueConstraint, ValueRangeConstraint, ConstraintsIntersection = mibBuilder.importSymbols("ASN1-REFINEMENT", "ConstraintsUnion", "ValueSizeConstraint", "SingleValueConstraint", "ValueRangeConstraint", "ConstraintsIntersection")
InterfaceIndex, ifAlias = mibBuilder.importSymbols("IF-MIB", "InterfaceIndex", "ifAlias")
nbs, WritableU64, Unsigned64 = mibBuilder.importSymbols("NBS-MIB", "nbs", "WritableU64", "Unsigned64")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
Unsigned32, IpAddress, Bits, Counter64, ObjectIdentity, Counter32, iso, Integer32, MibIdentifier, TimeTicks, ModuleIdentity, NotificationType, Gauge32, MibScalar, MibTable, MibTableRow, MibTableColumn = mibBuilder.importSymbols("SNMPv2-SMI", "Unsigned32", "IpAddress", "Bits", "Counter64", "ObjectIdentity", "Counter32", "iso", "Integer32", "MibIdentifier", "TimeTicks", "ModuleIdentity", "NotificationType", "Gauge32", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn")
DisplayString, TextualConvention = mibBuilder.importSymbols("SNMPv2-TC", "DisplayString", "TextualConvention")
nbsOtnpmMib = ModuleIdentity((1, 3, 6, 1, 4, 1, 629, 222))
if mibBuilder.loadTexts: nbsOtnpmMib.setLastUpdated('201401230000Z')
if mibBuilder.loadTexts: nbsOtnpmMib.setOrganization('NBS')
if mibBuilder.loadTexts: nbsOtnpmMib.setContactInfo('For technical support, please contact your service channel')
if mibBuilder.loadTexts: nbsOtnpmMib.setDescription('OTN Performance Monitoring and user-controlled statistics')
nbsOtnpmThresholdsGrp = ObjectIdentity((1, 3, 6, 1, 4, 1, 629, 222, 1))
if mibBuilder.loadTexts: nbsOtnpmThresholdsGrp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsGrp.setDescription('Maximum considered safe by user')
nbsOtnpmCurrentGrp = ObjectIdentity((1, 3, 6, 1, 4, 1, 629, 222, 2))
if mibBuilder.loadTexts: nbsOtnpmCurrentGrp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentGrp.setDescription('Subtotals and statistics for sample now underway')
nbsOtnpmHistoricGrp = ObjectIdentity((1, 3, 6, 1, 4, 1, 629, 222, 3))
if mibBuilder.loadTexts: nbsOtnpmHistoricGrp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricGrp.setDescription('Totals and final statistics for a previous sample')
nbsOtnpmRunningGrp = ObjectIdentity((1, 3, 6, 1, 4, 1, 629, 222, 4))
if mibBuilder.loadTexts: nbsOtnpmRunningGrp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningGrp.setDescription('Totals and statistics since (boot-up) protocol configuration')
nbsOtnAlarmsGrp = ObjectIdentity((1, 3, 6, 1, 4, 1, 629, 222, 80))
if mibBuilder.loadTexts: nbsOtnAlarmsGrp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnAlarmsGrp.setDescription('OTN alarms')
nbsOtnStatsGrp = ObjectIdentity((1, 3, 6, 1, 4, 1, 629, 222, 90))
if mibBuilder.loadTexts: nbsOtnStatsGrp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsGrp.setDescription('User-controlled OTN alarms and statistics')
nbsOtnpmEventsGrp = ObjectIdentity((1, 3, 6, 1, 4, 1, 629, 222, 100))
if mibBuilder.loadTexts: nbsOtnpmEventsGrp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmEventsGrp.setDescription('Threshold crossing events')
nbsOtnpmTraps = ObjectIdentity((1, 3, 6, 1, 4, 1, 629, 222, 100, 0))
if mibBuilder.loadTexts: nbsOtnpmTraps.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmTraps.setDescription('Threshold crossing Traps or Notifications')
class NbsOtnAlarmId(TextualConvention, Integer32):
description = 'OTN alarm id, also used to identify a mask bit'
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53))
namedValues = NamedValues(("aLOS", 1), ("aLOF", 2), ("aOOF", 3), ("aLOM", 4), ("aOOM", 5), ("aRxLOL", 6), ("aTxLOL", 7), ("aOtuAIS", 8), ("aSectBDI", 9), ("aSectBIAE", 10), ("aSectIAE", 11), ("aSectTIM", 12), ("aOduAIS", 13), ("aOduOCI", 14), ("aOduLCK", 15), ("aPathBDI", 16), ("aPathTIM", 17), ("aTcm1BDI", 18), ("aTcm2BDI", 19), ("aTcm3BDI", 20), ("aTcm4BDI", 21), ("aTcm5BDI", 22), ("aTcm6BDI", 23), ("aTcm1BIAE", 24), ("aTcm2BIAE", 25), ("aTcm3BIAE", 26), ("aTcm4BIAE", 27), ("aTcm5BIAE", 28), ("aTcm6BIAE", 29), ("aTcm1IAE", 30), ("aTcm2IAE", 31), ("aTcm3IAE", 32), ("aTcm4IAE", 33), ("aTcm5IAE", 34), ("aTcm6IAE", 35), ("aTcm1LTC", 36), ("aTcm2LTC", 37), ("aTcm3LTC", 38), ("aTcm4LTC", 39), ("aTcm5LTC", 40), ("aTcm6LTC", 41), ("aTcm1TIM", 42), ("aTcm2TIM", 43), ("aTcm3TIM", 44), ("aTcm4TIM", 45), ("aTcm5TIM", 46), ("aTcm6TIM", 47), ("aFwdSF", 48), ("aFwdSD", 49), ("aBwdSF", 50), ("aBwdSD", 51), ("aPTM", 52), ("aCSF", 53))
class NbsOtnAlarmMask(TextualConvention, OctetString):
description = 'OTN alarm mask, encoded within an octet string. The bit assigned to a particular alarm (id from NbsOtnAlarmId) is calculated by: index = id/8; bit = id%8; where the leftmost bit (msb) is deemed as bit 0. The mask length is either full-size or zero if not supported.'
status = 'current'
subtypeSpec = OctetString.subtypeSpec + ValueSizeConstraint(0, 7)
nbsOtnpmThresholdsTable = MibTable((1, 3, 6, 1, 4, 1, 629, 222, 1, 1), )
if mibBuilder.loadTexts: nbsOtnpmThresholdsTable.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsTable.setDescription('OTN Performance Monitoring thresholds')
nbsOtnpmThresholdsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1), ).setIndexNames((0, "NBS-OTNPM-MIB", "nbsOtnpmThresholdsIfIndex"), (0, "NBS-OTNPM-MIB", "nbsOtnpmThresholdsInterval"), (0, "NBS-OTNPM-MIB", "nbsOtnpmThresholdsScope"))
if mibBuilder.loadTexts: nbsOtnpmThresholdsEntry.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsEntry.setDescription('Performance monitoring thresholds for a particular interface')
nbsOtnpmThresholdsIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 1), InterfaceIndex()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmThresholdsIfIndex.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsIfIndex.setDescription('The mib2 ifIndex')
nbsOtnpmThresholdsInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("quarterHour", 1), ("twentyfourHour", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmThresholdsInterval.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsInterval.setDescription('Indicates the sampling period to which these thresholds apply')
nbsOtnpmThresholdsScope = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8))).clone(namedValues=NamedValues(("tcm1", 1), ("tcm2", 2), ("tcm3", 3), ("tcm4", 4), ("tcm5", 5), ("tcm6", 6), ("section", 7), ("path", 8)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmThresholdsScope.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsScope.setDescription('This object specifies the network segment to which these thresholds apply.')
nbsOtnpmThresholdsEs = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 10), Unsigned32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: nbsOtnpmThresholdsEs.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsEs.setDescription('Persistent. The number of Errored Seconds (ES) which, if met or exceeded at the end of the nbsOtnpmThresholdsInterval period, should trigger the nbsOtnpmTrapsEs event notification. The reserved value 0 disables notifications for this event.')
nbsOtnpmThresholdsEsrSig = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 11), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: nbsOtnpmThresholdsEsrSig.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsEsrSig.setDescription('Persistent. The significand of the Errored Seconds Ratio (ESR) threshold, which is calculated by: nbsOtnpmThresholdsEsrSig x 10^nbsOtnpmThresholdsEsrExp An ESR that meets or exceeds this threshold at the end of the nbsOtnpmThresholdsInterval period triggers the nbsOtnpmTrapsEsr event notification. The reserved value 0 disables notifications for this event.')
nbsOtnpmThresholdsEsrExp = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 12), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-2147483648, 2147483647))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: nbsOtnpmThresholdsEsrExp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsEsrExp.setDescription('Persistent. The exponent of the Errored Seconds Ratio (ESR) threshold; see nbsOtnpmThresholdsEsrSig. Not supported value: 0x80000000')
nbsOtnpmThresholdsSes = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 13), Unsigned32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: nbsOtnpmThresholdsSes.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsSes.setDescription('Persistent. The number of Severely Errored Seconds (SES) which, if met or exceeded at the end of the nbsOtnpmThresholdsInterval period, should trigger the nbsOtnpmTrapsSes event notification. The reserved value 0 disables notifications for this event.')
nbsOtnpmThresholdsSesrSig = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 14), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: nbsOtnpmThresholdsSesrSig.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsSesrSig.setDescription('Persistent. The significand of the Severely Errored Seconds Ratio (SESR) threshold, which is calculated by: nbsOtnpmThresholdsSesrSig x 10^nbsOtnpmThresholdsSesrExp A SESR that meets or exceeds this threshold at the end of the nbsOtnpmThresholdsInterval period triggers the nbsOtnpmTrapsSesr notification. The reserved value 0 disables notifications for this event.')
nbsOtnpmThresholdsSesrExp = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 15), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-2147483648, 2147483647))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: nbsOtnpmThresholdsSesrExp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsSesrExp.setDescription('Persistent. The exponent of the Severely Errored Seconds Ratio (SESR) threshold; see nbsOtnpmThresholdsSesrSig. Not supported value: 0x80000000')
nbsOtnpmThresholdsBbe = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 16), WritableU64()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: nbsOtnpmThresholdsBbe.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsBbe.setDescription('Persistent. The number of Background Block Errors (BBE) which, if met or exceeded at the end of the nbsOtnpmThresholdsInterval period, should trigger the nbsOtnpmTrapsBbe event notification. The reserved value 0 disables notifications for this event.')
nbsOtnpmThresholdsBberSig = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 17), Integer32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: nbsOtnpmThresholdsBberSig.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsBberSig.setDescription('Persistent. The significand of the Background Block Errors Ratio (BBER) threshold, which is calculated by: nbsOtnpmThresholdsBberSig x 10^nbsOtnpmThresholdsBberExp A BBER that meets or exceeds this threshold at the end of the nbsOtnpmThresholdsInterval period triggers the nbsOtnpmTrapsBber notification. The reserved value 0 disables notifications for this event.')
nbsOtnpmThresholdsBberExp = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 18), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-2147483648, 2147483647))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: nbsOtnpmThresholdsBberExp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsBberExp.setDescription('Persistent. The exponent of the Background Block Errors Ratio (BBER) threshold; see nbsOtnpmThresholdsBberSig. Not supported value: 0x80000000')
nbsOtnpmThresholdsUas = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 19), Unsigned32()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: nbsOtnpmThresholdsUas.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsUas.setDescription('Persistent. The number of Unavailable Seconds (UAS) which, if met or exceeded at the end of the nbsOtnpmThresholdsInterval period, should trigger the nbsOtnpmTrapsUas event notification. The reserved value 0 disables notifications for this event.')
nbsOtnpmThresholdsFc = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 1, 1, 1, 20), WritableU64()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: nbsOtnpmThresholdsFc.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmThresholdsFc.setDescription('Persistent. The number of Failure Counts (FC) which, if met or exceeded at the end of the nbsOtnpmThresholdsInterval period, should trigger the nbsOtnpmTrapsFc event notification. The reserved value 0 disables notifications for this event.')
nbsOtnpmCurrentTable = MibTable((1, 3, 6, 1, 4, 1, 629, 222, 2, 3), )
if mibBuilder.loadTexts: nbsOtnpmCurrentTable.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentTable.setDescription('All OTN Performance Monitoring statistics for the nbsOtnpmCurrentInterval now underway.')
nbsOtnpmCurrentEntry = MibTableRow((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1), ).setIndexNames((0, "NBS-OTNPM-MIB", "nbsOtnpmCurrentIfIndex"), (0, "NBS-OTNPM-MIB", "nbsOtnpmCurrentInterval"), (0, "NBS-OTNPM-MIB", "nbsOtnpmCurrentScope"))
if mibBuilder.loadTexts: nbsOtnpmCurrentEntry.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentEntry.setDescription('OTN Performance Monitoring statistics for a specific port/ interface and nbsOtnpmCurrentInterval.')
nbsOtnpmCurrentIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 1), InterfaceIndex()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentIfIndex.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentIfIndex.setDescription('The mib2 ifIndex')
nbsOtnpmCurrentInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("quarterHour", 1), ("twentyfourHour", 2)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentInterval.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentInterval.setDescription('Indicates the sampling period of statistic')
nbsOtnpmCurrentScope = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8))).clone(namedValues=NamedValues(("tcm1", 1), ("tcm2", 2), ("tcm3", 3), ("tcm4", 4), ("tcm5", 5), ("tcm6", 6), ("section", 7), ("path", 8)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentScope.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentScope.setDescription("Indicates statistic's network segment")
nbsOtnpmCurrentDate = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentDate.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentDate.setDescription('The date (UTC) this interval began, represented by an eight digit decimal number: yyyymmdd')
nbsOtnpmCurrentTime = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentTime.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentTime.setDescription('The time (UTC) this interval began, represented by a six digit decimal number: hhmmss')
nbsOtnpmCurrentEs = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 10), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentEs.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentEs.setDescription('The number of Errored Seconds (ES) in this interval so far.')
nbsOtnpmCurrentEsrSig = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 11), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentEsrSig.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentEsrSig.setDescription('The significand of the current Errored Seconds Ratio (ESR), which is calculated by: nbsOtnpmCurrentEsrSig x 10^nbsOtnpmCurrentEsrExp')
nbsOtnpmCurrentEsrExp = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 12), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-2147483648, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentEsrExp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentEsrExp.setDescription('The exponent of the current Errored Seconds Ratio (ESR); see nbsOtnpmCurrentEsrSig. Not supported value: 0x80000000')
nbsOtnpmCurrentSes = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 13), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentSes.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentSes.setDescription('The number of Severely Errored Seconds (SES) in this interval so far')
nbsOtnpmCurrentSesrSig = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 14), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentSesrSig.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentSesrSig.setDescription('The significand of the current Severely Errored Seconds Ratio (SESR), which is calculated by: nbsOtnpmCurrentSesrSig x 10^nbsOtnpmCurrentSesrExp')
nbsOtnpmCurrentSesrExp = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 15), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-2147483648, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentSesrExp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentSesrExp.setDescription('The exponent of the current Severely Errored Seconds Ratio (SESR); see nbsOtnpmCurrentSesrSig. Not supported value: 0x80000000')
nbsOtnpmCurrentBbe = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 16), Unsigned64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentBbe.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentBbe.setDescription('The number of Background Block Errors (BBE) so far, i.e. the count of Bit Interleave Parity (BIP8) errors.')
nbsOtnpmCurrentBberSig = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 17), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentBberSig.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentBberSig.setDescription('The significand of the current Background Block Errors (BBER), which is calculated by: nbsOtnpmCurrentBberSig x 10^nbsOtnpmCurrentBberExp')
nbsOtnpmCurrentBberExp = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 18), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-2147483648, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentBberExp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentBberExp.setDescription('The exponent of the current Background Block Errors Ratio (BBER); see nbsOtnpmCurrentBberSig. Not supported value: 0x80000000')
nbsOtnpmCurrentUas = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 19), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentUas.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentUas.setDescription('The number of Unavailable Seconds (UAS) so far')
nbsOtnpmCurrentFc = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 20), Unsigned64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentFc.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentFc.setDescription('The number of Failure Counts (FC) so far, i.e. the count of Backward Error Indication (BEI) errors.')
nbsOtnpmCurrentAlarmsSupported = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 100), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentAlarmsSupported.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentAlarmsSupported.setDescription('The mask of OTN alarms that are supported.')
nbsOtnpmCurrentAlarmsRaised = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 101), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentAlarmsRaised.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentAlarmsRaised.setDescription('The mask of OTN alarms that are currently raised.')
nbsOtnpmCurrentAlarmsChanged = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 2, 3, 1, 102), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmCurrentAlarmsChanged.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmCurrentAlarmsChanged.setDescription('The mask of OTN alarms that have changed so far, i.e. alarms that have transitioned at least once from clear to raised or from raised to clear.')
nbsOtnpmHistoricTable = MibTable((1, 3, 6, 1, 4, 1, 629, 222, 3, 3), )
if mibBuilder.loadTexts: nbsOtnpmHistoricTable.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricTable.setDescription('All OTN Performance Monitoring statistics for past nbsOtnpmHistoricInterval periods.')
nbsOtnpmHistoricEntry = MibTableRow((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1), ).setIndexNames((0, "NBS-OTNPM-MIB", "nbsOtnpmHistoricIfIndex"), (0, "NBS-OTNPM-MIB", "nbsOtnpmHistoricInterval"), (0, "NBS-OTNPM-MIB", "nbsOtnpmHistoricScope"), (0, "NBS-OTNPM-MIB", "nbsOtnpmHistoricSample"))
if mibBuilder.loadTexts: nbsOtnpmHistoricEntry.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricEntry.setDescription('OTN Performance Monitoring statistics for a specific port/ interface and nbsOtnpmHistoricInterval.')
nbsOtnpmHistoricIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 1), InterfaceIndex()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricIfIndex.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricIfIndex.setDescription('The mib2 ifIndex')
nbsOtnpmHistoricInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2))).clone(namedValues=NamedValues(("quarterHour", 1), ("twentyfourHour", 2))))
if mibBuilder.loadTexts: nbsOtnpmHistoricInterval.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricInterval.setDescription('Indicates the sampling period of statistic')
nbsOtnpmHistoricScope = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8))).clone(namedValues=NamedValues(("tcm1", 1), ("tcm2", 2), ("tcm3", 3), ("tcm4", 4), ("tcm5", 5), ("tcm6", 6), ("section", 7), ("path", 8))))
if mibBuilder.loadTexts: nbsOtnpmHistoricScope.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricScope.setDescription("Indicates statistic's network segment")
nbsOtnpmHistoricSample = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 4), Integer32())
if mibBuilder.loadTexts: nbsOtnpmHistoricSample.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricSample.setDescription('Indicates the sample number of this statistic. The most recent sample is numbered 1, the next previous 2, and so on until the oldest sample.')
nbsOtnpmHistoricDate = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricDate.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricDate.setDescription('The date (UTC) the interval began, represented by an eight digit decimal number: yyyymmdd')
nbsOtnpmHistoricTime = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricTime.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricTime.setDescription('The time (UTC) the interval began, represented by a six digit decimal number: hhmmss')
nbsOtnpmHistoricEs = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 10), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricEs.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricEs.setDescription('The final count of Errored Seconds (ES) for this interval')
nbsOtnpmHistoricEsrSig = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 11), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricEsrSig.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricEsrSig.setDescription('The significand of the final Errored Seconds Ratio (ESR) for this interval, which is calculated by: nbsOtnpmHistoricEsrSig x 10^nbsOtnpmHistoricEsrExp')
nbsOtnpmHistoricEsrExp = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 12), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-2147483648, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricEsrExp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricEsrExp.setDescription('The exponent of the final Errored Seconds Ratio (ESR) for this interval; see nbsOtnpmHistoricEsrSig. Not supported value: 0x80000000')
nbsOtnpmHistoricSes = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 13), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricSes.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricSes.setDescription('The final count of Severely Errored Seconds (SES) in this interval')
nbsOtnpmHistoricSesrSig = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 14), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricSesrSig.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricSesrSig.setDescription('The significand of the final Severely Errored Seconds Ratio (SESR) for this interval, which is calculated by: nbsOtnpmHistoricSesrSig x 10^nbsOtnpmHistoricSesrExp')
nbsOtnpmHistoricSesrExp = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 15), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-2147483648, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricSesrExp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricSesrExp.setDescription('The exponent of the final Severely Errored Seconds Ratio (SESR) for this interval; see nbsOtnpmHistoricSesrSig. Not supported value: 0x80000000')
nbsOtnpmHistoricBbe = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 16), Unsigned64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricBbe.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricBbe.setDescription('The final count of Background Block Errors (BBE), i.e. the count of Bit Interleave Parity (BIP8) errors.')
nbsOtnpmHistoricBberSig = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 17), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricBberSig.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricBberSig.setDescription('The significand of the final Background Block Errors Ratio (BBER) for this interval, which is calculated by: nbsOtnpmHistoricBberSig x 10^nbsOtnpmHistoricBberExp)')
nbsOtnpmHistoricBberExp = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 18), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-2147483648, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricBberExp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricBberExp.setDescription('The exponent of the final Background Block Errors Ratio (BBER) for this interval; see nbsOtnpmHistoricBberSig. Not supported value: 0x80000000')
nbsOtnpmHistoricUas = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 19), Unsigned32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricUas.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricUas.setDescription('The final count of Unavailable Seconds (UAS)')
nbsOtnpmHistoricFc = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 20), Unsigned64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricFc.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricFc.setDescription('The final number of Failure Counts (FC), i.e. the count of Backward Error Indication (BEI) errors.')
nbsOtnpmHistoricAlarmsSupported = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 100), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricAlarmsSupported.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricAlarmsSupported.setDescription('The mask of OTN alarms that were supported.')
nbsOtnpmHistoricAlarmsRaised = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 101), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricAlarmsRaised.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricAlarmsRaised.setDescription('The mask of OTN alarms that were raised at the end of this interval.')
nbsOtnpmHistoricAlarmsChanged = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 3, 3, 1, 102), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmHistoricAlarmsChanged.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmHistoricAlarmsChanged.setDescription('The mask of OTN alarms that changed in this interval, i.e. alarms that transitioned at least once from clear to raised or from raised to clear.')
nbsOtnpmRunningTable = MibTable((1, 3, 6, 1, 4, 1, 629, 222, 4, 3), )
if mibBuilder.loadTexts: nbsOtnpmRunningTable.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningTable.setDescription('All OTN Performance Monitoring statistics since (boot-up) protocol configuration.')
nbsOtnpmRunningEntry = MibTableRow((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1), ).setIndexNames((0, "NBS-OTNPM-MIB", "nbsOtnpmRunningIfIndex"), (0, "NBS-OTNPM-MIB", "nbsOtnpmRunningScope"))
if mibBuilder.loadTexts: nbsOtnpmRunningEntry.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningEntry.setDescription('OTN Performance Monitoring statistics for a specific port/ interface.')
nbsOtnpmRunningIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 1), InterfaceIndex()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningIfIndex.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningIfIndex.setDescription('The mib2 ifIndex')
nbsOtnpmRunningScope = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5, 6, 7, 8))).clone(namedValues=NamedValues(("tcm1", 1), ("tcm2", 2), ("tcm3", 3), ("tcm4", 4), ("tcm5", 5), ("tcm6", 6), ("section", 7), ("path", 8))))
if mibBuilder.loadTexts: nbsOtnpmRunningScope.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningScope.setDescription("Indicates statistic's network segment")
nbsOtnpmRunningDate = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningDate.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningDate.setDescription('The date (UTC) of protocol configuration, represented by an eight digit decimal number: yyyymmdd')
nbsOtnpmRunningTime = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningTime.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningTime.setDescription('The time (UTC) of protocol configuration, represented by a six digit decimal number: hhmmss')
nbsOtnpmRunningEs = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 10), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningEs.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningEs.setDescription('The number of Errored Seconds (ES) since protocol configuration.')
nbsOtnpmRunningEsrSig = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 11), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningEsrSig.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningEsrSig.setDescription('The significand of the running Errored Seconds Ratio (ESR), which is calculated by: nbsOtnpmRunningEsrSig x 10^nbsOtnpmRunningEsrExp')
nbsOtnpmRunningEsrExp = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 12), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-2147483648, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningEsrExp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningEsrExp.setDescription('The exponent of the running Errored Seconds Ratio (ESR); see nbsOtnpmRunningEsrSig. Not supported value: 0x80000000')
nbsOtnpmRunningSes = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 13), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningSes.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningSes.setDescription('The number of Severely Errored Seconds (SES) since protocol configuration')
nbsOtnpmRunningSesrSig = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 14), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningSesrSig.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningSesrSig.setDescription('The significand of the running Severely Errored Seconds Ratio (SESR), which is calculated by: nbsOtnpmRunningSesrSig x 10^nbsOtnpmRunningSesrExp')
nbsOtnpmRunningSesrExp = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 15), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-2147483648, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningSesrExp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningSesrExp.setDescription('The exponent of the running Severely Errored Seconds Ratio (SESR); see nbsOtnpmRunningSesrSig. Not supported value: 0x80000000')
nbsOtnpmRunningBbe = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 16), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningBbe.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningBbe.setDescription('The number of Background Block Errors (BBE) since protocol configuration, i.e. the count of Bit Interleave Parity (BIP8) errors.')
nbsOtnpmRunningBberSig = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 17), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningBberSig.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningBberSig.setDescription('The significand of the running Background Block Errors (BBER), which is calculated by: nbsOtnpmRunningBberSig x 10^nbsOtnpmRunningBberExp')
nbsOtnpmRunningBberExp = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 18), Integer32().subtype(subtypeSpec=ValueRangeConstraint(-2147483648, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningBberExp.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningBberExp.setDescription('The exponent of the running Background Block Errors Ratio (BBER); see nbsOtnpmRunningBberSig. Not supported value: 0x80000000')
nbsOtnpmRunningUas = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 19), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningUas.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningUas.setDescription('The number of Unavailable Seconds (UAS) since protocol configuration')
nbsOtnpmRunningFc = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 20), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningFc.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningFc.setDescription('The number of Failure Counts (FC) since protocol configuration, i.e. the count of Backward Error Indication (BEI) errors.')
nbsOtnpmRunningAlarmsSupported = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 100), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningAlarmsSupported.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningAlarmsSupported.setDescription('The mask of OTN alarms that are supported.')
nbsOtnpmRunningAlarmsRaised = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 101), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningAlarmsRaised.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningAlarmsRaised.setDescription('The mask of OTN alarms that are currently raised.')
nbsOtnpmRunningAlarmsChanged = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 4, 3, 1, 102), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnpmRunningAlarmsChanged.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmRunningAlarmsChanged.setDescription('The mask of OTN alarms that changed since protocol configuration, i.e. alarms that transitioned at least once from clear to raised or from raised to clear.')
nbsOtnAlarmsTable = MibTable((1, 3, 6, 1, 4, 1, 629, 222, 80, 3), )
if mibBuilder.loadTexts: nbsOtnAlarmsTable.setStatus('current')
if mibBuilder.loadTexts: nbsOtnAlarmsTable.setDescription('OTN alarm monitoring scoreboard, showing for each possible alarm if it is currently raised and if it has changed since monitoring began (or was cleared). The latter indicator may be cleared at anytime without affecting normal performance monitoring activity.')
nbsOtnAlarmsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 629, 222, 80, 3, 1), ).setIndexNames((0, "NBS-OTNPM-MIB", "nbsOtnAlarmsIfIndex"))
if mibBuilder.loadTexts: nbsOtnAlarmsEntry.setStatus('current')
if mibBuilder.loadTexts: nbsOtnAlarmsEntry.setDescription('OTN alarm monitoring scoreboard for a specific port/interface.')
nbsOtnAlarmsIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 80, 3, 1, 1), InterfaceIndex()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnAlarmsIfIndex.setStatus('current')
if mibBuilder.loadTexts: nbsOtnAlarmsIfIndex.setDescription('The mib2 ifIndex')
nbsOtnAlarmsDate = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 80, 3, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnAlarmsDate.setStatus('current')
if mibBuilder.loadTexts: nbsOtnAlarmsDate.setDescription('The date (UTC) OTN alarm monitoring began (was cleared), represented by an eight digit decimal number: yyyymmdd')
nbsOtnAlarmsTime = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 80, 3, 1, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnAlarmsTime.setStatus('current')
if mibBuilder.loadTexts: nbsOtnAlarmsTime.setDescription('The time (UTC) OTN alarm monitoring began (was cleared), represented by a six digit decimal number: hhmmss')
nbsOtnAlarmsSpan = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 80, 3, 1, 7), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnAlarmsSpan.setStatus('current')
if mibBuilder.loadTexts: nbsOtnAlarmsSpan.setDescription('The amount of time (deci-sec) since nbsOtnAlarmsDate and nbsOtnAlarmsTime.')
nbsOtnAlarmsState = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 80, 3, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("notSupported", 1), ("monitoring", 2), ("clearing", 3)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: nbsOtnAlarmsState.setStatus('current')
if mibBuilder.loadTexts: nbsOtnAlarmsState.setDescription("This object reads 'notSupported' if the port is not configured with an OTN protocol. Otherwise it reads 'monitoring' to indicate that supported OTN alarms are actively reported in nbsOtnAlarmsRaised and nbsOtnAlarmsChanged. Writing 'clearing' to this object clears nbsOtnAlarmsChanged.")
nbsOtnAlarmsSupported = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 80, 3, 1, 100), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnAlarmsSupported.setStatus('current')
if mibBuilder.loadTexts: nbsOtnAlarmsSupported.setDescription('The mask of OTN alarms that are supported on this port.')
nbsOtnAlarmsRaised = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 80, 3, 1, 101), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnAlarmsRaised.setStatus('current')
if mibBuilder.loadTexts: nbsOtnAlarmsRaised.setDescription('The mask of OTN alarms that are currently raised.')
nbsOtnAlarmsChanged = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 80, 3, 1, 102), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnAlarmsChanged.setStatus('current')
if mibBuilder.loadTexts: nbsOtnAlarmsChanged.setDescription('The mask of OTN alarms that have changed since nbsOtnAlarmsDate and AlarmsTime, i.e. alarms that have transitioned at least once from clear to raised or from raised to clear.')
nbsOtnAlarmsRcvdFTFL = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 80, 3, 1, 110), OctetString().subtype(subtypeSpec=ValueSizeConstraint(0, 256))).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnAlarmsRcvdFTFL.setStatus('current')
if mibBuilder.loadTexts: nbsOtnAlarmsRcvdFTFL.setDescription('The current Fault Type Fault Location information received on the given port. The length will be zero when there is a no fault code in both the forward and backward fields. Otherwise, the full 256 bytes will be provided; see ITU-T G.709, section 15.8.2.5.')
nbsOtnStatsTable = MibTable((1, 3, 6, 1, 4, 1, 629, 222, 90, 3), )
if mibBuilder.loadTexts: nbsOtnStatsTable.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsTable.setDescription('OTN alarms and statistics monitoring managed per user discretion. This monitoring may be started, stopped, and cleared as desired without affecting the normal performance monitoring activity.')
nbsOtnStatsEntry = MibTableRow((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1), ).setIndexNames((0, "NBS-OTNPM-MIB", "nbsOtnStatsIfIndex"))
if mibBuilder.loadTexts: nbsOtnStatsEntry.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsEntry.setDescription('User-controlled OTN monitoring for a specific port/interface.')
nbsOtnStatsIfIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 1), InterfaceIndex()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsIfIndex.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsIfIndex.setDescription('The mib2 ifIndex')
nbsOtnStatsDate = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 5), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsDate.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsDate.setDescription('The date (UTC) OTN statistics collection began, represented by an eight digit decimal number: yyyymmdd')
nbsOtnStatsTime = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 6), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsTime.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsTime.setDescription('The time (UTC) OTN statistics collection began, represented by a six digit decimal number: hhmmss')
nbsOtnStatsSpan = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 7), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsSpan.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsSpan.setDescription('The amount of time (deci-sec) statistics collection has been underway since nbsOtnStatsDate and nbsOtnStatsTime, or if stopped, the duration of the prior collection.')
nbsOtnStatsState = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))).clone(namedValues=NamedValues(("notSupported", 1), ("counting", 2), ("clearing", 3), ("stopped", 4)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: nbsOtnStatsState.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsState.setDescription("Writing 'stopped' to this object stops (pauses) OTN statistics collection. Re-configuring this port to a non-OTN protocol sets this object to 'stopped' automatically. Writing 'counting' to this object starts (resumes) OTN statistics collection if this port is configured with an OTN protocol. Writing 'clearing' to this object clears all statistical counters.")
nbsOtnStatsErrCntSectBEI = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 21), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntSectBEI.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntSectBEI.setDescription('The count of section Backward Error Indication errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntPathBEI = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 22), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntPathBEI.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntPathBEI.setDescription('The count of path Backward Error Indication errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntTcm1BEI = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 23), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm1BEI.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm1BEI.setDescription('The count of TCM1 Backward Error Indication errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntTcm2BEI = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 24), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm2BEI.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm2BEI.setDescription('The count of TCM2 Backward Error Indication errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntTcm3BEI = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 25), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm3BEI.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm3BEI.setDescription('The count of TCM3 Backward Error Indication errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntTcm4BEI = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 26), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm4BEI.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm4BEI.setDescription('The count of TCM4 Backward Error Indication errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntTcm5BEI = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 27), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm5BEI.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm5BEI.setDescription('The count of TCM5 Backward Error Indication errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntTcm6BEI = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 28), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm6BEI.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm6BEI.setDescription('The count of TCM6 Backward Error Indication errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntSectBIP8 = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 31), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntSectBIP8.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntSectBIP8.setDescription('The count of section Bit Interleave Parity errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntPathBIP8 = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 32), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntPathBIP8.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntPathBIP8.setDescription('The count of path Bit Interleave Parity errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntTcm1BIP8 = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 33), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm1BIP8.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm1BIP8.setDescription('The count of TCM1 Bit Interleave Parity errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntTcm2BIP8 = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 34), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm2BIP8.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm2BIP8.setDescription('The count of TCM2 Bit Interleave Parity errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntTcm3BIP8 = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 35), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm3BIP8.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm3BIP8.setDescription('The count of TCM3 Bit Interleave Parity errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntTcm4BIP8 = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 36), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm4BIP8.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm4BIP8.setDescription('The count of TCM4 Bit Interleave Parity errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntTcm5BIP8 = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 37), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm5BIP8.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm5BIP8.setDescription('The count of TCM5 Bit Interleave Parity errors detected since OTN statistics collection began.')
nbsOtnStatsErrCntTcm6BIP8 = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 38), Counter64()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm6BIP8.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsErrCntTcm6BIP8.setDescription('The count of TCM6 Bit Interleave Parity errors detected since OTN statistics collection began.')
nbsOtnStatsAlarmsSupported = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 100), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsAlarmsSupported.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsAlarmsSupported.setDescription('The mask of OTN alarms that are supported.')
nbsOtnStatsAlarmsRaised = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 101), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsAlarmsRaised.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsAlarmsRaised.setDescription('The mask of OTN alarms that are currently raised.')
nbsOtnStatsAlarmsChanged = MibTableColumn((1, 3, 6, 1, 4, 1, 629, 222, 90, 3, 1, 102), NbsOtnAlarmMask()).setMaxAccess("readonly")
if mibBuilder.loadTexts: nbsOtnStatsAlarmsChanged.setStatus('current')
if mibBuilder.loadTexts: nbsOtnStatsAlarmsChanged.setDescription('The mask of OTN alarms that have changed since OTN statistics collection began, i.e. alarms that have transitioned at least once from clear to raised or from raised to clear.')
nbsOtnpmTrapsEs = NotificationType((1, 3, 6, 1, 4, 1, 629, 222, 100, 0, 10)).setObjects(("NBS-OTNPM-MIB", "nbsOtnpmCurrentIfIndex"), ("IF-MIB", "ifAlias"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentInterval"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentScope"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentEs"))
if mibBuilder.loadTexts: nbsOtnpmTrapsEs.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmTrapsEs.setDescription('Sent at the conclusion of an nbsOtnpmThresholdsInterval if nbsOtnpmThresholdsEs is non-zero and less than or equal to nbsOtnpmCurrentEs.')
nbsOtnpmTrapsEsr = NotificationType((1, 3, 6, 1, 4, 1, 629, 222, 100, 0, 11)).setObjects(("NBS-OTNPM-MIB", "nbsOtnpmCurrentIfIndex"), ("IF-MIB", "ifAlias"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentInterval"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentScope"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentEsrSig"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentEsrExp"))
if mibBuilder.loadTexts: nbsOtnpmTrapsEsr.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmTrapsEsr.setDescription('Sent at the conclusion of an nbsOtnpmThresholdsInterval if nbsOtnpmThresholdsEsr is non-zero and less than or equal to nbsOtnpmCurrentEsr.')
nbsOtnpmTrapsSes = NotificationType((1, 3, 6, 1, 4, 1, 629, 222, 100, 0, 12)).setObjects(("NBS-OTNPM-MIB", "nbsOtnpmCurrentIfIndex"), ("IF-MIB", "ifAlias"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentInterval"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentScope"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentSes"))
if mibBuilder.loadTexts: nbsOtnpmTrapsSes.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmTrapsSes.setDescription('Sent at the conclusion of an nbsOtnpmThresholdsInterval if nbsOtnpmThresholdsSes is non-zero and less than or equal to nbsOtnpmCurrentSes.')
nbsOtnpmTrapsSesr = NotificationType((1, 3, 6, 1, 4, 1, 629, 222, 100, 0, 13)).setObjects(("NBS-OTNPM-MIB", "nbsOtnpmCurrentIfIndex"), ("IF-MIB", "ifAlias"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentInterval"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentScope"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentSesrSig"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentSesrExp"))
if mibBuilder.loadTexts: nbsOtnpmTrapsSesr.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmTrapsSesr.setDescription('Sent at the conclusion of an nbsOtnpmThresholdsInterval if nbsOtnpmThresholdsSesr is non-zero and less than or equal to nbsOtnpmCurrentSesr.')
nbsOtnpmTrapsBbe = NotificationType((1, 3, 6, 1, 4, 1, 629, 222, 100, 0, 14)).setObjects(("NBS-OTNPM-MIB", "nbsOtnpmCurrentIfIndex"), ("IF-MIB", "ifAlias"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentInterval"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentScope"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentBbe"))
if mibBuilder.loadTexts: nbsOtnpmTrapsBbe.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmTrapsBbe.setDescription('Sent at the conclusion of an nbsOtnpmThresholdsInterval if nbsOtnpmThresholdsBbe is non-zero and less than or equal to nbsOtnpmCurrentBbe.')
nbsOtnpmTrapsBber = NotificationType((1, 3, 6, 1, 4, 1, 629, 222, 100, 0, 15)).setObjects(("NBS-OTNPM-MIB", "nbsOtnpmCurrentIfIndex"), ("IF-MIB", "ifAlias"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentInterval"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentScope"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentBberSig"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentBberExp"))
if mibBuilder.loadTexts: nbsOtnpmTrapsBber.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmTrapsBber.setDescription('Sent at the conclusion of an nbsOtnpmThresholdsInterval if nbsOtnpmThresholdsBber is non-zero and less than or equal to nbsOtnpmCurrentBber.')
nbsOtnpmTrapsUas = NotificationType((1, 3, 6, 1, 4, 1, 629, 222, 100, 0, 16)).setObjects(("NBS-OTNPM-MIB", "nbsOtnpmCurrentIfIndex"), ("IF-MIB", "ifAlias"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentInterval"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentScope"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentUas"))
if mibBuilder.loadTexts: nbsOtnpmTrapsUas.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmTrapsUas.setDescription('Sent at the conclusion of an nbsOtnpmThresholdsInterval if nbsOtnpmThresholdsUas is non-zero and less than or equal to nbsOtnpmCurrentUas.')
nbsOtnpmTrapsFc = NotificationType((1, 3, 6, 1, 4, 1, 629, 222, 100, 0, 17)).setObjects(("NBS-OTNPM-MIB", "nbsOtnpmCurrentIfIndex"), ("IF-MIB", "ifAlias"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentInterval"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentScope"), ("NBS-OTNPM-MIB", "nbsOtnpmCurrentFc"))
if mibBuilder.loadTexts: nbsOtnpmTrapsFc.setStatus('current')
if mibBuilder.loadTexts: nbsOtnpmTrapsFc.setDescription('Sent at the conclusion of an nbsOtnpmThresholdsInterval if nbsOtnpmThresholdsFc is non-zero and less than or equal to nbsOtnpmCurrentFc.')
mibBuilder.exportSymbols("NBS-OTNPM-MIB", nbsOtnpmRunningIfIndex=nbsOtnpmRunningIfIndex, nbsOtnpmHistoricGrp=nbsOtnpmHistoricGrp, nbsOtnpmCurrentSes=nbsOtnpmCurrentSes, nbsOtnpmMib=nbsOtnpmMib, nbsOtnpmRunningTable=nbsOtnpmRunningTable, nbsOtnAlarmsRcvdFTFL=nbsOtnAlarmsRcvdFTFL, nbsOtnpmRunningGrp=nbsOtnpmRunningGrp, nbsOtnpmCurrentFc=nbsOtnpmCurrentFc, nbsOtnStatsErrCntTcm6BEI=nbsOtnStatsErrCntTcm6BEI, nbsOtnpmCurrentTime=nbsOtnpmCurrentTime, nbsOtnpmThresholdsTable=nbsOtnpmThresholdsTable, nbsOtnpmRunningBbe=nbsOtnpmRunningBbe, nbsOtnStatsErrCntTcm1BEI=nbsOtnStatsErrCntTcm1BEI, nbsOtnAlarmsTable=nbsOtnAlarmsTable, nbsOtnpmThresholdsSes=nbsOtnpmThresholdsSes, nbsOtnAlarmsTime=nbsOtnAlarmsTime, nbsOtnpmThresholdsEs=nbsOtnpmThresholdsEs, nbsOtnAlarmsDate=nbsOtnAlarmsDate, nbsOtnpmCurrentGrp=nbsOtnpmCurrentGrp, nbsOtnStatsGrp=nbsOtnStatsGrp, nbsOtnpmCurrentEs=nbsOtnpmCurrentEs, nbsOtnpmHistoricEsrSig=nbsOtnpmHistoricEsrSig, nbsOtnAlarmsState=nbsOtnAlarmsState, nbsOtnStatsErrCntTcm4BEI=nbsOtnStatsErrCntTcm4BEI, nbsOtnpmThresholdsIfIndex=nbsOtnpmThresholdsIfIndex, nbsOtnpmHistoricSes=nbsOtnpmHistoricSes, nbsOtnpmCurrentIfIndex=nbsOtnpmCurrentIfIndex, nbsOtnpmCurrentBbe=nbsOtnpmCurrentBbe, nbsOtnpmCurrentEntry=nbsOtnpmCurrentEntry, nbsOtnpmRunningEsrExp=nbsOtnpmRunningEsrExp, nbsOtnAlarmsSpan=nbsOtnAlarmsSpan, nbsOtnStatsErrCntTcm2BEI=nbsOtnStatsErrCntTcm2BEI, nbsOtnpmCurrentBberExp=nbsOtnpmCurrentBberExp, nbsOtnpmCurrentInterval=nbsOtnpmCurrentInterval, nbsOtnStatsAlarmsRaised=nbsOtnStatsAlarmsRaised, nbsOtnpmRunningDate=nbsOtnpmRunningDate, nbsOtnpmCurrentSesrSig=nbsOtnpmCurrentSesrSig, nbsOtnpmRunningAlarmsSupported=nbsOtnpmRunningAlarmsSupported, nbsOtnpmRunningUas=nbsOtnpmRunningUas, nbsOtnAlarmsRaised=nbsOtnAlarmsRaised, nbsOtnStatsErrCntTcm2BIP8=nbsOtnStatsErrCntTcm2BIP8, nbsOtnpmThresholdsSesrSig=nbsOtnpmThresholdsSesrSig, nbsOtnpmHistoricBbe=nbsOtnpmHistoricBbe, nbsOtnpmHistoricUas=nbsOtnpmHistoricUas, nbsOtnpmCurrentDate=nbsOtnpmCurrentDate, nbsOtnpmHistoricIfIndex=nbsOtnpmHistoricIfIndex, nbsOtnpmRunningFc=nbsOtnpmRunningFc, nbsOtnpmEventsGrp=nbsOtnpmEventsGrp, nbsOtnStatsErrCntSectBEI=nbsOtnStatsErrCntSectBEI, nbsOtnStatsErrCntTcm6BIP8=nbsOtnStatsErrCntTcm6BIP8, nbsOtnpmHistoricSesrExp=nbsOtnpmHistoricSesrExp, nbsOtnpmThresholdsInterval=nbsOtnpmThresholdsInterval, nbsOtnpmThresholdsFc=nbsOtnpmThresholdsFc, nbsOtnpmRunningAlarmsChanged=nbsOtnpmRunningAlarmsChanged, nbsOtnpmRunningEntry=nbsOtnpmRunningEntry, nbsOtnStatsAlarmsSupported=nbsOtnStatsAlarmsSupported, nbsOtnpmThresholdsBbe=nbsOtnpmThresholdsBbe, NbsOtnAlarmId=NbsOtnAlarmId, nbsOtnpmTrapsEs=nbsOtnpmTrapsEs, nbsOtnpmHistoricBberExp=nbsOtnpmHistoricBberExp, nbsOtnpmCurrentEsrExp=nbsOtnpmCurrentEsrExp, nbsOtnpmTrapsEsr=nbsOtnpmTrapsEsr, nbsOtnStatsEntry=nbsOtnStatsEntry, nbsOtnpmHistoricScope=nbsOtnpmHistoricScope, nbsOtnStatsErrCntTcm5BEI=nbsOtnStatsErrCntTcm5BEI, nbsOtnpmTrapsSesr=nbsOtnpmTrapsSesr, nbsOtnpmCurrentBberSig=nbsOtnpmCurrentBberSig, nbsOtnpmThresholdsGrp=nbsOtnpmThresholdsGrp, nbsOtnpmThresholdsSesrExp=nbsOtnpmThresholdsSesrExp, nbsOtnAlarmsEntry=nbsOtnAlarmsEntry, nbsOtnpmCurrentAlarmsSupported=nbsOtnpmCurrentAlarmsSupported, nbsOtnpmRunningTime=nbsOtnpmRunningTime, nbsOtnStatsState=nbsOtnStatsState, nbsOtnpmRunningEs=nbsOtnpmRunningEs, nbsOtnStatsErrCntTcm3BEI=nbsOtnStatsErrCntTcm3BEI, nbsOtnStatsErrCntSectBIP8=nbsOtnStatsErrCntSectBIP8, nbsOtnAlarmsIfIndex=nbsOtnAlarmsIfIndex, nbsOtnpmRunningBberSig=nbsOtnpmRunningBberSig, nbsOtnpmHistoricSample=nbsOtnpmHistoricSample, nbsOtnpmThresholdsEsrSig=nbsOtnpmThresholdsEsrSig, nbsOtnStatsErrCntTcm5BIP8=nbsOtnStatsErrCntTcm5BIP8, nbsOtnStatsErrCntTcm1BIP8=nbsOtnStatsErrCntTcm1BIP8, nbsOtnpmRunningBberExp=nbsOtnpmRunningBberExp, nbsOtnpmCurrentScope=nbsOtnpmCurrentScope, nbsOtnpmRunningEsrSig=nbsOtnpmRunningEsrSig, nbsOtnpmTrapsBbe=nbsOtnpmTrapsBbe, nbsOtnpmHistoricEsrExp=nbsOtnpmHistoricEsrExp, nbsOtnpmRunningSesrExp=nbsOtnpmRunningSesrExp, nbsOtnpmHistoricDate=nbsOtnpmHistoricDate, nbsOtnpmCurrentEsrSig=nbsOtnpmCurrentEsrSig, nbsOtnStatsErrCntTcm3BIP8=nbsOtnStatsErrCntTcm3BIP8, nbsOtnpmThresholdsBberSig=nbsOtnpmThresholdsBberSig, nbsOtnStatsTime=nbsOtnStatsTime, nbsOtnpmHistoricBberSig=nbsOtnpmHistoricBberSig, NbsOtnAlarmMask=NbsOtnAlarmMask, nbsOtnpmHistoricTable=nbsOtnpmHistoricTable, nbsOtnpmRunningSes=nbsOtnpmRunningSes, nbsOtnpmHistoricAlarmsRaised=nbsOtnpmHistoricAlarmsRaised, nbsOtnpmRunningSesrSig=nbsOtnpmRunningSesrSig, nbsOtnStatsIfIndex=nbsOtnStatsIfIndex, nbsOtnStatsSpan=nbsOtnStatsSpan, nbsOtnpmCurrentAlarmsRaised=nbsOtnpmCurrentAlarmsRaised, nbsOtnpmHistoricEs=nbsOtnpmHistoricEs, nbsOtnpmThresholdsEntry=nbsOtnpmThresholdsEntry, nbsOtnpmRunningAlarmsRaised=nbsOtnpmRunningAlarmsRaised, nbsOtnpmCurrentUas=nbsOtnpmCurrentUas, nbsOtnpmThresholdsScope=nbsOtnpmThresholdsScope, nbsOtnpmTrapsSes=nbsOtnpmTrapsSes, nbsOtnpmThresholdsEsrExp=nbsOtnpmThresholdsEsrExp, nbsOtnpmCurrentTable=nbsOtnpmCurrentTable, nbsOtnpmHistoricTime=nbsOtnpmHistoricTime, nbsOtnAlarmsGrp=nbsOtnAlarmsGrp, nbsOtnpmTrapsUas=nbsOtnpmTrapsUas, nbsOtnpmHistoricAlarmsSupported=nbsOtnpmHistoricAlarmsSupported, nbsOtnpmTraps=nbsOtnpmTraps, nbsOtnpmCurrentSesrExp=nbsOtnpmCurrentSesrExp, nbsOtnpmTrapsFc=nbsOtnpmTrapsFc, PYSNMP_MODULE_ID=nbsOtnpmMib, nbsOtnpmHistoricFc=nbsOtnpmHistoricFc, nbsOtnAlarmsSupported=nbsOtnAlarmsSupported, nbsOtnAlarmsChanged=nbsOtnAlarmsChanged, nbsOtnStatsTable=nbsOtnStatsTable, nbsOtnStatsErrCntPathBEI=nbsOtnStatsErrCntPathBEI, nbsOtnpmTrapsBber=nbsOtnpmTrapsBber, nbsOtnpmHistoricAlarmsChanged=nbsOtnpmHistoricAlarmsChanged, nbsOtnpmCurrentAlarmsChanged=nbsOtnpmCurrentAlarmsChanged, nbsOtnStatsErrCntPathBIP8=nbsOtnStatsErrCntPathBIP8, nbsOtnpmHistoricSesrSig=nbsOtnpmHistoricSesrSig, nbsOtnpmRunningScope=nbsOtnpmRunningScope, nbsOtnpmThresholdsBberExp=nbsOtnpmThresholdsBberExp, nbsOtnStatsDate=nbsOtnStatsDate, nbsOtnStatsErrCntTcm4BIP8=nbsOtnStatsErrCntTcm4BIP8, nbsOtnpmHistoricEntry=nbsOtnpmHistoricEntry, nbsOtnpmHistoricInterval=nbsOtnpmHistoricInterval, nbsOtnStatsAlarmsChanged=nbsOtnStatsAlarmsChanged, nbsOtnpmThresholdsUas=nbsOtnpmThresholdsUas)
| 144.244755 | 6,082 | 0.794493 | 6,902 | 61,881 | 7.122863 | 0.089829 | 0.065416 | 0.114479 | 0.010821 | 0.557361 | 0.424556 | 0.367581 | 0.346935 | 0.323644 | 0.235243 | 0 | 0.059626 | 0.088008 | 61,881 | 428 | 6,083 | 144.581776 | 0.8115 | 0.005171 | 0 | 0.004773 | 0 | 0.124105 | 0.318527 | 0.036879 | 0 | 0 | 0.00195 | 0 | 0 | 1 | 0 | false | 0 | 0.019093 | 0 | 0.040573 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c2c845d503fa13d0189a5f8e591e18afc37b8a0 | 1,056 | py | Python | tests/authorization/models.py | ninumedia/django-tastypie | 4e4de4dddff9feeda4bed9d1e334ed286c409792 | [
"BSD-3-Clause"
] | null | null | null | tests/authorization/models.py | ninumedia/django-tastypie | 4e4de4dddff9feeda4bed9d1e334ed286c409792 | [
"BSD-3-Clause"
] | 2 | 2017-04-07T16:28:25.000Z | 2017-05-23T04:52:17.000Z | tests/authorization/models.py | ninumedia/django-tastypie | 4e4de4dddff9feeda4bed9d1e334ed286c409792 | [
"BSD-3-Clause"
] | null | null | null | from django.contrib.auth.models import User
from django.contrib.sites.models import Site
from django.db import models
class AuthorProfile(models.Model):
user = models.OneToOneField(User, related_name='author_profile')
short_bio = models.CharField(max_length=255, blank=True, default='')
bio = models.TextField(blank=True, default='')
# We'll use the ``sites`` the author is assigned to as a way to control
# the permissions.
sites = models.ManyToManyField(Site, related_name='author_profiles')
def __unicode__(self):
return u"Profile: {0}".format(self.user.get_full_name())
class Article(models.Model):
# We'll also use the ``authors`` to control perms.
authors = models.ManyToManyField(AuthorProfile, related_name='articles')
title = models.CharField(max_length=255)
url = models.UrlField()
content = models.TextField(blank=True, default='')
added_on = models.DateTimeField(default=datetime.datetime.now)
def __unicode__(self):
return u"{0} - {1}".format(self.title, self.url)
| 37.714286 | 76 | 0.71875 | 140 | 1,056 | 5.285714 | 0.471429 | 0.040541 | 0.064865 | 0.064865 | 0.213514 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010124 | 0.158144 | 1,056 | 27 | 77 | 39.111111 | 0.822272 | 0.127841 | 0 | 0.111111 | 0 | 0 | 0.06325 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.166667 | 0.111111 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
9c31b5c31906504c93ffb05ba04ed22705e74525 | 418 | py | Python | src/spaceone/core/service/__init__.py | stat-kwon/python-core | 387fda0a57705c1329846d1bea83b7b92f24cabe | [
"Apache-2.0"
] | 14 | 2020-06-01T08:17:43.000Z | 2022-01-13T22:37:50.000Z | src/spaceone/core/service/__init__.py | stat-kwon/python-core | 387fda0a57705c1329846d1bea83b7b92f24cabe | [
"Apache-2.0"
] | 7 | 2020-08-11T23:05:59.000Z | 2022-01-12T05:08:49.000Z | src/spaceone/core/service/__init__.py | stat-kwon/python-core | 387fda0a57705c1329846d1bea83b7b92f24cabe | [
"Apache-2.0"
] | 11 | 2020-06-01T08:17:49.000Z | 2021-11-25T08:26:37.000Z | from spaceone.core.service.utils import *
from spaceone.core.service.service import *
__all__ = ['BaseService', 'transaction', 'authentication_handler', 'authorization_handler', 'mutation_handler',
'event_handler', 'check_required', 'append_query_filter', 'change_tag_filter', 'change_timestamp_value',
'change_timestamp_filter', 'change_date_value', 'append_keyword_filter', 'change_only_key']
| 59.714286 | 115 | 0.763158 | 46 | 418 | 6.434783 | 0.586957 | 0.162162 | 0.108108 | 0.155405 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11244 | 418 | 6 | 116 | 69.666667 | 0.797844 | 0 | 0 | 0 | 0 | 0 | 0.578947 | 0.260766 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
9c3652ed616a97f1fd60550284c0a6d47b41caa9 | 1,665 | py | Python | songbook/__main__.py | mathiaswk/songbook | 703cecc1d24d1065fd6218f48451877580c817ef | [
"MIT"
] | null | null | null | songbook/__main__.py | mathiaswk/songbook | 703cecc1d24d1065fd6218f48451877580c817ef | [
"MIT"
] | null | null | null | songbook/__main__.py | mathiaswk/songbook | 703cecc1d24d1065fd6218f48451877580c817ef | [
"MIT"
] | null | null | null | """
songbook.__main__
~~~~~~~~~~~~~~~~~
A one line summary of the module or program, terminated by a period.
"""
import argparse
from songbook import __version__
from songbook.cli import cli
from songbook.gui import gui
def _build_parser():
parser = argparse.ArgumentParser(prog='songbook', description='Create a songbook.')
parser.add_argument(
'-p', '--style', action='store',
help="used for making a new pagenumber style")
parser.add_argument(
'-s', '--number_style', action='store',
help="used to set the pagenumber style in the tex file")
parser.add_argument(
'-n', '--name', action='store',
help="used to set the name")
parser.add_argument(
'-a', '--author', action='store',
help="used to set both unf to true")
parser.add_argument(
'-l', '--logo', action='store',
help="used to get a logo on the front page")
parser.add_argument(
'-e', '--empty', action='store',
help="used to not have a front page")
parser.add_argument(
'-t', '--twosided', action='store_true')
# TODO
parser.add_argument('-v', '--version', action='version', version=f'%(prog)s {__version__}')
parser.add_argument('-i', '--interactive', action='store_true')
parser.add_argument('--order', choices=['fixed', 'sorted', 'random'])
parser.add_argument('--seed', action='store')
parser.add_argument('song', nargs='*')
return parser
def main():
parser = _build_parser()
args = parser.parse_args()
if not args.cli:
cli.run()
else:
gui.run()
if __name__ == '__main__':
main()
| 28.706897 | 95 | 0.609009 | 209 | 1,665 | 4.660287 | 0.397129 | 0.110883 | 0.209446 | 0.117043 | 0.206366 | 0.080082 | 0.055441 | 0 | 0 | 0 | 0 | 0 | 0.224024 | 1,665 | 57 | 96 | 29.210526 | 0.75387 | 0.066667 | 0 | 0.170732 | 0 | 0 | 0.293351 | 0 | 0 | 0 | 0 | 0.017544 | 0 | 1 | 0.04878 | false | 0 | 0.097561 | 0 | 0.170732 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c424d7754343c590e38c919248eeb281e754f55 | 838 | py | Python | q2_api_client/clients/mobile_ws/v2_client.py | jcook00/q2-api-client | 4431af164eb4baf52e26e8842e017cad1609a279 | [
"BSD-2-Clause"
] | null | null | null | q2_api_client/clients/mobile_ws/v2_client.py | jcook00/q2-api-client | 4431af164eb4baf52e26e8842e017cad1609a279 | [
"BSD-2-Clause"
] | null | null | null | q2_api_client/clients/mobile_ws/v2_client.py | jcook00/q2-api-client | 4431af164eb4baf52e26e8842e017cad1609a279 | [
"BSD-2-Clause"
] | null | null | null | from q2_api_client.clients.base_q2_client import BaseQ2Client
from q2_api_client.endpoints.mobile_ws_endpoints import V2Endpoint
class V2Client(BaseQ2Client):
def get_commercial_tax_payments(self):
"""GET /mobilews/v2/commercial/taxpayment
:return: Response object
:rtype: requests.Response
"""
endpoint = V2Endpoint.COMMERCIAL_TAX_PAYMENT.value
return self._get(url=self._build_url(endpoint))
def get_commercial_tax_payment(self, tax_payment_id):
"""GET /mobilews/v2/commercial/taxpayment/{id}
:param str tax_payment_id: path parameter
:return: Response object
:rtype: requests.Response
"""
endpoint = V2Endpoint.COMMERCIAL_TAX_PAYMENT_ID.value.format(id=tax_payment_id)
return self._get(url=self._build_url(endpoint))
| 33.52 | 87 | 0.717184 | 102 | 838 | 5.588235 | 0.382353 | 0.105263 | 0.084211 | 0.052632 | 0.519298 | 0.403509 | 0.403509 | 0.403509 | 0.277193 | 0.277193 | 0 | 0.01632 | 0.195704 | 838 | 24 | 88 | 34.916667 | 0.829377 | 0.272076 | 0 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.222222 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
9c4e08a81fa2cafd46af4c8d0253db5eaf0073bb | 19,808 | py | Python | src/ssh_audit/kexdh.py | stefanjay/ssh-audit | ee811fa3b40df378423c445fec4521ea2c0ecba1 | [
"MIT"
] | 1 | 2021-06-04T19:55:21.000Z | 2021-06-04T19:55:21.000Z | src/ssh_audit/kexdh.py | stefanjay/ssh-audit | ee811fa3b40df378423c445fec4521ea2c0ecba1 | [
"MIT"
] | null | null | null | src/ssh_audit/kexdh.py | stefanjay/ssh-audit | ee811fa3b40df378423c445fec4521ea2c0ecba1 | [
"MIT"
] | null | null | null | """
The MIT License (MIT)
Copyright (C) 2017-2020 Joe Testa (jtesta@positronsecurity.com)
Copyright (C) 2017 Andris Raugulis (moo@arthepsy.eu)
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
"""
import binascii
import os
import random
import struct
# pylint: disable=unused-import
from typing import Dict, List, Set, Sequence, Tuple, Iterable # noqa: F401
from typing import Callable, Optional, Union, Any # noqa: F401
from ssh_audit.protocol import Protocol
from ssh_audit.ssh_socket import SSH_Socket
class KexDH: # pragma: nocover
def __init__(self, kex_name: str, hash_alg: str, g: int, p: int) -> None:
self.__kex_name = kex_name
self.__hash_alg = hash_alg
self.__g = 0
self.__p = 0
self.__q = 0
self.__x = 0
self.__e = 0
self.set_params(g, p)
self.__ed25519_pubkey: Optional[bytes] = None
self.__hostkey_type: Optional[bytes] = None
self.__hostkey_e = 0
self.__hostkey_n = 0
self.__hostkey_n_len = 0 # Length of the host key modulus.
self.__ca_n_len = 0 # Length of the CA key modulus (if hostkey is a cert).
def set_params(self, g: int, p: int) -> None:
self.__g = g
self.__p = p
self.__q = (self.__p - 1) // 2
self.__x = 0
self.__e = 0
def send_init(self, s: SSH_Socket, init_msg: int = Protocol.MSG_KEXDH_INIT) -> None:
r = random.SystemRandom()
self.__x = r.randrange(2, self.__q)
self.__e = pow(self.__g, self.__x, self.__p)
s.write_byte(init_msg)
s.write_mpint2(self.__e)
s.send_packet()
# Parse a KEXDH_REPLY or KEXDH_GEX_REPLY message from the server. This
# contains the host key, among other things. Function returns the host
# key blob (from which the fingerprint can be calculated).
def recv_reply(self, s: 'SSH_Socket', parse_host_key_size: bool = True) -> Optional[bytes]:
packet_type, payload = s.read_packet(2)
# Skip any & all MSG_DEBUG messages.
while packet_type == Protocol.MSG_DEBUG:
packet_type, payload = s.read_packet(2)
if packet_type != -1 and packet_type not in [Protocol.MSG_KEXDH_REPLY, Protocol.MSG_KEXDH_GEX_REPLY]: # pylint: disable=no-else-raise
# TODO: change Exception to something more specific.
raise Exception('Expected MSG_KEXDH_REPLY (%d) or MSG_KEXDH_GEX_REPLY (%d), but got %d instead.' % (Protocol.MSG_KEXDH_REPLY, Protocol.MSG_KEXDH_GEX_REPLY, packet_type))
elif packet_type == -1:
# A connection error occurred. We can't parse anything, so just
# return. The host key modulus (and perhaps certificate modulus)
# will remain at length 0.
return None
hostkey_len = 0 # pylint: disable=unused-variable
hostkey_type_len = hostkey_e_len = 0 # pylint: disable=unused-variable
key_id_len = principles_len = 0 # pylint: disable=unused-variable
critical_options_len = extensions_len = 0 # pylint: disable=unused-variable
nonce_len = ca_key_len = ca_key_type_len = 0 # pylint: disable=unused-variable
ca_key_len = ca_key_type_len = ca_key_e_len = 0 # pylint: disable=unused-variable
key_id = principles = None # pylint: disable=unused-variable
critical_options = extensions = None # pylint: disable=unused-variable
nonce = ca_key = ca_key_type = None # pylint: disable=unused-variable
ca_key_e = ca_key_n = None # pylint: disable=unused-variable
# Get the host key blob, F, and signature.
ptr = 0
hostkey, hostkey_len, ptr = KexDH.__get_bytes(payload, ptr)
# If we are not supposed to parse the host key size (i.e.: it is a type that is of fixed size such as ed25519), then stop here.
if not parse_host_key_size:
return hostkey
_, _, ptr = KexDH.__get_bytes(payload, ptr)
_, _, ptr = KexDH.__get_bytes(payload, ptr)
# Now pick apart the host key blob.
# Get the host key type (i.e.: 'ssh-rsa', 'ssh-ed25519', etc).
ptr = 0
self.__hostkey_type, hostkey_type_len, ptr = KexDH.__get_bytes(hostkey, ptr)
# If this is an RSA certificate, skip over the nonce.
if self.__hostkey_type.startswith(b'ssh-rsa-cert-v0'):
nonce, nonce_len, ptr = KexDH.__get_bytes(hostkey, ptr)
# The public key exponent.
hostkey_e, hostkey_e_len, ptr = KexDH.__get_bytes(hostkey, ptr)
self.__hostkey_e = int(binascii.hexlify(hostkey_e), 16)
# Here is the modulus size & actual modulus of the host key public key.
hostkey_n, self.__hostkey_n_len, ptr = KexDH.__get_bytes(hostkey, ptr)
self.__hostkey_n = int(binascii.hexlify(hostkey_n), 16)
# If this is an RSA certificate, continue parsing to extract the CA
# key.
if self.__hostkey_type.startswith(b'ssh-rsa-cert-v0'):
# Skip over the serial number.
ptr += 8
# Get the certificate type.
cert_type = int(binascii.hexlify(hostkey[ptr:ptr + 4]), 16)
ptr += 4
# Only SSH2_CERT_TYPE_HOST (2) makes sense in this context.
if cert_type == 2:
# Skip the key ID (this is the serial number of the
# certificate).
key_id, key_id_len, ptr = KexDH.__get_bytes(hostkey, ptr)
# The principles, which are... I don't know what.
principles, principles_len, ptr = KexDH.__get_bytes(hostkey, ptr)
# Skip over the timestamp that this certificate is valid after.
ptr += 8
# Skip over the timestamp that this certificate is valid before.
ptr += 8
# TODO: validate the principles, and time range.
# The critical options.
critical_options, critical_options_len, ptr = KexDH.__get_bytes(hostkey, ptr)
# Certificate extensions.
extensions, extensions_len, ptr = KexDH.__get_bytes(hostkey, ptr)
# Another nonce.
nonce, nonce_len, ptr = KexDH.__get_bytes(hostkey, ptr)
# Finally, we get to the CA key.
ca_key, ca_key_len, ptr = KexDH.__get_bytes(hostkey, ptr)
# Last in the host key blob is the CA signature. It isn't
# interesting to us, so we won't bother parsing any further.
# The CA key has the modulus, however...
ptr = 0
# 'ssh-rsa', 'rsa-sha2-256', etc.
ca_key_type, ca_key_type_len, ptr = KexDH.__get_bytes(ca_key, ptr)
# CA's public key exponent.
ca_key_e, ca_key_e_len, ptr = KexDH.__get_bytes(ca_key, ptr)
# CA's modulus. Bingo.
ca_key_n, self.__ca_n_len, ptr = KexDH.__get_bytes(ca_key, ptr)
return hostkey
@staticmethod
def __get_bytes(buf: bytes, ptr: int) -> Tuple[bytes, int, int]:
num_bytes = struct.unpack('>I', buf[ptr:ptr + 4])[0]
ptr += 4
return buf[ptr:ptr + num_bytes], num_bytes, ptr + num_bytes
# Converts a modulus length in bytes to its size in bits, after some
# possible adjustments.
@staticmethod
def __adjust_key_size(size: int) -> int:
size = size * 8
# Actual keys are observed to be about 8 bits bigger than expected
# (i.e.: 1024-bit keys have a 1032-bit modulus). Check if this is
# the case, and subtract 8 if so. This simply improves readability
# in the UI.
if (size >> 3) % 2 != 0:
size = size - 8
return size
# Returns the size of the hostkey, in bits.
def get_hostkey_size(self) -> int:
return KexDH.__adjust_key_size(self.__hostkey_n_len)
# Returns the size of the CA key, in bits.
def get_ca_size(self) -> int:
return KexDH.__adjust_key_size(self.__ca_n_len)
# Returns the size of the DH modulus, in bits.
def get_dh_modulus_size(self) -> int:
# -2 to account for the '0b' prefix in the string.
return len(bin(self.__p)) - 2
class KexGroup1(KexDH): # pragma: nocover
def __init__(self) -> None:
# rfc2409: second oakley group
p = int('ffffffffffffffffc90fdaa22168c234c4c6628b80dc1cd129024e088a67cc74020bbea63b139b22514a08798e3404ddef9519b3cd3a431b302b0a6df25f14374fe1356d6d51c245e485b576625e7ec6f44c42e9a637ed6b0bff5cb6f406b7edee386bfb5a899fa5ae9f24117c4b1fe649286651ece65381ffffffffffffffff', 16)
super(KexGroup1, self).__init__('KexGroup1', 'sha1', 2, p)
class KexGroup14(KexDH): # pragma: nocover
def __init__(self, hash_alg: str) -> None:
# rfc3526: 2048-bit modp group
p = int('ffffffffffffffffc90fdaa22168c234c4c6628b80dc1cd129024e088a67cc74020bbea63b139b22514a08798e3404ddef9519b3cd3a431b302b0a6df25f14374fe1356d6d51c245e485b576625e7ec6f44c42e9a637ed6b0bff5cb6f406b7edee386bfb5a899fa5ae9f24117c4b1fe649286651ece45b3dc2007cb8a163bf0598da48361c55d39a69163fa8fd24cf5f83655d23dca3ad961c62f356208552bb9ed529077096966d670c354e4abc9804f1746c08ca18217c32905e462e36ce3be39e772c180e86039b2783a2ec07a28fb5c55df06f4c52c9de2bcbf6955817183995497cea956ae515d2261898fa051015728e5a8aacaa68ffffffffffffffff', 16)
super(KexGroup14, self).__init__('KexGroup14', hash_alg, 2, p)
class KexGroup14_SHA1(KexGroup14):
def __init__(self) -> None:
super(KexGroup14_SHA1, self).__init__('sha1')
class KexGroup14_SHA256(KexGroup14):
def __init__(self) -> None:
super(KexGroup14_SHA256, self).__init__('sha256')
class KexGroup16_SHA512(KexDH):
def __init__(self) -> None:
# rfc3526: 4096-bit modp group
p = int('ffffffffffffffffc90fdaa22168c234c4c6628b80dc1cd129024e088a67cc74020bbea63b139b22514a08798e3404ddef9519b3cd3a431b302b0a6df25f14374fe1356d6d51c245e485b576625e7ec6f44c42e9a637ed6b0bff5cb6f406b7edee386bfb5a899fa5ae9f24117c4b1fe649286651ece45b3dc2007cb8a163bf0598da48361c55d39a69163fa8fd24cf5f83655d23dca3ad961c62f356208552bb9ed529077096966d670c354e4abc9804f1746c08ca18217c32905e462e36ce3be39e772c180e86039b2783a2ec07a28fb5c55df06f4c52c9de2bcbf6955817183995497cea956ae515d2261898fa051015728e5a8aaac42dad33170d04507a33a85521abdf1cba64ecfb850458dbef0a8aea71575d060c7db3970f85a6e1e4c7abf5ae8cdb0933d71e8c94e04a25619dcee3d2261ad2ee6bf12ffa06d98a0864d87602733ec86a64521f2b18177b200cbbe117577a615d6c770988c0bad946e208e24fa074e5ab3143db5bfce0fd108e4b82d120a92108011a723c12a787e6d788719a10bdba5b2699c327186af4e23c1a946834b6150bda2583e9ca2ad44ce8dbbbc2db04de8ef92e8efc141fbecaa6287c59474e6bc05d99b2964fa090c3a2233ba186515be7ed1f612970cee2d7afb81bdd762170481cd0069127d5b05aa993b4ea988d8fddc186ffb7dc90a6c08f4df435c934063199ffffffffffffffff', 16)
super(KexGroup16_SHA512, self).__init__('KexGroup16_SHA512', 'sha512', 2, p)
class KexGroup18_SHA512(KexDH):
def __init__(self) -> None:
# rfc3526: 8192-bit modp group
p = int('ffffffffffffffffc90fdaa22168c234c4c6628b80dc1cd129024e088a67cc74020bbea63b139b22514a08798e3404ddef9519b3cd3a431b302b0a6df25f14374fe1356d6d51c245e485b576625e7ec6f44c42e9a637ed6b0bff5cb6f406b7edee386bfb5a899fa5ae9f24117c4b1fe649286651ece45b3dc2007cb8a163bf0598da48361c55d39a69163fa8fd24cf5f83655d23dca3ad961c62f356208552bb9ed529077096966d670c354e4abc9804f1746c08ca18217c32905e462e36ce3be39e772c180e86039b2783a2ec07a28fb5c55df06f4c52c9de2bcbf6955817183995497cea956ae515d2261898fa051015728e5a8aaac42dad33170d04507a33a85521abdf1cba64ecfb850458dbef0a8aea71575d060c7db3970f85a6e1e4c7abf5ae8cdb0933d71e8c94e04a25619dcee3d2261ad2ee6bf12ffa06d98a0864d87602733ec86a64521f2b18177b200cbbe117577a615d6c770988c0bad946e208e24fa074e5ab3143db5bfce0fd108e4b82d120a92108011a723c12a787e6d788719a10bdba5b2699c327186af4e23c1a946834b6150bda2583e9ca2ad44ce8dbbbc2db04de8ef92e8efc141fbecaa6287c59474e6bc05d99b2964fa090c3a2233ba186515be7ed1f612970cee2d7afb81bdd762170481cd0069127d5b05aa993b4ea988d8fddc186ffb7dc90a6c08f4df435c93402849236c3fab4d27c7026c1d4dcb2602646dec9751e763dba37bdf8ff9406ad9e530ee5db382f413001aeb06a53ed9027d831179727b0865a8918da3edbebcf9b14ed44ce6cbaced4bb1bdb7f1447e6cc254b332051512bd7af426fb8f401378cd2bf5983ca01c64b92ecf032ea15d1721d03f482d7ce6e74fef6d55e702f46980c82b5a84031900b1c9e59e7c97fbec7e8f323a97a7e36cc88be0f1d45b7ff585ac54bd407b22b4154aacc8f6d7ebf48e1d814cc5ed20f8037e0a79715eef29be32806a1d58bb7c5da76f550aa3d8a1fbff0eb19ccb1a313d55cda56c9ec2ef29632387fe8d76e3c0468043e8f663f4860ee12bf2d5b0b7474d6e694f91e6dbe115974a3926f12fee5e438777cb6a932df8cd8bec4d073b931ba3bc832b68d9dd300741fa7bf8afc47ed2576f6936ba424663aab639c5ae4f5683423b4742bf1c978238f16cbe39d652de3fdb8befc848ad922222e04a4037c0713eb57a81a23f0c73473fc646cea306b4bcbc8862f8385ddfa9d4b7fa2c087e879683303ed5bdd3a062b3cf5b3a278a66d2a13f83f44f82ddf310ee074ab6a364597e899a0255dc164f31cc50846851df9ab48195ded7ea1b1d510bd7ee74d73faf36bc31ecfa268359046f4eb879f924009438b481c6cd7889a002ed5ee382bc9190da6fc026e479558e4475677e9aa9e3050e2765694dfc81f56e880b96e7160c980dd98edd3dfffffffffffffffff', 16)
super(KexGroup18_SHA512, self).__init__('KexGroup18_SHA512', 'sha512', 2, p)
class KexCurve25519_SHA256(KexDH):
def __init__(self) -> None:
super(KexCurve25519_SHA256, self).__init__('KexCurve25519_SHA256', 'sha256', 0, 0)
# To start an ED25519 kex, we simply send a random 256-bit number as the
# public key.
def send_init(self, s: 'SSH_Socket', init_msg: int = Protocol.MSG_KEXDH_INIT) -> None:
self.__ed25519_pubkey = os.urandom(32)
s.write_byte(init_msg)
s.write_string(self.__ed25519_pubkey)
s.send_packet()
class KexNISTP256(KexDH):
def __init__(self) -> None:
super(KexNISTP256, self).__init__('KexNISTP256', 'sha256', 0, 0)
# Because the server checks that the value sent here is valid (i.e.: it lies
# on the curve, among other things), we would have to write a lot of code
# or import an elliptic curve library in order to randomly generate a
# valid elliptic point each time. Hence, we will simply send a static
# value, which is enough for us to extract the server's host key.
def send_init(self, s: 'SSH_Socket', init_msg: int = Protocol.MSG_KEXDH_INIT) -> None:
s.write_byte(init_msg)
s.write_string(b'\x04\x0b\x60\x44\x9f\x8a\x11\x9e\xc7\x81\x0c\xa9\x98\xfc\xb7\x90\xaa\x6b\x26\x8c\x12\x4a\xc0\x09\xbb\xdf\xc4\x2c\x4c\x2c\x99\xb6\xe1\x71\xa0\xd4\xb3\x62\x47\x74\xb3\x39\x0c\xf2\x88\x4a\x84\x6b\x3b\x15\x77\xa5\x77\xd2\xa9\xc9\x94\xf9\xd5\x66\x19\xcd\x02\x34\xd1')
s.send_packet()
class KexNISTP384(KexDH):
def __init__(self) -> None:
super(KexNISTP384, self).__init__('KexNISTP384', 'sha256', 0, 0)
# See comment for KexNISTP256.send_init().
def send_init(self, s: 'SSH_Socket', init_msg: int = Protocol.MSG_KEXDH_INIT) -> None:
s.write_byte(init_msg)
s.write_string(b'\x04\xe2\x9b\x84\xce\xa1\x39\x50\xfe\x1e\xa3\x18\x70\x1c\xe2\x7a\xe4\xb5\x6f\xdf\x93\x9f\xd4\xf4\x08\xcc\x9b\x02\x10\xa4\xca\x77\x9c\x2e\x51\x44\x1d\x50\x7a\x65\x4e\x7e\x2f\x10\x2d\x2d\x4a\x32\xc9\x8e\x18\x75\x90\x6c\x19\x10\xda\xcc\xa8\xe9\xf4\xc4\x3a\x53\x80\x35\xf4\x97\x9c\x04\x16\xf9\x5a\xdc\xcc\x05\x94\x29\xfa\xc4\xd6\x87\x4e\x13\x21\xdb\x3d\x12\xac\xbd\x20\x3b\x60\xff\xe6\x58\x42')
s.send_packet()
class KexNISTP521(KexDH):
def __init__(self) -> None:
super(KexNISTP521, self).__init__('KexNISTP521', 'sha256', 0, 0)
# See comment for KexNISTP256.send_init().
def send_init(self, s: 'SSH_Socket', init_msg: int = Protocol.MSG_KEXDH_INIT) -> None:
s.write_byte(init_msg)
s.write_string(b'\x04\x01\x02\x90\x29\xe9\x8f\xa8\x04\xaf\x1c\x00\xf9\xc6\x29\xc0\x39\x74\x8e\xea\x47\x7e\x7c\xf7\x15\x6e\x43\x3b\x59\x13\x53\x43\xb0\xae\x0b\xe7\xe6\x7c\x55\x73\x52\xa5\x2a\xc1\x42\xde\xfc\xf4\x1f\x8b\x5a\x8d\xfa\xcd\x0a\x65\x77\xa8\xce\x68\xd2\xc6\x26\xb5\x3f\xee\x4b\x01\x7b\xd2\x96\x23\x69\x53\xc7\x01\xe1\x0d\x39\xe9\x87\x49\x3b\xc8\xec\xda\x0c\xf9\xca\xad\x89\x42\x36\x6f\x93\x78\x78\x31\x55\x51\x09\x51\xc0\x96\xd7\xea\x61\xbf\xc2\x44\x08\x80\x43\xed\xc6\xbb\xfb\x94\xbd\xf8\xdf\x2b\xd8\x0b\x2e\x29\x1b\x8c\xc4\x8a\x04\x2d\x3a')
s.send_packet()
class KexGroupExchange(KexDH):
def __init__(self, classname: str, hash_alg: str) -> None:
super(KexGroupExchange, self).__init__(classname, hash_alg, 0, 0)
def send_init(self, s: 'SSH_Socket', init_msg: int = Protocol.MSG_KEXDH_GEX_REQUEST) -> None:
self.send_init_gex(s)
# The group exchange starts with sending a message to the server with
# the minimum, maximum, and preferred number of bits are for the DH group.
# The server responds with a generator and prime modulus that matches that,
# then the handshake continues on like a normal DH handshake (except the
# SSH message types differ).
def send_init_gex(self, s: 'SSH_Socket', minbits: int = 1024, prefbits: int = 2048, maxbits: int = 8192) -> None:
# Send the initial group exchange request. Tell the server what range
# of modulus sizes we will accept, along with our preference.
s.write_byte(Protocol.MSG_KEXDH_GEX_REQUEST)
s.write_int(minbits)
s.write_int(prefbits)
s.write_int(maxbits)
s.send_packet()
packet_type, payload = s.read_packet(2)
if packet_type not in [Protocol.MSG_KEXDH_GEX_GROUP, Protocol.MSG_DEBUG]:
# TODO: replace with a better exception type.
raise Exception('Expected MSG_KEXDH_GEX_REPLY (%d), but got %d instead.' % (Protocol.MSG_KEXDH_GEX_REPLY, packet_type))
# Skip any & all MSG_DEBUG messages.
while packet_type == Protocol.MSG_DEBUG:
packet_type, payload = s.read_packet(2)
# Parse the modulus (p) and generator (g) values from the server.
ptr = 0
p_len = struct.unpack('>I', payload[ptr:ptr + 4])[0]
ptr += 4
p = int(binascii.hexlify(payload[ptr:ptr + p_len]), 16)
ptr += p_len
g_len = struct.unpack('>I', payload[ptr:ptr + 4])[0]
ptr += 4
g = int(binascii.hexlify(payload[ptr:ptr + g_len]), 16)
ptr += g_len
# Now that we got the generator and modulus, perform the DH exchange
# like usual.
super(KexGroupExchange, self).set_params(g, p)
super(KexGroupExchange, self).send_init(s, Protocol.MSG_KEXDH_GEX_INIT)
class KexGroupExchange_SHA1(KexGroupExchange):
def __init__(self) -> None:
super(KexGroupExchange_SHA1, self).__init__('KexGroupExchange_SHA1', 'sha1')
class KexGroupExchange_SHA256(KexGroupExchange):
def __init__(self) -> None:
super(KexGroupExchange_SHA256, self).__init__('KexGroupExchange_SHA256', 'sha256')
| 54.120219 | 2,071 | 0.721628 | 2,380 | 19,808 | 5.783613 | 0.247479 | 0.008718 | 0.012786 | 0.018598 | 0.239811 | 0.212278 | 0.161714 | 0.118344 | 0.111733 | 0.077152 | 0 | 0.190828 | 0.194164 | 19,808 | 365 | 2,072 | 54.268493 | 0.671532 | 0.257674 | 0 | 0.303191 | 0 | 0.015957 | 0.37573 | 0.347969 | 0 | 1 | 0 | 0.00274 | 0 | 1 | 0.148936 | false | 0 | 0.042553 | 0.015957 | 0.308511 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c5707ff8012fd28dc76181051dd92cc37e3c79f | 2,107 | py | Python | Chatbot_DockerVersion/webapp/requirements/mindmeld/tests/test_request.py | ptrckhmmr/ChatBotforCulturalInstitutions | c3da1a6d142e306c2e3183ba5609553e15a0e124 | [
"Apache-2.0"
] | 1 | 2020-12-24T13:28:35.000Z | 2020-12-24T13:28:35.000Z | Chatbot_DockerVersion/webapp/requirements/mindmeld/tests/test_request.py | ptrckhmmr/ChatBotforCulturalInstitutions | c3da1a6d142e306c2e3183ba5609553e15a0e124 | [
"Apache-2.0"
] | null | null | null | Chatbot_DockerVersion/webapp/requirements/mindmeld/tests/test_request.py | ptrckhmmr/ChatBotforCulturalInstitutions | c3da1a6d142e306c2e3183ba5609553e15a0e124 | [
"Apache-2.0"
] | null | null | null | import pytest
from attr.exceptions import FrozenInstanceError
from mindmeld.components.request import Request, Params, FrozenParams
@pytest.fixture
def request():
return Request(domain='some_domain', intent='some_intent', entities=(), text='some_text')
def test_domain(request):
with pytest.raises(FrozenInstanceError):
request.domain = 'new_domain'
def test_intent(request):
with pytest.raises(FrozenInstanceError):
request.intent = 'new_intent'
def test_entities(request):
with pytest.raises(FrozenInstanceError):
request.entities = ('some_entity',)
def test_text(request):
with pytest.raises(FrozenInstanceError):
request.text = 'some_text'
def test_frame(request):
with pytest.raises(FrozenInstanceError):
request.frame = {'key': 'value'}
def test_params(request):
with pytest.raises(FrozenInstanceError):
request.params = {'key': 'value'}
def test_context(request):
with pytest.raises(FrozenInstanceError):
request.context = {'key': 'value'}
def test_nbest(request):
with pytest.raises(FrozenInstanceError):
request.confidences = {'key': 'value'}
with pytest.raises(FrozenInstanceError):
request.nbest_transcripts_text = ['some_text']
with pytest.raises(FrozenInstanceError):
request.nbest_transcripts_entities = [{'key': 'value'}]
with pytest.raises(FrozenInstanceError):
request.nbest_aligned_entities = [{'key': 'value'}]
def test_immutability_of_request_and_params():
"""Test the immutability of the request and params objects"""
with pytest.raises(FrozenInstanceError):
params = FrozenParams()
params.allowed_intents = []
with pytest.raises(TypeError):
params = FrozenParams()
params.dynamic_resource['a'] = 'b'
with pytest.raises(FrozenInstanceError):
request = Request()
request.params = Params()
with pytest.raises(TypeError):
request = Request()
request.frame['a'] = 'b'
| 27.363636 | 94 | 0.667299 | 212 | 2,107 | 6.495283 | 0.207547 | 0.108932 | 0.174292 | 0.330428 | 0.472767 | 0.41467 | 0.129993 | 0.079884 | 0 | 0 | 0 | 0 | 0.219269 | 2,107 | 76 | 95 | 27.723684 | 0.837082 | 0.026103 | 0 | 0.387755 | 0 | 0 | 0.067005 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.204082 | false | 0 | 0.061224 | 0.020408 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9c6296b20a014354603e01eea7e45b5500fe9c61 | 4,878 | py | Python | src/fidesctl/cli/commands/generate.py | ethyca/fides | 03f4521cba31f01e482d3b7d02f6f1f59abaae74 | [
"Apache-2.0"
] | 153 | 2021-11-01T20:59:33.000Z | 2022-03-31T02:31:50.000Z | src/fidesctl/cli/commands/generate.py | ethyca/fides | 03f4521cba31f01e482d3b7d02f6f1f59abaae74 | [
"Apache-2.0"
] | 205 | 2021-11-03T12:33:03.000Z | 2022-03-31T20:33:14.000Z | src/fidesctl/cli/commands/generate.py | ethyca/fides | 03f4521cba31f01e482d3b7d02f6f1f59abaae74 | [
"Apache-2.0"
] | 24 | 2021-11-03T15:08:14.000Z | 2022-03-30T05:51:45.000Z | """Contains the generate group of CLI commands for Fidesctl."""
import click
from fidesctl.cli.options import (
aws_access_key_id_option,
aws_region_option,
aws_secret_access_key_option,
connection_string_option,
credentials_id_option,
include_null_flag,
okta_org_url_option,
okta_token_option,
)
from fidesctl.cli.utils import (
handle_aws_credentials_options,
handle_database_credentials_options,
handle_okta_credentials_options,
with_analytics,
)
from fidesctl.core import dataset as _dataset
from fidesctl.core import system as _system
@click.group(name="generate")
@click.pass_context
def generate(ctx: click.Context) -> None:
"""
Generate fidesctl resource types
"""
@generate.group(name="dataset")
@click.pass_context
def generate_dataset(ctx: click.Context) -> None:
"""
Generate fidesctl Dataset resources
"""
@generate_dataset.command(name="db")
@click.pass_context
@click.argument("output_filename", type=str)
@credentials_id_option
@connection_string_option
@include_null_flag
@with_analytics
def generate_dataset_db(
ctx: click.Context,
output_filename: str,
connection_string: str,
credentials_id: str,
include_null: bool,
) -> None:
"""
Connect to a database directly via a SQLAlchemy-style connection string and
generate a dataset manifest file that consists of every schema/table/field.
Connection string can be supplied as an option or a credentials reference
to fidesctl config.
This is a one-time operation that does not track the state of the database.
It will need to be run again if the database schema changes.
"""
actual_connection_string = handle_database_credentials_options(
fides_config=ctx.obj["CONFIG"],
connection_string=connection_string,
credentials_id=credentials_id,
)
_dataset.generate_dataset_db(
connection_string=actual_connection_string,
file_name=output_filename,
include_null=include_null,
)
@generate.group(name="system")
@click.pass_context
def generate_system(ctx: click.Context) -> None:
"""
Generate fidesctl System resources
"""
@generate_system.command(name="okta")
@click.pass_context
@click.argument("output_filename", type=str)
@credentials_id_option
@okta_org_url_option
@okta_token_option
@include_null_flag
@with_analytics
def generate_system_okta(
ctx: click.Context,
output_filename: str,
credentials_id: str,
token: str,
org_url: str,
include_null: bool,
) -> None:
"""
Generates systems for your Okta applications. Connect to an Okta admin
account by providing an organization url and auth token or a credentials
reference to fidesctl config. Auth token and organization url can also
be supplied by setting environment variables as defined by the okta python sdk.
This is a one-time operation that does not track the state of the okta resources.
It will need to be run again if the tracked resources change.
"""
okta_config = handle_okta_credentials_options(
fides_config=ctx.obj["CONFIG"],
token=token,
org_url=org_url,
credentials_id=credentials_id,
)
_system.generate_system_okta(
okta_config=okta_config,
file_name=output_filename,
include_null=include_null,
)
@generate_system.command(name="aws")
@click.pass_context
@click.argument("output_filename", type=str)
@credentials_id_option
@aws_access_key_id_option
@aws_secret_access_key_option
@aws_region_option
@include_null_flag
@click.option("-o", "--organization", type=str, default="default_organization")
@with_analytics
def generate_system_aws(
ctx: click.Context,
output_filename: str,
include_null: bool,
organization: str,
credentials_id: str,
access_key_id: str,
secret_access_key: str,
region: str,
) -> None:
"""
Connect to an aws account and generate a system manifest file that consists of every
tracked resource.
Credentials can be supplied as options, a credentials
reference to fidesctl config, or boto3 environment configuration.
Tracked resources: [Redshift, RDS]
This is a one-time operation that does not track the state of the aws resources.
It will need to be run again if the tracked resources change.
"""
config = ctx.obj["CONFIG"]
aws_config = handle_aws_credentials_options(
fides_config=config,
access_key_id=access_key_id,
secret_access_key=secret_access_key,
region=region,
credentials_id=credentials_id,
)
_system.generate_system_aws(
file_name=output_filename,
include_null=include_null,
organization_key=organization,
aws_config=aws_config,
url=config.cli.server_url,
headers=config.user.request_headers,
)
| 28.526316 | 88 | 0.732062 | 640 | 4,878 | 5.320313 | 0.198438 | 0.049633 | 0.028194 | 0.02467 | 0.472834 | 0.425551 | 0.308957 | 0.218502 | 0.179148 | 0.140675 | 0 | 0.000254 | 0.193317 | 4,878 | 170 | 89 | 28.694118 | 0.865057 | 0.284338 | 0 | 0.382609 | 1 | 0 | 0.038727 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052174 | false | 0.052174 | 0.043478 | 0 | 0.095652 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
9c71942acd9c4d02bbf5f89b10912c5dba214492 | 5,682 | py | Python | espnet/nets/asr_interface.py | kqwyf/espnet | 78057f3ad8f69998cfa8b8c88fe51c5943f67e41 | [
"Apache-2.0"
] | 3 | 2021-05-27T13:33:37.000Z | 2021-10-06T05:52:20.000Z | espnet/nets/asr_interface.py | kqwyf/espnet | 78057f3ad8f69998cfa8b8c88fe51c5943f67e41 | [
"Apache-2.0"
] | null | null | null | espnet/nets/asr_interface.py | kqwyf/espnet | 78057f3ad8f69998cfa8b8c88fe51c5943f67e41 | [
"Apache-2.0"
] | 2 | 2021-11-30T07:42:44.000Z | 2021-12-01T07:10:01.000Z | """ASR Interface module."""
import argparse
from espnet.bin.asr_train import get_parser
from espnet.utils.dynamic_import import dynamic_import
from espnet.utils.fill_missing_args import fill_missing_args
class ASRInterface:
"""ASR Interface for ESPnet model implementation."""
@staticmethod
def add_arguments(parser):
"""Add arguments to parser."""
return parser
@classmethod
def build(cls, idim: int, odim: int, **kwargs):
"""Initialize this class with python-level args.
Args:
idim (int): The number of an input feature dim.
odim (int): The number of output vocab.
Returns:
ASRinterface: A new instance of ASRInterface.
"""
def wrap(parser):
return get_parser(parser, required=False)
args = argparse.Namespace(**kwargs)
args = fill_missing_args(args, wrap)
args = fill_missing_args(args, cls.add_arguments)
return cls(idim, odim, args)
def forward(self, xs, ilens, ys):
"""Compute loss for training.
:param xs:
For pytorch, batch of padded source sequences torch.Tensor (B, Tmax, idim)
For chainer, list of source sequences chainer.Variable
:param ilens: batch of lengths of source sequences (B)
For pytorch, torch.Tensor
For chainer, list of int
:param ys:
For pytorch, batch of padded source sequences torch.Tensor (B, Lmax)
For chainer, list of source sequences chainer.Variable
:return: loss value
:rtype: torch.Tensor for pytorch, chainer.Variable for chainer
"""
raise NotImplementedError("forward method is not implemented")
def recognize(self, x, recog_args, char_list=None, rnnlm=None):
"""Recognize x for evaluation.
:param ndarray x: input acouctic feature (B, T, D) or (T, D)
:param namespace recog_args: argment namespace contraining options
:param list char_list: list of characters
:param torch.nn.Module rnnlm: language model module
:return: N-best decoding results
:rtype: list
"""
raise NotImplementedError("recognize method is not implemented")
def recognize_batch(self, x, recog_args, char_list=None, rnnlm=None):
"""Beam search implementation for batch.
:param torch.Tensor x: encoder hidden state sequences (B, Tmax, Henc)
:param namespace recog_args: argument namespace containing options
:param list char_list: list of characters
:param torch.nn.Module rnnlm: language model module
:return: N-best decoding results
:rtype: list
"""
raise NotImplementedError("Batch decoding is not supported yet.")
def calculate_all_attentions(self, xs, ilens, ys):
"""Caluculate attention.
:param list xs: list of padded input sequences [(T1, idim), (T2, idim), ...]
:param ndarray ilens: batch of lengths of input sequences (B)
:param list ys: list of character id sequence tensor [(L1), (L2), (L3), ...]
:return: attention weights (B, Lmax, Tmax)
:rtype: float ndarray
"""
raise NotImplementedError("calculate_all_attentions method is not implemented")
def calculate_all_ctc_probs(self, xs, ilens, ys):
"""Caluculate CTC probability.
:param list xs_pad: list of padded input sequences [(T1, idim), (T2, idim), ...]
:param ndarray ilens: batch of lengths of input sequences (B)
:param list ys: list of character id sequence tensor [(L1), (L2), (L3), ...]
:return: CTC probabilities (B, Tmax, vocab)
:rtype: float ndarray
"""
raise NotImplementedError("calculate_all_ctc_probs method is not implemented")
@property
def attention_plot_class(self):
"""Get attention plot class."""
from espnet.asr.asr_utils import PlotAttentionReport
return PlotAttentionReport
@property
def ctc_plot_class(self):
"""Get CTC plot class."""
from espnet.asr.asr_utils import PlotCTCReport
return PlotCTCReport
def encode(self, feat):
"""Encode feature in `beam_search` (optional).
Args:
x (numpy.ndarray): input feature (T, D)
Returns:
torch.Tensor for pytorch, chainer.Variable for chainer:
encoded feature (T, D)
"""
raise NotImplementedError("encode method is not implemented")
def scorers(self):
"""Get scorers for `beam_search` (optional).
Returns:
dict[str, ScorerInterface]: dict of `ScorerInterface` objects
"""
raise NotImplementedError("decoders method is not implemented")
predefined_asr = {
"pytorch": {
"rnn": "espnet.nets.pytorch_backend.e2e_asr:E2E",
"transducer": "espnet.nets.pytorch_backend.e2e_asr_transducer:E2E",
"transformer": "espnet.nets.pytorch_backend.e2e_asr_transformer:E2E",
},
"chainer": {
"rnn": "espnet.nets.chainer_backend.e2e_asr:E2E",
"transformer": "espnet.nets.chainer_backend.e2e_asr_transformer:E2E",
},
}
def dynamic_import_asr(module, backend):
"""Import ASR models dynamically.
Args:
module (str): module_name:class_name or alias in `predefined_asr`
backend (str): NN backend. e.g., pytorch, chainer
Returns:
type: ASR class
"""
model_class = dynamic_import(module, predefined_asr.get(backend, dict()))
assert issubclass(
model_class, ASRInterface
), f"{module} does not implement ASRInterface"
return model_class
| 34.228916 | 88 | 0.644139 | 675 | 5,682 | 5.322963 | 0.257778 | 0.015029 | 0.018369 | 0.036738 | 0.418313 | 0.365154 | 0.304481 | 0.274979 | 0.203729 | 0.184247 | 0 | 0.004763 | 0.261 | 5,682 | 165 | 89 | 34.436364 | 0.850917 | 0.457937 | 0 | 0.036364 | 0 | 0 | 0.231947 | 0.108713 | 0 | 0 | 0 | 0 | 0.018182 | 1 | 0.236364 | false | 0 | 0.145455 | 0.018182 | 0.509091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
9c7cc6985912193094f9488cab3579ebbac8b042 | 2,847 | py | Python | ise-hotspot-access-code.py | dagolovach/ise-hotspot-access-code | 83d2cadaa2a15c292f80d6a33048dd7c31c7f741 | [
"MIT"
] | null | null | null | ise-hotspot-access-code.py | dagolovach/ise-hotspot-access-code | 83d2cadaa2a15c292f80d6a33048dd7c31c7f741 | [
"MIT"
] | null | null | null | ise-hotspot-access-code.py | dagolovach/ise-hotspot-access-code | 83d2cadaa2a15c292f80d6a33048dd7c31c7f741 | [
"MIT"
] | null | null | null | # -----------------------------------------------------------
# Generate a new access-code for the ISE Hotspot portal,
# push it to the ISE using API and send an email with
# new access-code
#
# requires the portal-id
#
# (C) 2019 Dmitry Golovach
# email dmitry.golovach@outlook.com
# -----------------------------------------------------------
import requests
import json
import random
import string
from requests.auth import HTTPBasicAuth
import smtplib
def send_email(access_code):
"""
Send email with new generated access code
Configuration could be different if another SMTP server is in use
Current setup is for Gmail account.
Parameters to change:
YOUR_EMAIL@GMAIL.COM - email to use as from address
YOUR_EMAIL@GMAIL.COM - email to use as to address
YOUR_GMAIL_APP_PASSWORD - password for gmail account
"""
fromaddr = "YOUR_EMAIL@GMAIL.COM"
toaddr = "YOUR_EMAIL@GMAIL.COM"
server = smtplib.SMTP('smtp.gmail.com', 587)
server.ehlo()
server.starttls()
server.ehlo()
server.login(fromaddr, "YOUR_GMAIL_APP_PASSWORD")
SUBJECT = "New Guest Password"
TEXT = "New Guest Password: " + str(access_code)
message = 'Subject: {}\n\n{}'.format(SUBJECT, TEXT)
server.sendmail(fromaddr, toaddr, message)
server.quit()
return
def randomStringDigits(stringLength=6):
"""Generate a random string of letters and digits """
lettersAndDigits = string.ascii_letters + string.digits
return ''.join(random.choice(lettersAndDigits) for i in range(stringLength))
def main():
"""
Send email with new generated access code
Configuration could be different if another SMTP server is in use
Current setup is for Gmail account.
Parameters to change:
ISE_PAN_IP - ISE PAN IP address
HOTSPOT-PORTAL-ID - ISE Hotspot Portal ID
HOTSPOT-PORTAL-NAME - ISE Hotspot Portal NAME
ISE_USERNAME - ISE API username
ISE_PASSWORD - ISE API password
"""
url = 'https://<ISE_PAN_IP>:9060/ers/config/hotspotportal/HOTSPOT-PORTAL-ID'
access_code = randomStringDigits(6)
data = {'HotspotPortal':
{'id': 'HOTSPOT-PORTAL-ID',
'name': 'HOTSPOT-PORTAL-NAME',
'settings':
{"aupSettings":
{"includeAup": 'true',
"requireAccessCode": 'true',
"accessCode": access_code,
"requireScrolling": 'false'
}
}
}
}
headers = {"Content-Type": "application/json"}
response = requests.put(url, data=json.dumps(data), headers=headers, verify=False, auth=HTTPBasicAuth('ISE_USERNAME', 'ISE_PASSWORD'))
send_email(access_code)
return
if __name__ == "__main__":
main()
| 31.285714 | 138 | 0.612926 | 326 | 2,847 | 5.248466 | 0.358896 | 0.052601 | 0.032729 | 0.039743 | 0.194039 | 0.194039 | 0.194039 | 0.194039 | 0.16014 | 0.16014 | 0 | 0.006118 | 0.2536 | 2,847 | 90 | 139 | 31.633333 | 0.799059 | 0.367404 | 0 | 0.088889 | 1 | 0 | 0.235294 | 0.013529 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0.088889 | 0.133333 | 0 | 0.266667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
9c7e116b42dd5ab781f175abebd600d7ce721133 | 183 | py | Python | day1/__main__.py | adampirani/advent-of-code-2020-python | 8e2bfaef83ff509b6198af43787435dc41bc4b60 | [
"Unlicense"
] | null | null | null | day1/__main__.py | adampirani/advent-of-code-2020-python | 8e2bfaef83ff509b6198af43787435dc41bc4b60 | [
"Unlicense"
] | null | null | null | day1/__main__.py | adampirani/advent-of-code-2020-python | 8e2bfaef83ff509b6198af43787435dc41bc4b60 | [
"Unlicense"
] | null | null | null |
from day1.src.products import findTripleProduct
EXPENSES = open('day1/input.txt', 'r')
expenses_array = EXPENSES.read().splitlines()
print(findTripleProduct(2020, expenses_array))
| 22.875 | 47 | 0.781421 | 22 | 183 | 6.409091 | 0.727273 | 0.184397 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035928 | 0.087432 | 183 | 7 | 48 | 26.142857 | 0.808383 | 0 | 0 | 0 | 0 | 0 | 0.082418 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92c4e0f6cdef6f5ce30d9ce3f99c8a3d0be3ea44 | 2,114 | py | Python | app/lib/twitter_api/rates.py | MichaelCurrin/twitterverse | 9629f848377e4346be833db70f11c593cc0d7b6c | [
"MIT"
] | 10 | 2019-03-22T07:07:41.000Z | 2022-01-26T00:57:45.000Z | app/lib/twitter_api/rates.py | MichaelCurrin/twitterverse | 9629f848377e4346be833db70f11c593cc0d7b6c | [
"MIT"
] | 70 | 2017-07-12T19:49:38.000Z | 2020-09-02T10:03:28.000Z | app/lib/twitter_api/rates.py | MichaelCurrin/twitterverse | 9629f848377e4346be833db70f11c593cc0d7b6c | [
"MIT"
] | 2 | 2017-06-30T07:13:39.000Z | 2020-12-04T00:39:12.000Z | """
Handle Twitter API rate limit error.
The newer version of tweepy accepts the following in tweepy.API object.
- wait_on_rate_limit: If the api until next rate limit window is reached
to continue. Default is False.
- wait_on_rate_limit_notify: Default is False. If the api prints a notification
when the rate limit is hit. See the tweepy binder.py script.
See authentication.py for setting those on the tweepy.API object, so that the
catching of errors with limitHandled below is not needed.
This script is based on tutorial in the documentation. It needs
to be improved but shows where a hook could be used instead of the standard
waiting. e.g. to log the warning to a different location, or process
data. See asyncio library's sleep and return of control, as alternative to
time.sleep.
"""
import time
import tweepy
def limitHandled(cursor):
"""
Function to handle Twitter API rate limiting when cursoring through items
(note that this does not work with Streaming API.)
Since cursors raise RateLimitErrors in their next() method, handling them
can be done by wrapping the cursor in an iterator, such that an error is
never raised outside the cursor. Alternatively, if not using a cursor, set
wait_on_rate_limit to True on the tweepy.API object.
See tweepy docs and link:
https://stackoverflow.com/questions/21308762/avoid-twitter-api-limitation-with-tweepy
TODO: Sleeping for 15 minutes is not efficient, as when you exceed the
limit for the 15 minute window, you could be a few seconds from reaching
the next window. Rather get the reset time and wait until current time is
that. See tweepy's binder.py script which does this.
:param: cursor: tweepy Cursor items list. Example Usage: >>> for x in
limitHandled(tweepy.Cursor(api.followers).items()): ... print(x)
:return: cursor.next() in a generator expression.
"""
while True:
try:
yield next(cursor)
except tweepy.RateLimitError as e:
print("Sleeping 15 min. {0}".format(str(e)))
time.sleep(15 * 60)
| 39.886792 | 93 | 0.727531 | 331 | 2,114 | 4.616314 | 0.498489 | 0.03534 | 0.02945 | 0.02945 | 0.026178 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011418 | 0.212867 | 2,114 | 52 | 94 | 40.653846 | 0.906851 | 0.842952 | 0 | 0 | 0 | 0 | 0.079051 | 0 | 0 | 0 | 0 | 0.019231 | 0 | 1 | 0.111111 | false | 0 | 0.222222 | 0 | 0.333333 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92d1cfb69330a0cd6ef4c0a089be195ea3e220bb | 2,921 | py | Python | a_storage/codec/_codec_data.py | praefrontalis/Anfisa-Annotations | b4127c68e3696b75b2972f6759437034cc56f8e3 | [
"Apache-2.0"
] | null | null | null | a_storage/codec/_codec_data.py | praefrontalis/Anfisa-Annotations | b4127c68e3696b75b2972f6759437034cc56f8e3 | [
"Apache-2.0"
] | 3 | 2022-03-28T13:44:24.000Z | 2022-03-28T13:53:57.000Z | a_storage/codec/_codec_data.py | praefrontalis/Anfisa-Annotations | b4127c68e3696b75b2972f6759437034cc56f8e3 | [
"Apache-2.0"
] | 3 | 2019-02-18T17:05:06.000Z | 2022-03-22T19:42:38.000Z | import abc
#===============================================
class _CodecData:
sCreateFunc = None
@classmethod
def create(cls, master, parent, schema_instr, default_name = "?"):
return cls.sCreateFunc(master, parent, schema_instr, default_name)
def __init__(self, master, parent, schema_instr, default_name = "?"):
self.mMaster = master
self.mParent = parent
self.mSchemaInstr = schema_instr
self.mSchemaDescr = dict()
self.mOnDuty = False
self._getProperty("tp")
name = self._getProperty("name", default_name)
if default_name != "?":
assert name == default_name
self.mLabel = schema_instr.get("label")
if self.mLabel is not None:
master.setCodecByLabel(self._getProperty("label"), self)
def _getProperty(self, name, default_value = None):
if name in self.mSchemaDescr:
return self.mSchemaDescr[name]
if name in self.mSchemaInstr:
self.mSchemaDescr[name] = self.mSchemaInstr[name]
else:
assert default_value is not None, (
"Property %s is required for codec %s"
% (name, self.getPath()))
self.mSchemaDescr[name] = default_value
return self.mSchemaDescr[name]
def _updateProperty(self, key, val):
self.mSchemaDescr[key] = val
def _checkNameUsage(self, used_names):
name = self.getName()
assert name, "Empty name for codec %s" % self.getPath()
assert name not in used_names, (
"Duplication name for codec %s" % self.getPath())
used_names.add(name)
def _onDuty(self):
assert not self.mOnDuty
unused = set(self.mSchemaInstr.keys()) - set(self.mSchemaDescr.keys())
assert not self.mMaster.isWriteMode() or len(unused) == 0, (
"Lost option(s) for codec %s: %s"
% (self.getPath(), ", ".join(sorted(unused))))
self.mOnDuty = True
def getSchemaDescr(self):
return self.mSchemaDescr
def getMaster(self):
return self.mMaster
def getParent(self):
return self.mParent
def getName(self):
return self._getProperty("name")
def getPath(self):
name = self.getName()
if name:
path_frag = "/" + name
else:
path_frag = ""
if self.mParent is None:
return path_frag
return self.mParent.getPath() + path_frag
def isAggregate(self):
return False
@abc.abstractmethod
def isAtomic(self):
return None
@abc.abstractmethod
def getType(self):
return None
@abc.abstractmethod
def updateWStat(self):
return None
@abc.abstractmethod
def encode(self, value, encode_env):
return None
@abc.abstractmethod
def decode(self, repr_obj, decode_env):
return None
| 29.806122 | 78 | 0.592263 | 320 | 2,921 | 5.290625 | 0.259375 | 0.085056 | 0.059067 | 0.063792 | 0.166568 | 0.148848 | 0 | 0 | 0 | 0 | 0 | 0.000484 | 0.292023 | 2,921 | 97 | 79 | 30.113402 | 0.818182 | 0.01609 | 0 | 0.202532 | 0 | 0 | 0.05047 | 0 | 0 | 0 | 0 | 0 | 0.075949 | 1 | 0.21519 | false | 0 | 0.012658 | 0.139241 | 0.443038 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
92de3bea15f957a46824d66786241607cfdf26c6 | 632 | py | Python | Tokenization/TokenizationTest02.py | SunWooBang/NLP-Practice | 0047abcad88e83fc7944cd97ecfae78f243ee636 | [
"MIT"
] | 1 | 2020-05-02T14:33:48.000Z | 2020-05-02T14:33:48.000Z | Tokenization/TokenizationTest02.py | SunWooBang/NLP-Practice | 0047abcad88e83fc7944cd97ecfae78f243ee636 | [
"MIT"
] | null | null | null | Tokenization/TokenizationTest02.py | SunWooBang/NLP-Practice | 0047abcad88e83fc7944cd97ecfae78f243ee636 | [
"MIT"
] | null | null | null | #*nltk를 이용한 문장 토큰화*
from nltk.tokenize import sent_tokenize
text = "His barber kept his word. But keeping such a huge secret to himself was driving him crazy. Finally, the barber went up a mountain and almost to the edge of a cliff. He dug a hole in the midst of some reeds. He looked about, to mae sure no one was near."
print(sent_tokenize(text))
#문장에 온점이 많이 나오는 경우
from nltk.tokenize import sent_tokenize
text="I am actively looking for Ph.D. students. and you are a Ph.D student."
print(sent_tokenize(text))
#kss를 이용한 한국어 문장 토큰화
import kss
text='나는 관대하다. 관 씨 집안 3대 독자다. 내 이름은 대하다. 그래서 나는 관대하다'
print(kss.split_sentences(text)) | 45.142857 | 261 | 0.759494 | 124 | 632 | 3.830645 | 0.669355 | 0.101053 | 0.134737 | 0.092632 | 0.16 | 0.16 | 0.16 | 0 | 0 | 0 | 0 | 0.001908 | 0.170886 | 632 | 14 | 262 | 45.142857 | 0.90458 | 0.085443 | 0 | 0.444444 | 0 | 0.111111 | 0.637153 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
92df50cf0543254340fc9b9b7faed234d5c2f44e | 290 | py | Python | tools/remove_all.py | CodyKochmann/stock_info | 8de591b5a589aefebccab3650cb5dfea987ab4c8 | [
"MIT"
] | null | null | null | tools/remove_all.py | CodyKochmann/stock_info | 8de591b5a589aefebccab3650cb5dfea987ab4c8 | [
"MIT"
] | null | null | null | tools/remove_all.py | CodyKochmann/stock_info | 8de591b5a589aefebccab3650cb5dfea987ab4c8 | [
"MIT"
] | null | null | null | def remove_all(input_string,to_be_removed):
''' removes all instance of a substring from a string '''
while(to_be_removed in input_string):
input_string = ''.join(input_string.split(to_be_removed))
return(input_string)
if __name__ == '__main__':
print(remove_all('hello world','l'))
| 32.222222 | 59 | 0.755172 | 45 | 290 | 4.4 | 0.577778 | 0.277778 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113793 | 290 | 8 | 60 | 36.25 | 0.770428 | 0.168966 | 0 | 0 | 0 | 0 | 0.085837 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.166667 | 0.166667 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
92e6f07046bf7fb3c454e2fdbb9563c84e20bd66 | 1,404 | py | Python | ticket/forms.py | aessex24/rindr | 2a546b27ce1c72728bb7a63e60653929ca592cfe | [
"MIT"
] | 1 | 2021-12-21T16:24:18.000Z | 2021-12-21T16:24:18.000Z | ticket/forms.py | aessex24/rindr | 2a546b27ce1c72728bb7a63e60653929ca592cfe | [
"MIT"
] | 1 | 2021-12-10T01:05:55.000Z | 2021-12-10T01:05:55.000Z | ticket/forms.py | aessex24/rindr | 2a546b27ce1c72728bb7a63e60653929ca592cfe | [
"MIT"
] | 1 | 2021-12-10T00:48:33.000Z | 2021-12-10T00:48:33.000Z | from django import forms
from django.forms import ModelForm,DateInput,Textarea,TextInput,Form,CheckboxInput
from .models import Ticket
from datetime import datetime
class DateInput(forms.DateInput):
input_type = 'date'
class TicketForm(forms.ModelForm):
class Meta:
model = Ticket
fields = ['cause', 'type','opened','responded','affirmer','notes','reference','contributors','fix','team','difficulty','system','regression','regression_url']
widgets = {
'notes':'TextInput'
},
opened = forms.DateTimeField(widget=TextInput(attrs={"type": "datetime-local","step":1,"value":datetime.now().isoformat('T').split(".")[0]}))
responded = forms.DateTimeField(widget=TextInput(attrs={"type": "datetime-local","step":1,"value":datetime.now().isoformat('T').split(".")[0]}))
closed = forms.DateTimeField(widget=TextInput(attrs={"type": "datetime-local","step":1,"value":datetime.now().isoformat('T').split(".")[0]}))
affirmer = forms.CharField(widget=TextInput(attrs={}))
team = forms.CharField(widget=TextInput(attrs={}))
notes = forms.CharField(widget=TextInput(attrs={}))
reference = forms.URLField(widget=TextInput(attrs={}))
contributors = forms.CharField(widget=TextInput(attrs={}))
regression = forms.BooleanField(widget=CheckboxInput(attrs={}))
regression_url = forms.CharField(widget=TextInput(attrs={}))
| 46.8 | 166 | 0.689459 | 153 | 1,404 | 6.30719 | 0.333333 | 0.139896 | 0.186529 | 0.150259 | 0.462176 | 0.28601 | 0.28601 | 0.28601 | 0.28601 | 0.28601 | 0 | 0.004874 | 0.123219 | 1,404 | 29 | 167 | 48.413793 | 0.779041 | 0 | 0 | 0 | 0 | 0 | 0.149786 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.173913 | 0 | 0.782609 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
92f2c9aa9daf4480549681b07dc4aaee1bb9e024 | 3,002 | py | Python | django_project/settings.py | shivam0501/CC-PROJECT | 74797b648c8b9d45ac063669320d97ccaafec902 | [
"Unlicense"
] | null | null | null | django_project/settings.py | shivam0501/CC-PROJECT | 74797b648c8b9d45ac063669320d97ccaafec902 | [
"Unlicense"
] | null | null | null | django_project/settings.py | shivam0501/CC-PROJECT | 74797b648c8b9d45ac063669320d97ccaafec902 | [
"Unlicense"
] | null | null | null | import os
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# It is okay to share the secret key this time, because this application is not used in production
# If you want to deploy this application, then you will need to reset this secret key
SECRET_KEY = 'exhlfdat&vfum(-24*c2uf4wwddcp$o$9pv98=e6p^gl(-eoj'
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = False
ALLOWED_HOSTS = ['personal-blogging-app.herokuapp.com','127.0.0.1']
# Application definition
INSTALLED_APPS = [
'blog.apps.BlogConfig',
'users.apps.UsersConfig',
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
]
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
'whitenoise.middleware.WhiteNoiseMiddleware',
]
ROOT_URLCONF = 'django_project.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': ['blog/templates/'],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'django_project.wsgi.application'
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
LANGUAGE_CODE = 'en-gb'
TIME_ZONE = 'Europe/Berlin'
USE_I18N = True # setting the default language for users, if their locale isn't known
USE_L10N = True # enabling date/time to be displayed in a local format
USE_TZ = True # enabling timezone support
STATIC_URL = '/static/'
STATIC_ROOT = os.path.join(BASE_DIR, 'static')
MEDIA_URL = '/media/'
MEDIA_ROOT = os.path.join(BASE_DIR, 'media')
LOGIN_REDIRECT_URL = 'blog-home'
LOGIN_URL = 'login'
# Defining a view for the CSRF error
CSRF_FAILURE_VIEW = 'blog.views.csrf_failure'
| 30.632653 | 98 | 0.697202 | 339 | 3,002 | 6.047198 | 0.471976 | 0.095122 | 0.058049 | 0.027317 | 0.12878 | 0.096585 | 0 | 0 | 0 | 0 | 0 | 0.008107 | 0.178215 | 3,002 | 97 | 99 | 30.948454 | 0.822862 | 0.172885 | 0 | 0 | 0 | 0 | 0.565305 | 0.465427 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.067568 | 0.013514 | 0 | 0.013514 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
92faf44208c23843cf6cb4b6f0813e89a193d1f1 | 1,129 | py | Python | netmiko/keymile/keymile_ssh.py | AAm-kun/netmiko | 1c5d2e4c345778ee46e5e487f62c66d02f297625 | [
"MIT"
] | null | null | null | netmiko/keymile/keymile_ssh.py | AAm-kun/netmiko | 1c5d2e4c345778ee46e5e487f62c66d02f297625 | [
"MIT"
] | null | null | null | netmiko/keymile/keymile_ssh.py | AAm-kun/netmiko | 1c5d2e4c345778ee46e5e487f62c66d02f297625 | [
"MIT"
] | null | null | null | import time
from netmiko.no_enable import NoEnable
from netmiko.no_config import NoConfig
from netmiko.cisco.cisco_ios import CiscoIosBase
class KeymileSSH(NoEnable, NoConfig, CiscoIosBase):
def __init__(self, **kwargs):
kwargs.setdefault("default_enter", "\r\n")
return super().__init__(**kwargs)
def session_preparation(self):
"""Prepare the session after the connection has been established."""
self._test_channel_read(pattern=r">")
self.set_base_prompt()
time.sleep(0.3 * self.global_delay_factor)
self.clear_buffer()
def disable_paging(self, *args, **kwargs):
"""Keymile does not use paging."""
return ""
def strip_prompt(self, a_string):
"""Remove appending empty line and prompt from output"""
self._write_session_log(a_string)
a_string = a_string[:-1]
return super().strip_prompt(a_string=a_string)
def set_base_prompt(self, pri_prompt_terminator=">", **kwargs):
"""set prompt termination to >"""
return super().set_base_prompt(pri_prompt_terminator=pri_prompt_terminator)
| 34.212121 | 83 | 0.685562 | 143 | 1,129 | 5.111888 | 0.503497 | 0.057456 | 0.053352 | 0.057456 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003333 | 0.202834 | 1,129 | 32 | 84 | 35.28125 | 0.808889 | 0.151461 | 0 | 0 | 0 | 0 | 0.020277 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.238095 | false | 0 | 0.190476 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
130d27240be549a058db739cc1a2864d7c8d82f1 | 136 | py | Python | 02/requests.02.py | study-machine-learning/dongheon.shin | 6103ef9c73b162603bc39a27e4ecca0f1ac35e57 | [
"MIT"
] | 2 | 2017-09-24T02:29:48.000Z | 2017-10-05T11:15:22.000Z | 02/requests.02.py | study-machine-learning/dongheon.shin | 6103ef9c73b162603bc39a27e4ecca0f1ac35e57 | [
"MIT"
] | null | null | null | 02/requests.02.py | study-machine-learning/dongheon.shin | 6103ef9c73b162603bc39a27e4ecca0f1ac35e57 | [
"MIT"
] | null | null | null | import requests
res = requests.get("http://api.aoikujira.com/time/get.php")
text = res.text
print(text)
bin = res.content
print(bin)
| 13.6 | 59 | 0.720588 | 22 | 136 | 4.454545 | 0.636364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 136 | 9 | 60 | 15.111111 | 0.816667 | 0 | 0 | 0 | 0 | 0 | 0.272059 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13362fddcdcb7a60942ced9de99802074e9481f2 | 876 | py | Python | wolphin/exceptions.py | locationlabs/wolphin | bf8d8c25903c291a869fc3acb46d87c3d57d6812 | [
"Apache-2.0"
] | null | null | null | wolphin/exceptions.py | locationlabs/wolphin | bf8d8c25903c291a869fc3acb46d87c3d57d6812 | [
"Apache-2.0"
] | null | null | null | wolphin/exceptions.py | locationlabs/wolphin | bf8d8c25903c291a869fc3acb46d87c3d57d6812 | [
"Apache-2.0"
] | 1 | 2021-09-08T09:54:29.000Z | 2021-09-08T09:54:29.000Z | class WolphinException(Exception):
"""
Base class for wolphin related exceptions
"""
def __init__(self, message=None):
"""
WolphinException constructor
:param message: error message for the exception
"""
self.message = message
def __str__(self):
return self.message
class NoRunningInstances(WolphinException):
"""
Raised when a project has no running instances.
"""
pass
class EC2InstanceLimitExceeded(WolphinException):
"""
Raised when ec2 instance limit is exceeded.
"""
pass
class InvalidWolphinConfiguration(WolphinException):
"""
Raised when an invalid wolphin configuration is encountered.
"""
pass
class SSHTimeoutError(WolphinException):
"""
Raised when all of a project's instances could not be made ssh-ready.
"""
pass
| 18.25 | 73 | 0.65411 | 85 | 876 | 6.647059 | 0.588235 | 0.155752 | 0.184071 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00311 | 0.265982 | 876 | 47 | 74 | 18.638298 | 0.875583 | 0.390411 | 0 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0.307692 | 0 | 0.076923 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
13386ec9cbb3d6a97b80c07120c5a06891e0cf69 | 1,693 | py | Python | lib/third_party/requests/packages/__init__.py | kustodian/google-cloud-sdk | b6bae4137d4b58030adb3dcb1271216dfb19f96d | [
"Apache-2.0"
] | 2 | 2019-11-10T09:17:07.000Z | 2019-12-18T13:44:08.000Z | lib/third_party/requests/packages/__init__.py | kustodian/google-cloud-sdk | b6bae4137d4b58030adb3dcb1271216dfb19f96d | [
"Apache-2.0"
] | 11 | 2020-02-29T02:51:12.000Z | 2022-03-30T23:20:08.000Z | lib/third_party/requests/packages/__init__.py | kustodian/google-cloud-sdk | b6bae4137d4b58030adb3dcb1271216dfb19f96d | [
"Apache-2.0"
] | 1 | 2020-07-24T18:47:35.000Z | 2020-07-24T18:47:35.000Z | '''
Debian and other distributions "unbundle" requests' vendored dependencies, and
rewrite all imports to use the global versions of ``urllib3`` and ``chardet``.
The problem with this is that not only requests itself imports those
dependencies, but third-party code outside of the distros' control too.
In reaction to these problems, the distro maintainers replaced
``requests.packages`` with a magical "stub module" that imports the correct
modules. The implementations were varying in quality and all had severe
problems. For example, a symlink (or hardlink) that links the correct modules
into place introduces problems regarding object identity, since you now have
two modules in `sys.modules` with the same API, but different identities::
requests.packages.urllib3 is not urllib3
With version ``2.5.2``, requests started to maintain its own stub, so that
distro-specific breakage would be reduced to a minimum, even though the whole
issue is not requests' fault in the first place. See
https://github.com/kennethreitz/requests/pull/2375 for the corresponding pull
request.
'''
from __future__ import absolute_import
import sys
try:
from . import urllib3
except ImportError:
import urllib3
try:
from . import chardet
except ImportError:
import chardet
try:
from . import idna
except ImportError:
import idna
for package in ('urllib3', 'idna', 'chardet'):
# This traversal is apparently necessary such that the identities are
# preserved (requests.packages.urllib3.* is urllib3.*)
for mod in list(sys.modules):
if mod == package or mod.startswith(package + '.'):
sys.modules['requests.packages.' + mod] = sys.modules[mod]
| 36.021277 | 78 | 0.754873 | 239 | 1,693 | 5.32636 | 0.527197 | 0.050275 | 0.030636 | 0.039277 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010714 | 0.173066 | 1,693 | 46 | 79 | 36.804348 | 0.898571 | 0.709982 | 0 | 0.333333 | 0 | 0 | 0.076763 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.611111 | 0 | 0.611111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
1338ec1d6675a5e8114aab03ad773c1bbaf6d760 | 10,061 | py | Python | MyWatchList/models.py | fgl-foundation/MovieDB | 17bf065336f52f7f6ef423f85b6aa5315b1d685f | [
"MIT"
] | null | null | null | MyWatchList/models.py | fgl-foundation/MovieDB | 17bf065336f52f7f6ef423f85b6aa5315b1d685f | [
"MIT"
] | 1 | 2019-11-01T15:47:04.000Z | 2019-11-01T15:47:04.000Z | MyWatchList/models.py | fgl-foundation/MovieDB | 17bf065336f52f7f6ef423f85b6aa5315b1d685f | [
"MIT"
] | null | null | null | import datetime
from django.db import models
from taggit.managers import TaggableManager
from django.conf import settings
from django.contrib.contenttypes.models import ContentType
from django.contrib.contenttypes.fields import GenericForeignKey
from django.contrib.auth.models import User
from django.urls import reverse
class MovieManager(models.Manager):
def get_series(self):
return self.filter(series=True, active=True)
def get_films(self):
return self.filter(series=False, active=True)
def get_seriesByUser(self, user):
return self.get_series().filter(watchlist__user=user, active=True).distinct()
def get_filmsByUser(self, user):
return self.get_films().filter(watchlist__user=user, active=True)
class Movie(models.Model):
active = models.BooleanField(default=True)
name = models.TextField()
originalname = models.TextField()
length = models.PositiveSmallIntegerField(default=0)
year = models.PositiveSmallIntegerField()
release_date = models.DateField(default=datetime.date.today)
release_dateRU = models.DateField(default=datetime.date.today)
UScert = models.CharField(null=True, max_length=10, blank=True )
RUcert = models.CharField(null=True, max_length=10, blank=True)
tmdbid = models.CharField(max_length=100)
imdbid = models.CharField(null=True, blank=True, max_length=100)
kinopoiskid = models.PositiveIntegerField(null=True, blank=True)
disctiption = models.TextField(null=True, blank=True)
rating = models.FloatField(default=0, editable=False)
tags = TaggableManager()
series = models.BooleanField(default=False)
img = models.ImageField(upload_to='Posters', default="default.png")
manager = MovieManager()
objects = models.Manager()
def __str__(self):
return self.name
class Meta:
ordering = ('name',)
def get_absolute_url(self):
if self.series:
return reverse("serial", kwargs={"id": self.id})
else:
return reverse("film", kwargs={"id": self.id})
def get_seasons(self):
if self.series:
return self.season.all()
class StatusList(models.Model):
name= models.TextField()
color= models.TextField()
def __str__(self):
return self.name
class Season(models.Model):
movie = models.ForeignKey(Movie, null=True, related_name="season", on_delete=models.CASCADE)
status = models.ForeignKey(StatusList, default=1, on_delete=models.SET_DEFAULT)
name = models.TextField()
episodecount = models.IntegerField()
position = models.PositiveSmallIntegerField(default=0)
img = models.ImageField(upload_to='Posters', default="default.png")
rating = models.FloatField(default=0, editable=False)
tmdbid = models.CharField(max_length=100, null=True, blank=True)
disctiption = models.TextField(default="Нет данных")
def get_date(self):
result=None
try:
result= self.serieslist.all().order_by('date').first().date
except:
pass
return result
class Meta:
ordering = ('movie__name', 'position', 'name')
def __str__(self):
return str(self.name + " " + self.movie.name)
def get_absolute_url(self):
return reverse("season", kwargs={"id": self.id})
class SeriesList(models.Model):
season = models.ForeignKey(Season, related_name="serieslist", on_delete=models.CASCADE)
name = models.TextField()
date = models.DateField()
disctiption = models.TextField(default="Нет данных", null=True, blank=True)
class Profile(models.Model):
user = models.OneToOneField(settings.AUTH_USER_MODEL, on_delete=models.CASCADE)
date_of_birth = models.DateField(blank=True, null=True)
photo = models.ImageField(upload_to='users', default="users/default.png", blank=True)
sex = models.CharField(max_length=1, choices=[('M', 'Муж'), ('F', 'Жен')], default='1')
photobg = models.ImageField(upload_to='users/bg', null=True, blank=True)
def __str__(self):
return 'Profile for user {}'.format(self.user.username)
@property
def name(self):
return " ".join((self.user.first_name, self.user.last_name))
@property
def is_f(self):
return self.sex == "F"
@property
def age(self):
from dateutil.relativedelta import relativedelta
today = datetime.datetime.today()
delta = relativedelta(today, self.date_of_birth)
return delta.years
@property
def countNoties(self):
return Notifications.objects.filter(profile__user=self.user).count()
@property
def img(self):
return self.photo
def get_absolute_url(self):
return "/profile/%s/" % self.user.username
class MessageManager(models.Manager):
def get_dialog(self, sender, accepter):
return self.filter(Q(FromUser=accepter, ToUser=sender) | Q(FromUser=sender, ToUser=accepter)).order_by(
"-sended")
class Messages(models.Model):
FromUser = models.ForeignKey(Profile, on_delete=models.CASCADE, related_name='From')
ToUser = models.ForeignKey(Profile, on_delete=models.CASCADE, related_name='To')
sended = models.DateTimeField(auto_now_add=True)
message = models.TextField()
objects = MessageManager()
class Notifications(models.Model):
profile = models.ForeignKey(Profile, on_delete=models.CASCADE)
sended = models.DateTimeField(auto_now_add=True)
message = models.TextField()
item_ct = models.ForeignKey(ContentType, blank=True, null=True, related_name='item_notifi',
on_delete=models.CASCADE)
item_id = models.PositiveIntegerField(null=True, blank=True, db_index=True)
item = GenericForeignKey('item_ct', 'item_id')
class Feed(models.Model):
profile = models.ForeignKey(Profile, on_delete=models.CASCADE)
item_ct = models.ForeignKey(ContentType, blank=True, null=True, related_name='item_obj',
on_delete=models.CASCADE)
item_id = models.PositiveIntegerField(null=True, blank=True)
item = GenericForeignKey('item_ct', 'item_id')
created = models.DateTimeField(auto_now_add=True)
feed_type = models.CharField(max_length=20, null=True)
verb = models.CharField(max_length=255, null=True)
class Meta:
ordering = ('-created',)
def __str__(self):
return '{} {}'.format(self.profile, self.verb)
class Follower(models.Model):
follow_from = models.ForeignKey(Profile, related_name='rel_from_set', on_delete=models.CASCADE)
follow_to = models.ForeignKey(Profile, related_name='rel_to_set', on_delete=models.CASCADE)
created = models.DateTimeField(auto_now_add=True, db_index=True)
class Meta:
ordering = ('-created',)
def __str__(self):
return '{} follows to {}'.format(self.follow_from, self.follow_to)
Profile.add_to_class('following',
models.ManyToManyField('self', through=Follower, related_name='followers', symmetrical=False))
class CommentManager(models.Manager):
def get_comments(self, item):
return self.filter(content_type=ContentType.objects.get_for_model(item), object_id=item.id)
class CommentModel(models.Model):
content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE)
object_id = models.PositiveIntegerField()
item = GenericForeignKey('content_type', 'object_id')
comments = CommentManager()
objects = models.Manager()
user = models.ForeignKey(User, on_delete=models.SET_DEFAULT, default=0)
text = models.TextField()
created = models.DateTimeField(auto_now_add=True)
updated = models.DateTimeField(auto_now=True)
spoiler = models.BooleanField(default=False)
active = models.BooleanField(default=True)
def get_absolute_url(self):
return self.item.get_absolute_url()
def __str__(self):
return " - ".join((self.item.name, self.user.profile.name))
class ReplyModel(models.Model):
item = models.ForeignKey(CommentModel, on_delete=models.CASCADE)
comments = CommentManager()
objects = models.Manager()
user = models.ForeignKey(User, on_delete=models.SET_DEFAULT, default=0)
text = models.TextField()
created = models.DateTimeField(auto_now_add=True)
updated = models.DateTimeField(auto_now=True)
spoiler = models.BooleanField(default=False)
active = models.BooleanField(default=True)
class WatchListManager(models.Manager):
def get_series(self):
return self.filter(movie__series=True)
def get_seasons(self, wl):
return self.filter(movie=wl.movie)
def get_films(self):
return self.filter(movie__series=False)
class WatchList(models.Model):
user= models.ForeignKey(User, on_delete=models.CASCADE)
movie= models.ForeignKey(Movie, on_delete=models.CASCADE)
season= models.ForeignKey(Season, on_delete=models.CASCADE, null=True)
userrate= models.PositiveSmallIntegerField(default=0)
userstatus= models.PositiveSmallIntegerField(default=1)
rewatch= models.PositiveIntegerField(default=0)
updated = models.DateTimeField(auto_now=True)
userepisode = models.PositiveIntegerField(default=0, null=True)
manager = WatchListManager()
objects = models.Manager()
@property
def get_status(self):
from MyWatchList.views.list.userstatus import UserStatusDict
return UserStatusDict.get(self.userstatus)
@property
def get_statusTag(self):
from MyWatchList.views.list.userstatus import UserTagsStatusDict
return UserTagsStatusDict.get(self.userstatus)
class UserList(models.Model):
name = models.CharField(max_length=50)
user = models.ForeignKey(User, on_delete=models.CASCADE)
def __str__(self):
return self.name
class UserListRecord(models.Model):
header = models.ForeignKey(UserList, on_delete=models.CASCADE)
movie = models.ForeignKey(Movie, on_delete=models.CASCADE)
class Meta:
unique_together = ('header', 'movie') | 31.539185 | 115 | 0.704204 | 1,198 | 10,061 | 5.767112 | 0.171953 | 0.025474 | 0.04458 | 0.057751 | 0.45419 | 0.408887 | 0.331162 | 0.250253 | 0.240266 | 0.171371 | 0 | 0.004 | 0.180002 | 10,061 | 319 | 116 | 31.539185 | 0.833455 | 0 | 0 | 0.361111 | 0 | 0 | 0.03717 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.138889 | false | 0.00463 | 0.050926 | 0.111111 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
133995c61b774a99247ae32d8b3f3bdfd15a484c | 263 | py | Python | packages/pyright-internal/src/tests/samples/genericTypes27.py | ashb/pyright | 9ef8ec236f88eb5332363974087f9e581f20f697 | [
"MIT"
] | 1 | 2020-12-28T16:58:24.000Z | 2020-12-28T16:58:24.000Z | packages/pyright-internal/src/tests/samples/genericTypes27.py | ashb/pyright | 9ef8ec236f88eb5332363974087f9e581f20f697 | [
"MIT"
] | 3 | 2022-03-03T03:03:24.000Z | 2022-03-25T14:43:54.000Z | packages/pyright-internal/src/tests/samples/genericTypes27.py | ashb/pyright | 9ef8ec236f88eb5332363974087f9e581f20f697 | [
"MIT"
] | null | null | null | # This sample tests that a generic type alias can use
# a compatible bound TypeVar.
from typing import Generic, TypeVar
D = TypeVar("D", bool, int, float, object)
E = TypeVar("E", bool, int, float, object)
class Gen(Generic[D]):
pass
GenAlias = Gen[E]
| 16.4375 | 53 | 0.688213 | 41 | 263 | 4.414634 | 0.634146 | 0.088398 | 0.132597 | 0.198895 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.197719 | 263 | 15 | 54 | 17.533333 | 0.85782 | 0.30038 | 0 | 0 | 0 | 0 | 0.011111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.166667 | 0.166667 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
133e63787fe0cec6a70bca2eaacfa2e628d2b385 | 197 | py | Python | config/logger/file_logger.py | phirasit/TestcaseGenerator | 443f320e927a606d9d64933b60591c67c83b6630 | [
"MIT"
] | null | null | null | config/logger/file_logger.py | phirasit/TestcaseGenerator | 443f320e927a606d9d64933b60591c67c83b6630 | [
"MIT"
] | null | null | null | config/logger/file_logger.py | phirasit/TestcaseGenerator | 443f320e927a606d9d64933b60591c67c83b6630 | [
"MIT"
] | null | null | null | from config.logger import Logger
class FileLogger(Logger):
def __init__(self, file_name):
self.fp = open(file_name, 'a')
def write(self, msg):
self.fp.write(msg + "\n")
| 17.909091 | 38 | 0.624365 | 28 | 197 | 4.178571 | 0.607143 | 0.136752 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.238579 | 197 | 10 | 39 | 19.7 | 0.78 | 0 | 0 | 0 | 0 | 0 | 0.015228 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1344c23ebbe9a78ca0165253cf260f991c7c73d9 | 228 | py | Python | models/schemas.py | stylepatrick/crypto_price_tracker | cae3e6642209472a23f0d3de8482c7e45c90f2ad | [
"MIT"
] | null | null | null | models/schemas.py | stylepatrick/crypto_price_tracker | cae3e6642209472a23f0d3de8482c7e45c90f2ad | [
"MIT"
] | null | null | null | models/schemas.py | stylepatrick/crypto_price_tracker | cae3e6642209472a23f0d3de8482c7e45c90f2ad | [
"MIT"
] | null | null | null | from datetime import datetime
from pydantic import BaseModel
class CoinBase(BaseModel):
name: str
price: float
class Coin(CoinBase):
id: int
time_created: datetime
class Config:
orm_mode = True
| 13.411765 | 30 | 0.688596 | 28 | 228 | 5.535714 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.258772 | 228 | 16 | 31 | 14.25 | 0.91716 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.9 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
134928e6721789de037f9619c2d1f2462f68389d | 634 | py | Python | lib/coginvasion/friends/FriendRequest.py | theclashingfritz/Cog-Invasion-Online-Dump | 2561abbacb3e2e288e06f3f04b935b5ed589c8f8 | [
"Apache-2.0"
] | 1 | 2020-03-12T16:44:10.000Z | 2020-03-12T16:44:10.000Z | lib/coginvasion/friends/FriendRequest.py | theclashingfritz/Cog-Invasion-Online-Dump | 2561abbacb3e2e288e06f3f04b935b5ed589c8f8 | [
"Apache-2.0"
] | null | null | null | lib/coginvasion/friends/FriendRequest.py | theclashingfritz/Cog-Invasion-Online-Dump | 2561abbacb3e2e288e06f3f04b935b5ed589c8f8 | [
"Apache-2.0"
] | null | null | null | # uncompyle6 version 3.2.4
# Python bytecode 2.7 (62211)
# Decompiled from: Python 2.7.15 (v2.7.15:ca079a3ea3, Apr 30 2018, 16:30:26) [MSC v.1500 64 bit (AMD64)]
# Embedded file name: lib.coginvasion.friends.FriendRequest
from direct.directnotify.DirectNotifyGlobal import directNotify
from direct.gui.DirectGui import DirectFrame, OnscreenText, DirectButton
from lib.coginvasion.toon import ToonDNA
class FriendRequest(DirectFrame):
notify = directNotify.newCategory('FriendRequest')
def __init__(self, name, dnaStrand):
DirectFrame.__init__(self)
dna = ToonDNA.ToonDNA()
dna.setDNAStrand(dnaStrand) | 42.266667 | 104 | 0.760252 | 80 | 634 | 5.925 | 0.65 | 0.008439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080882 | 0.141956 | 634 | 15 | 105 | 42.266667 | 0.790441 | 0.335962 | 0 | 0 | 0 | 0 | 0.031175 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
134a475e8554f33fe233e9ce74e14203ca891951 | 276 | py | Python | testapp/models.py | mani571/apiview_1 | 21918d14c0dca08475b733e1f6756c1f4d3fb654 | [
"BSD-3-Clause"
] | null | null | null | testapp/models.py | mani571/apiview_1 | 21918d14c0dca08475b733e1f6756c1f4d3fb654 | [
"BSD-3-Clause"
] | null | null | null | testapp/models.py | mani571/apiview_1 | 21918d14c0dca08475b733e1f6756c1f4d3fb654 | [
"BSD-3-Clause"
] | null | null | null | from django.db import models
# Create your models here.
class details(models.Model):
name=models.CharField(max_length=20)
position=models.CharField(max_length=10)
mobile=models.IntegerField()
address=models.TextField(max_length=50)
class Meta:
db_table="DETAILS"
| 18.4 | 41 | 0.778986 | 39 | 276 | 5.410256 | 0.641026 | 0.127962 | 0.170616 | 0.227488 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02449 | 0.112319 | 276 | 15 | 42 | 18.4 | 0.836735 | 0.086957 | 0 | 0 | 0 | 0 | 0.027888 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
134e20d47e5442f3110c4ba01708bd310bd06e7b | 1,188 | py | Python | tests/test_conv_pad.py | f-sky/DeepV2D | 5c1c6f58ee359d045a7efd5161445ea87d83bdbe | [
"BSD-3-Clause"
] | 1 | 2021-09-29T06:49:42.000Z | 2021-09-29T06:49:42.000Z | tests/test_conv_pad.py | f-sky/DeepV2D | 5c1c6f58ee359d045a7efd5161445ea87d83bdbe | [
"BSD-3-Clause"
] | null | null | null | tests/test_conv_pad.py | f-sky/DeepV2D | 5c1c6f58ee359d045a7efd5161445ea87d83bdbe | [
"BSD-3-Clause"
] | null | null | null | import os
os.environ['CUDA_VISIBLE_DEVICES'] = ''
import numpy as np
import matplotlib.pyplot as plt
import sys
import warnings
warnings.filterwarnings("ignore", message=r"Passing", category=FutureWarning)
import tensorflow as tf
from tensorflow.contrib import slim
np.random.seed(0)
config = tf.ConfigProto(
device_count={'GPU': 0}
)
weight = np.ones((3, 3))[:, :, None, None]
# weight = np.load('../MonoTrack/tmp/conv0w.npy').transpose(2, 3, 1, 0)
# bias = np.load('../MonoTrack/tmp/conv0b.npy')
with tf.Session(config=config) as sess:
x = tf.constant(np.ones((1, 96, 96, 1)))
x = tf.pad(x, tf.constant([[0, 0], [0, 1], [0, 1], [0, 0]]), constant_values=0)
y = slim.conv2d(x, 1, [3, 3], stride=2, padding='VALID', activation_fn=None,
weights_initializer=tf.constant_initializer(weight),
biases_initializer=tf.constant_initializer(0), )
sess.run(tf.global_variables_initializer())
res = sess.run(y)
print(res[0, :, :, 0].shape)
# todo: kernelsize=3,stride=2
# 5.21374078
# 手动pad0 3.0595844
# random weight
# 1->1
# SAME 13.21971972
# manual pad + valid 7.41130873
# all 1 weight
# SAME 25
# manual pad + valid 16
| 28.285714 | 83 | 0.664141 | 179 | 1,188 | 4.346369 | 0.486034 | 0.051414 | 0.03856 | 0.046272 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080285 | 0.171717 | 1,188 | 41 | 84 | 28.97561 | 0.710366 | 0.23569 | 0 | 0 | 0 | 0 | 0.04581 | 0 | 0 | 0 | 0 | 0.02439 | 0 | 1 | 0 | false | 0.043478 | 0.304348 | 0 | 0.304348 | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
13544123ddf73c82dc89e4c2e3fdcdb2ccff0aec | 494 | py | Python | Hackerrank/Python/07 - Collections/03 - collections.namedtuple().py | Next-Gen-UI/Code-Dynamics | a9b9d5e3f27e870b3e030c75a1060d88292de01c | [
"MIT"
] | null | null | null | Hackerrank/Python/07 - Collections/03 - collections.namedtuple().py | Next-Gen-UI/Code-Dynamics | a9b9d5e3f27e870b3e030c75a1060d88292de01c | [
"MIT"
] | null | null | null | Hackerrank/Python/07 - Collections/03 - collections.namedtuple().py | Next-Gen-UI/Code-Dynamics | a9b9d5e3f27e870b3e030c75a1060d88292de01c | [
"MIT"
] | null | null | null | # ========================
# Information
# ========================
# Direct Link: https://www.hackerrank.com/challenges/py-collections-namedtuple/problem
# Difficulty: Easy
# Max Score: 20
# Language: Python
# ========================
# Solution
# ========================
from collections import namedtuple
N, STUDENT = int(input()), namedtuple('Student', input())
print("{:.2f}".format(sum([int(STUDENT(*input().split()).MARKS) for _ in range(N)]) / N))
| 26 | 90 | 0.508097 | 45 | 494 | 5.555556 | 0.777778 | 0.096 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007246 | 0.161943 | 494 | 18 | 91 | 27.444444 | 0.596618 | 0.540486 | 0 | 0 | 0 | 0 | 0.065657 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
136aedd52a26cd7ade2fec977fc17b08afd00cc2 | 906 | py | Python | miniDocker/tests/server/test_standalone.py | WhiteVermouth/miniDocker-server | 74ea83653f0b8d7b7e4bb98c6ba0fa22e925fb80 | [
"MIT"
] | 5 | 2018-06-13T00:46:45.000Z | 2019-07-16T10:24:19.000Z | miniDocker/tests/server/test_standalone.py | WhiteVermouth/miniDocker-server | 74ea83653f0b8d7b7e4bb98c6ba0fa22e925fb80 | [
"MIT"
] | 1 | 2018-06-14T11:37:23.000Z | 2018-06-14T11:37:23.000Z | miniDocker/tests/server/test_standalone.py | WhiteVermouth/miniDocker-server | 74ea83653f0b8d7b7e4bb98c6ba0fa22e925fb80 | [
"MIT"
] | 1 | 2020-07-19T02:08:51.000Z | 2020-07-19T02:08:51.000Z | from miniDocker.server.standalone import *
def test_list_containers():
print(list_containers())
def test_stop_container(name):
res = stop_container(name)
if res["status"] == "success":
print("success")
def test_start_container(name):
res = start_container(name)
if res["status"] == "success":
print("success")
def test_remove_container(name):
res = remove_container(name)
if res["status"] == "success":
print("success")
def test_get_logs(name):
res = get_logs(name)
if res["status"] == "success":
print(res["logs"])
def test_pause(name):
res = switch_container_pause_status(name)
if res["status"] == "success":
print("success")
def test_get_stats(name):
res = get_stats(name)
if res["status"] == "success":
print(res["stats"])
if __name__ == '__main__':
test_get_stats("eager_mestorf")
| 19.695652 | 45 | 0.636865 | 113 | 906 | 4.814159 | 0.238938 | 0.090074 | 0.099265 | 0.165441 | 0.472426 | 0.472426 | 0.472426 | 0.362132 | 0.362132 | 0.362132 | 0 | 0 | 0.211921 | 906 | 45 | 46 | 20.133333 | 0.761905 | 0 | 0 | 0.344828 | 0 | 0 | 0.15011 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.241379 | false | 0 | 0.034483 | 0 | 0.275862 | 0.241379 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
136c8c0f4ea658593706de40449b3b4311b93149 | 81 | py | Python | food_data.py | Siddhant8/repo-template | da97e5482a2cc75caf8241c29e8b46e2ae92c136 | [
"Unlicense"
] | null | null | null | food_data.py | Siddhant8/repo-template | da97e5482a2cc75caf8241c29e8b46e2ae92c136 | [
"Unlicense"
] | null | null | null | food_data.py | Siddhant8/repo-template | da97e5482a2cc75caf8241c29e8b46e2ae92c136 | [
"Unlicense"
] | 1 | 2022-01-12T20:22:12.000Z | 2022-01-12T20:22:12.000Z | #food_item
data = {"apple": 1, "banana": 5, "milk": 10, "bread": 3, "yogurt": 16} | 40.5 | 70 | 0.567901 | 13 | 81 | 3.461538 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101449 | 0.148148 | 81 | 2 | 70 | 40.5 | 0.550725 | 0.111111 | 0 | 0 | 0 | 0 | 0.361111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1377c3d87e85c2a7dc8097098fae6877ba054194 | 442 | py | Python | litebot/core/minecraft/rpc.py | rybot666/LiteBot | 9598c021a59ee7983a1515c54a6cbc43bfcb5eb9 | [
"MIT"
] | 22 | 2020-10-18T22:36:51.000Z | 2022-03-27T07:49:25.000Z | litebot/core/minecraft/rpc.py | rybot666/LiteBot | 9598c021a59ee7983a1515c54a6cbc43bfcb5eb9 | [
"MIT"
] | 8 | 2021-07-14T06:46:47.000Z | 2021-08-17T06:09:52.000Z | litebot/core/minecraft/rpc.py | rybot666/LiteBot | 9598c021a59ee7983a1515c54a6cbc43bfcb5eb9 | [
"MIT"
] | 7 | 2021-05-04T16:56:19.000Z | 2021-10-12T05:44:31.000Z | from typing import Callable
def rpc(*, name) -> Callable:
"""Decorate a coroutine as an RPC method that can be executed by the server.
Args:
name: The name of the RPC method
Returns:
A decorator that will mark the coroutine as an RPC method
"""
def decorator(func) -> Callable:
func.__rpc_handler__ = True
func.__name__ = name or func.__name__
return func
return decorator | 23.263158 | 80 | 0.647059 | 60 | 442 | 4.55 | 0.516667 | 0.098901 | 0.095238 | 0.117216 | 0.161172 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.291855 | 442 | 19 | 81 | 23.263158 | 0.872204 | 0.427602 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
137e51bb8e083eb532b29bf2df3c27fbf208253b | 802 | py | Python | demos/oneliners.py | dendisuhubdy/mpyc | 07c22fd5ae09fe310b6172847b99e009753d88fe | [
"MIT"
] | null | null | null | demos/oneliners.py | dendisuhubdy/mpyc | 07c22fd5ae09fe310b6172847b99e009753d88fe | [
"MIT"
] | null | null | null | demos/oneliners.py | dendisuhubdy/mpyc | 07c22fd5ae09fe310b6172847b99e009753d88fe | [
"MIT"
] | null | null | null | """Couple of MPyC oneliners.
Run with m parties to compute:
- m = sum_{i=0}^{m-1} 1 = sum(1 for i in range(m))
- m**2 = sum_{i=0}^{m-1} 2i+1 = sum(2*i+1 for i in range(m))
- 2**m = prod_{i=0}^{m-1} 2 = prod(2 for i in range(m))
- m! = prod_{i=0}^{m-1} i+1 = prod(i+1 for i in range(m))
Bit lengths of secure integers ensure each result fits for any m, 1<=m<=256.
"""
from mpyc.runtime import mpc
mpc.run(mpc.start())
print('m =', mpc.run(mpc.output(mpc.sum(mpc.input(mpc.SecInt(9)(1))))))
print('m**2 =', mpc.run(mpc.output(mpc.sum(mpc.input(mpc.SecInt(17)(2*mpc.pid+1))))))
print('2**m =', mpc.run(mpc.output(mpc.prod(mpc.input(mpc.SecInt(257)(2))))))
print('m! =', mpc.run(mpc.output(mpc.prod(mpc.input(mpc.SecInt(1685)(mpc.pid+1))))))
mpc.run(mpc.shutdown())
| 38.190476 | 86 | 0.599751 | 164 | 802 | 2.908537 | 0.27439 | 0.075472 | 0.113208 | 0.033543 | 0.530398 | 0.501048 | 0.408805 | 0.327044 | 0.327044 | 0.327044 | 0 | 0.060651 | 0.157107 | 802 | 20 | 87 | 40.1 | 0.64497 | 0.488778 | 0 | 0 | 0 | 0 | 0.059553 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0.571429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
1385d773d28d9a3083aa4902868748572d388f20 | 283 | py | Python | 1_beginner/chapter1/practice/style.py | code4tomorrow/Python | 035b6f5d8fd635a16caaff78bcd3f582663dadc3 | [
"MIT"
] | 4 | 2021-03-01T00:32:45.000Z | 2021-05-21T22:01:52.000Z | 1_beginner/chapter1/practice/style.py | code4tomorrow/Python | 035b6f5d8fd635a16caaff78bcd3f582663dadc3 | [
"MIT"
] | 29 | 2020-09-12T22:56:04.000Z | 2021-09-25T17:08:42.000Z | 1_beginner/chapter1/practice/style.py | code4tomorrow/Python | 035b6f5d8fd635a16caaff78bcd3f582663dadc3 | [
"MIT"
] | 7 | 2021-02-25T01:50:55.000Z | 2022-02-28T00:00:42.000Z | # Style
# Uncomment the following code, then
# fix the style in this file so that it runs properly
# and there are comments explaining the program
"""
print("Hello World!")
print("This is a Python program")
age =
input("Enter your age: ")
print("Your age is " + age)
"""
| 17.6875 | 53 | 0.674912 | 43 | 283 | 4.44186 | 0.72093 | 0.073298 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.215548 | 283 | 15 | 54 | 18.866667 | 0.86036 | 0.925795 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13975248eb2650630d0279166ec3f069b1018cfc | 574 | py | Python | paquetes/ej4-utils/utils/string/validacion.py | gabvillacis/modulos_y_paquetes | 67895f81d266961486de06b75d8a1e0c86714e15 | [
"MIT"
] | null | null | null | paquetes/ej4-utils/utils/string/validacion.py | gabvillacis/modulos_y_paquetes | 67895f81d266961486de06b75d8a1e0c86714e15 | [
"MIT"
] | null | null | null | paquetes/ej4-utils/utils/string/validacion.py | gabvillacis/modulos_y_paquetes | 67895f81d266961486de06b75d8a1e0c86714e15 | [
"MIT"
] | null | null | null | # string/validacion.py
""" Validar que una cadena de texto {@param str} cumpla un mínimo de caracteres {@param min_length}
Ejemplo: validate_min_length("Hola mundo", 3) -> True
"""
def validate_min_length(str, min_length):
if str is None:
return False
return len(str)>=min_length
""" Validar que una cadena de texto {@param str} cumpla un máximo de caracteres {@param max_length}
Ejemplo: validate_max_length("Hola", 4) -> True
"""
def validate_max_length(str, max_length):
if str is None:
return False
return len(str)<=max_length | 30.210526 | 99 | 0.698606 | 86 | 574 | 4.5 | 0.383721 | 0.116279 | 0.067183 | 0.098191 | 0.423773 | 0.423773 | 0.423773 | 0.423773 | 0.423773 | 0.423773 | 0 | 0.004329 | 0.195122 | 574 | 19 | 100 | 30.210526 | 0.833333 | 0.297909 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
13a9765d89465250b161ca4cfa248ac332463f1c | 194 | py | Python | evie/routes/base.py | Netroxen/evie.cms | 9c8247d222cc13a9db1ab9bbcc6c41f499c6d785 | [
"MIT"
] | 1 | 2021-12-16T19:50:23.000Z | 2021-12-16T19:50:23.000Z | evie/routes/base.py | Netroxen/evie.cms | 9c8247d222cc13a9db1ab9bbcc6c41f499c6d785 | [
"MIT"
] | null | null | null | evie/routes/base.py | Netroxen/evie.cms | 9c8247d222cc13a9db1ab9bbcc6c41f499c6d785 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from quart import Blueprint, render_template
bp = Blueprint('base', __name__)
@bp.route('/')
async def index():
return await render_template('base/base.html.j2')
| 17.636364 | 53 | 0.680412 | 26 | 194 | 4.846154 | 0.769231 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012121 | 0.149485 | 194 | 10 | 54 | 19.4 | 0.751515 | 0.108247 | 0 | 0 | 0 | 0 | 0.128655 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.4 | 0.4 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13ac08ab43bda0d53932ce328f7f6c571547a7f2 | 3,088 | py | Python | tests/test_config.py | bertosantamaria/redbeat | 3bd01210353126db6f37388b2b171ba40817457f | [
"Apache-2.0"
] | null | null | null | tests/test_config.py | bertosantamaria/redbeat | 3bd01210353126db6f37388b2b171ba40817457f | [
"Apache-2.0"
] | null | null | null | tests/test_config.py | bertosantamaria/redbeat | 3bd01210353126db6f37388b2b171ba40817457f | [
"Apache-2.0"
] | null | null | null | import mock
import pytest
from redbeat.schedulers import RedBeatConfig, CELERY_4_OR_GREATER
from basecase import AppCase
class test_RedBeatConfig(AppCase):
def setup(self):
self.conf = RedBeatConfig(self.app)
def test_app(self):
self.assertEqual(self.app, self.conf.app)
def test_lock_timeout(self):
# the config only has the lock_timeout if it was overidden
# via the REDBEAT_LOCK_TIMEOUT, see test_scheduler.py for real test
self.assertEqual(self.conf.lock_timeout, None)
def test_key_prefix_default(self):
self.assertEqual(self.conf.key_prefix, 'redbeat:')
def test_lock_key_default(self):
self.assertTrue("REDBEAT_LOCK_KEY" not in self.app.conf.keys())
self.assertTrue("redbeat_lock_key" not in self.app.conf.keys())
self.conf = RedBeatConfig(self.app)
self.assertEqual(self.conf.lock_key, 'redbeat::lock')
def test_disable_lock_key_4(self):
self.app.conf.redbeat_lock_key = None
self.assertTrue("redbeat_lock_key" in self.app.conf.keys())
self.conf = RedBeatConfig(self.app)
self.assertEqual(self.conf.lock_key, None)
def test_disable_lock_key_3(self):
self.app.conf.REDBEAT_LOCK_KEY = None
self.assertTrue("REDBEAT_LOCK_KEY" in self.app.conf.keys())
self.conf = RedBeatConfig(self.app)
self.assertEqual(self.conf.lock_key, None)
def test_other_keys(self):
self.assertEqual(self.conf.schedule_key, self.conf.key_prefix + ':schedule')
self.assertEqual(self.conf.statics_key, self.conf.key_prefix + ':statics')
self.assertEqual(self.conf.lock_key, self.conf.key_prefix + ':lock')
@pytest.mark.skipif(not CELERY_4_OR_GREATER, reason="requires Celery >= 4.x")
def test_key_prefix_override_4(self):
self.app.conf.redbeat_key_prefix = 'test-prefix:'
self.conf = RedBeatConfig(self.app)
self.assertEqual(self.conf.key_prefix, 'test-prefix:')
@pytest.mark.skipif(CELERY_4_OR_GREATER, reason="requires Celery < 4.x")
def test_key_prefix_override_3(self):
self.app.conf.REDBEAT_KEY_PREFIX = 'test-prefix:'
self.conf = RedBeatConfig(self.app)
self.assertEqual(self.conf.key_prefix, 'test-prefix:')
def test_schedule(self):
schedule = {'foo': 'bar'}
self.conf.schedule = schedule
self.assertEqual(self.conf.schedule, schedule)
@pytest.mark.skipif(CELERY_4_OR_GREATER, reason="requires Celery < 4.x")
@mock.patch('warnings.warn')
def test_key_has_value_or_3(self, warn_mock):
broker_url = self.conf.key_has_value_or('BROKER_URL')
self.assertFalse(warn_mock.called)
self.assertEqual(broker_url, self.app.conf.BROKER_URL)
@pytest.mark.skipif(not CELERY_4_OR_GREATER, reason="requires Celery >= 4.x")
@mock.patch('warnings.warn')
def test_key_has_value_or_4(self, warn_mock):
broker_url = self.conf.key_has_value_or('BROKER_URL')
self.assertTrue(warn_mock.called)
self.assertEqual(broker_url, self.app.conf.broker_url)
| 40.103896 | 84 | 0.703044 | 438 | 3,088 | 4.716895 | 0.150685 | 0.092933 | 0.110358 | 0.122459 | 0.759923 | 0.636012 | 0.604066 | 0.604066 | 0.604066 | 0.604066 | 0 | 0.005934 | 0.181347 | 3,088 | 76 | 85 | 40.631579 | 0.811313 | 0.039508 | 0 | 0.310345 | 0 | 0 | 0.098886 | 0 | 0 | 0 | 0 | 0 | 0.344828 | 1 | 0.224138 | false | 0 | 0.068966 | 0 | 0.310345 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13aea7e94915333c53d9469a34aaf5d2a028276b | 1,472 | py | Python | equipment/armor.py | jyurkiw/py_cyberpunk_2020_rest_api | f423761622c11b6f91556560a309697144a166a5 | [
"MIT"
] | null | null | null | equipment/armor.py | jyurkiw/py_cyberpunk_2020_rest_api | f423761622c11b6f91556560a309697144a166a5 | [
"MIT"
] | null | null | null | equipment/armor.py | jyurkiw/py_cyberpunk_2020_rest_api | f423761622c11b6f91556560a309697144a166a5 | [
"MIT"
] | null | null | null | from util import db
from util import getFilteredQuery
from flask import jsonify
from flask_restful import Resource
from flask_restful import reqparse
from urllib.parse import unquote
_armorCollection = db.armor
class ArmorListApi(Resource):
def get(self):
return jsonify(getFilteredQuery(_armorCollection, {}))
class ArmorHelmetApi(Resource):
def get(self):
return jsonify(getFilteredQuery(_armorCollection, {"head": True}))
class ArmorJacketApi(Resource):
def get(self):
return jsonify(getFilteredQuery(_armorCollection, {"torso": True}))
class ArmorPantsApi(Resource):
def get(self):
return jsonify(getFilteredQuery(_armorCollection, {"legs": True}))
class ArmorByFilterApi(Resource):
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("head", type=bool, default=None)
parser.add_argument("torso", type=bool, default=None)
parser.add_argument("arms", type=bool, default=None)
parser.add_argument("legs", type=bool, default=None)
parser.add_argument("soft", type=bool, default=None)
parser.add_argument("half_vs_edged", type=bool, default=None)
parser.add_argument("encumbrance_value", type=int, default=None)
parser.add_argument("random_count", type=int, default=None)
args = parser.parse_args()
return jsonify(getFilteredQuery(_armorCollection, args))
| 32 | 76 | 0.695652 | 162 | 1,472 | 6.191358 | 0.302469 | 0.071785 | 0.135593 | 0.139581 | 0.490528 | 0.462612 | 0.462612 | 0.247258 | 0 | 0 | 0 | 0 | 0.199049 | 1,472 | 45 | 77 | 32.711111 | 0.850721 | 0 | 0 | 0.125 | 0 | 0 | 0.053259 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15625 | false | 0 | 0.1875 | 0.125 | 0.65625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
13c39075d06908e724a0ebf5da80931cbcff3aad | 522 | py | Python | src/Day5_RemoveDuplicatesFromSortedListII.py | ruarfff/leetcode-jan-2021 | 9436c0d6b82e83c0b21a498c998fa9e41d443d3c | [
"MIT"
] | null | null | null | src/Day5_RemoveDuplicatesFromSortedListII.py | ruarfff/leetcode-jan-2021 | 9436c0d6b82e83c0b21a498c998fa9e41d443d3c | [
"MIT"
] | null | null | null | src/Day5_RemoveDuplicatesFromSortedListII.py | ruarfff/leetcode-jan-2021 | 9436c0d6b82e83c0b21a498c998fa9e41d443d3c | [
"MIT"
] | null | null | null | from typing import List
class Solution:
def findKthPositive(self, arr: List[int], k: int) -> int:
missing_numbers = 0
next_expected = 1
for x in arr:
while x != next_expected:
missing_numbers += 1
if missing_numbers == k:
return next_expected
next_expected += 1
next_expected += 1
if len(arr) > 0 and missing_numbers < k:
return arr[-1] + (k - missing_numbers)
return k | 26.1 | 61 | 0.521073 | 61 | 522 | 4.295082 | 0.42623 | 0.267176 | 0.148855 | 0.160305 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022508 | 0.404215 | 522 | 20 | 62 | 26.1 | 0.819936 | 0 | 0 | 0.133333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.066667 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13c97007189dae04abd901ee398494a9dabc587f | 328 | py | Python | score/masking_warning_color.py | SuperShinyEyes/Score-Tool | 907dd9e695c9950c6f168480e591a239cdf6a826 | [
"MIT"
] | 2 | 2020-09-25T20:35:58.000Z | 2020-10-10T17:24:15.000Z | score/masking_warning_color.py | SuperShinyEyes/Score-Tool | 907dd9e695c9950c6f168480e591a239cdf6a826 | [
"MIT"
] | 1 | 2020-10-10T17:57:02.000Z | 2020-10-28T15:13:49.000Z | score/masking_warning_color.py | SuperShinyEyes/Score-Tool | 907dd9e695c9950c6f168480e591a239cdf6a826 | [
"MIT"
] | 1 | 2020-09-25T20:36:06.000Z | 2020-09-25T20:36:06.000Z | def color(val):
if val < 60:
r = 0
g = 255
b = 0
if val >= 60:
r = ((val - 60) / 20) * 255
g = 255
b = 120
if val >= 80:
r = 255
g = 255 - (((val - 80) / 20) * 255)
b = 120 - (((val - 80) / 20) * 120)
return 'rgb({},{},{})'.format(r, g, b) | 21.866667 | 43 | 0.332317 | 47 | 328 | 2.319149 | 0.340426 | 0.137615 | 0.12844 | 0.146789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.271676 | 0.472561 | 328 | 15 | 44 | 21.866667 | 0.358382 | 0 | 0 | 0.142857 | 0 | 0 | 0.039514 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
13cd2425aac94174a04857f1f2c2ba4ecc65c8a4 | 1,167 | py | Python | ietf/ipr/feeds.py | wpjesus/codematch | eee7405259cce9239ea0545a2a1300ee1accfe94 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | 1 | 2015-09-02T19:53:12.000Z | 2015-09-02T19:53:12.000Z | ietf/ipr/feeds.py | wpjesus/codematch | eee7405259cce9239ea0545a2a1300ee1accfe94 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | ietf/ipr/feeds.py | wpjesus/codematch | eee7405259cce9239ea0545a2a1300ee1accfe94 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | # Copyright The IETF Trust 2007, All Rights Reserved
from django.contrib.syndication.views import Feed
from django.utils.feedgenerator import Atom1Feed
from django.core.urlresolvers import reverse_lazy
from django.utils.safestring import mark_safe
from ietf.ipr.models import IprDisclosureBase
class LatestIprDisclosuresFeed(Feed):
feed_type = Atom1Feed
title = "IPR Disclosures to the IETF"
link = reverse_lazy('ipr_showlist')
description = "Updates on new IPR Disclosures made to the IETF."
language = "en"
feed_url = "/feed/ipr/"
def items(self):
return IprDisclosureBase.objects.filter(state__in=('posted','removed')).order_by('-time')[:30]
def item_title(self, item):
return mark_safe(item.title)
def item_description(self, item):
return unicode(item.title)
def item_pubdate(self, item):
return item.time
def item_author_name(self, item):
if item.by:
return item.by.name
else:
return None
def item_author_email(self, item):
if item.by:
return item.by.email_address()
else:
return None
| 28.463415 | 102 | 0.676093 | 149 | 1,167 | 5.174497 | 0.456376 | 0.045396 | 0.054475 | 0.041505 | 0.072633 | 0.072633 | 0.072633 | 0.072633 | 0 | 0 | 0 | 0.008989 | 0.237361 | 1,167 | 40 | 103 | 29.175 | 0.857303 | 0.042845 | 0 | 0.2 | 0 | 0 | 0.104933 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.166667 | 0.133333 | 0.866667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
13ceaecd9da662e33949472d0b4e77af71d3dbb9 | 1,750 | py | Python | sdk/python/pulumi_azure_native/resources/v20210401/_enums.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_native/resources/v20210401/_enums.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure_native/resources/v20210401/_enums.py | sebtelko/pulumi-azure-native | 711ec021b5c73da05611c56c8a35adb0ce3244e4 | [
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
from enum import Enum
__all__ = [
'DeploymentMode',
'ExpressionEvaluationOptionsScopeType',
'ExtendedLocationType',
'OnErrorDeploymentType',
'ResourceIdentityType',
]
class DeploymentMode(str, Enum):
"""
The mode that is used to deploy resources. This value can be either Incremental or Complete. In Incremental mode, resources are deployed without deleting existing resources that are not included in the template. In Complete mode, resources are deployed and existing resources in the resource group that are not included in the template are deleted. Be careful when using Complete mode as you may unintentionally delete resources.
"""
INCREMENTAL = "Incremental"
COMPLETE = "Complete"
class ExpressionEvaluationOptionsScopeType(str, Enum):
"""
The scope to be used for evaluation of parameters, variables and functions in a nested template.
"""
NOT_SPECIFIED = "NotSpecified"
OUTER = "Outer"
INNER = "Inner"
class ExtendedLocationType(str, Enum):
"""
The extended location type.
"""
EDGE_ZONE = "EdgeZone"
class OnErrorDeploymentType(str, Enum):
"""
The deployment on error behavior type. Possible values are LastSuccessful and SpecificDeployment.
"""
LAST_SUCCESSFUL = "LastSuccessful"
SPECIFIC_DEPLOYMENT = "SpecificDeployment"
class ResourceIdentityType(str, Enum):
"""
The identity type.
"""
SYSTEM_ASSIGNED = "SystemAssigned"
USER_ASSIGNED = "UserAssigned"
SYSTEM_ASSIGNED_USER_ASSIGNED = "SystemAssigned, UserAssigned"
NONE = "None"
| 31.25 | 433 | 0.721143 | 194 | 1,750 | 6.438144 | 0.541237 | 0.028022 | 0.040032 | 0.038431 | 0.04964 | 0.04964 | 0.04964 | 0 | 0 | 0 | 0 | 0.000714 | 0.199429 | 1,750 | 55 | 434 | 31.818182 | 0.890792 | 0.476571 | 0 | 0 | 1 | 0 | 0.298329 | 0.068019 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.04 | 0 | 0.72 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
13d439b8f20fea937e4d002fd0aa50aada03d697 | 741 | py | Python | pyxel/music.py | sacredhotdog/pyxel | 08da48dbd1ac53c06cf8a383f28d66fd89f78f4a | [
"MIT"
] | 1 | 2019-10-18T01:54:10.000Z | 2019-10-18T01:54:10.000Z | pyxel/music.py | Shiozaki-s21/pyxel | 8938d656fccf1c87d56fa4187e9d6b242b132c75 | [
"MIT"
] | null | null | null | pyxel/music.py | Shiozaki-s21/pyxel | 8938d656fccf1c87d56fa4187e9d6b242b132c75 | [
"MIT"
] | null | null | null | class Music:
def __init__(self):
self._ch0 = []
self._ch1 = []
self._ch2 = []
self._ch3 = []
@property
def ch0(self):
return self._ch0
@property
def ch1(self):
return self._ch1
@property
def ch2(self):
return self._ch2
@property
def ch3(self):
return self._ch3
def set(self, ch0, ch1, ch2, ch3):
self.set_ch0(ch0)
self.set_ch1(ch1)
self.set_ch2(ch2)
self.set_ch3(ch3)
def set_ch0(self, data):
self._ch0[:] = data
def set_ch1(self, data):
self._ch1[:] = data
def set_ch2(self, data):
self._ch2[:] = data
def set_ch3(self, data):
self._ch3[:] = data
| 18.073171 | 38 | 0.516869 | 96 | 741 | 3.739583 | 0.135417 | 0.083565 | 0.155989 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066946 | 0.354926 | 741 | 40 | 39 | 18.525 | 0.6841 | 0 | 0 | 0.129032 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.322581 | false | 0 | 0 | 0.129032 | 0.483871 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
13d720fa3dfdb5b76c1c20dba0fcde0412524d07 | 13,769 | py | Python | Proj029Pipelines/pipeline_proj029_replication.py | CGATOxford/proj029 | f0a8ea63b4f086e673aa3bf8b7d3b9749261b525 | [
"BSD-3-Clause"
] | 3 | 2016-04-04T22:54:14.000Z | 2017-04-01T09:37:54.000Z | Proj029Pipelines/pipeline_proj029_replication.py | CGATOxford/proj029 | f0a8ea63b4f086e673aa3bf8b7d3b9749261b525 | [
"BSD-3-Clause"
] | null | null | null | Proj029Pipelines/pipeline_proj029_replication.py | CGATOxford/proj029 | f0a8ea63b4f086e673aa3bf8b7d3b9749261b525 | [
"BSD-3-Clause"
] | null | null | null | """
=======================================
Compare original and new RNA-seq data
at Day14 vs. Day0
=======================================
:Author: Nick Ilott
:Release: $Id$
:Date: |today|
:Tags: Python
"""
# load modules
from ruffus import *
import CGAT.Experiment as E
import logging as L
import CGAT.Database as Database
import CGAT.CSV as CSV
import sys
import os
import re
import shutil
import itertools
import math
import glob
import time
import gzip
import collections
import random
import numpy as np
import sqlite3
import CGAT.GTF as GTF
import CGAT.IOTools as IOTools
import CGAT.IndexedFasta as IndexedFasta
from rpy2.robjects import r as R
import rpy2.robjects as ro
import rpy2.robjects.vectors as rovectors
from rpy2.rinterface import RRuntimeError
import CGATPipelines.PipelineMetagenomeCommunities as PipelineMetagenomeCommunities
import pandas
###################################################
###################################################
###################################################
# Pipeline configuration
###################################################
# load options from the config file
import CGATPipelines.Pipeline as P
P.getParameters(
["pipeline.ini"])
PARAMS = P.PARAMS
###################################################################
# connecting to database
###################################################################
def connect():
'''connect to database.
This method also attaches to helper databases.
'''
dbh = sqlite3.connect(PARAMS["database"])
return dbh
###################################################
###################################################
###################################################
@follows(mkdir("pca.dir"))
@jobs_limit(1, "R")
@transform([os.path.join(PARAMS.get("communitiesdir"), "genes.dir/gene_counts.norm.matrix"),
os.path.join(PARAMS.get("communitiesdir"), "counts.dir/genus.diamond.aggregated.counts.norm.matrix")],
regex("(\S+)/(\S+).matrix"),
r"pca.dir/\2.loadings.tsv")
def buildPCALoadings(infile, outfile):
'''
run PCA and heatmap the loadings
'''
outname_plot = P.snip(outfile, ".loadings.tsv") + ".pca.pdf"
R('''dat <- read.csv("%s", header = T, stringsAsFactors = F, sep = "\t")''' % infile)
# just get day14 and day0
R('''remove <- c("day3", "day6", "day28")''')
R('''for (day in remove){; dat <- dat[, grep(day, colnames(dat), invert=T)]}''')
R('''rownames(dat) <- dat$taxa''')
R('''dat <- dat[, 1:ncol(dat)-1]''')
R('''pc <- prcomp(t(dat))''')
R('''conds <- unlist(strsplit(colnames(dat), ".R[0-9]"))[seq(1, ncol(dat)*2, 2)]''')
R('''conds <- unlist(strsplit(conds, ".", fixed = T))[seq(2, length(conds)*2, 2)]''')
# plot the principle components
R('''library(ggplot2)''')
R('''pcs <- data.frame(pc$x)''')
R('''pcs$cond <- conds''')
# get variance explained
R('''imps <- c(summary(pc)$importance[2], summary(pc)$importance[5])''')
R('''p <- ggplot(pcs, aes(x = PC1, y = PC2, colour = cond, size = 3)) + geom_point()''')
R('''p2 <- p + xlab(imps[1]) + ylab(imps[2])''')
R('''p3 <- p2 + scale_colour_manual(values = c("slateGrey", "red"))''')
R('''ggsave("%s")''' % outname_plot)
# get the loadings
R('''loads <- data.frame(pc$rotation)''')
R('''loads$taxa <- rownames(loads)''')
# write out data
R('''write.table(loads, file = "%s", sep = "\t", row.names = F, quote = F)''' % outfile.replace("/", "/%s_" % suffix))
P.touch(outfile)
#########################################
#########################################
#########################################
@transform([os.path.join(PARAMS.get("communitiesdir"), "genes.dir/gene_counts.norm.matrix"),
os.path.join(PARAMS.get("communitiesdir"), "counts.dir/genus.diamond.aggregated.counts.norm.matrix")],
regex("(\S+)/(\S+).matrix"),
r"pca.dir/\2.sig")
def testDistSignificance(infile, outfile):
'''
test whether the colitic samples
cluster significantly
'''
PipelineMetagenomeCommunities.testDistSignificance(infile,
outfile)
#########################################
#########################################
#########################################
@follows(mkdir("correlation.dir"))
@merge(["original_gene_counts.diff.tsv",
"replication_gene_counts.diff.tsv"],
"correlation.dir/gene_abundance_scatter.png")
def scatterplotAbundanceEstimates(infiles, outfile):
'''
scatterplot abundance estimates for NOGs
'''
R('''dat.orig <- read.csv("%s", header=T, stringsAsFactors=F, sep="\t")''' % infiles[0])
R('''dat.orig <- dat.orig[dat.orig$group2 == "WT" & dat.orig$group1 == "HhaIL10R",]''')
R('''dat.rep <- read.csv("%s", header=T, stringsAsFactors=F, sep="\t")''' % infiles[1])
R('''dat.rep <- dat.rep[dat.rep$group1 == "day0" & dat.rep$group2 == "day14",]''')
R('''rownames(dat.orig) <- dat.orig$taxa''')
R('''dat.orig <- dat.orig[dat.rep$taxa,]''')
R('''png("%s")''' % outfile)
R('''plot(dat.orig$AveExpr, dat.rep$AveExpr, pch=16, col="slateGrey")''')
R('''abline(0,1)''')
R('''text(x=3, y=15, labels=c(paste("r =", round(cor(dat.orig$AveExpr, dat.rep$AveExpr),2), sep=" ")))''')
R["dev.off"]()
#########################################
#########################################
#########################################
@follows(mkdir("correlation.dir"))
@merge(["original_gene_counts.diff.tsv",
"replication_gene_counts.diff.tsv"],
"correlation.dir/gene_fold_changes_scatter.png")
def scatterplotFoldChanges(infiles, outfile):
'''
scatterplot abundance estimates for NOGs
'''
R('''dat.orig <- read.csv("%s", header=T, stringsAsFactors=F, sep="\t")''' % infiles[0])
R('''dat.orig <- dat.orig[dat.orig$group2 == "WT" & dat.orig$group1 == "HhaIL10R",]''')
R('''dat.rep <- read.csv("%s", header=T, stringsAsFactors=F, sep="\t")''' % infiles[1])
R('''dat.rep <- dat.rep[dat.rep$group1 == "day0" & dat.rep$group2 == "day14",]''')
R('''rownames(dat.orig) <- dat.orig$taxa''')
R('''dat.orig <- dat.orig[dat.rep$taxa,]''')
R('''png("%s")''' % outfile)
R('''plot(dat.orig$logFC, -1*dat.rep$logFC, pch=16)''')
R('''text(x=-4, y=5, labels=c(paste("r =", round(cor(dat.orig$logFC, -1*dat.rep$logFC),2), sep=" ")))''')
R["dev.off"]()
#########################################
#########################################
#########################################
@follows(mkdir("correlation.dir"))
@merge(["original_gene_counts.diff.tsv",
"replication_gene_counts.diff.tsv"],
"correlation.dir/gene_diff_overlap.tsv")
def buildGeneDifferentialExpressionOverlap(infiles, outfile):
'''
scatterplot abundance estimates for NOGs
'''
R('''dat.orig <- read.csv("%s", header=T, stringsAsFactors=F, sep="\t")''' % infiles[0])
R('''dat.orig <- dat.orig[dat.orig$group2 == "WT" & dat.orig$group1 == "HhaIL10R",]''')
R('''dat.rep <- read.csv("%s", header=T, stringsAsFactors=F, sep="\t")''' % infiles[1])
R('''dat.rep <- dat.rep[dat.rep$group1 == "day0" & dat.rep$group2 == "day14",]''')
R('''rownames(dat.orig) <- dat.orig$taxa''')
R('''dat.orig <- dat.orig[dat.rep$taxa,]''')
R('''diff.orig <- dat.orig$taxa[dat.orig$adj.P.Val < 0.05]''')
R('''diff.rep <- dat.rep$taxa[dat.rep$adj.P.Val < 0.05]''')
R('''overlap <- intersect(diff.orig, diff.rep)''')
R('''write.table(overlap, file="%s", sep="\t")''' % outfile)
R('''norig <- length(diff.orig)''')
R('''nrep <- length(diff.rep)''')
R('''noverlap <- length(overlap)''')
# significance testing
R('''x <- length(intersect(dat.orig$taxa, dat.rep$taxa))''')
R('''m <- nrep''')
R('''n <- x - nrep''')
R('''k <- norig''')
R('''print(1-phyper(x,m,n,k))''')
R('''write.table(data.frame(c(norig, nrep,noverlap)), file="correlation.dir/noverlap.tsv")''')
#########################################
#########################################
#########################################
@follows(mkdir("wolinella_weisella.dir"))
@transform("original_genus.diamond.aggregated.counts.norm.matrix",
regex("(\S+).norm.matrix"),
r"wolinella_weisella.dir/\1.wolinella.pdf")
def plotOriginalWolinella(infile, outfile):
'''
plot the abundance of Weisella and Wolinella;
ones that replicated
'''
R('''library(reshape)''')
R('''library(ggplot2)''')
R('''dat <- read.csv("%s",
header=T,
stringsAsFactors=F,
sep="\t")''' % infile)
R('''dat <- melt(dat)''')
R('''conds <- unlist(strsplit(as.character(dat$variable), ".R[0-9]"))''')
R('''conds <- conds[seq(1,length(conds),2)]''')
R('''dat$cond <- conds''')
R('''dat <- dat[dat$taxa == "Wolinella",]''')
R('''plot1 <- ggplot(dat, aes(x=factor(cond, levels=c("stool.WT","stool.aIL10R", "stool.Hh", "stool.HhaIL10R")),
y=value, group=cond, colour=cond))''')
R('''plot2 <- plot1 + geom_boxplot() + geom_jitter(size=3)''')
R('''plot2 + scale_colour_manual(values=c("blue", "darkGreen", "red", "grey")) + ylim(c(0,3))''')
R('''ggsave("%s")''' % outfile)
#########################################
#########################################
#########################################
@follows(mkdir("wolinella_weisella.dir"))
@transform("replication_genus.diamond.aggregated.counts.norm.matrix",
regex("(\S+).norm.matrix"),
r"wolinella_weisella.dir/\1.wolinella.pdf")
def plotReplicationWolinella(infile, outfile):
'''
plot the abundance of Weisella and Wolinella;
ones that replicated
'''
R('''library(reshape)''')
R('''library(ggplot2)''')
R('''dat <- read.csv("%s",
header=T,
stringsAsFactors=F,
sep="\t")''' % infile)
# just get day14 and day0
R('''remove <- c("day3", "day6", "day28")''')
R('''for (day in remove){; dat <- dat[, grep(day, colnames(dat), invert=T)]}''')
R('''dat <- melt(dat)''')
R('''conds <- unlist(strsplit(as.character(dat$variable), ".R[0-9]"))''')
R('''conds <- conds[seq(1,length(conds),2)]''')
R('''dat$cond <- conds''')
R('''dat <- dat[dat$taxa == "Wolinella",]''')
R('''plot1 <- ggplot(dat, aes(x=factor(cond, levels=c("stool.day0","stool.day14")),
y=value, group=cond, colour=cond))''')
R('''plot2 <- plot1 + geom_boxplot() + geom_jitter(size=3)''')
R('''plot2 + scale_colour_manual(values=c("grey", "red")) + ylim(c(0,3))''')
R('''ggsave("%s")''' % outfile)
#########################################
#########################################
#########################################
@follows(mkdir("wolinella_weisella.dir"))
@transform("original_genus.diamond.aggregated.counts.norm.matrix",
regex("(\S+).norm.matrix"),
r"wolinella_weisella.dir/\1.weissella.pdf")
def plotOriginalWeissella(infile, outfile):
'''
plot the abundance of Weisella and Wolinella;
ones that replicated
'''
R('''library(reshape)''')
R('''library(ggplot2)''')
R('''dat <- read.csv("%s",
header=T,
stringsAsFactors=F,
sep="\t")''' % infile)
R('''dat <- melt(dat)''')
R('''conds <- unlist(strsplit(as.character(dat$variable), ".R[0-9]"))''')
R('''conds <- conds[seq(1,length(conds),2)]''')
R('''dat$cond <- conds''')
R('''dat <- dat[dat$taxa == "Weissella",]''')
R('''plot1 <- ggplot(dat, aes(x=factor(cond, levels=c("stool.WT","stool.aIL10R", "stool.Hh", "stool.HhaIL10R")),
y=value, group=cond, colour=cond))''')
R('''plot2 <- plot1 + geom_boxplot() + geom_jitter(size=3)''')
R('''plot2 + scale_colour_manual(values=c("blue", "darkGreen", "red", "grey")) + ylim(c(0,4))''')
R('''ggsave("%s")''' % outfile)
#########################################
#########################################
#########################################
@follows(mkdir("wolinella_weisella.dir"))
@transform("replication_genus.diamond.aggregated.counts.norm.matrix",
regex("(\S+).norm.matrix"),
r"wolinella_weisella.dir/\1.weissella.pdf")
def plotReplicationWeissella(infile, outfile):
'''
plot the abundance of Weisella and Wolinella;
ones that replicated
'''
R('''library(reshape)''')
R('''library(ggplot2)''')
R('''dat <- read.csv("%s",
header=T,
stringsAsFactors=F,
sep="\t")''' % infile)
# just get day14 and day0
R('''remove <- c("day3", "day6", "day28")''')
R('''for (day in remove){; dat <- dat[, grep(day, colnames(dat), invert=T)]}''')
R('''dat <- melt(dat)''')
R('''conds <- unlist(strsplit(as.character(dat$variable), ".R[0-9]"))''')
R('''conds <- conds[seq(1,length(conds),2)]''')
R('''dat$cond <- conds''')
R('''dat <- dat[dat$taxa == "Weissella",]''')
R('''plot1 <- ggplot(dat, aes(x=factor(cond, levels=c("stool.day0","stool.day14")),
y=value, group=cond, colour=cond))''')
R('''plot2 <- plot1 + geom_boxplot() + geom_jitter(size=3)''')
R('''plot2 + scale_colour_manual(values=c("grey", "red")) + ylim(c(0,4))''')
R('''ggsave("%s")''' % outfile)
@follows(plotOriginalWeissella,
plotOriginalWolinella,
plotReplicationWeissella,
plotReplicationWolinella)
def plotWolinellaWeissella():
pass
if __name__ == "__main__":
sys.exit(P.main(sys.argv))
| 37.415761 | 122 | 0.511511 | 1,559 | 13,769 | 4.476588 | 0.186017 | 0.034102 | 0.021493 | 0.024072 | 0.648517 | 0.641782 | 0.632899 | 0.628027 | 0.620003 | 0.616564 | 0 | 0.015381 | 0.173651 | 13,769 | 367 | 123 | 37.517711 | 0.597996 | 0.074079 | 0 | 0.545455 | 0 | 0.177273 | 0.603969 | 0.196752 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0.004545 | 0.131818 | 0 | 0.186364 | 0.004545 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.