hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
67e4a190f4b21b618d8a69e714cec31032c3687f | 8,111 | py | Python | layers/util/mapping_functions.py | meder411/spherical-package | 73d51a25da5891d12e4c04d8ad2e6f1854ffa121 | [
"BSD-3-Clause"
] | 8 | 2020-06-13T19:49:06.000Z | 2022-02-24T07:16:02.000Z | layers/util/mapping_functions.py | meder411/spherical-package | 73d51a25da5891d12e4c04d8ad2e6f1854ffa121 | [
"BSD-3-Clause"
] | 4 | 2020-07-03T08:44:13.000Z | 2021-09-17T12:18:57.000Z | layers/util/mapping_functions.py | meder411/spherical-package | 73d51a25da5891d12e4c04d8ad2e6f1854ffa121 | [
"BSD-3-Clause"
] | 3 | 2020-06-10T23:30:20.000Z | 2020-12-29T13:50:01.000Z | import torch
import math
from .grids import *
from .conversions import *
# =============================================================================
# Equirectangular mapping functions
# =============================================================================
#
# Note that there is no concept of padding for spherical images because there
# are no image boundaries.
# #
def equirectangular_kernel(shape, kernel_size, dilation=1):
"""
Returns a kernel sampling grid with angular spacing according to the provided shape (and associated computed angular resolution) of an equirectangular image
shape: (H, W)
kernel_size: (kh, kw)
"""
# For convenience
kh, kw = kernel_size
# Get equirectangular grid resolution
res_lon, res_lat = get_equirectangular_grid_resolution(shape)
# Build the kernel according to the angular resolution of the equirectangular image
dlon = torch.zeros(kernel_size)
dlat = torch.zeros(kernel_size)
for i in range(kh):
cur_i = i - (kh // 2)
for j in range(kw):
cur_j = j - (kw // 2)
dlon[i, j] = cur_j * dilation * res_lon
# Flip sign is because +Y is down
dlat[i, j] = cur_i * dilation * -res_lat
# Returns the kernel differentials as kh x kw
return dlon, dlat
def grid_projection_map(shape, kernel_size, stride=1, dilation=1):
# For convenience
H, W = shape
kh, kw = kernel_size
# Get lat/lon mesh grid and resolution
lon, lat = spherical_meshgrid(shape)
# Get the kernel differentials
dlon, dlat = equirectangular_kernel(shape, kernel_size, dilation)
# Equalize views
lat = lat.view(H, W, 1)
lon = lon.view(H, W, 1)
dlon = dlon.view(1, 1, kh * kw)
dlat = dlat.view(1, 1, kh * kw)
# Compute the "projection"
map_lat = lat + dlat
map_lon = lon + dlon
# Convert the spherical coordinates to pixel coordinates
# H x W x KH*KW x 2
map_pixels = convert_spherical_to_image(
torch.stack((map_lon, map_lat), -1), shape)
# Adjust the stride of the map accordingly
map_pixels = map_pixels[::stride, ::stride, ...].contiguous()
# Return the pixel sampling map
# H x W x KH*KW x 2
return map_pixels
def inverse_gnomonic_projection_map(shape, kernel_size, stride=1, dilation=1):
# For convenience
H, W = shape
kh, kw = kernel_size
# Get lat/lon mesh grid and resolution
lon, lat = spherical_meshgrid(shape)
# Get the kernel differentials
dlon, dlat = equirectangular_kernel(shape, kernel_size, dilation)
# Equalize views
lat = lat.view(H, W, 1)
lon = lon.view(H, W, 1)
dlon = dlon.view(1, 1, kh * kw)
dlat = dlat.view(1, 1, kh * kw)
# Compute the inverse gnomonic projection of each tangent grid (the kernel) back onto sphere at each pixel of the equirectangular image.
rho = (dlon**2 + dlat**2).sqrt()
nu = rho.atan()
map_lat = (nu.cos() * lat.sin() + dlat * nu.sin() * lat.cos() / rho).asin()
map_lon = lon + torch.atan2(
dlon * nu.sin(),
rho * lat.cos() * nu.cos() - dlat * lat.sin() * nu.sin())
# Handle the (0,0) case
map_lat[..., [kh * kw // 2]] = lat
map_lon[..., [kh * kw // 2]] = lon
# Compensate for longitudinal wrap around
map_lon = ((map_lon + math.pi) % (2 * math.pi)) - math.pi
# Convert the spherical coordinates to pixel coordinates
# H x W x KH*KW x 2
map_pixels = convert_spherical_to_image(
torch.stack((map_lon, map_lat), -1), shape)
# Adjust the stride of the map accordingly
map_pixels = map_pixels[::stride, ::stride, ...].contiguous()
# Return the pixel sampling map
# H x W x KH*KW x 2
return map_pixels
def inverse_equirectangular_projection_map(shape,
kernel_size,
stride=1,
dilation=1):
# For convenience
H, W = shape
kh, kw = kernel_size
# Get lat/lon mesh grid and resolution
lon, lat = spherical_meshgrid(shape)
# Get the kernel differentials
dlon, dlat = equirectangular_kernel(shape, kernel_size, dilation)
# Equalize views
lat = lat.view(H, W, 1)
lon = lon.view(H, W, 1)
dlon = dlon.view(1, 1, kh * kw)
dlat = dlat.view(1, 1, kh * kw)
# Compute the inverse equirectangular projection of each tangent grid (the kernel) back onto sphere at each pixel of the equirectangular image.
# Compute the projection back onto sphere
map_lat = lat + dlat
map_lon = lon + dlon / map_lat.cos()
# Compensate for longitudinal wrap around
map_lon = ((map_lon + math.pi) % (2 * math.pi)) - math.pi
# Convert the spherical coordinates to pixel coordinates
# H x W x KH*KW x 2
map_pixels = convert_spherical_to_image(
torch.stack((map_lon, map_lat), -1), shape)
# Adjust the stride of the map accordingly
map_pixels = map_pixels[::stride, ::stride, ...].contiguous()
# Return the pixel sampling map
# H x W x KH*KW x 2
return map_pixels
# =============================================================================
# Cube map mapping functions
# =============================================================================
def cube_kernel(cube_dim, kernel_size, dilation=1):
"""
Returns a kernel sampling grid with angular spacing according to the provided cube dimension (and associated computed angular resolution) of a cube map
cube_dim: length of side of square face of cube map
kernel_size: (kh, kw)
"""
# For convenience
kh, kw = kernel_size
cube_res = 1 / cube_dim
# Build the kernel according to the angular resolution of the cube face
dx = torch.zeros(kernel_size)
dy = torch.zeros(kernel_size)
for i in range(kh):
cur_i = i - (kh // 2)
for j in range(kw):
cur_j = j - (kw // 2)
dx[i, j] = cur_j * dilation * cube_res
# Flip sign is because +Y is down
dy[i, j] = cur_i * dilation * -cube_res
# Returns the kernel differentials as kh x kw
return dx, dy
def inverse_cube_face_projection_map(cube_dim,
kernel_size,
stride=1,
dilation=1,
polar=False):
"""
Creates a sampling map which models each face of the cube as an gnomonic projection (equatorial aspect) of the sphere. Warps the kernel according to the inverse gnomonic projection for the face.
"""
# For convenience
kh, kw = kernel_size
# Get a meshgrid of a cube face in terms of spherical coordinates
face_lon, face_lat = cube_face_spherical_meshgrid(cube_dim, polar)
# Get the kernel differentials
dx, dy = cube_kernel(cube_dim, kernel_size, dilation)
# Equalize views
face_lat = face_lat.view(cube_dim, cube_dim, 1)
face_lon = face_lon.view(cube_dim, cube_dim, 1)
dx = dx.view(1, 1, kh * kw)
dy = dy.view(1, 1, kh * kw)
# Compute the inverse gnomonic projection of each tangent grid (the kernel) back onto sphere at each pixel of the cube face
rho = (dx**2 + dy**2).sqrt()
nu = rho.atan()
map_lat = (nu.cos() * face_lat.sin() +
dy * nu.sin() * face_lat.cos() / rho).asin()
map_lon = face_lon + torch.atan2(
dx * nu.sin(),
rho * face_lat.cos() * nu.cos() - dy * face_lat.sin() * nu.sin())
# Handle the (0,0) case
map_lat[..., [kh * kw // 2]] = face_lat
map_lon[..., [kh * kw // 2]] = face_lon
# Create the sample map in terms of spherical coordinates
map_face = torch.stack((map_lon, map_lat), -1)
# Convert the cube coordinates on the sphere to pixels in the cube map
map_pixels = convert_spherical_to_cube_face(map_face, cube_dim)
# Adjust the stride of the map accordingly
map_pixels = map_pixels[::stride, ::stride, ...].contiguous()
# Return the pixel sampling map
# cube_dime x cube_dim x KH*KW x 2
return map_pixels | 33.378601 | 198 | 0.601159 | 1,139 | 8,111 | 4.152766 | 0.119403 | 0.022833 | 0.010148 | 0.013531 | 0.74186 | 0.702326 | 0.652431 | 0.619239 | 0.603594 | 0.575264 | 0 | 0.011106 | 0.267291 | 8,111 | 243 | 199 | 33.378601 | 0.784789 | 0.392183 | 0 | 0.541284 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055046 | false | 0 | 0.036697 | 0 | 0.146789 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
67e5a6a6c74d4339ea14061f1806e706d149cac0 | 6,026 | py | Python | Modules/ego_planner/ego-planner-swarm/src/uav_simulator/Utils/multi_map_server/src/multi_map_server/msg/_VerticalOccupancyGridList.py | 473867143/Prometheus | df1e1b0d861490223ac8b94d8cc4796537172292 | [
"BSD-3-Clause"
] | 1,217 | 2020-07-02T13:15:18.000Z | 2022-03-31T06:17:44.000Z | Modules/ego_planner/ego-planner-swarm/src/uav_simulator/Utils/multi_map_server/src/multi_map_server/msg/_VerticalOccupancyGridList.py | 473867143/Prometheus | df1e1b0d861490223ac8b94d8cc4796537172292 | [
"BSD-3-Clause"
] | 167 | 2020-07-12T15:35:43.000Z | 2022-03-31T11:57:40.000Z | Modules/ego_planner/ego-planner-swarm/src/uav_simulator/Utils/multi_map_server/src/multi_map_server/msg/_VerticalOccupancyGridList.py | 473867143/Prometheus | df1e1b0d861490223ac8b94d8cc4796537172292 | [
"BSD-3-Clause"
] | 270 | 2020-07-02T13:28:00.000Z | 2022-03-28T05:43:08.000Z | """autogenerated by genpy from multi_map_server/VerticalOccupancyGridList.msg. Do not edit."""
import sys
python3 = True if sys.hexversion > 0x03000000 else False
import genpy
import struct
class VerticalOccupancyGridList(genpy.Message):
_md5sum = "7ef85cc95b82747f51eb01a16bd7c795"
_type = "multi_map_server/VerticalOccupancyGridList"
_has_header = False #flag to mark the presence of a Header object
_full_text = """float32 x
float32 y
int32[] upper
int32[] lower
int32[] mass
"""
__slots__ = ['x','y','upper','lower','mass']
_slot_types = ['float32','float32','int32[]','int32[]','int32[]']
def __init__(self, *args, **kwds):
"""
Constructor. Any message fields that are implicitly/explicitly
set to None will be assigned a default value. The recommend
use is keyword arguments as this is more robust to future message
changes. You cannot mix in-order arguments and keyword arguments.
The available fields are:
x,y,upper,lower,mass
:param args: complete set of field values, in .msg order
:param kwds: use keyword arguments corresponding to message field names
to set specific fields.
"""
if args or kwds:
super(VerticalOccupancyGridList, self).__init__(*args, **kwds)
#message fields cannot be None, assign default values for those that are
if self.x is None:
self.x = 0.
if self.y is None:
self.y = 0.
if self.upper is None:
self.upper = []
if self.lower is None:
self.lower = []
if self.mass is None:
self.mass = []
else:
self.x = 0.
self.y = 0.
self.upper = []
self.lower = []
self.mass = []
def _get_types(self):
"""
internal API method
"""
return self._slot_types
def serialize(self, buff):
"""
serialize message into buffer
:param buff: buffer, ``StringIO``
"""
try:
_x = self
buff.write(_struct_2f.pack(_x.x, _x.y))
length = len(self.upper)
buff.write(_struct_I.pack(length))
pattern = '<%si'%length
buff.write(struct.pack(pattern, *self.upper))
length = len(self.lower)
buff.write(_struct_I.pack(length))
pattern = '<%si'%length
buff.write(struct.pack(pattern, *self.lower))
length = len(self.mass)
buff.write(_struct_I.pack(length))
pattern = '<%si'%length
buff.write(struct.pack(pattern, *self.mass))
except struct.error as se: self._check_types(struct.error("%s: '%s' when writing '%s'" % (type(se), str(se), str(_x))))
except TypeError as te: self._check_types(ValueError("%s: '%s' when writing '%s'" % (type(te), str(te), str(_x))))
def deserialize(self, str):
"""
unpack serialized message in str into this message instance
:param str: byte array of serialized message, ``str``
"""
try:
end = 0
_x = self
start = end
end += 8
(_x.x, _x.y,) = _struct_2f.unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
pattern = '<%si'%length
start = end
end += struct.calcsize(pattern)
self.upper = struct.unpack(pattern, str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
pattern = '<%si'%length
start = end
end += struct.calcsize(pattern)
self.lower = struct.unpack(pattern, str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
pattern = '<%si'%length
start = end
end += struct.calcsize(pattern)
self.mass = struct.unpack(pattern, str[start:end])
return self
except struct.error as e:
raise genpy.DeserializationError(e) #most likely buffer underfill
def serialize_numpy(self, buff, numpy):
"""
serialize message with numpy array types into buffer
:param buff: buffer, ``StringIO``
:param numpy: numpy python module
"""
try:
_x = self
buff.write(_struct_2f.pack(_x.x, _x.y))
length = len(self.upper)
buff.write(_struct_I.pack(length))
pattern = '<%si'%length
buff.write(self.upper.tostring())
length = len(self.lower)
buff.write(_struct_I.pack(length))
pattern = '<%si'%length
buff.write(self.lower.tostring())
length = len(self.mass)
buff.write(_struct_I.pack(length))
pattern = '<%si'%length
buff.write(self.mass.tostring())
except struct.error as se: self._check_types(struct.error("%s: '%s' when writing '%s'" % (type(se), str(se), str(_x))))
except TypeError as te: self._check_types(ValueError("%s: '%s' when writing '%s'" % (type(te), str(te), str(_x))))
def deserialize_numpy(self, str, numpy):
"""
unpack serialized message in str into this message instance using numpy for array types
:param str: byte array of serialized message, ``str``
:param numpy: numpy python module
"""
try:
end = 0
_x = self
start = end
end += 8
(_x.x, _x.y,) = _struct_2f.unpack(str[start:end])
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
pattern = '<%si'%length
start = end
end += struct.calcsize(pattern)
self.upper = numpy.frombuffer(str[start:end], dtype=numpy.int32, count=length)
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
pattern = '<%si'%length
start = end
end += struct.calcsize(pattern)
self.lower = numpy.frombuffer(str[start:end], dtype=numpy.int32, count=length)
start = end
end += 4
(length,) = _struct_I.unpack(str[start:end])
pattern = '<%si'%length
start = end
end += struct.calcsize(pattern)
self.mass = numpy.frombuffer(str[start:end], dtype=numpy.int32, count=length)
return self
except struct.error as e:
raise genpy.DeserializationError(e) #most likely buffer underfill
_struct_I = genpy.struct_I
_struct_2f = struct.Struct("<2f")
| 32.397849 | 123 | 0.623963 | 801 | 6,026 | 4.580524 | 0.193508 | 0.061052 | 0.041973 | 0.037067 | 0.629872 | 0.62115 | 0.580267 | 0.580267 | 0.559008 | 0.531207 | 0 | 0.016856 | 0.241952 | 6,026 | 185 | 124 | 32.572973 | 0.78634 | 0.201792 | 0 | 0.671533 | 1 | 0 | 0.073684 | 0.015897 | 0 | 0 | 0.002148 | 0 | 0 | 1 | 0.043796 | false | 0 | 0.021898 | 0 | 0.138686 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
67e7dfe8a3a11d78c472c0f64358e33daa1e6979 | 1,696 | py | Python | listener.py | chrismarget/ios-icmp-channel | b2a09f1c345816f525a3f7aed6a562631b0fc7e6 | [
"Apache-2.0"
] | 1 | 2018-01-30T01:53:20.000Z | 2018-01-30T01:53:20.000Z | listener.py | chrismarget/ios-icmp-channel | b2a09f1c345816f525a3f7aed6a562631b0fc7e6 | [
"Apache-2.0"
] | null | null | null | listener.py | chrismarget/ios-icmp-channel | b2a09f1c345816f525a3f7aed6a562631b0fc7e6 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
import sys
class message(object):
def add(self, idx, b):
self.message[idx] = b
if (b == '\x04') and self.is_complete():
self.print_message()
def get_eom_idx(self):
for i in sorted(self.message.keys()):
if self.message[i] == '\x04':
return i
return False
def is_complete(self):
eom_idx = self.get_eom_idx()
if not eom_idx:
return False
received = self.message.keys()
for i in range(0,eom_idx):
if not (i in received):
return False
return True
def print_message(self):
print self.sender + "\t" + self.get_message()
def get_message(self):
out = ''
eom_idx = self.get_eom_idx()
for i in range(0,eom_idx):
out+=self.message[i]
return out
def __init__(self, sender, idx, b):
self.sender = sender
self.message = {}
self.add(idx, b)
def open_icmp_sniffer():
import socket, sys
import struct
try:
s = socket.socket(socket.AF_INET, socket.SOCK_RAW,
socket.IPPROTO_ICMP)
except socket.error, msg:
print 'Socket create failed: '+str(msg[0])+' Message ' + msg[1]
sys.exit()
s.setsockopt(socket.IPPROTO_IP, socket.IP_HDRINCL, 1)
s.bind(('', 0))
return s
s = open_icmp_sniffer()
messages = {}
while True:
p = s.recvfrom(65565)
sender = p[1][0]
sequence = ord(p[0][-2])
payload = p[0][-1]
if sender not in messages.keys():
messages[sender] = message(sender, sequence, payload)
else:
messages[sender].add(sequence, payload)
| 27.803279 | 71 | 0.56191 | 229 | 1,696 | 4.030568 | 0.31441 | 0.052004 | 0.029252 | 0.028169 | 0.080173 | 0.080173 | 0.039003 | 0 | 0 | 0 | 0 | 0.017995 | 0.31191 | 1,696 | 60 | 72 | 28.266667 | 0.772922 | 0.011792 | 0 | 0.127273 | 0 | 0 | 0.024478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.054545 | null | null | 0.072727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
67f15b64983f5eafc8f2961a8adfe37568e44cb9 | 2,051 | py | Python | tests/test_keepalived2.py | khosrow/lvsm | 516ee1422f736d016ccc198e54f5f019102504a6 | [
"MIT"
] | 15 | 2015-03-18T21:45:24.000Z | 2021-02-22T09:41:30.000Z | tests/test_keepalived2.py | khosrow/lvsm | 516ee1422f736d016ccc198e54f5f019102504a6 | [
"MIT"
] | 12 | 2016-01-15T19:32:36.000Z | 2016-10-27T14:21:14.000Z | tests/test_keepalived2.py | khosrow/lvsm | 516ee1422f736d016ccc198e54f5f019102504a6 | [
"MIT"
] | 8 | 2015-03-20T00:24:56.000Z | 2021-11-19T06:21:19.000Z | import unittest
import os
import sys
import StringIO
path = os.path.abspath(os.path.dirname(__file__))
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '../lvsm')))
from lvsm.modules import keepalived
class Keepalived(unittest.TestCase):
"""Tests for the functionality of the keepalived module"""
def setUp(self):
args = {'keepalived-mib': 'KEEPALIVED-MIB',
'snmp_community': 'private',
'snmp_host': 'localhost',
'snmp_user': '',
'snmp_password': '',
'cache_dir': path + '/cache'
}
self.director = keepalived.Keepalived(path + '/scripts/ipvsadm3',
path + '/etc/keepalived.conf',
restart_cmd='',
nodes='',
args=args)
def test_show(self):
self.maxDiff = None
# Testing show on non-standard ports
expected_result = ['',
'Layer 4 Load balancing',
'======================',
'TCP 192.0.2.2:8888 rr ',
' -> 192.0.2.200:8888 Masq 1 0 0 ',
' -> 192.0.2.201:8888 Masq 1 0 0 ',
'',
'UDP 192.0.2.2:domain rr ',
' -> 192.0.2.202:domain Masq 1 0 0 ',
' -> 192.0.2.203:domain Masq 1 0 0 ',
'',
'']
self.assertEqual(self.director.show(numeric=False, color=False), expected_result)
if __name__ == "__main__":
unittest.main()
| 42.729167 | 113 | 0.376889 | 171 | 2,051 | 4.374269 | 0.473684 | 0.032086 | 0.040107 | 0.037433 | 0.128342 | 0.032086 | 0.032086 | 0 | 0 | 0 | 0 | 0.070577 | 0.509508 | 2,051 | 47 | 114 | 43.638298 | 0.672962 | 0.042906 | 0 | 0.052632 | 0 | 0 | 0.323965 | 0.011242 | 0 | 0 | 0 | 0 | 0.026316 | 1 | 0.052632 | false | 0.026316 | 0.131579 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
67f2b9d79410dba976d86159718de46c71935384 | 1,416 | py | Python | faeAuditor/auditGroupResults/urlsCSV.py | opena11y/fae-auditor | ea9099b37b77ddc30092b0cdd962647c92b143a7 | [
"Apache-2.0"
] | 2 | 2018-02-28T19:03:28.000Z | 2021-09-30T13:40:23.000Z | faeAuditor/auditGroupResults/urlsCSV.py | opena11y/fae-auditor | ea9099b37b77ddc30092b0cdd962647c92b143a7 | [
"Apache-2.0"
] | 6 | 2020-02-11T21:53:58.000Z | 2022-02-10T07:57:58.000Z | faeAuditor/auditGroupResults/urlsCSV.py | opena11y/fae-auditor | ea9099b37b77ddc30092b0cdd962647c92b143a7 | [
"Apache-2.0"
] | 1 | 2019-12-05T06:05:20.000Z | 2019-12-05T06:05:20.000Z | """
Copyright 2014-2018 University of Illinois
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
file: auditResults/urlsCSV.py
Author: Jon Gunderson
"""
# reports/urls.py
from __future__ import absolute_import
from django.conf.urls import url
from .viewsCSV import GroupResultsViewCSV
from .viewsCSV import GroupResultsAuditGroupViewCSV
from .viewsCSV import GroupRuleGroupResultsViewCSV
urlpatterns = [
url(r'^all/(?P<result_slug>[\w-]+)/(?P<rule_grouping>[\w-]+)/$',
GroupResultsViewCSV,
name='group_results_csv'),
url(r'^all/(?P<result_slug>[\w-]+)/(?P<rule_grouping>[\w-]+)/g/(?P<audit_group_slug>[\w-]+)/$',
GroupResultsAuditGroupViewCSV,
name='group_results_audit_group_csv'),
# Rule grouping result views
url(r'^some/(?P<result_slug>[\w-]+)/(?P<rule_grouping>[\w-]+)/rg/(?P<rule_group_slug>[\w-]+)/$',
GroupRuleGroupResultsViewCSV,
name='group_rule_group_results_csv')
]
| 29.5 | 100 | 0.735169 | 194 | 1,416 | 5.237113 | 0.520619 | 0.059055 | 0.05315 | 0.035433 | 0.090551 | 0.090551 | 0.090551 | 0.090551 | 0.064961 | 0.064961 | 0 | 0.009868 | 0.141243 | 1,416 | 47 | 101 | 30.12766 | 0.825658 | 0.469633 | 0 | 0 | 0 | 0.125 | 0.412162 | 0.389189 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.3125 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
67f2fda918bbde7a4b1b415f81dab3ffab386200 | 876 | py | Python | randomizer.py | shane1027/PollDaddySlurp | 6cc17156f38427379d095277681dbe1a68baa49d | [
"MIT"
] | null | null | null | randomizer.py | shane1027/PollDaddySlurp | 6cc17156f38427379d095277681dbe1a68baa49d | [
"MIT"
] | null | null | null | randomizer.py | shane1027/PollDaddySlurp | 6cc17156f38427379d095277681dbe1a68baa49d | [
"MIT"
] | 1 | 2019-10-10T15:19:33.000Z | 2019-10-10T15:19:33.000Z | #!/usr/bin/env python2.7
import time
from http_request_randomizer.requests.proxy.requestProxy import RequestProxy
if __name__ == '__main__':
start = time.time()
req_proxy = RequestProxy()
print "Initialization took: {0} sec".format((time.time() - start))
print "Size : ", len(req_proxy.get_proxy_list())
print " ALL = ", req_proxy.get_proxy_list()
test_url = 'http://ipv4.icanhazip.com'
while True:
start = time.time()
request = req_proxy.generate_proxied_request(test_url)
print "Proxied Request Took: {0} sec => Status: {1}".format((time.time() - start), request.__str__())
if request is not None:
print "\t Response: ip={0}".format(u''.join(request.text).encode('utf-8'))
print "Proxy List Size: ", len(req_proxy.get_proxy_list())
print"-> Going to sleep.."
time.sleep(1)
| 35.04 | 109 | 0.643836 | 117 | 876 | 4.57265 | 0.487179 | 0.074766 | 0.061682 | 0.08972 | 0.157009 | 0.119626 | 0.119626 | 0.119626 | 0 | 0 | 0 | 0.012931 | 0.205479 | 876 | 24 | 110 | 36.5 | 0.755747 | 0.026256 | 0 | 0.111111 | 0 | 0 | 0.210094 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.111111 | null | null | 0.388889 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
67f38cc9e41435b2a8a8c22aa5a456b1d76fb88e | 555 | py | Python | examples/nni_data_augmentation/basenet/data.py | petuum/tuun | 8eec472dbf0e5e695449b0fa2d98985469fd5b30 | [
"Apache-2.0"
] | 33 | 2020-08-30T16:22:35.000Z | 2022-02-26T13:48:32.000Z | examples/nni_data_augmentation/basenet/data.py | petuum/tuun | 8eec472dbf0e5e695449b0fa2d98985469fd5b30 | [
"Apache-2.0"
] | 2 | 2021-01-18T19:46:43.000Z | 2021-03-24T09:59:14.000Z | examples/nni_data_augmentation/basenet/data.py | petuum/tuun | 8eec472dbf0e5e695449b0fa2d98985469fd5b30 | [
"Apache-2.0"
] | 2 | 2020-08-25T17:02:15.000Z | 2021-04-21T16:40:44.000Z | #!/usr/bin/env python
"""
data.py
"""
import itertools
def loopy_wrapper(gen):
while True:
for x in gen:
yield x
class ZipDataloader:
def __init__(self, dataloaders):
self.dataloaders = dataloaders
self._len = len(dataloaders[0])
def __len__(self):
return self._len
def __iter__(self):
counter = 0
iters = [loopy_wrapper(d) for d in self.dataloaders]
while counter < len(self):
yield tuple(zip(*[next(it) for it in iters]))
counter += 1
| 18.5 | 60 | 0.578378 | 69 | 555 | 4.42029 | 0.507246 | 0.147541 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007895 | 0.315315 | 555 | 29 | 61 | 19.137931 | 0.794737 | 0.05045 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.058824 | 0.058824 | 0.411765 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
67facec68d3d68647d57845cc972fe7ead4b3012 | 793 | py | Python | lnbits/extensions/usermanager/models.py | blackcoffeexbt/lnbits-legend | a9f2877af77ea56d1900e2b5bc1c21b9b7ac2f64 | [
"MIT"
] | 76 | 2021-11-02T22:19:59.000Z | 2022-03-30T18:01:33.000Z | lnbits/extensions/usermanager/models.py | blackcoffeexbt/lnbits-legend | a9f2877af77ea56d1900e2b5bc1c21b9b7ac2f64 | [
"MIT"
] | 100 | 2021-11-04T16:33:28.000Z | 2022-03-30T15:03:52.000Z | lnbits/extensions/usermanager/models.py | blackcoffeexbt/lnbits-legend | a9f2877af77ea56d1900e2b5bc1c21b9b7ac2f64 | [
"MIT"
] | 57 | 2021-11-08T06:43:59.000Z | 2022-03-31T08:53:16.000Z | from sqlite3 import Row
from fastapi.param_functions import Query
from pydantic import BaseModel
from typing import Optional
class CreateUserData(BaseModel):
user_name: str = Query(...)
wallet_name: str = Query(...)
admin_id: str = Query(...)
email: str = Query("")
password: str = Query("")
class CreateUserWallet(BaseModel):
user_id: str = Query(...)
wallet_name: str = Query(...)
admin_id: str = Query(...)
class Users(BaseModel):
id: str
name: str
admin: str
email: Optional[str] = None
password: Optional[str] = None
class Wallets(BaseModel):
id: str
admin: str
name: str
user: str
adminkey: str
inkey: str
@classmethod
def from_row(cls, row: Row) -> "Wallets":
return cls(**dict(row))
| 19.341463 | 45 | 0.630517 | 98 | 793 | 5.020408 | 0.336735 | 0.130081 | 0.073171 | 0.073171 | 0.166667 | 0.166667 | 0.166667 | 0.166667 | 0.166667 | 0.166667 | 0 | 0.001667 | 0.24338 | 793 | 40 | 46 | 19.825 | 0.818333 | 0 | 0 | 0.333333 | 0 | 0 | 0.008827 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033333 | false | 0.066667 | 0.133333 | 0.033333 | 0.966667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
67ff40cfd4c8a6b2e69d26c388ef6020f73b4c94 | 2,151 | py | Python | river/migrations/0012_auto_20191113_1550.py | xuziheng1002/django-river | 7c7f23aa4790e451019c3e2b4d29f35852de17e6 | [
"BSD-3-Clause"
] | null | null | null | river/migrations/0012_auto_20191113_1550.py | xuziheng1002/django-river | 7c7f23aa4790e451019c3e2b4d29f35852de17e6 | [
"BSD-3-Clause"
] | null | null | null | river/migrations/0012_auto_20191113_1550.py | xuziheng1002/django-river | 7c7f23aa4790e451019c3e2b4d29f35852de17e6 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11.25 on 2019-11-13 21:50
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('river', '0011_auto_20191110_1411'),
]
operations = [
migrations.AlterField(
model_name='onapprovedhook',
name='transition_approval',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='on_approved_hooks', to='river.TransitionApproval',
verbose_name='Transition Approval'),
),
migrations.AlterField(
model_name='onapprovedhook',
name='transition_approval_meta',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='on_approved_hooks', to='river.TransitionApprovalMeta', verbose_name='Transition Approval Meta'),
),
migrations.AlterField(
model_name='ontransithook',
name='transition',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='on_transit_hooks', to='river.Transition', verbose_name='Transition'),
),
migrations.AlterField(
model_name='ontransithook',
name='transition_meta',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='on_transit_hooks', to='river.TransitionMeta', verbose_name='Transition Meta'),
),
migrations.AlterField(
model_name='workflow',
name='content_type',
field=models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to='contenttypes.ContentType', verbose_name='Content Type'),
),
migrations.AlterField(
model_name='workflow',
name='initial_state',
field=models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='workflow_this_set_as_initial_state', to='river.State', verbose_name='Initial State'),
),
]
| 45.765957 | 191 | 0.664807 | 228 | 2,151 | 6.052632 | 0.29386 | 0.046377 | 0.071014 | 0.111594 | 0.617391 | 0.611594 | 0.552174 | 0.471014 | 0.376812 | 0.376812 | 0 | 0.020214 | 0.218038 | 2,151 | 46 | 192 | 46.76087 | 0.800238 | 0.032078 | 0 | 0.45 | 1 | 0 | 0.243867 | 0.075517 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.075 | 0 | 0.15 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
db031f4543bacf2c603d4a3ccb452d553dc3e0d6 | 486 | py | Python | user/migrations/0004_auto_20200813_1948.py | VladimirZubavlenko/ikaf42-app | 240e012675e4347370289554f34d9c60c8b6f35d | [
"MIT"
] | null | null | null | user/migrations/0004_auto_20200813_1948.py | VladimirZubavlenko/ikaf42-app | 240e012675e4347370289554f34d9c60c8b6f35d | [
"MIT"
] | null | null | null | user/migrations/0004_auto_20200813_1948.py | VladimirZubavlenko/ikaf42-app | 240e012675e4347370289554f34d9c60c8b6f35d | [
"MIT"
] | null | null | null | # Generated by Django 3.0.5 on 2020-08-13 19:48
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('user', '0003_auto_20200813_1943'),
]
operations = [
migrations.AlterField(
model_name='user',
name='emailConfirmToken',
field=models.TextField(default='-CBGbHSkumN38RqAx2UPSak73vs1Tklm2_-xoY1V', max_length=30, verbose_name='Токен подтверждения почты'),
),
]
| 25.578947 | 144 | 0.652263 | 50 | 486 | 6.2 | 0.84 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110512 | 0.236626 | 486 | 18 | 145 | 27 | 0.725067 | 0.092593 | 0 | 0 | 1 | 0 | 0.257403 | 0.143508 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
db063dcff6ca568e771df05b7ae7f650c6cd2aea | 4,270 | py | Python | interpreter.py | bendmorris/beaver | 4db3e1690145dee89d30144f3632396313218214 | [
"MIT"
] | 2 | 2018-10-06T08:35:41.000Z | 2019-04-03T21:15:02.000Z | interpreter.py | bendmorris/beaver | 4db3e1690145dee89d30144f3632396313218214 | [
"MIT"
] | null | null | null | interpreter.py | bendmorris/beaver | 4db3e1690145dee89d30144f3632396313218214 | [
"MIT"
] | null | null | null | import argparse
import os
import sys
from lib.graph import Graph
from lib.types import BeaverException, Uri
from lib.command import OutCommand
import sys
reload(sys)
sys.setdefaultencoding('utf8')
from __init__ import __version__
arg_parser = argparse.ArgumentParser()
arg_parser.add_argument('--version', help='print version and exit', action='version', version=__version__)
arg_parser.add_argument('-t', '--test', help='run unit tests and exit', action='store_true')
arg_parser.add_argument('file', nargs='*', help='file to be interpreted')
arg_parser.add_argument('-i', '--interactive', help='enter interactive mode after interpreting file', action='store_true')
arg_parser.add_argument('-e', '--eval', help='string to be evaluated')
arg_parser.add_argument('-v', '--verbose', help='print each triple statement as evaluated', action='store_true')
arg_parser.add_argument('-d', '--draw', help='output an image of the resulting graph to the given image file; image type is inferred from file extension')
arg_parser.add_argument('-o', '--out', help='serialize the resulting graph to the given output file (using Turtle)', nargs='?', const=True, default=None)
args = arg_parser.parse_args()
#print args.__dict__
if args.test:
import tests
tests.run_tests(verbose=args.verbose)
sys.exit()
if not sys.stdin.isatty():
# read and evaluate piped input
if args.eval is None: args.eval = ''
args.eval = sys.stdin.read() + args.eval
interactive = (not args.file and not args.eval) or (args.interactive and sys.stdin.isatty())
def run():
if interactive: print '''Beaver %s''' % __version__
graph = Graph(verbose=args.verbose)
for input_file in args.file:
try:
graph.parse(filename=input_file)
except KeyboardInterrupt:
print "KeyboardInterrupt"
sys.exit()
except Exception as e:
print e
sys.exit()
if args.eval:
try:
graph.parse(text=args.eval)
except KeyboardInterrupt:
print "KeyboardInterrupt"
sys.exit()
except Exception as e:
print e
sys.exit()
if interactive:
import readline
exit = False
while not exit:
graph.verbose = args.verbose
try:
next_line = raw_input('>> ').strip()
if not next_line: continue
if next_line[0] == '-' and next_line.split(' ')[0] in arg_parser._option_string_actions:
command = next_line.split(' ')[0]
action = arg_parser._option_string_actions[command].dest
if len(next_line.split(' ')) > 1:
arg = ' '.join(next_line.split(' ')[1:])
try: arg = eval(arg)
except: pass
else:
arg = not getattr(args, action)
try:
setattr(args, action, arg)
except:
print 'Illegal argument: %s %s' % (command, arg)
elif next_line in ('exit', 'quit'):
exit = True
else:
stmts = graph.parse(text=next_line)
if stmts == 0:
raise BeaverException('Failed to parse line: %s' % next_line)
except EOFError:
print
exit = True
except KeyboardInterrupt:
print
continue
except Exception as e:
print e
continue
if args.out:
if args.out is True:
filename = None
else:
filename = args.out
if not filename.startswith('<') and filename.endswith('>'):
filename = '<%s>' % os.path.abspath(filename)
filename = Uri(filename)
graph.execute(OutCommand(filename))
if args.draw:
graph.draw(args.draw)
if __name__ == '__main__': run()
| 32.846154 | 154 | 0.544028 | 468 | 4,270 | 4.809829 | 0.284188 | 0.047979 | 0.042648 | 0.07108 | 0.187916 | 0.187916 | 0.122168 | 0.075522 | 0.075522 | 0.075522 | 0 | 0.002536 | 0.35363 | 4,270 | 129 | 155 | 33.100775 | 0.813043 | 0.011475 | 0 | 0.319588 | 0 | 0 | 0.138421 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.010309 | 0.103093 | null | null | 0.113402 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
db0693e026c74e759573c7252d4aff5ef90ae5ad | 242 | py | Python | euler/28.py | DevStarSJ/algorithmExercise | 66b42c54cdd594ff3f229613fd83446f8c1f9153 | [
"MIT"
] | null | null | null | euler/28.py | DevStarSJ/algorithmExercise | 66b42c54cdd594ff3f229613fd83446f8c1f9153 | [
"MIT"
] | null | null | null | euler/28.py | DevStarSJ/algorithmExercise | 66b42c54cdd594ff3f229613fd83446f8c1f9153 | [
"MIT"
] | null | null | null | def get_cross_sum(n):
start = 1
total = 1
for i in range(1, n):
step = i * 2
start = start + step
total += start * 4 + step * 6
start = start + step * 3
return total
print(get_cross_sum(501)) | 18.615385 | 37 | 0.516529 | 37 | 242 | 3.27027 | 0.540541 | 0.132231 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 0.380165 | 242 | 13 | 38 | 18.615385 | 0.74 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0 | 0 | 0.2 | 0.1 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
db086691881d363f79126af6b8d208d584242b29 | 114,519 | py | Python | cisco-ios-xe/ydk/models/cisco_ios_xe/MPLS_LDP_STD_MIB.py | Maikor/ydk-py | b86c4a7c570ae3b2c5557d098420446df5de4929 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xe/ydk/models/cisco_ios_xe/MPLS_LDP_STD_MIB.py | Maikor/ydk-py | b86c4a7c570ae3b2c5557d098420446df5de4929 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xe/ydk/models/cisco_ios_xe/MPLS_LDP_STD_MIB.py | Maikor/ydk-py | b86c4a7c570ae3b2c5557d098420446df5de4929 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | """ MPLS_LDP_STD_MIB
Copyright (C) The Internet Society (2004). The
initial version of this MIB module was published
in RFC 3815. For full legal notices see the RFC
itself or see\:
http\://www.ietf.org/copyrights/ianamib.html
This MIB contains managed object definitions for the
'Multiprotocol Label Switching, Label Distribution
Protocol, LDP' document.
"""
from collections import OrderedDict
from ydk.types import Entity, EntityPath, Identity, Enum, YType, YLeaf, YLeafList, YList, LeafDataList, Bits, Empty, Decimal64
from ydk.filters import YFilter
from ydk.errors import YError, YModelError
from ydk.errors.error_handler import handle_type_error as _handle_type_error
class MPLSLDPSTDMIB(Entity):
"""
.. attribute:: mplsldplsrobjects
**type**\: :py:class:`MplsLdpLsrObjects <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpLsrObjects>`
.. attribute:: mplsldpentityobjects
**type**\: :py:class:`MplsLdpEntityObjects <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityObjects>`
.. attribute:: mplsldpsessionobjects
**type**\: :py:class:`MplsLdpSessionObjects <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpSessionObjects>`
.. attribute:: mplsfecobjects
**type**\: :py:class:`MplsFecObjects <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsFecObjects>`
.. attribute:: mplsldpentitytable
This table contains information about the MPLS Label Distribution Protocol Entities which exist on this Label Switching Router (LSR) or Label Edge Router (LER)
**type**\: :py:class:`MplsLdpEntityTable <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable>`
.. attribute:: mplsldppeertable
Information about LDP peers known by Entities in the mplsLdpEntityTable. The information in this table is based on information from the Entity\-Peer interaction during session initialization but is not appropriate for the mplsLdpSessionTable, because objects in this table may or may not be used in session establishment
**type**\: :py:class:`MplsLdpPeerTable <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpPeerTable>`
.. attribute:: mplsldphelloadjacencytable
A table of Hello Adjacencies for Sessions
**type**\: :py:class:`MplsLdpHelloAdjacencyTable <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpHelloAdjacencyTable>`
.. attribute:: mplsinsegmentldplsptable
A table of LDP LSP's which map to the mplsInSegmentTable in the MPLS\-LSR\-STD\-MIB module
**type**\: :py:class:`MplsInSegmentLdpLspTable <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsInSegmentLdpLspTable>`
.. attribute:: mplsoutsegmentldplsptable
A table of LDP LSP's which map to the mplsOutSegmentTable in the MPLS\-LSR\-STD\-MIB
**type**\: :py:class:`MplsOutSegmentLdpLspTable <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsOutSegmentLdpLspTable>`
.. attribute:: mplsfectable
This table represents the FEC (Forwarding Equivalence Class) Information associated with an LSP
**type**\: :py:class:`MplsFecTable <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsFecTable>`
.. attribute:: mplsldplspfectable
A table which shows the relationship between LDP LSPs and FECs. Each row represents a single LDP LSP to FEC association
**type**\: :py:class:`MplsLdpLspFecTable <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpLspFecTable>`
.. attribute:: mplsldpsessionpeeraddrtable
This table 'extends' the mplsLdpSessionTable. This table is used to store Label Address Information from Label Address Messages received by this LSR from Peers. This table is read\-only and should be updated when Label Withdraw Address Messages are received, i.e., Rows should be deleted as appropriate. NOTE\: since more than one address may be contained in a Label Address Message, this table 'sparse augments', the mplsLdpSessionTable's information
**type**\: :py:class:`MplsLdpSessionPeerAddrTable <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpSessionPeerAddrTable>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB, self).__init__()
self._top_entity = None
self.yang_name = "MPLS-LDP-STD-MIB"
self.yang_parent_name = "MPLS-LDP-STD-MIB"
self.is_top_level_class = True
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("mplsLdpLsrObjects", ("mplsldplsrobjects", MPLSLDPSTDMIB.MplsLdpLsrObjects)), ("mplsLdpEntityObjects", ("mplsldpentityobjects", MPLSLDPSTDMIB.MplsLdpEntityObjects)), ("mplsLdpSessionObjects", ("mplsldpsessionobjects", MPLSLDPSTDMIB.MplsLdpSessionObjects)), ("mplsFecObjects", ("mplsfecobjects", MPLSLDPSTDMIB.MplsFecObjects)), ("mplsLdpEntityTable", ("mplsldpentitytable", MPLSLDPSTDMIB.MplsLdpEntityTable)), ("mplsLdpPeerTable", ("mplsldppeertable", MPLSLDPSTDMIB.MplsLdpPeerTable)), ("mplsLdpHelloAdjacencyTable", ("mplsldphelloadjacencytable", MPLSLDPSTDMIB.MplsLdpHelloAdjacencyTable)), ("mplsInSegmentLdpLspTable", ("mplsinsegmentldplsptable", MPLSLDPSTDMIB.MplsInSegmentLdpLspTable)), ("mplsOutSegmentLdpLspTable", ("mplsoutsegmentldplsptable", MPLSLDPSTDMIB.MplsOutSegmentLdpLspTable)), ("mplsFecTable", ("mplsfectable", MPLSLDPSTDMIB.MplsFecTable)), ("mplsLdpLspFecTable", ("mplsldplspfectable", MPLSLDPSTDMIB.MplsLdpLspFecTable)), ("mplsLdpSessionPeerAddrTable", ("mplsldpsessionpeeraddrtable", MPLSLDPSTDMIB.MplsLdpSessionPeerAddrTable))])
self._leafs = OrderedDict()
self.mplsldplsrobjects = MPLSLDPSTDMIB.MplsLdpLsrObjects()
self.mplsldplsrobjects.parent = self
self._children_name_map["mplsldplsrobjects"] = "mplsLdpLsrObjects"
self.mplsldpentityobjects = MPLSLDPSTDMIB.MplsLdpEntityObjects()
self.mplsldpentityobjects.parent = self
self._children_name_map["mplsldpentityobjects"] = "mplsLdpEntityObjects"
self.mplsldpsessionobjects = MPLSLDPSTDMIB.MplsLdpSessionObjects()
self.mplsldpsessionobjects.parent = self
self._children_name_map["mplsldpsessionobjects"] = "mplsLdpSessionObjects"
self.mplsfecobjects = MPLSLDPSTDMIB.MplsFecObjects()
self.mplsfecobjects.parent = self
self._children_name_map["mplsfecobjects"] = "mplsFecObjects"
self.mplsldpentitytable = MPLSLDPSTDMIB.MplsLdpEntityTable()
self.mplsldpentitytable.parent = self
self._children_name_map["mplsldpentitytable"] = "mplsLdpEntityTable"
self.mplsldppeertable = MPLSLDPSTDMIB.MplsLdpPeerTable()
self.mplsldppeertable.parent = self
self._children_name_map["mplsldppeertable"] = "mplsLdpPeerTable"
self.mplsldphelloadjacencytable = MPLSLDPSTDMIB.MplsLdpHelloAdjacencyTable()
self.mplsldphelloadjacencytable.parent = self
self._children_name_map["mplsldphelloadjacencytable"] = "mplsLdpHelloAdjacencyTable"
self.mplsinsegmentldplsptable = MPLSLDPSTDMIB.MplsInSegmentLdpLspTable()
self.mplsinsegmentldplsptable.parent = self
self._children_name_map["mplsinsegmentldplsptable"] = "mplsInSegmentLdpLspTable"
self.mplsoutsegmentldplsptable = MPLSLDPSTDMIB.MplsOutSegmentLdpLspTable()
self.mplsoutsegmentldplsptable.parent = self
self._children_name_map["mplsoutsegmentldplsptable"] = "mplsOutSegmentLdpLspTable"
self.mplsfectable = MPLSLDPSTDMIB.MplsFecTable()
self.mplsfectable.parent = self
self._children_name_map["mplsfectable"] = "mplsFecTable"
self.mplsldplspfectable = MPLSLDPSTDMIB.MplsLdpLspFecTable()
self.mplsldplspfectable.parent = self
self._children_name_map["mplsldplspfectable"] = "mplsLdpLspFecTable"
self.mplsldpsessionpeeraddrtable = MPLSLDPSTDMIB.MplsLdpSessionPeerAddrTable()
self.mplsldpsessionpeeraddrtable.parent = self
self._children_name_map["mplsldpsessionpeeraddrtable"] = "mplsLdpSessionPeerAddrTable"
self._segment_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB, [], name, value)
class MplsLdpLsrObjects(Entity):
"""
.. attribute:: mplsldplsrid
The Label Switching Router's Identifier
**type**\: str
**length:** 4
.. attribute:: mplsldplsrloopdetectioncapable
A indication of whether this Label Switching Router supports loop detection. none(1) \-\- Loop Detection is not supported on this LSR. other(2) \-\- Loop Detection is supported but by a method other than those listed below. hopCount(3) \-\- Loop Detection is supported by Hop Count only. pathVector(4) \-\- Loop Detection is supported by Path Vector only. hopCountAndPathVector(5) \-\- Loop Detection is supported by both Hop Count And Path Vector. Since Loop Detection is determined during Session Initialization, an individual session may not be running with loop detection. This object simply gives an indication of whether or not the LSR has the ability to support Loop Detection and which types
**type**\: :py:class:`MplsLdpLsrLoopDetectionCapable <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpLsrObjects.MplsLdpLsrLoopDetectionCapable>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsLdpLsrObjects, self).__init__()
self.yang_name = "mplsLdpLsrObjects"
self.yang_parent_name = "MPLS-LDP-STD-MIB"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mplsldplsrid', (YLeaf(YType.str, 'mplsLdpLsrId'), ['str'])),
('mplsldplsrloopdetectioncapable', (YLeaf(YType.enumeration, 'mplsLdpLsrLoopDetectionCapable'), [('ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB', 'MPLSLDPSTDMIB', 'MplsLdpLsrObjects.MplsLdpLsrLoopDetectionCapable')])),
])
self.mplsldplsrid = None
self.mplsldplsrloopdetectioncapable = None
self._segment_path = lambda: "mplsLdpLsrObjects"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsLdpLsrObjects, [u'mplsldplsrid', u'mplsldplsrloopdetectioncapable'], name, value)
class MplsLdpLsrLoopDetectionCapable(Enum):
"""
MplsLdpLsrLoopDetectionCapable (Enum Class)
A indication of whether this
Label Switching Router supports
loop detection.
none(1) \-\- Loop Detection is not supported
on this LSR.
other(2) \-\- Loop Detection is supported but
by a method other than those
listed below.
hopCount(3) \-\- Loop Detection is supported by
Hop Count only.
pathVector(4) \-\- Loop Detection is supported by
Path Vector only.
hopCountAndPathVector(5) \-\- Loop Detection is
supported by both Hop Count
And Path Vector.
Since Loop Detection is determined during
Session Initialization, an individual session
may not be running with loop detection. This
object simply gives an indication of whether or not the
LSR has the ability to support Loop Detection and
which types.
.. data:: none = 1
.. data:: other = 2
.. data:: hopCount = 3
.. data:: pathVector = 4
.. data:: hopCountAndPathVector = 5
"""
none = Enum.YLeaf(1, "none")
other = Enum.YLeaf(2, "other")
hopCount = Enum.YLeaf(3, "hopCount")
pathVector = Enum.YLeaf(4, "pathVector")
hopCountAndPathVector = Enum.YLeaf(5, "hopCountAndPathVector")
class MplsLdpEntityObjects(Entity):
"""
.. attribute:: mplsldpentitylastchange
The value of sysUpTime at the time of the most recent addition or deletion of an entry to/from the mplsLdpEntityTable/mplsLdpEntityStatsTable, or the most recent change in value of any objects in the mplsLdpEntityTable. If no such changes have occurred since the last re\-initialization of the local management subsystem, then this object contains a zero value
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentityindexnext
This object contains an appropriate value to be used for mplsLdpEntityIndex when creating entries in the mplsLdpEntityTable. The value 0 indicates that no unassigned entries are available
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsLdpEntityObjects, self).__init__()
self.yang_name = "mplsLdpEntityObjects"
self.yang_parent_name = "MPLS-LDP-STD-MIB"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mplsldpentitylastchange', (YLeaf(YType.uint32, 'mplsLdpEntityLastChange'), ['int'])),
('mplsldpentityindexnext', (YLeaf(YType.uint32, 'mplsLdpEntityIndexNext'), ['int'])),
])
self.mplsldpentitylastchange = None
self.mplsldpentityindexnext = None
self._segment_path = lambda: "mplsLdpEntityObjects"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsLdpEntityObjects, [u'mplsldpentitylastchange', u'mplsldpentityindexnext'], name, value)
class MplsLdpSessionObjects(Entity):
"""
.. attribute:: mplsldppeerlastchange
The value of sysUpTime at the time of the most recent addition or deletion to/from the mplsLdpPeerTable/mplsLdpSessionTable
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldplspfeclastchange
The value of sysUpTime at the time of the most recent addition/deletion of an entry to/from the mplsLdpLspFecTable or the most recent change in values to any objects in the mplsLdpLspFecTable. If no such changes have occurred since the last re\-initialization of the local management subsystem, then this object contains a zero value
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsLdpSessionObjects, self).__init__()
self.yang_name = "mplsLdpSessionObjects"
self.yang_parent_name = "MPLS-LDP-STD-MIB"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mplsldppeerlastchange', (YLeaf(YType.uint32, 'mplsLdpPeerLastChange'), ['int'])),
('mplsldplspfeclastchange', (YLeaf(YType.uint32, 'mplsLdpLspFecLastChange'), ['int'])),
])
self.mplsldppeerlastchange = None
self.mplsldplspfeclastchange = None
self._segment_path = lambda: "mplsLdpSessionObjects"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsLdpSessionObjects, [u'mplsldppeerlastchange', u'mplsldplspfeclastchange'], name, value)
class MplsFecObjects(Entity):
"""
.. attribute:: mplsfeclastchange
The value of sysUpTime at the time of the most recent addition/deletion of an entry to/from the mplsLdpFectTable or the most recent change in values to any objects in the mplsLdpFecTable. If no such changes have occurred since the last re\-initialization of the local management subsystem, then this object contains a zero value
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsfecindexnext
This object contains an appropriate value to be used for mplsFecIndex when creating entries in the mplsFecTable. The value 0 indicates that no unassigned entries are available
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsFecObjects, self).__init__()
self.yang_name = "mplsFecObjects"
self.yang_parent_name = "MPLS-LDP-STD-MIB"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mplsfeclastchange', (YLeaf(YType.uint32, 'mplsFecLastChange'), ['int'])),
('mplsfecindexnext', (YLeaf(YType.uint32, 'mplsFecIndexNext'), ['int'])),
])
self.mplsfeclastchange = None
self.mplsfecindexnext = None
self._segment_path = lambda: "mplsFecObjects"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsFecObjects, [u'mplsfeclastchange', u'mplsfecindexnext'], name, value)
class MplsLdpEntityTable(Entity):
"""
This table contains information about the
MPLS Label Distribution Protocol Entities which
exist on this Label Switching Router (LSR)
or Label Edge Router (LER).
.. attribute:: mplsldpentityentry
An entry in this table represents an LDP entity. An entry can be created by a network administrator or by an SNMP agent as instructed by LDP
**type**\: list of :py:class:`MplsLdpEntityEntry <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsLdpEntityTable, self).__init__()
self.yang_name = "mplsLdpEntityTable"
self.yang_parent_name = "MPLS-LDP-STD-MIB"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("mplsLdpEntityEntry", ("mplsldpentityentry", MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry))])
self._leafs = OrderedDict()
self.mplsldpentityentry = YList(self)
self._segment_path = lambda: "mplsLdpEntityTable"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsLdpEntityTable, [], name, value)
class MplsLdpEntityEntry(Entity):
"""
An entry in this table represents an LDP entity.
An entry can be created by a network administrator
or by an SNMP agent as instructed by LDP.
.. attribute:: mplsldpentityldpid (key)
The LDP identifier
**type**\: str
.. attribute:: mplsldpentityindex (key)
This index is used as a secondary index to uniquely identify this row. Before creating a row in this table, the 'mplsLdpEntityIndexNext' object should be retrieved. That value should be used for the value of this index when creating a row in this table. NOTE\: if a value of zero (0) is retrieved, that indicates that no rows can be created in this table at this time. A secondary index (this object) is meaningful to some but not all, LDP implementations. For example an LDP implementation which uses PPP would use this index to differentiate PPP sub\-links. Another way to use this index is to give this the value of ifIndex. However, this is dependant on the implementation
**type**\: int
**range:** 1..4294967295
.. attribute:: mplsldpentityprotocolversion
The version number of the LDP protocol which will be used in the session initialization message. Section 3.5.3 in the LDP Specification specifies that the version of the LDP protocol is negotiated during session establishment. The value of this object represents the value that is sent in the initialization message
**type**\: int
**range:** 1..65535
.. attribute:: mplsldpentityadminstatus
The administrative status of this LDP Entity. If this object is changed from 'enable' to 'disable' and this entity has already attempted to establish contact with a Peer, then all contact with that Peer is lost and all information from that Peer needs to be removed from the MIB. (This implies that the network management subsystem should clean up any related entry in the mplsLdpPeerTable. This further implies that a 'tear\-down' for that session is issued and the session and all information related to that session cease to exist). At this point the operator is able to change values which are related to this entity. When the admin status is set back to 'enable', then this Entity will attempt to establish a new session with the Peer
**type**\: :py:class:`MplsLdpEntityAdminStatus <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry.MplsLdpEntityAdminStatus>`
.. attribute:: mplsldpentityoperstatus
The operational status of this LDP Entity. The value of unknown(1) indicates that the operational status cannot be determined at this time. The value of unknown should be a transient condition before changing to enabled(2) or disabled(3)
**type**\: :py:class:`MplsLdpEntityOperStatus <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry.MplsLdpEntityOperStatus>`
.. attribute:: mplsldpentitytcpport
The TCP Port for LDP. The default value is the well\-known value of this port
**type**\: int
**range:** 0..65535
.. attribute:: mplsldpentityudpdscport
The UDP Discovery Port for LDP. The default value is the well\-known value for this port
**type**\: int
**range:** 0..65535
.. attribute:: mplsldpentitymaxpdulength
The maximum PDU Length that is sent in the Common Session Parameters of an Initialization Message. According to the LDP Specification [RFC3036] a value of 255 or less specifies the default maximum length of 4096 octets, this is why the value of this object starts at 256. The operator should explicitly choose the default value (i.e., 4096), or some other value. The receiving LSR MUST calculate the maximum PDU length for the session by using the smaller of its and its peer's proposals for Max PDU Length
**type**\: int
**range:** 256..65535
**units**\: octets
.. attribute:: mplsldpentitykeepaliveholdtimer
The 16\-bit integer value which is the proposed keep alive hold timer for this LDP Entity
**type**\: int
**range:** 1..65535
**units**\: seconds
.. attribute:: mplsldpentityhelloholdtimer
The 16\-bit integer value which is the proposed Hello hold timer for this LDP Entity. The Hello Hold time in seconds. An LSR maintains a record of Hellos received from potential peers. This object represents the Hold Time in the Common Hello Parameters TLV of the Hello Message. A value of 0 is a default value and should be interpretted in conjunction with the mplsLdpEntityTargetPeer object. If the value of this object is 0\: if the value of the mplsLdpEntityTargetPeer object is false(2), then this specifies that the Hold Time's actual default value is 15 seconds (i.e., the default Hold time for Link Hellos is 15 seconds). Otherwise if the value of the mplsLdpEntityTargetPeer object is true(1), then this specifies that the Hold Time's actual default value is 45 seconds (i.e., the default Hold time for Targeted Hellos is 45 seconds). A value of 65535 means infinite (i.e., wait forever). All other values represent the amount of time in seconds to wait for a Hello Message. Setting the hold time to a value smaller than 15 is not recommended, although not forbidden according to RFC3036
**type**\: int
**range:** 0..65535
**units**\: seconds
.. attribute:: mplsldpentityinitsessionthreshold
When attempting to establish a session with a given Peer, the given LDP Entity should send out the SNMP notification, 'mplsLdpInitSessionThresholdExceeded', when the number of Session Initialization messages sent exceeds this threshold. The notification is used to notify an operator when this Entity and its Peer are possibly engaged in an endless sequence of messages as each NAKs the other's Initialization messages with Error Notification messages. Setting this threshold which triggers the notification is one way to notify the operator. The notification should be generated each time this threshold is exceeded and for every subsequent Initialization message which is NAK'd with an Error Notification message after this threshold is exceeded. A value of 0 (zero) for this object indicates that the threshold is infinity, thus the SNMP notification will never be generated
**type**\: int
**range:** 0..100
.. attribute:: mplsldpentitylabeldistmethod
For any given LDP session, the method of label distribution must be specified
**type**\: :py:class:`MplsLabelDistributionMethod <ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB.MplsLabelDistributionMethod>`
.. attribute:: mplsldpentitylabelretentionmode
The LDP Entity can be configured to use either conservative or liberal label retention mode. If the value of this object is conservative(1) then advertized label mappings are retained only if they will be used to forward packets, i.e., if label came from a valid next hop. If the value of this object is liberal(2) then all advertized label mappings are retained whether they are from a valid next hop or not
**type**\: :py:class:`MplsRetentionMode <ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB.MplsRetentionMode>`
.. attribute:: mplsldpentitypathvectorlimit
If the value of this object is 0 (zero) then Loop Detection for Path Vectors is disabled. Otherwise, if this object has a value greater than zero, then Loop Dection for Path Vectors is enabled, and the Path Vector Limit is this value. Also, the value of the object, 'mplsLdpLsrLoopDetectionCapable', must be set to either 'pathVector(4)' or 'hopCountAndPathVector(5)', if this object has a value greater than 0 (zero), otherwise it is ignored
**type**\: int
**range:** 0..255
.. attribute:: mplsldpentityhopcountlimit
If the value of this object is 0 (zero), then Loop Detection using Hop Counters is disabled. If the value of this object is greater than 0 (zero) then Loop Detection using Hop Counters is enabled, and this object specifies this Entity's maximum allowable value for the Hop Count. Also, the value of the object mplsLdpLsrLoopDetectionCapable must be set to either 'hopCount(3)' or 'hopCountAndPathVector(5)' if this object has a value greater than 0 (zero), otherwise it is ignored
**type**\: int
**range:** 0..255
.. attribute:: mplsldpentitytransportaddrkind
This specifies whether the loopback or interface address is to be used as the transport address in the transport address TLV of the hello message. If the value is interface(1), then the IP address of the interface from which hello messages are sent is used as the transport address in the hello message. Otherwise, if the value is loopback(2), then the IP address of the loopback interface is used as the transport address in the hello message
**type**\: :py:class:`MplsLdpEntityTransportAddrKind <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry.MplsLdpEntityTransportAddrKind>`
.. attribute:: mplsldpentitytargetpeer
If this LDP entity uses targeted peer then set this to true
**type**\: bool
.. attribute:: mplsldpentitytargetpeeraddrtype
The type of the internetwork layer address used for the Extended Discovery. This object indicates how the value of mplsLdpEntityTargetPeerAddr is to be interpreted
**type**\: :py:class:`InetAddressType <ydk.models.cisco_ios_xe.INET_ADDRESS_MIB.InetAddressType>`
.. attribute:: mplsldpentitytargetpeeraddr
The value of the internetwork layer address used for the Extended Discovery. The value of mplsLdpEntityTargetPeerAddrType specifies how this address is to be interpreted
**type**\: str
**length:** 0..255
.. attribute:: mplsldpentitylabeltype
Specifies the optional parameters for the LDP Initialization Message. If the value is generic(1) then no optional parameters will be sent in the LDP Initialization message associated with this Entity. If the value is atmParameters(2) then a row must be created in the mplsLdpEntityAtmTable, which corresponds to this entry. If the value is frameRelayParameters(3) then a row must be created in the mplsLdpEntityFrameRelayTable, which corresponds to this entry
**type**\: :py:class:`MplsLdpLabelType <ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB.MplsLdpLabelType>`
.. attribute:: mplsldpentitydiscontinuitytime
The value of sysUpTime on the most recent occasion at which any one or more of this entity's counters suffered a discontinuity. The relevant counters are the specific instances associated with this entity of any Counter32 object contained in the 'mplsLdpEntityStatsTable'. If no such discontinuities have occurred since the last re\-initialization of the local management subsystem, then this object contains a zero value
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentitystoragetype
The storage type for this conceptual row. Conceptual rows having the value 'permanent(4)' need not allow write\-access to any columnar objects in the row
**type**\: :py:class:`StorageType <ydk.models.cisco_ios_xe.SNMPv2_TC.StorageType>`
.. attribute:: mplsldpentityrowstatus
The status of this conceptual row. All writable objects in this row may be modified at any time, however, as described in detail in the section entitled, 'Changing Values After Session Establishment', and again described in the DESCRIPTION clause of the mplsLdpEntityAdminStatus object, if a session has been initiated with a Peer, changing objects in this table will wreak havoc with the session and interrupt traffic. To repeat again\: the recommended procedure is to set the mplsLdpEntityAdminStatus to down, thereby explicitly causing a session to be torn down. Then, change objects in this entry, then set the mplsLdpEntityAdminStatus to enable, which enables a new session to be initiated
**type**\: :py:class:`RowStatus <ydk.models.cisco_ios_xe.SNMPv2_TC.RowStatus>`
.. attribute:: mplsldpentitystatssessionattempts
A count of the Session Initialization messages which were sent or received by this LDP Entity and were NAK'd. In other words, this counter counts the number of session initializations that failed. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpEntityDiscontinuityTime
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentitystatssessionrejectednohelloerrors
A count of the Session Rejected/No Hello Error Notification Messages sent or received by this LDP Entity. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpEntityDiscontinuityTime
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentitystatssessionrejectedaderrors
A count of the Session Rejected/Parameters Advertisement Mode Error Notification Messages sent or received by this LDP Entity. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpEntityDiscontinuityTime
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentitystatssessionrejectedmaxpduerrors
A count of the Session Rejected/Parameters Max Pdu Length Error Notification Messages sent or received by this LDP Entity. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpEntityDiscontinuityTime
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentitystatssessionrejectedlrerrors
A count of the Session Rejected/Parameters Label Range Notification Messages sent or received by this LDP Entity. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpEntityDiscontinuityTime
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentitystatsbadldpidentifiererrors
This object counts the number of Bad LDP Identifier Fatal Errors detected by the session(s) (past and present) associated with this LDP Entity. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpEntityDiscontinuityTime
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentitystatsbadpdulengtherrors
This object counts the number of Bad PDU Length Fatal Errors detected by the session(s) (past and present) associated with this LDP Entity. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpEntityDiscontinuityTime
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentitystatsbadmessagelengtherrors
This object counts the number of Bad Message Length Fatal Errors detected by the session(s) (past and present) associated with this LDP Entity. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpEntityDiscontinuityTime
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentitystatsbadtlvlengtherrors
This object counts the number of Bad TLV Length Fatal Errors detected by the session(s) (past and present) associated with this LDP Entity. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpEntityDiscontinuityTime
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentitystatsmalformedtlvvalueerrors
This object counts the number of Malformed TLV Value Fatal Errors detected by the session(s) (past and present) associated with this LDP Entity. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpEntityDiscontinuityTime
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentitystatskeepalivetimerexperrors
This object counts the number of Session Keep Alive Timer Expired Errors detected by the session(s) (past and present) associated with this LDP Entity. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpEntityDiscontinuityTime
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentitystatsshutdownreceivednotifications
This object counts the number of Shutdown Notifications received related to session(s) (past and present) associated with this LDP Entity. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpEntityDiscontinuityTime
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpentitystatsshutdownsentnotifications
This object counts the number of Shutdown Notfications sent related to session(s) (past and present) associated with this LDP Entity. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpEntityDiscontinuityTime
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry, self).__init__()
self.yang_name = "mplsLdpEntityEntry"
self.yang_parent_name = "mplsLdpEntityTable"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['mplsldpentityldpid','mplsldpentityindex']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mplsldpentityldpid', (YLeaf(YType.str, 'mplsLdpEntityLdpId'), ['str'])),
('mplsldpentityindex', (YLeaf(YType.uint32, 'mplsLdpEntityIndex'), ['int'])),
('mplsldpentityprotocolversion', (YLeaf(YType.uint32, 'mplsLdpEntityProtocolVersion'), ['int'])),
('mplsldpentityadminstatus', (YLeaf(YType.enumeration, 'mplsLdpEntityAdminStatus'), [('ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB', 'MPLSLDPSTDMIB', 'MplsLdpEntityTable.MplsLdpEntityEntry.MplsLdpEntityAdminStatus')])),
('mplsldpentityoperstatus', (YLeaf(YType.enumeration, 'mplsLdpEntityOperStatus'), [('ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB', 'MPLSLDPSTDMIB', 'MplsLdpEntityTable.MplsLdpEntityEntry.MplsLdpEntityOperStatus')])),
('mplsldpentitytcpport', (YLeaf(YType.uint16, 'mplsLdpEntityTcpPort'), ['int'])),
('mplsldpentityudpdscport', (YLeaf(YType.uint16, 'mplsLdpEntityUdpDscPort'), ['int'])),
('mplsldpentitymaxpdulength', (YLeaf(YType.uint32, 'mplsLdpEntityMaxPduLength'), ['int'])),
('mplsldpentitykeepaliveholdtimer', (YLeaf(YType.uint32, 'mplsLdpEntityKeepAliveHoldTimer'), ['int'])),
('mplsldpentityhelloholdtimer', (YLeaf(YType.uint32, 'mplsLdpEntityHelloHoldTimer'), ['int'])),
('mplsldpentityinitsessionthreshold', (YLeaf(YType.int32, 'mplsLdpEntityInitSessionThreshold'), ['int'])),
('mplsldpentitylabeldistmethod', (YLeaf(YType.enumeration, 'mplsLdpEntityLabelDistMethod'), [('ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB', 'MplsLabelDistributionMethod', '')])),
('mplsldpentitylabelretentionmode', (YLeaf(YType.enumeration, 'mplsLdpEntityLabelRetentionMode'), [('ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB', 'MplsRetentionMode', '')])),
('mplsldpentitypathvectorlimit', (YLeaf(YType.int32, 'mplsLdpEntityPathVectorLimit'), ['int'])),
('mplsldpentityhopcountlimit', (YLeaf(YType.int32, 'mplsLdpEntityHopCountLimit'), ['int'])),
('mplsldpentitytransportaddrkind', (YLeaf(YType.enumeration, 'mplsLdpEntityTransportAddrKind'), [('ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB', 'MPLSLDPSTDMIB', 'MplsLdpEntityTable.MplsLdpEntityEntry.MplsLdpEntityTransportAddrKind')])),
('mplsldpentitytargetpeer', (YLeaf(YType.boolean, 'mplsLdpEntityTargetPeer'), ['bool'])),
('mplsldpentitytargetpeeraddrtype', (YLeaf(YType.enumeration, 'mplsLdpEntityTargetPeerAddrType'), [('ydk.models.cisco_ios_xe.INET_ADDRESS_MIB', 'InetAddressType', '')])),
('mplsldpentitytargetpeeraddr', (YLeaf(YType.str, 'mplsLdpEntityTargetPeerAddr'), ['str'])),
('mplsldpentitylabeltype', (YLeaf(YType.enumeration, 'mplsLdpEntityLabelType'), [('ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB', 'MplsLdpLabelType', '')])),
('mplsldpentitydiscontinuitytime', (YLeaf(YType.uint32, 'mplsLdpEntityDiscontinuityTime'), ['int'])),
('mplsldpentitystoragetype', (YLeaf(YType.enumeration, 'mplsLdpEntityStorageType'), [('ydk.models.cisco_ios_xe.SNMPv2_TC', 'StorageType', '')])),
('mplsldpentityrowstatus', (YLeaf(YType.enumeration, 'mplsLdpEntityRowStatus'), [('ydk.models.cisco_ios_xe.SNMPv2_TC', 'RowStatus', '')])),
('mplsldpentitystatssessionattempts', (YLeaf(YType.uint32, 'mplsLdpEntityStatsSessionAttempts'), ['int'])),
('mplsldpentitystatssessionrejectednohelloerrors', (YLeaf(YType.uint32, 'mplsLdpEntityStatsSessionRejectedNoHelloErrors'), ['int'])),
('mplsldpentitystatssessionrejectedaderrors', (YLeaf(YType.uint32, 'mplsLdpEntityStatsSessionRejectedAdErrors'), ['int'])),
('mplsldpentitystatssessionrejectedmaxpduerrors', (YLeaf(YType.uint32, 'mplsLdpEntityStatsSessionRejectedMaxPduErrors'), ['int'])),
('mplsldpentitystatssessionrejectedlrerrors', (YLeaf(YType.uint32, 'mplsLdpEntityStatsSessionRejectedLRErrors'), ['int'])),
('mplsldpentitystatsbadldpidentifiererrors', (YLeaf(YType.uint32, 'mplsLdpEntityStatsBadLdpIdentifierErrors'), ['int'])),
('mplsldpentitystatsbadpdulengtherrors', (YLeaf(YType.uint32, 'mplsLdpEntityStatsBadPduLengthErrors'), ['int'])),
('mplsldpentitystatsbadmessagelengtherrors', (YLeaf(YType.uint32, 'mplsLdpEntityStatsBadMessageLengthErrors'), ['int'])),
('mplsldpentitystatsbadtlvlengtherrors', (YLeaf(YType.uint32, 'mplsLdpEntityStatsBadTlvLengthErrors'), ['int'])),
('mplsldpentitystatsmalformedtlvvalueerrors', (YLeaf(YType.uint32, 'mplsLdpEntityStatsMalformedTlvValueErrors'), ['int'])),
('mplsldpentitystatskeepalivetimerexperrors', (YLeaf(YType.uint32, 'mplsLdpEntityStatsKeepAliveTimerExpErrors'), ['int'])),
('mplsldpentitystatsshutdownreceivednotifications', (YLeaf(YType.uint32, 'mplsLdpEntityStatsShutdownReceivedNotifications'), ['int'])),
('mplsldpentitystatsshutdownsentnotifications', (YLeaf(YType.uint32, 'mplsLdpEntityStatsShutdownSentNotifications'), ['int'])),
])
self.mplsldpentityldpid = None
self.mplsldpentityindex = None
self.mplsldpentityprotocolversion = None
self.mplsldpentityadminstatus = None
self.mplsldpentityoperstatus = None
self.mplsldpentitytcpport = None
self.mplsldpentityudpdscport = None
self.mplsldpentitymaxpdulength = None
self.mplsldpentitykeepaliveholdtimer = None
self.mplsldpentityhelloholdtimer = None
self.mplsldpentityinitsessionthreshold = None
self.mplsldpentitylabeldistmethod = None
self.mplsldpentitylabelretentionmode = None
self.mplsldpentitypathvectorlimit = None
self.mplsldpentityhopcountlimit = None
self.mplsldpentitytransportaddrkind = None
self.mplsldpentitytargetpeer = None
self.mplsldpentitytargetpeeraddrtype = None
self.mplsldpentitytargetpeeraddr = None
self.mplsldpentitylabeltype = None
self.mplsldpentitydiscontinuitytime = None
self.mplsldpentitystoragetype = None
self.mplsldpentityrowstatus = None
self.mplsldpentitystatssessionattempts = None
self.mplsldpentitystatssessionrejectednohelloerrors = None
self.mplsldpentitystatssessionrejectedaderrors = None
self.mplsldpentitystatssessionrejectedmaxpduerrors = None
self.mplsldpentitystatssessionrejectedlrerrors = None
self.mplsldpentitystatsbadldpidentifiererrors = None
self.mplsldpentitystatsbadpdulengtherrors = None
self.mplsldpentitystatsbadmessagelengtherrors = None
self.mplsldpentitystatsbadtlvlengtherrors = None
self.mplsldpentitystatsmalformedtlvvalueerrors = None
self.mplsldpentitystatskeepalivetimerexperrors = None
self.mplsldpentitystatsshutdownreceivednotifications = None
self.mplsldpentitystatsshutdownsentnotifications = None
self._segment_path = lambda: "mplsLdpEntityEntry" + "[mplsLdpEntityLdpId='" + str(self.mplsldpentityldpid) + "']" + "[mplsLdpEntityIndex='" + str(self.mplsldpentityindex) + "']"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/mplsLdpEntityTable/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry, [u'mplsldpentityldpid', u'mplsldpentityindex', u'mplsldpentityprotocolversion', u'mplsldpentityadminstatus', u'mplsldpentityoperstatus', u'mplsldpentitytcpport', u'mplsldpentityudpdscport', u'mplsldpentitymaxpdulength', u'mplsldpentitykeepaliveholdtimer', u'mplsldpentityhelloholdtimer', u'mplsldpentityinitsessionthreshold', u'mplsldpentitylabeldistmethod', u'mplsldpentitylabelretentionmode', u'mplsldpentitypathvectorlimit', u'mplsldpentityhopcountlimit', u'mplsldpentitytransportaddrkind', u'mplsldpentitytargetpeer', u'mplsldpentitytargetpeeraddrtype', u'mplsldpentitytargetpeeraddr', u'mplsldpentitylabeltype', u'mplsldpentitydiscontinuitytime', u'mplsldpentitystoragetype', u'mplsldpentityrowstatus', u'mplsldpentitystatssessionattempts', u'mplsldpentitystatssessionrejectednohelloerrors', u'mplsldpentitystatssessionrejectedaderrors', u'mplsldpentitystatssessionrejectedmaxpduerrors', u'mplsldpentitystatssessionrejectedlrerrors', u'mplsldpentitystatsbadldpidentifiererrors', u'mplsldpentitystatsbadpdulengtherrors', u'mplsldpentitystatsbadmessagelengtherrors', u'mplsldpentitystatsbadtlvlengtherrors', u'mplsldpentitystatsmalformedtlvvalueerrors', u'mplsldpentitystatskeepalivetimerexperrors', u'mplsldpentitystatsshutdownreceivednotifications', u'mplsldpentitystatsshutdownsentnotifications'], name, value)
class MplsLdpEntityAdminStatus(Enum):
"""
MplsLdpEntityAdminStatus (Enum Class)
The administrative status of this LDP Entity.
If this object is changed from 'enable' to 'disable'
and this entity has already attempted to establish
contact with a Peer, then all contact with that
Peer is lost and all information from that Peer
needs to be removed from the MIB. (This implies
that the network management subsystem should clean
up any related entry in the mplsLdpPeerTable. This
further implies that a 'tear\-down' for that session
is issued and the session and all information related
to that session cease to exist).
At this point the operator is able to change values
which are related to this entity.
When the admin status is set back to 'enable', then
this Entity will attempt to establish a new session
with the Peer.
.. data:: enable = 1
.. data:: disable = 2
"""
enable = Enum.YLeaf(1, "enable")
disable = Enum.YLeaf(2, "disable")
class MplsLdpEntityOperStatus(Enum):
"""
MplsLdpEntityOperStatus (Enum Class)
The operational status of this LDP Entity.
The value of unknown(1) indicates that the
operational status cannot be determined at
this time. The value of unknown should be
a transient condition before changing
to enabled(2) or disabled(3).
.. data:: unknown = 1
.. data:: enabled = 2
.. data:: disabled = 3
"""
unknown = Enum.YLeaf(1, "unknown")
enabled = Enum.YLeaf(2, "enabled")
disabled = Enum.YLeaf(3, "disabled")
class MplsLdpEntityTransportAddrKind(Enum):
"""
MplsLdpEntityTransportAddrKind (Enum Class)
This specifies whether the loopback or interface
address is to be used as the transport address
in the transport address TLV of the
hello message.
If the value is interface(1), then the IP
address of the interface from which hello
messages are sent is used as the transport
address in the hello message.
Otherwise, if the value is loopback(2), then the IP
address of the loopback interface is used as the
transport address in the hello message.
.. data:: interface = 1
.. data:: loopback = 2
"""
interface = Enum.YLeaf(1, "interface")
loopback = Enum.YLeaf(2, "loopback")
class MplsLdpPeerTable(Entity):
"""
Information about LDP peers known by Entities in
the mplsLdpEntityTable. The information in this table
is based on information from the Entity\-Peer interaction
during session initialization but is not appropriate
for the mplsLdpSessionTable, because objects in this
table may or may not be used in session establishment.
.. attribute:: mplsldppeerentry
Information about a single Peer which is related to a Session. This table is augmented by the mplsLdpSessionTable
**type**\: list of :py:class:`MplsLdpPeerEntry <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpPeerTable.MplsLdpPeerEntry>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsLdpPeerTable, self).__init__()
self.yang_name = "mplsLdpPeerTable"
self.yang_parent_name = "MPLS-LDP-STD-MIB"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("mplsLdpPeerEntry", ("mplsldppeerentry", MPLSLDPSTDMIB.MplsLdpPeerTable.MplsLdpPeerEntry))])
self._leafs = OrderedDict()
self.mplsldppeerentry = YList(self)
self._segment_path = lambda: "mplsLdpPeerTable"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsLdpPeerTable, [], name, value)
class MplsLdpPeerEntry(Entity):
"""
Information about a single Peer which is related
to a Session. This table is augmented by
the mplsLdpSessionTable.
.. attribute:: mplsldpentityldpid (key)
**type**\: str
**refers to**\: :py:class:`mplsldpentityldpid <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry>`
.. attribute:: mplsldpentityindex (key)
**type**\: int
**range:** 1..4294967295
**refers to**\: :py:class:`mplsldpentityindex <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry>`
.. attribute:: mplsldppeerldpid (key)
The LDP identifier of this LDP Peer
**type**\: str
.. attribute:: mplsldppeerlabeldistmethod
For any given LDP session, the method of label distribution must be specified
**type**\: :py:class:`MplsLabelDistributionMethod <ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB.MplsLabelDistributionMethod>`
.. attribute:: mplsldppeerpathvectorlimit
If the value of this object is 0 (zero) then Loop Dection for Path Vectors for this Peer is disabled. Otherwise, if this object has a value greater than zero, then Loop Dection for Path Vectors for this Peer is enabled and the Path Vector Limit is this value
**type**\: int
**range:** 0..255
.. attribute:: mplsldppeertransportaddrtype
The type of the Internet address for the mplsLdpPeerTransportAddr object. The LDP specification describes this as being either an IPv4 Transport Address or IPv6 Transport Address which is used in opening the LDP session's TCP connection, or if the optional TLV is not present, then this is the IPv4/IPv6 source address for the UPD packet carrying the Hellos. This object specifies how the value of the mplsLdpPeerTransportAddr object should be interpreted
**type**\: :py:class:`InetAddressType <ydk.models.cisco_ios_xe.INET_ADDRESS_MIB.InetAddressType>`
.. attribute:: mplsldppeertransportaddr
The Internet address advertised by the peer in the Hello Message or the Hello source address. The type of this address is specified by the value of the mplsLdpPeerTransportAddrType object
**type**\: str
**length:** 0..255
.. attribute:: mplsldpsessionstatelastchange
The value of sysUpTime at the time this Session entered its current state as denoted by the mplsLdpSessionState object
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpsessionstate
The current state of the session, all of the states 1 to 5 are based on the state machine for session negotiation behavior
**type**\: :py:class:`MplsLdpSessionState <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpPeerTable.MplsLdpPeerEntry.MplsLdpSessionState>`
.. attribute:: mplsldpsessionrole
During session establishment the LSR/LER takes either the active role or the passive role based on address comparisons. This object indicates whether this LSR/LER was behaving in an active role or passive role during this session's establishment. The value of unknown(1), indicates that the role is not able to be determined at the present time
**type**\: :py:class:`MplsLdpSessionRole <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpPeerTable.MplsLdpPeerEntry.MplsLdpSessionRole>`
.. attribute:: mplsldpsessionprotocolversion
The version of the LDP Protocol which this session is using. This is the version of the LDP protocol which has been negotiated during session initialization
**type**\: int
**range:** 1..65535
.. attribute:: mplsldpsessionkeepaliveholdtimerem
The keep alive hold time remaining for this session
**type**\: int
**range:** 0..2147483647
.. attribute:: mplsldpsessionkeepalivetime
The negotiated KeepAlive Time which represents the amount of seconds between keep alive messages. The mplsLdpEntityKeepAliveHoldTimer related to this Session is the value that was proposed as the KeepAlive Time for this session. This value is negotiated during session initialization between the entity's proposed value (i.e., the value configured in mplsLdpEntityKeepAliveHoldTimer) and the peer's proposed KeepAlive Hold Timer value. This value is the smaller of the two proposed values
**type**\: int
**range:** 1..65535
**units**\: seconds
.. attribute:: mplsldpsessionmaxpdulength
The value of maximum allowable length for LDP PDUs for this session. This value may have been negotiated during the Session Initialization. This object is related to the mplsLdpEntityMaxPduLength object. The mplsLdpEntityMaxPduLength object specifies the requested LDP PDU length, and this object reflects the negotiated LDP PDU length between the Entity and the Peer
**type**\: int
**range:** 1..65535
**units**\: octets
.. attribute:: mplsldpsessiondiscontinuitytime
The value of sysUpTime on the most recent occasion at which any one or more of this session's counters suffered a discontinuity. The relevant counters are the specific instances associated with this session of any Counter32 object contained in the mplsLdpSessionStatsTable. The initial value of this object is the value of sysUpTime when the entry was created in this table. Also, a command generator can distinguish when a session between a given Entity and Peer goes away and a new session is established. This value would change and thus indicate to the command generator that this is a different session
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpsessionstatsunknownmestypeerrors
This object counts the number of Unknown Message Type Errors detected by this LSR/LER during this session. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpSessionDiscontinuityTime
**type**\: int
**range:** 0..4294967295
.. attribute:: mplsldpsessionstatsunknowntlverrors
This object counts the number of Unknown TLV Errors detected by this LSR/LER during this session. Discontinuities in the value of this counter can occur at re\-initialization of the management system, and at other times as indicated by the value of mplsLdpSessionDiscontinuityTime
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsLdpPeerTable.MplsLdpPeerEntry, self).__init__()
self.yang_name = "mplsLdpPeerEntry"
self.yang_parent_name = "mplsLdpPeerTable"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['mplsldpentityldpid','mplsldpentityindex','mplsldppeerldpid']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mplsldpentityldpid', (YLeaf(YType.str, 'mplsLdpEntityLdpId'), ['str'])),
('mplsldpentityindex', (YLeaf(YType.str, 'mplsLdpEntityIndex'), ['int'])),
('mplsldppeerldpid', (YLeaf(YType.str, 'mplsLdpPeerLdpId'), ['str'])),
('mplsldppeerlabeldistmethod', (YLeaf(YType.enumeration, 'mplsLdpPeerLabelDistMethod'), [('ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB', 'MplsLabelDistributionMethod', '')])),
('mplsldppeerpathvectorlimit', (YLeaf(YType.int32, 'mplsLdpPeerPathVectorLimit'), ['int'])),
('mplsldppeertransportaddrtype', (YLeaf(YType.enumeration, 'mplsLdpPeerTransportAddrType'), [('ydk.models.cisco_ios_xe.INET_ADDRESS_MIB', 'InetAddressType', '')])),
('mplsldppeertransportaddr', (YLeaf(YType.str, 'mplsLdpPeerTransportAddr'), ['str'])),
('mplsldpsessionstatelastchange', (YLeaf(YType.uint32, 'mplsLdpSessionStateLastChange'), ['int'])),
('mplsldpsessionstate', (YLeaf(YType.enumeration, 'mplsLdpSessionState'), [('ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB', 'MPLSLDPSTDMIB', 'MplsLdpPeerTable.MplsLdpPeerEntry.MplsLdpSessionState')])),
('mplsldpsessionrole', (YLeaf(YType.enumeration, 'mplsLdpSessionRole'), [('ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB', 'MPLSLDPSTDMIB', 'MplsLdpPeerTable.MplsLdpPeerEntry.MplsLdpSessionRole')])),
('mplsldpsessionprotocolversion', (YLeaf(YType.uint32, 'mplsLdpSessionProtocolVersion'), ['int'])),
('mplsldpsessionkeepaliveholdtimerem', (YLeaf(YType.int32, 'mplsLdpSessionKeepAliveHoldTimeRem'), ['int'])),
('mplsldpsessionkeepalivetime', (YLeaf(YType.uint32, 'mplsLdpSessionKeepAliveTime'), ['int'])),
('mplsldpsessionmaxpdulength', (YLeaf(YType.uint32, 'mplsLdpSessionMaxPduLength'), ['int'])),
('mplsldpsessiondiscontinuitytime', (YLeaf(YType.uint32, 'mplsLdpSessionDiscontinuityTime'), ['int'])),
('mplsldpsessionstatsunknownmestypeerrors', (YLeaf(YType.uint32, 'mplsLdpSessionStatsUnknownMesTypeErrors'), ['int'])),
('mplsldpsessionstatsunknowntlverrors', (YLeaf(YType.uint32, 'mplsLdpSessionStatsUnknownTlvErrors'), ['int'])),
])
self.mplsldpentityldpid = None
self.mplsldpentityindex = None
self.mplsldppeerldpid = None
self.mplsldppeerlabeldistmethod = None
self.mplsldppeerpathvectorlimit = None
self.mplsldppeertransportaddrtype = None
self.mplsldppeertransportaddr = None
self.mplsldpsessionstatelastchange = None
self.mplsldpsessionstate = None
self.mplsldpsessionrole = None
self.mplsldpsessionprotocolversion = None
self.mplsldpsessionkeepaliveholdtimerem = None
self.mplsldpsessionkeepalivetime = None
self.mplsldpsessionmaxpdulength = None
self.mplsldpsessiondiscontinuitytime = None
self.mplsldpsessionstatsunknownmestypeerrors = None
self.mplsldpsessionstatsunknowntlverrors = None
self._segment_path = lambda: "mplsLdpPeerEntry" + "[mplsLdpEntityLdpId='" + str(self.mplsldpentityldpid) + "']" + "[mplsLdpEntityIndex='" + str(self.mplsldpentityindex) + "']" + "[mplsLdpPeerLdpId='" + str(self.mplsldppeerldpid) + "']"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/mplsLdpPeerTable/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsLdpPeerTable.MplsLdpPeerEntry, [u'mplsldpentityldpid', u'mplsldpentityindex', u'mplsldppeerldpid', u'mplsldppeerlabeldistmethod', u'mplsldppeerpathvectorlimit', u'mplsldppeertransportaddrtype', u'mplsldppeertransportaddr', u'mplsldpsessionstatelastchange', u'mplsldpsessionstate', u'mplsldpsessionrole', u'mplsldpsessionprotocolversion', u'mplsldpsessionkeepaliveholdtimerem', u'mplsldpsessionkeepalivetime', u'mplsldpsessionmaxpdulength', u'mplsldpsessiondiscontinuitytime', u'mplsldpsessionstatsunknownmestypeerrors', u'mplsldpsessionstatsunknowntlverrors'], name, value)
class MplsLdpSessionRole(Enum):
"""
MplsLdpSessionRole (Enum Class)
During session establishment the LSR/LER takes either
the active role or the passive role based on address
comparisons. This object indicates whether this LSR/LER
was behaving in an active role or passive role during
this session's establishment.
The value of unknown(1), indicates that the role is not
able to be determined at the present time.
.. data:: unknown = 1
.. data:: active = 2
.. data:: passive = 3
"""
unknown = Enum.YLeaf(1, "unknown")
active = Enum.YLeaf(2, "active")
passive = Enum.YLeaf(3, "passive")
class MplsLdpSessionState(Enum):
"""
MplsLdpSessionState (Enum Class)
The current state of the session, all of the
states 1 to 5 are based on the state machine
for session negotiation behavior.
.. data:: nonexistent = 1
.. data:: initialized = 2
.. data:: openrec = 3
.. data:: opensent = 4
.. data:: operational = 5
"""
nonexistent = Enum.YLeaf(1, "nonexistent")
initialized = Enum.YLeaf(2, "initialized")
openrec = Enum.YLeaf(3, "openrec")
opensent = Enum.YLeaf(4, "opensent")
operational = Enum.YLeaf(5, "operational")
class MplsLdpHelloAdjacencyTable(Entity):
"""
A table of Hello Adjacencies for Sessions.
.. attribute:: mplsldphelloadjacencyentry
Each row represents a single LDP Hello Adjacency. An LDP Session can have one or more Hello Adjacencies
**type**\: list of :py:class:`MplsLdpHelloAdjacencyEntry <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpHelloAdjacencyTable.MplsLdpHelloAdjacencyEntry>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsLdpHelloAdjacencyTable, self).__init__()
self.yang_name = "mplsLdpHelloAdjacencyTable"
self.yang_parent_name = "MPLS-LDP-STD-MIB"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("mplsLdpHelloAdjacencyEntry", ("mplsldphelloadjacencyentry", MPLSLDPSTDMIB.MplsLdpHelloAdjacencyTable.MplsLdpHelloAdjacencyEntry))])
self._leafs = OrderedDict()
self.mplsldphelloadjacencyentry = YList(self)
self._segment_path = lambda: "mplsLdpHelloAdjacencyTable"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsLdpHelloAdjacencyTable, [], name, value)
class MplsLdpHelloAdjacencyEntry(Entity):
"""
Each row represents a single LDP Hello Adjacency.
An LDP Session can have one or more Hello
Adjacencies.
.. attribute:: mplsldpentityldpid (key)
**type**\: str
**refers to**\: :py:class:`mplsldpentityldpid <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry>`
.. attribute:: mplsldpentityindex (key)
**type**\: int
**range:** 1..4294967295
**refers to**\: :py:class:`mplsldpentityindex <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry>`
.. attribute:: mplsldppeerldpid (key)
**type**\: str
**refers to**\: :py:class:`mplsldppeerldpid <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpPeerTable.MplsLdpPeerEntry>`
.. attribute:: mplsldphelloadjacencyindex (key)
An identifier for this specific adjacency
**type**\: int
**range:** 1..4294967295
.. attribute:: mplsldphelloadjacencyholdtimerem
If the value of this object is 65535, this means that the hold time is infinite (i.e., wait forever). Otherwise, the time remaining for this Hello Adjacency to receive its next Hello Message. This interval will change when the 'next' Hello Message which corresponds to this Hello Adjacency is received unless it is infinite
**type**\: int
**range:** 0..2147483647
**units**\: seconds
.. attribute:: mplsldphelloadjacencyholdtime
The Hello hold time which is negotiated between the Entity and the Peer. The entity associated with this Hello Adjacency issues a proposed Hello Hold Time value in the mplsLdpEntityHelloHoldTimer object. The peer also proposes a value and this object represents the negotiated value. A value of 0 means the default, which is 15 seconds for Link Hellos and 45 seconds for Targeted Hellos. A value of 65535 indicates an infinite hold time
**type**\: int
**range:** 0..65535
.. attribute:: mplsldphelloadjacencytype
This adjacency is the result of a 'link' hello if the value of this object is link(1). Otherwise, it is a result of a 'targeted' hello, targeted(2)
**type**\: :py:class:`MplsLdpHelloAdjacencyType <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpHelloAdjacencyTable.MplsLdpHelloAdjacencyEntry.MplsLdpHelloAdjacencyType>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsLdpHelloAdjacencyTable.MplsLdpHelloAdjacencyEntry, self).__init__()
self.yang_name = "mplsLdpHelloAdjacencyEntry"
self.yang_parent_name = "mplsLdpHelloAdjacencyTable"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['mplsldpentityldpid','mplsldpentityindex','mplsldppeerldpid','mplsldphelloadjacencyindex']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mplsldpentityldpid', (YLeaf(YType.str, 'mplsLdpEntityLdpId'), ['str'])),
('mplsldpentityindex', (YLeaf(YType.str, 'mplsLdpEntityIndex'), ['int'])),
('mplsldppeerldpid', (YLeaf(YType.str, 'mplsLdpPeerLdpId'), ['str'])),
('mplsldphelloadjacencyindex', (YLeaf(YType.uint32, 'mplsLdpHelloAdjacencyIndex'), ['int'])),
('mplsldphelloadjacencyholdtimerem', (YLeaf(YType.int32, 'mplsLdpHelloAdjacencyHoldTimeRem'), ['int'])),
('mplsldphelloadjacencyholdtime', (YLeaf(YType.uint32, 'mplsLdpHelloAdjacencyHoldTime'), ['int'])),
('mplsldphelloadjacencytype', (YLeaf(YType.enumeration, 'mplsLdpHelloAdjacencyType'), [('ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB', 'MPLSLDPSTDMIB', 'MplsLdpHelloAdjacencyTable.MplsLdpHelloAdjacencyEntry.MplsLdpHelloAdjacencyType')])),
])
self.mplsldpentityldpid = None
self.mplsldpentityindex = None
self.mplsldppeerldpid = None
self.mplsldphelloadjacencyindex = None
self.mplsldphelloadjacencyholdtimerem = None
self.mplsldphelloadjacencyholdtime = None
self.mplsldphelloadjacencytype = None
self._segment_path = lambda: "mplsLdpHelloAdjacencyEntry" + "[mplsLdpEntityLdpId='" + str(self.mplsldpentityldpid) + "']" + "[mplsLdpEntityIndex='" + str(self.mplsldpentityindex) + "']" + "[mplsLdpPeerLdpId='" + str(self.mplsldppeerldpid) + "']" + "[mplsLdpHelloAdjacencyIndex='" + str(self.mplsldphelloadjacencyindex) + "']"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/mplsLdpHelloAdjacencyTable/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsLdpHelloAdjacencyTable.MplsLdpHelloAdjacencyEntry, [u'mplsldpentityldpid', u'mplsldpentityindex', u'mplsldppeerldpid', u'mplsldphelloadjacencyindex', u'mplsldphelloadjacencyholdtimerem', u'mplsldphelloadjacencyholdtime', u'mplsldphelloadjacencytype'], name, value)
class MplsLdpHelloAdjacencyType(Enum):
"""
MplsLdpHelloAdjacencyType (Enum Class)
This adjacency is the result of a 'link'
hello if the value of this object is link(1).
Otherwise, it is a result of a 'targeted'
hello, targeted(2).
.. data:: link = 1
.. data:: targeted = 2
"""
link = Enum.YLeaf(1, "link")
targeted = Enum.YLeaf(2, "targeted")
class MplsInSegmentLdpLspTable(Entity):
"""
A table of LDP LSP's which
map to the mplsInSegmentTable in the
MPLS\-LSR\-STD\-MIB module.
.. attribute:: mplsinsegmentldplspentry
An entry in this table represents information on a single LDP LSP which is represented by a session's index triple (mplsLdpEntityLdpId, mplsLdpEntityIndex, mplsLdpPeerLdpId) AND the index for the mplsInSegmentTable (mplsInSegmentLdpLspLabelIndex) from the MPLS\-LSR\-STD\-MIB. The information contained in a row is read\-only
**type**\: list of :py:class:`MplsInSegmentLdpLspEntry <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsInSegmentLdpLspTable.MplsInSegmentLdpLspEntry>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsInSegmentLdpLspTable, self).__init__()
self.yang_name = "mplsInSegmentLdpLspTable"
self.yang_parent_name = "MPLS-LDP-STD-MIB"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("mplsInSegmentLdpLspEntry", ("mplsinsegmentldplspentry", MPLSLDPSTDMIB.MplsInSegmentLdpLspTable.MplsInSegmentLdpLspEntry))])
self._leafs = OrderedDict()
self.mplsinsegmentldplspentry = YList(self)
self._segment_path = lambda: "mplsInSegmentLdpLspTable"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsInSegmentLdpLspTable, [], name, value)
class MplsInSegmentLdpLspEntry(Entity):
"""
An entry in this table represents information
on a single LDP LSP which is represented by
a session's index triple (mplsLdpEntityLdpId,
mplsLdpEntityIndex, mplsLdpPeerLdpId) AND the
index for the mplsInSegmentTable
(mplsInSegmentLdpLspLabelIndex) from the
MPLS\-LSR\-STD\-MIB.
The information contained in a row is read\-only.
.. attribute:: mplsldpentityldpid (key)
**type**\: str
**refers to**\: :py:class:`mplsldpentityldpid <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry>`
.. attribute:: mplsldpentityindex (key)
**type**\: int
**range:** 1..4294967295
**refers to**\: :py:class:`mplsldpentityindex <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry>`
.. attribute:: mplsldppeerldpid (key)
**type**\: str
**refers to**\: :py:class:`mplsldppeerldpid <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpPeerTable.MplsLdpPeerEntry>`
.. attribute:: mplsinsegmentldplspindex (key)
This contains the same value as the mplsInSegmentIndex in the MPLS\-LSR\-STD\-MIB's mplsInSegmentTable
**type**\: str
**length:** 1..24
.. attribute:: mplsinsegmentldplsplabeltype
The Layer 2 Label Type
**type**\: :py:class:`MplsLdpLabelType <ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB.MplsLdpLabelType>`
.. attribute:: mplsinsegmentldplsptype
The type of LSP connection
**type**\: :py:class:`MplsLspType <ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB.MplsLspType>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsInSegmentLdpLspTable.MplsInSegmentLdpLspEntry, self).__init__()
self.yang_name = "mplsInSegmentLdpLspEntry"
self.yang_parent_name = "mplsInSegmentLdpLspTable"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['mplsldpentityldpid','mplsldpentityindex','mplsldppeerldpid','mplsinsegmentldplspindex']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mplsldpentityldpid', (YLeaf(YType.str, 'mplsLdpEntityLdpId'), ['str'])),
('mplsldpentityindex', (YLeaf(YType.str, 'mplsLdpEntityIndex'), ['int'])),
('mplsldppeerldpid', (YLeaf(YType.str, 'mplsLdpPeerLdpId'), ['str'])),
('mplsinsegmentldplspindex', (YLeaf(YType.str, 'mplsInSegmentLdpLspIndex'), ['str'])),
('mplsinsegmentldplsplabeltype', (YLeaf(YType.enumeration, 'mplsInSegmentLdpLspLabelType'), [('ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB', 'MplsLdpLabelType', '')])),
('mplsinsegmentldplsptype', (YLeaf(YType.enumeration, 'mplsInSegmentLdpLspType'), [('ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB', 'MplsLspType', '')])),
])
self.mplsldpentityldpid = None
self.mplsldpentityindex = None
self.mplsldppeerldpid = None
self.mplsinsegmentldplspindex = None
self.mplsinsegmentldplsplabeltype = None
self.mplsinsegmentldplsptype = None
self._segment_path = lambda: "mplsInSegmentLdpLspEntry" + "[mplsLdpEntityLdpId='" + str(self.mplsldpentityldpid) + "']" + "[mplsLdpEntityIndex='" + str(self.mplsldpentityindex) + "']" + "[mplsLdpPeerLdpId='" + str(self.mplsldppeerldpid) + "']" + "[mplsInSegmentLdpLspIndex='" + str(self.mplsinsegmentldplspindex) + "']"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/mplsInSegmentLdpLspTable/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsInSegmentLdpLspTable.MplsInSegmentLdpLspEntry, [u'mplsldpentityldpid', u'mplsldpentityindex', u'mplsldppeerldpid', u'mplsinsegmentldplspindex', u'mplsinsegmentldplsplabeltype', u'mplsinsegmentldplsptype'], name, value)
class MplsOutSegmentLdpLspTable(Entity):
"""
A table of LDP LSP's which
map to the mplsOutSegmentTable in the
MPLS\-LSR\-STD\-MIB.
.. attribute:: mplsoutsegmentldplspentry
An entry in this table represents information on a single LDP LSP which is represented by a session's index triple (mplsLdpEntityLdpId, mplsLdpEntityIndex, mplsLdpPeerLdpId) AND the index (mplsOutSegmentLdpLspIndex) for the mplsOutSegmentTable. The information contained in a row is read\-only
**type**\: list of :py:class:`MplsOutSegmentLdpLspEntry <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsOutSegmentLdpLspTable.MplsOutSegmentLdpLspEntry>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsOutSegmentLdpLspTable, self).__init__()
self.yang_name = "mplsOutSegmentLdpLspTable"
self.yang_parent_name = "MPLS-LDP-STD-MIB"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("mplsOutSegmentLdpLspEntry", ("mplsoutsegmentldplspentry", MPLSLDPSTDMIB.MplsOutSegmentLdpLspTable.MplsOutSegmentLdpLspEntry))])
self._leafs = OrderedDict()
self.mplsoutsegmentldplspentry = YList(self)
self._segment_path = lambda: "mplsOutSegmentLdpLspTable"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsOutSegmentLdpLspTable, [], name, value)
class MplsOutSegmentLdpLspEntry(Entity):
"""
An entry in this table represents information
on a single LDP LSP which is represented by
a session's index triple (mplsLdpEntityLdpId,
mplsLdpEntityIndex, mplsLdpPeerLdpId) AND the
index (mplsOutSegmentLdpLspIndex)
for the mplsOutSegmentTable.
The information contained in a row is read\-only.
.. attribute:: mplsldpentityldpid (key)
**type**\: str
**refers to**\: :py:class:`mplsldpentityldpid <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry>`
.. attribute:: mplsldpentityindex (key)
**type**\: int
**range:** 1..4294967295
**refers to**\: :py:class:`mplsldpentityindex <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry>`
.. attribute:: mplsldppeerldpid (key)
**type**\: str
**refers to**\: :py:class:`mplsldppeerldpid <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpPeerTable.MplsLdpPeerEntry>`
.. attribute:: mplsoutsegmentldplspindex (key)
This contains the same value as the mplsOutSegmentIndex in the MPLS\-LSR\-STD\-MIB's mplsOutSegmentTable
**type**\: str
**length:** 1..24
.. attribute:: mplsoutsegmentldplsplabeltype
The Layer 2 Label Type
**type**\: :py:class:`MplsLdpLabelType <ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB.MplsLdpLabelType>`
.. attribute:: mplsoutsegmentldplsptype
The type of LSP connection
**type**\: :py:class:`MplsLspType <ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB.MplsLspType>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsOutSegmentLdpLspTable.MplsOutSegmentLdpLspEntry, self).__init__()
self.yang_name = "mplsOutSegmentLdpLspEntry"
self.yang_parent_name = "mplsOutSegmentLdpLspTable"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['mplsldpentityldpid','mplsldpentityindex','mplsldppeerldpid','mplsoutsegmentldplspindex']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mplsldpentityldpid', (YLeaf(YType.str, 'mplsLdpEntityLdpId'), ['str'])),
('mplsldpentityindex', (YLeaf(YType.str, 'mplsLdpEntityIndex'), ['int'])),
('mplsldppeerldpid', (YLeaf(YType.str, 'mplsLdpPeerLdpId'), ['str'])),
('mplsoutsegmentldplspindex', (YLeaf(YType.str, 'mplsOutSegmentLdpLspIndex'), ['str'])),
('mplsoutsegmentldplsplabeltype', (YLeaf(YType.enumeration, 'mplsOutSegmentLdpLspLabelType'), [('ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB', 'MplsLdpLabelType', '')])),
('mplsoutsegmentldplsptype', (YLeaf(YType.enumeration, 'mplsOutSegmentLdpLspType'), [('ydk.models.cisco_ios_xe.MPLS_TC_STD_MIB', 'MplsLspType', '')])),
])
self.mplsldpentityldpid = None
self.mplsldpentityindex = None
self.mplsldppeerldpid = None
self.mplsoutsegmentldplspindex = None
self.mplsoutsegmentldplsplabeltype = None
self.mplsoutsegmentldplsptype = None
self._segment_path = lambda: "mplsOutSegmentLdpLspEntry" + "[mplsLdpEntityLdpId='" + str(self.mplsldpentityldpid) + "']" + "[mplsLdpEntityIndex='" + str(self.mplsldpentityindex) + "']" + "[mplsLdpPeerLdpId='" + str(self.mplsldppeerldpid) + "']" + "[mplsOutSegmentLdpLspIndex='" + str(self.mplsoutsegmentldplspindex) + "']"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/mplsOutSegmentLdpLspTable/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsOutSegmentLdpLspTable.MplsOutSegmentLdpLspEntry, [u'mplsldpentityldpid', u'mplsldpentityindex', u'mplsldppeerldpid', u'mplsoutsegmentldplspindex', u'mplsoutsegmentldplsplabeltype', u'mplsoutsegmentldplsptype'], name, value)
class MplsFecTable(Entity):
"""
This table represents the FEC
(Forwarding Equivalence Class)
Information associated with an LSP.
.. attribute:: mplsfecentry
Each row represents a single FEC Element
**type**\: list of :py:class:`MplsFecEntry <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsFecTable.MplsFecEntry>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsFecTable, self).__init__()
self.yang_name = "mplsFecTable"
self.yang_parent_name = "MPLS-LDP-STD-MIB"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("mplsFecEntry", ("mplsfecentry", MPLSLDPSTDMIB.MplsFecTable.MplsFecEntry))])
self._leafs = OrderedDict()
self.mplsfecentry = YList(self)
self._segment_path = lambda: "mplsFecTable"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsFecTable, [], name, value)
class MplsFecEntry(Entity):
"""
Each row represents a single FEC Element.
.. attribute:: mplsfecindex (key)
The index which uniquely identifies this entry
**type**\: int
**range:** 1..4294967295
.. attribute:: mplsfectype
The type of the FEC. If the value of this object is 'prefix(1)' then the FEC type described by this row is an address prefix. If the value of this object is 'hostAddress(2)' then the FEC type described by this row is a host address
**type**\: :py:class:`MplsFecType <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsFecTable.MplsFecEntry.MplsFecType>`
.. attribute:: mplsfecaddrprefixlength
If the value of the 'mplsFecType' is 'hostAddress(2)' then this object is undefined. If the value of 'mplsFecType' is 'prefix(1)' then the value of this object is the length in bits of the address prefix represented by 'mplsFecAddr', or zero. If the value of this object is zero, this indicates that the prefix matches all addresses. In this case the address prefix MUST also be zero (i.e., 'mplsFecAddr' should have the value of zero.)
**type**\: int
**range:** 0..2040
.. attribute:: mplsfecaddrtype
The value of this object is the type of the Internet address. The value of this object, decides how the value of the mplsFecAddr object is interpreted
**type**\: :py:class:`InetAddressType <ydk.models.cisco_ios_xe.INET_ADDRESS_MIB.InetAddressType>`
.. attribute:: mplsfecaddr
The value of this object is interpreted based on the value of the 'mplsFecAddrType' object. This address is then further interpretted as an being used with the address prefix, or as the host address. This further interpretation is indicated by the 'mplsFecType' object. In other words, the FEC element is populated according to the Prefix FEC Element value encoding, or the Host Address FEC Element encoding
**type**\: str
**length:** 0..255
.. attribute:: mplsfecstoragetype
The storage type for this conceptual row. Conceptual rows having the value 'permanent(4)' need not allow write\-access to any columnar objects in the row
**type**\: :py:class:`StorageType <ydk.models.cisco_ios_xe.SNMPv2_TC.StorageType>`
.. attribute:: mplsfecrowstatus
The status of this conceptual row. If the value of this object is 'active(1)', then none of the writable objects of this entry can be modified, except to set this object to 'destroy(6)'. NOTE\: if this row is being referenced by any entry in the mplsLdpLspFecTable, then a request to destroy this row, will result in an inconsistentValue error
**type**\: :py:class:`RowStatus <ydk.models.cisco_ios_xe.SNMPv2_TC.RowStatus>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsFecTable.MplsFecEntry, self).__init__()
self.yang_name = "mplsFecEntry"
self.yang_parent_name = "mplsFecTable"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['mplsfecindex']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mplsfecindex', (YLeaf(YType.uint32, 'mplsFecIndex'), ['int'])),
('mplsfectype', (YLeaf(YType.enumeration, 'mplsFecType'), [('ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB', 'MPLSLDPSTDMIB', 'MplsFecTable.MplsFecEntry.MplsFecType')])),
('mplsfecaddrprefixlength', (YLeaf(YType.uint32, 'mplsFecAddrPrefixLength'), ['int'])),
('mplsfecaddrtype', (YLeaf(YType.enumeration, 'mplsFecAddrType'), [('ydk.models.cisco_ios_xe.INET_ADDRESS_MIB', 'InetAddressType', '')])),
('mplsfecaddr', (YLeaf(YType.str, 'mplsFecAddr'), ['str'])),
('mplsfecstoragetype', (YLeaf(YType.enumeration, 'mplsFecStorageType'), [('ydk.models.cisco_ios_xe.SNMPv2_TC', 'StorageType', '')])),
('mplsfecrowstatus', (YLeaf(YType.enumeration, 'mplsFecRowStatus'), [('ydk.models.cisco_ios_xe.SNMPv2_TC', 'RowStatus', '')])),
])
self.mplsfecindex = None
self.mplsfectype = None
self.mplsfecaddrprefixlength = None
self.mplsfecaddrtype = None
self.mplsfecaddr = None
self.mplsfecstoragetype = None
self.mplsfecrowstatus = None
self._segment_path = lambda: "mplsFecEntry" + "[mplsFecIndex='" + str(self.mplsfecindex) + "']"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/mplsFecTable/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsFecTable.MplsFecEntry, [u'mplsfecindex', u'mplsfectype', u'mplsfecaddrprefixlength', u'mplsfecaddrtype', u'mplsfecaddr', u'mplsfecstoragetype', u'mplsfecrowstatus'], name, value)
class MplsFecType(Enum):
"""
MplsFecType (Enum Class)
The type of the FEC. If the value of this object
is 'prefix(1)' then the FEC type described by this
row is an address prefix.
If the value of this object is 'hostAddress(2)' then
the FEC type described by this row is a host address.
.. data:: prefix = 1
.. data:: hostAddress = 2
"""
prefix = Enum.YLeaf(1, "prefix")
hostAddress = Enum.YLeaf(2, "hostAddress")
class MplsLdpLspFecTable(Entity):
"""
A table which shows the relationship between
LDP LSPs and FECs. Each row represents
a single LDP LSP to FEC association.
.. attribute:: mplsldplspfecentry
An entry represents a LDP LSP to FEC association
**type**\: list of :py:class:`MplsLdpLspFecEntry <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpLspFecTable.MplsLdpLspFecEntry>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsLdpLspFecTable, self).__init__()
self.yang_name = "mplsLdpLspFecTable"
self.yang_parent_name = "MPLS-LDP-STD-MIB"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("mplsLdpLspFecEntry", ("mplsldplspfecentry", MPLSLDPSTDMIB.MplsLdpLspFecTable.MplsLdpLspFecEntry))])
self._leafs = OrderedDict()
self.mplsldplspfecentry = YList(self)
self._segment_path = lambda: "mplsLdpLspFecTable"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsLdpLspFecTable, [], name, value)
class MplsLdpLspFecEntry(Entity):
"""
An entry represents a LDP LSP
to FEC association.
.. attribute:: mplsldpentityldpid (key)
**type**\: str
**refers to**\: :py:class:`mplsldpentityldpid <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry>`
.. attribute:: mplsldpentityindex (key)
**type**\: int
**range:** 1..4294967295
**refers to**\: :py:class:`mplsldpentityindex <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry>`
.. attribute:: mplsldppeerldpid (key)
**type**\: str
**refers to**\: :py:class:`mplsldppeerldpid <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpPeerTable.MplsLdpPeerEntry>`
.. attribute:: mplsldplspfecsegment (key)
If the value is inSegment(1), then this indicates that the following index, mplsLdpLspFecSegmentIndex, contains the same value as the mplsInSegmentLdpLspIndex. Otherwise, if the value of this object is outSegment(2), then this indicates that following index, mplsLdpLspFecSegmentIndex, contains the same value as the mplsOutSegmentLdpLspIndex
**type**\: :py:class:`MplsLdpLspFecSegment <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpLspFecTable.MplsLdpLspFecEntry.MplsLdpLspFecSegment>`
.. attribute:: mplsldplspfecsegmentindex (key)
This index is interpretted by using the value of the mplsLdpLspFecSegment. If the mplsLdpLspFecSegment is inSegment(1), then this index has the same value as mplsInSegmentLdpLspIndex. If the mplsLdpLspFecSegment is outSegment(2), then this index has the same value as mplsOutSegmentLdpLspIndex
**type**\: str
**length:** 1..24
.. attribute:: mplsldplspfecindex (key)
This index identifies the FEC entry in the mplsFecTable associated with this session. In other words, the value of this index is the same as the value of the mplsFecIndex that denotes the FEC associated with this Session
**type**\: int
**range:** 1..4294967295
.. attribute:: mplsldplspfecstoragetype
The storage type for this conceptual row. Conceptual rows having the value 'permanent(4)' need not allow write\-access to any columnar objects in the row
**type**\: :py:class:`StorageType <ydk.models.cisco_ios_xe.SNMPv2_TC.StorageType>`
.. attribute:: mplsldplspfecrowstatus
The status of this conceptual row. If the value of this object is 'active(1)', then none of the writable objects of this entry can be modified. The Agent should delete this row when the session ceases to exist. If an operator wants to associate the session with a different FEC, the recommended procedure is (as described in detail in the section entitled, 'Changing Values After Session Establishment', and again described in the DESCRIPTION clause of the mplsLdpEntityAdminStatus object) is to set the mplsLdpEntityAdminStatus to down, thereby explicitly causing a session to be torn down. This will also cause this entry to be deleted. Then, set the mplsLdpEntityAdminStatus to enable which enables a new session to be initiated. Once the session is initiated, an entry may be added to this table to associate the new session with a FEC
**type**\: :py:class:`RowStatus <ydk.models.cisco_ios_xe.SNMPv2_TC.RowStatus>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsLdpLspFecTable.MplsLdpLspFecEntry, self).__init__()
self.yang_name = "mplsLdpLspFecEntry"
self.yang_parent_name = "mplsLdpLspFecTable"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['mplsldpentityldpid','mplsldpentityindex','mplsldppeerldpid','mplsldplspfecsegment','mplsldplspfecsegmentindex','mplsldplspfecindex']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mplsldpentityldpid', (YLeaf(YType.str, 'mplsLdpEntityLdpId'), ['str'])),
('mplsldpentityindex', (YLeaf(YType.str, 'mplsLdpEntityIndex'), ['int'])),
('mplsldppeerldpid', (YLeaf(YType.str, 'mplsLdpPeerLdpId'), ['str'])),
('mplsldplspfecsegment', (YLeaf(YType.enumeration, 'mplsLdpLspFecSegment'), [('ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB', 'MPLSLDPSTDMIB', 'MplsLdpLspFecTable.MplsLdpLspFecEntry.MplsLdpLspFecSegment')])),
('mplsldplspfecsegmentindex', (YLeaf(YType.str, 'mplsLdpLspFecSegmentIndex'), ['str'])),
('mplsldplspfecindex', (YLeaf(YType.uint32, 'mplsLdpLspFecIndex'), ['int'])),
('mplsldplspfecstoragetype', (YLeaf(YType.enumeration, 'mplsLdpLspFecStorageType'), [('ydk.models.cisco_ios_xe.SNMPv2_TC', 'StorageType', '')])),
('mplsldplspfecrowstatus', (YLeaf(YType.enumeration, 'mplsLdpLspFecRowStatus'), [('ydk.models.cisco_ios_xe.SNMPv2_TC', 'RowStatus', '')])),
])
self.mplsldpentityldpid = None
self.mplsldpentityindex = None
self.mplsldppeerldpid = None
self.mplsldplspfecsegment = None
self.mplsldplspfecsegmentindex = None
self.mplsldplspfecindex = None
self.mplsldplspfecstoragetype = None
self.mplsldplspfecrowstatus = None
self._segment_path = lambda: "mplsLdpLspFecEntry" + "[mplsLdpEntityLdpId='" + str(self.mplsldpentityldpid) + "']" + "[mplsLdpEntityIndex='" + str(self.mplsldpentityindex) + "']" + "[mplsLdpPeerLdpId='" + str(self.mplsldppeerldpid) + "']" + "[mplsLdpLspFecSegment='" + str(self.mplsldplspfecsegment) + "']" + "[mplsLdpLspFecSegmentIndex='" + str(self.mplsldplspfecsegmentindex) + "']" + "[mplsLdpLspFecIndex='" + str(self.mplsldplspfecindex) + "']"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/mplsLdpLspFecTable/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsLdpLspFecTable.MplsLdpLspFecEntry, [u'mplsldpentityldpid', u'mplsldpentityindex', u'mplsldppeerldpid', u'mplsldplspfecsegment', u'mplsldplspfecsegmentindex', u'mplsldplspfecindex', u'mplsldplspfecstoragetype', u'mplsldplspfecrowstatus'], name, value)
class MplsLdpLspFecSegment(Enum):
"""
MplsLdpLspFecSegment (Enum Class)
If the value is inSegment(1), then this
indicates that the following index,
mplsLdpLspFecSegmentIndex, contains the same
value as the mplsInSegmentLdpLspIndex.
Otherwise, if the value of this object is
outSegment(2), then this
indicates that following index,
mplsLdpLspFecSegmentIndex, contains the same
value as the mplsOutSegmentLdpLspIndex.
.. data:: inSegment = 1
.. data:: outSegment = 2
"""
inSegment = Enum.YLeaf(1, "inSegment")
outSegment = Enum.YLeaf(2, "outSegment")
class MplsLdpSessionPeerAddrTable(Entity):
"""
This table 'extends' the mplsLdpSessionTable.
This table is used to store Label Address Information
from Label Address Messages received by this LSR from
Peers. This table is read\-only and should be updated
when Label Withdraw Address Messages are received, i.e.,
Rows should be deleted as appropriate.
NOTE\: since more than one address may be contained
in a Label Address Message, this table 'sparse augments',
the mplsLdpSessionTable's information.
.. attribute:: mplsldpsessionpeeraddrentry
An entry in this table represents information on a session's single next hop address which was advertised in an Address Message from the LDP peer. The information contained in a row is read\-only
**type**\: list of :py:class:`MplsLdpSessionPeerAddrEntry <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpSessionPeerAddrTable.MplsLdpSessionPeerAddrEntry>`
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsLdpSessionPeerAddrTable, self).__init__()
self.yang_name = "mplsLdpSessionPeerAddrTable"
self.yang_parent_name = "MPLS-LDP-STD-MIB"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("mplsLdpSessionPeerAddrEntry", ("mplsldpsessionpeeraddrentry", MPLSLDPSTDMIB.MplsLdpSessionPeerAddrTable.MplsLdpSessionPeerAddrEntry))])
self._leafs = OrderedDict()
self.mplsldpsessionpeeraddrentry = YList(self)
self._segment_path = lambda: "mplsLdpSessionPeerAddrTable"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsLdpSessionPeerAddrTable, [], name, value)
class MplsLdpSessionPeerAddrEntry(Entity):
"""
An entry in this table represents information on
a session's single next hop address which was
advertised in an Address Message from the LDP peer.
The information contained in a row is read\-only.
.. attribute:: mplsldpentityldpid (key)
**type**\: str
**refers to**\: :py:class:`mplsldpentityldpid <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry>`
.. attribute:: mplsldpentityindex (key)
**type**\: int
**range:** 1..4294967295
**refers to**\: :py:class:`mplsldpentityindex <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpEntityTable.MplsLdpEntityEntry>`
.. attribute:: mplsldppeerldpid (key)
**type**\: str
**refers to**\: :py:class:`mplsldppeerldpid <ydk.models.cisco_ios_xe.MPLS_LDP_STD_MIB.MPLSLDPSTDMIB.MplsLdpPeerTable.MplsLdpPeerEntry>`
.. attribute:: mplsldpsessionpeeraddrindex (key)
An index which uniquely identifies this entry within a given session
**type**\: int
**range:** 1..4294967295
.. attribute:: mplsldpsessionpeernexthopaddrtype
The internetwork layer address type of this Next Hop Address as specified in the Label Address Message associated with this Session. The value of this object indicates how to interpret the value of mplsLdpSessionPeerNextHopAddr
**type**\: :py:class:`InetAddressType <ydk.models.cisco_ios_xe.INET_ADDRESS_MIB.InetAddressType>`
.. attribute:: mplsldpsessionpeernexthopaddr
The next hop address. The type of this address is specified by the value of the mplsLdpSessionPeerNextHopAddrType
**type**\: str
**length:** 0..255
"""
_prefix = 'MPLS-LDP-STD-MIB'
_revision = '2004-06-03'
def __init__(self):
super(MPLSLDPSTDMIB.MplsLdpSessionPeerAddrTable.MplsLdpSessionPeerAddrEntry, self).__init__()
self.yang_name = "mplsLdpSessionPeerAddrEntry"
self.yang_parent_name = "mplsLdpSessionPeerAddrTable"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['mplsldpentityldpid','mplsldpentityindex','mplsldppeerldpid','mplsldpsessionpeeraddrindex']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('mplsldpentityldpid', (YLeaf(YType.str, 'mplsLdpEntityLdpId'), ['str'])),
('mplsldpentityindex', (YLeaf(YType.str, 'mplsLdpEntityIndex'), ['int'])),
('mplsldppeerldpid', (YLeaf(YType.str, 'mplsLdpPeerLdpId'), ['str'])),
('mplsldpsessionpeeraddrindex', (YLeaf(YType.uint32, 'mplsLdpSessionPeerAddrIndex'), ['int'])),
('mplsldpsessionpeernexthopaddrtype', (YLeaf(YType.enumeration, 'mplsLdpSessionPeerNextHopAddrType'), [('ydk.models.cisco_ios_xe.INET_ADDRESS_MIB', 'InetAddressType', '')])),
('mplsldpsessionpeernexthopaddr', (YLeaf(YType.str, 'mplsLdpSessionPeerNextHopAddr'), ['str'])),
])
self.mplsldpentityldpid = None
self.mplsldpentityindex = None
self.mplsldppeerldpid = None
self.mplsldpsessionpeeraddrindex = None
self.mplsldpsessionpeernexthopaddrtype = None
self.mplsldpsessionpeernexthopaddr = None
self._segment_path = lambda: "mplsLdpSessionPeerAddrEntry" + "[mplsLdpEntityLdpId='" + str(self.mplsldpentityldpid) + "']" + "[mplsLdpEntityIndex='" + str(self.mplsldpentityindex) + "']" + "[mplsLdpPeerLdpId='" + str(self.mplsldppeerldpid) + "']" + "[mplsLdpSessionPeerAddrIndex='" + str(self.mplsldpsessionpeeraddrindex) + "']"
self._absolute_path = lambda: "MPLS-LDP-STD-MIB:MPLS-LDP-STD-MIB/mplsLdpSessionPeerAddrTable/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(MPLSLDPSTDMIB.MplsLdpSessionPeerAddrTable.MplsLdpSessionPeerAddrEntry, [u'mplsldpentityldpid', u'mplsldpentityindex', u'mplsldppeerldpid', u'mplsldpsessionpeeraddrindex', u'mplsldpsessionpeernexthopaddrtype', u'mplsldpsessionpeernexthopaddr'], name, value)
def clone_ptr(self):
self._top_entity = MPLSLDPSTDMIB()
return self._top_entity
| 55.323188 | 1,407 | 0.642164 | 11,042 | 114,519 | 6.547002 | 0.067832 | 0.01303 | 0.018398 | 0.023917 | 0.567269 | 0.545316 | 0.522617 | 0.503472 | 0.49242 | 0.485752 | 0 | 0.011888 | 0.272828 | 114,519 | 2,069 | 1,408 | 55.349928 | 0.856223 | 0.458614 | 0 | 0.410334 | 0 | 0 | 0.295938 | 0.202035 | 0.010638 | 0 | 0 | 0 | 0 | 1 | 0.06535 | false | 0.00152 | 0.007599 | 0 | 0.1231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
db098feaf4cd2fcd3ec50721e2eaf014b9b9cc97 | 892 | py | Python | atlas/foundations_contrib/src/foundations_contrib/helpers/shell.py | DeepLearnI/atlas | 8aca652d7e647b4e88530b93e265b536de7055ed | [
"Apache-2.0"
] | 296 | 2020-03-16T19:55:00.000Z | 2022-01-10T19:46:05.000Z | atlas/foundations_contrib/src/foundations_contrib/helpers/shell.py | DeepLearnI/atlas | 8aca652d7e647b4e88530b93e265b536de7055ed | [
"Apache-2.0"
] | 57 | 2020-03-17T11:15:57.000Z | 2021-07-10T14:42:27.000Z | atlas/foundations_contrib/src/foundations_contrib/helpers/shell.py | DeepLearnI/atlas | 8aca652d7e647b4e88530b93e265b536de7055ed | [
"Apache-2.0"
] | 38 | 2020-03-17T21:06:05.000Z | 2022-02-08T03:19:34.000Z |
def find_bash():
import os
if os.name == 'nt':
return _find_windows_bash()
return '/bin/bash'
def _find_windows_bash():
winreg = _winreg_module()
import csv
StringIO = _get_string_io()
from os.path import dirname
sub_key = 'Directory\\shell\\git_shell\\command'
value = winreg.QueryValue(winreg.HKEY_CLASSES_ROOT, sub_key)
with StringIO(value) as file:
reader = csv.reader(file, delimiter=' ', quotechar='"')
git_bash_location = list(reader)[0][0]
git_bash_directory = git_bash_location.split('\\git-bash.exe')[0]
bash_location = git_bash_directory + '\\bin\\bash.exe'
return bash_location
def _get_string_io():
try:
from StringIO import StringIO
except ImportError:
from io import StringIO
return StringIO
def _winreg_module():
import winreg
return winreg
| 25.485714 | 73 | 0.659193 | 114 | 892 | 4.877193 | 0.403509 | 0.06295 | 0.053957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004412 | 0.237668 | 892 | 34 | 74 | 26.235294 | 0.813235 | 0 | 0 | 0 | 0 | 0 | 0.087542 | 0.040404 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148148 | false | 0 | 0.259259 | 0 | 0.592593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
db0ab3da5d70c76acedaa4a8af65bab398892ba2 | 9,104 | py | Python | app/models/user.py | tonyngophd/dronest | f0976c31cbbf6fb032851bd42ac566bb381608f0 | [
"MIT"
] | 13 | 2021-02-03T13:26:59.000Z | 2021-03-24T19:34:19.000Z | app/models/user.py | suasllc/dronest | f0976c31cbbf6fb032851bd42ac566bb381608f0 | [
"MIT"
] | null | null | null | app/models/user.py | suasllc/dronest | f0976c31cbbf6fb032851bd42ac566bb381608f0 | [
"MIT"
] | 1 | 2021-06-07T17:56:58.000Z | 2021-06-07T17:56:58.000Z | from .db import db
from .userfollower import UserFollower
from werkzeug.security import generate_password_hash, check_password_hash
from flask_login import UserMixin
from sqlalchemy import Table, Column, Integer, ForeignKey, or_
from .directmessage import DirectMessage
from .userequipment import UserEquipment
from .equipment import Equipment
from .message import Message
from .messagereceiver import MessageReceiver
from sqlalchemy.orm import validates
class User(db.Model, UserMixin):
__tablename__ = 'Users'
id = db.Column(db.Integer, primary_key = True)
username = db.Column(db.String(40), nullable = False, unique = True)
name = db.Column(db.String(100), nullable=True)
email = db.Column(db.String(255), nullable = False, unique = True)
hashed_password = db.Column(db.String(255), nullable = False)
bio = db.Column(db.Text, nullable=True)
websiteUrl = db.Column(db.Text, nullable=False, default="www.google.com")
userType = db.Column(db.Integer, nullable=True, default=0)
profilePicUrl = db.Column(db.Text, nullable=True)
createdAt = db.Column(db.DateTime(timezone=True), server_default=db.func.now()) #func.sysdate())
updatedAt = db.Column(db.DateTime(timezone=True), server_default=db.func.now(), server_onupdate=db.func.now())
ownPosts = db.relationship('Post', foreign_keys='Post.userId')
ownComments = db.relationship('Comment', foreign_keys='Comment.userId')
taggedInPosts = db.relationship('Post', secondary='taggedusers')
likedPosts = db.relationship('Post', secondary='likedposts')
savedPosts = db.relationship('Post', secondary='savedposts')
sentMessages = db.relationship('DirectMessage', foreign_keys='DirectMessage.senderId')
receivedMessages = db.relationship('DirectMessage', foreign_keys='DirectMessage.receiverId')
likedComments = db.relationship('Comment', secondary='commentlikes')
taggedInComments = db.relationship('Comment', secondary='commenttaggedusers')
followers = [] #db.relationship('User', secondary='userfollowers', foreign_keys='UserFollower.followerId')
following = [] #db.relationship('User', secondary='userfollowers', foreign_keys='UserFollower.userId')
allMessages = []
# equipmentList = []
equipmentList = db.relationship('Equipment', secondary="UserEquipments")
# @validates('username', 'email')
# def convert_lower(self, key, value):
# return value.lower()
@property
def password(self):
return self.hashed_password
@password.setter
def password(self, password):
self.hashed_password = generate_password_hash(password)
def check_password(self, password):
return check_password_hash(self.password, password)
def get_followers(self):
ufs = UserFollower.query.filter(UserFollower.userId == self.id).all()
self.followers = [uf.follower for uf in ufs]
def get_following(self):
ufs = UserFollower.query.filter(UserFollower.followerId == self.id).all()
self.following = [uf.person for uf in ufs]
def get_messages(self):
msgs = DirectMessage.query\
.filter(or_(DirectMessage.senderId == self.id, \
DirectMessage.receiverId == self.id)).order_by(DirectMessage.id).all()
self.allMessages = msgs
def get_conversations(self):
convos = MessageReceiver.query\
.filter(or_(MessageReceiver.senderId == self.id, \
MessageReceiver.receiverId == self.id)).order_by(MessageReceiver.id).all()
uniqueConvos = []
if len(convos):
messageIdSet = set()
for convo in convos:
if convo.senderId != self.id:
uniqueConvos.append(convo)
else:
if convo.messageId not in messageIdSet:
uniqueConvos.append(convo)
messageIdSet.add(convo.messageId)
self.allMessages = uniqueConvos
def get_last_conversation(self):
convo = MessageReceiver.query\
.filter(or_(MessageReceiver.senderId == self.id, \
MessageReceiver.receiverId == self.id)).order_by(-MessageReceiver.id).first()
self.allMessages = [convo]
def to_dict(self):
return {
"id": self.id,
"name": self.name,
"username": self.username,
"email": self.email,
"bio": self.bio,
"websiteUrl": self.websiteUrl,
"profilePicUrl": self.profilePicUrl,
}
def to_dict_with_posts_and_follows(self):
self.get_followers()
self.get_following()
return {
"id": self.id,
"name": self.name,
"username": self.username,
"email": self.email,
"bio": self.bio,
"websiteUrl": self.websiteUrl,
"profilePicUrl": self.profilePicUrl,
"followers": [user.to_dict() for user in self.followers],
"following": [user.to_dict() for user in self.following],
"ownPosts": [post.to_dict() for post in self.ownPosts],
"equipmentList": [equipment.to_dict() for equipment in self.equipmentList],
}
def to_dict_with_posts(self):
return {
"id": self.id,
"name": self.name,
"username": self.username,
"email": self.email,
"bio": self.bio,
"websiteUrl": self.websiteUrl,
"profilePicUrl": self.profilePicUrl,
"ownPosts": [post.to_dict() for post in self.ownPosts],
}
def to_dict_with_posts_fast(self):
user_as_dict_basic = {
"id": self.id,
"name": self.name,
"username": self.username,
"email": self.email,
"bio": self.bio,
"websiteUrl": self.websiteUrl,
"profilePicUrl": self.profilePicUrl,
}
user_as_dict = user_as_dict_basic.copy()
user_as_dict["ownPosts"] = [post.to_dict_fast_own_user(user_as_dict_basic) for post in self.ownPosts]
return user_as_dict
# "ownPosts": [post.to_dict_fast() for post in self.ownPosts],
def to_dict_feed(self):
self.get_following()
return {
"followingIds": [int(follow.id) for follow in self.following]
}
def to_dict_for_mentions(self):
return {
"id": self.id,
"displayName": self.name,
"name": self.username,
"profilePicUrl": self.profilePicUrl,
}
def to_dict_no_posts(self):
#no posts so if a post has this user, there is no infinite circular references
return {
"id": self.id,
"username": self.username,
"email": self.email,
"bio": self.bio,
"websiteUrl": self.websiteUrl,
"profilePicUrl": self.profilePicUrl,
}
def to_dict_for_self(self):
self.get_followers()
self.get_following()
# self.get_messages()
self.get_conversations()
return {
"id": self.id,
"username": self.username,
"name": self.name,
"email": self.email,
"bio": self.bio,
"websiteUrl": self.websiteUrl,
"profilePicUrl": self.profilePicUrl,
"userType": self.userType,
"ownPosts": [post.to_dict() for post in self.ownPosts],
"likedPosts": [post.to_dict() for post in self.likedPosts],
"savedPosts": [post.to_dict() for post in self.savedPosts],
"taggedInPosts": [post.to_dict() for post in self.taggedInPosts],
"messages": [m.to_dict() for m in self.allMessages], #[sentMsg.to_dict() for sentMsg in self.sentMessages] + [recvdMsg.to_dict() for recvdMsg in self.receivedMessages],
"followers": [user.to_dict() for user in self.followers],
"following": [user.to_dict() for user in self.following],
"likedComments": [comment.to_dict() for comment in self.likedComments],
"taggedInComments": [comment.to_dict() for comment in self.taggedInComments],
"equipmentList": [equipment.to_dict() for equipment in self.equipmentList],
}
def to_dict_as_generic_profile(self):
'''
compared to "for_self" this does not include:
- messages
and more later
'''
self.get_followers()
self.get_following()
return {
"id": self.id,
"username": self.username,
"name": self.name,
"email": self.email,
"bio": self.bio,
"websiteUrl": self.websiteUrl,
"profilePicUrl": self.profilePicUrl,
"ownPosts": [post.to_dict() for post in self.ownPosts],
"likedPosts": [post.to_dict() for post in self.likedPosts],
"savedPosts": [post.to_dict() for post in self.savedPosts],
"taggedInPosts": [post.to_dict() for post in self.taggedInPosts],
"followers": [user.to_dict() for user in self.followers],
"following": [user.to_dict() for user in self.following],
"likedComments": [comment.to_dict() for comment in self.likedComments],
"taggedInComments": [comment.to_dict() for comment in self.taggedInComments],
"equipmentList": [equipment.to_dict() for equipment in self.equipmentList],
}
'''
mapper(
User, t_users,
properties={
'followers': relation(
User,
secondary=t_follows,
primaryjoin=(t_follows.c.followee_id==t_users.c.id),
secondaryjoin=(t_follows.c.follower_id==t_users.c.id),
),
'followees': relation(
User,
secondary=t_follows,
primaryjoin=(t_follows.c.follower_id==t_users.c.id),
secondaryjoin=(t_follows.c.followee_id==t_users.c.id),
),
},
)
'''
| 35.286822 | 174 | 0.672781 | 1,077 | 9,104 | 5.556175 | 0.164345 | 0.037099 | 0.042112 | 0.02607 | 0.557821 | 0.539271 | 0.487634 | 0.467914 | 0.429646 | 0.386698 | 0 | 0.001637 | 0.19464 | 9,104 | 257 | 175 | 35.424125 | 0.814512 | 0.071397 | 0 | 0.479167 | 1 | 0 | 0.109102 | 0.005815 | 0 | 0 | 0 | 0 | 0 | 1 | 0.088542 | false | 0.046875 | 0.057292 | 0.03125 | 0.338542 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
db0eabb87d8f110b34f799008d45115ae3494a8a | 470 | py | Python | tests/test_toolbar.py | WilliamMayor/django-mail-panel | 2c41f808a645d5d7bad90510f44e53d29981cf22 | [
"Apache-2.0"
] | null | null | null | tests/test_toolbar.py | WilliamMayor/django-mail-panel | 2c41f808a645d5d7bad90510f44e53d29981cf22 | [
"Apache-2.0"
] | null | null | null | tests/test_toolbar.py | WilliamMayor/django-mail-panel | 2c41f808a645d5d7bad90510f44e53d29981cf22 | [
"Apache-2.0"
] | null | null | null | from .context import *
import unittest
from mail_panel.panels import MailToolbarPanel
class ToolbarSuite(unittest.TestCase):
def test_panel(self):
"""
General 'does it run' test.
"""
p = MailToolbarPanel(None)
assert(p.toolbar is None)
def suite():
suite = unittest.TestSuite()
suite.addTest(unittest.makeSuite(ToolbarSuite))
return suite
if __name__ == "__main__":
unittest.TextTestRunner().run(suite())
| 21.363636 | 51 | 0.668085 | 51 | 470 | 5.960784 | 0.627451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.223404 | 470 | 21 | 52 | 22.380952 | 0.832877 | 0.057447 | 0 | 0 | 0 | 0 | 0.019093 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 1 | 0.153846 | false | 0 | 0.230769 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
db116d889b8b1d94133fabaa9ee920a870375f4b | 839 | py | Python | pangram.py | ZorbaTheStrange/pangram | f9fda95f119d328224f21f19690122e36be34482 | [
"MIT"
] | null | null | null | pangram.py | ZorbaTheStrange/pangram | f9fda95f119d328224f21f19690122e36be34482 | [
"MIT"
] | null | null | null | pangram.py | ZorbaTheStrange/pangram | f9fda95f119d328224f21f19690122e36be34482 | [
"MIT"
] | null | null | null | #! /usr/bin/python3
'''
panogram.py - this program recongizes pangrams.
by zorba
'''
import sys
def pangram_check(sentence_or_word):
'''
checks the user input to see if it is a pangram.
'''
letters = set('abcdefghijklmnopqrstuvwxyz')
if sentence_or_word.lower() == 'done':
z
for letter in sentence_or_word.lower():
if letter in letters:
letters.remove(letter)
if len(letters) == 0:
print('\nThe sentence or word is a panogram!')
else:
print('\nThis sentence or word is not a panogram.')
def main():
'''
main
'''
sentence_or_word = input('\nPlease enter a sentence or a word to check to see if it is a pangram: \nIf Done, Please type Done')
pangram_check(sentence_or_word)
if __name__ == '__main__':
sys.exit(main())
| 19.511628 | 131 | 0.623361 | 115 | 839 | 4.373913 | 0.469565 | 0.159046 | 0.194831 | 0.087475 | 0.178926 | 0.075547 | 0.075547 | 0 | 0 | 0 | 0 | 0.003247 | 0.265793 | 839 | 42 | 132 | 19.97619 | 0.813312 | 0.153754 | 0 | 0 | 0 | 0.058824 | 0.322388 | 0.038806 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.058824 | 0 | 0.176471 | 0.117647 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
db13d0f32b95cfef64253a43f004918a6c18619d | 232 | py | Python | Chapter-4 Sequence/Dictionary.py | jaiswalIT02/pythonprograms | bc94e52121202b04c3e9112d9786f93ed6707f7a | [
"MIT"
] | null | null | null | Chapter-4 Sequence/Dictionary.py | jaiswalIT02/pythonprograms | bc94e52121202b04c3e9112d9786f93ed6707f7a | [
"MIT"
] | null | null | null | Chapter-4 Sequence/Dictionary.py | jaiswalIT02/pythonprograms | bc94e52121202b04c3e9112d9786f93ed6707f7a | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Sat Oct 10 15:31:57 2020
@author: Tarun Jaiswal
"""
dictone = {
"bookname": "Recursion Sutras",
"subject": "Recursion",
"author": "Champak Roy"
}
dicttwo = dict(dictone)
print(dicttwo) | 15.466667 | 35 | 0.633621 | 29 | 232 | 5.068966 | 0.862069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068063 | 0.176724 | 232 | 15 | 36 | 15.466667 | 0.701571 | 0.353448 | 0 | 0 | 0 | 0 | 0.398601 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
db1fd3d38056cafb0f7ff39c5a005804f923571f | 5,310 | py | Python | GoogleCloud/backend.py | ryanjsfx2424/HowToNFTs | f4cff7ad676d272815bd936eb142556f92540a32 | [
"MIT"
] | null | null | null | GoogleCloud/backend.py | ryanjsfx2424/HowToNFTs | f4cff7ad676d272815bd936eb142556f92540a32 | [
"MIT"
] | null | null | null | GoogleCloud/backend.py | ryanjsfx2424/HowToNFTs | f4cff7ad676d272815bd936eb142556f92540a32 | [
"MIT"
] | null | null | null | ## backend.py
"""
The purpose of this script is to continuously monitor the blockchain to
1) determine if a holder aquires or loses an NFT:
2) if they do, generate a new image/movie for the tokens they hold,
3) upload the new image/movie to the hosting service
4) update the metadata file
Repeat :)
(The above ordering matters!)
"""
## use python3!!!
import os
import io
import json
from web3 import Web3
## PARAMETERS
DEPLOYER_ADDRESS = "0x01656d41e041b50fc7c1eb270f7d891021937436"
INFURA_URL = "https://rinkeby.infura.io/v3/37de3193ccf345fe810932c3d0f103d8"
EXT_IMG = ".mp4"
EXT_METADATA = ".json"
ADDRESS = "0xb552E0dDd94EA72DBc089619115c81529cd8CA70" # address for deployed smart contract
## web3 stuff
w3 = Web3(Web3.HTTPProvider(INFURA_URL))
with open("../contract/abi_v020.json", "r") as fid:
rl = "".join(fid.readlines())
abi = json.loads(rl)
# end with open
## goal is to update token URI based on how many are held
## by that owner (but deployer doesn't count!)
contract = w3.eth.contract(address=ADDRESS, abi=abi)
totalSupply = contract.functions.totalSupply().call()
print("total supply: ", totalSupply)
for ii in range(totalSupply):
token = contract.functions.tokenByIndex(ii).call()
owner = contract.functions.ownerOf(token).call()
tokenList = contract.functions.walletOfOwner(owner).call()
## string comparison fails for some mysterious reason
if int(owner,16) == int(DEPLOYER_ADDRESS,16):
tokenList = [ii+1]
# end if
print("token: ", token)
print("owner: ", owner)
print("tokenList: ", tokenList)
newTokenName = str(token)
for jj in range(len(tokenList)):
if tokenList[jj] != token:
newTokenName += "_" + str(tokenList[jj])
# end if
# end for jj
print("newTokenName: ", newTokenName)
## first, check if metadata on hosting service has newTokenName.
## if so, we're good! If not, update it!
old_foos = []
metadata_correct = False
os.system("gsutil ls gs://how-to-nfts-metadata/foo" + str(token) + ".txt"
+ " > foo_file0.txt")
os.system("gsutil ls gs://how-to-nfts-metadata/foo" + str(token) + "_*.txt"
+ " > foo_file1.txt")
for jj in range(2):
with open("foo_file" + str(jj) + ".txt", "r") as fid:
for line in fid:
old_foos.append(line)
if "foo" + newTokenName + ".txt" in line:
metadata_correct = True
# end if
# end for
# end with
os.system("rm foo_file" + str(jj) + ".txt")
# end for jj
print("old_foos: ", old_foos)
if metadata_correct:
print("metadata correct (supposedly) so skipping")
continue
# end if
if len(old_foos) > 1:
print("error! only expected one old foo file.")
raise
# end if
old_foo = old_foos[0][:-1] # strip trailing newline character
old_foo = old_foo.split("metadata/")[1]
print("old_foo: ", old_foo)
## evidently metadata is not correct...
## first, we generate a new movie (if needed) and rsync with
## the GCP bucket.
## then, we'll update the metadata file, remove the old foo
## file and touch a new one
## then we'll rsync the metadata folder with the bucket.
target = "../nftmp4s/HowToKarate" + str(token) + ".mp4"
destination = "../nftmp4s/HowToKarate" + newTokenName + ".mp4"
if not os.path.exists(destination):
os.system("cp " + target + " " + destination)
for jj in range(len(tokenList)):
if tokenList[jj] != token:
print("destination: ", destination)
print("tokenList[jj]: ", tokenList[jj])
os.system('ffmpeg -y -i ' + destination + ' -i nftmp4s/HowToKarate' + str(tokenList[jj]) + '.mp4' + \
' -filter_complex "[0:v] [1:v]' + \
' concat=n=2:v=1 [v]"' + \
' -map "[v]" ' + "concat.mp4")
os.system("mv concat.mp4 " + destination)
# end if
# end for jj
## note, can rsync in parallel via rsync -m...
os.system("gsutil rsync ../nftmp4s/ gs://how-to-nfts-data/")
# end if
## next, we'll update the metadata file, remove the old foo
## file and touch a new one
## then we'll rsync the metadata folder with the bucket.
os.system("cp ../metadata/" + str(token) + ".json temp.json")
with open("../metadata/" + str(token) + ".json", "w") as fid_write:
with open("temp.json", "r") as fid_read:
for line in fid_read:
if '"image":' in line:
line = line.split("HowToKarate")[0] + "HowToKarate" + \
str(newTokenName) + '.mp4",\n'
# end i
fid_write.write(line)
# end for line
# end with open write
# end with open read
os.system("rm temp.json")
os.system("touch ../metadata/foo" + str(newTokenName) + ".txt")
os.system("rm ../metadata/" + old_foo)
## last, we need to update the _metadata file and then rsync.
with open("../metadata/_metadata.json", "w") as fid_write:
fid_write.write("{\n")
for jj in range(1,25):
with open("../metadata/" + str(jj) + ".json", "r") as fid_read:
for line in fid_read:
if "}" in line and len(line) == 2 and jj != 24:
line = "},\n"
# end if
fid_write.write(line)
# end for
# end with open
fid_write.write("}")
# end with open
os.system("gsutil rsync -d ../metadata/ gs://how-to-nfts-metadata/")
# end for ii
## end test.py
| 32.378049 | 109 | 0.628625 | 742 | 5,310 | 4.443396 | 0.284367 | 0.029117 | 0.016682 | 0.025478 | 0.184107 | 0.15226 | 0.138308 | 0.138308 | 0.138308 | 0.138308 | 0 | 0.031128 | 0.225612 | 5,310 | 163 | 110 | 32.576687 | 0.770671 | 0.26403 | 0 | 0.090909 | 1 | 0 | 0.263103 | 0.074316 | 0 | 0 | 0.021904 | 0 | 0 | 1 | 0 | false | 0 | 0.045455 | 0 | 0.045455 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
db29c33820407f3d84d5b4458a06a85d146e75c7 | 1,130 | py | Python | core/analyser.py | hryu/cpu_usage_analyser | bc870c4fd3be873033a7f43612c1a0379d5bf419 | [
"MIT"
] | null | null | null | core/analyser.py | hryu/cpu_usage_analyser | bc870c4fd3be873033a7f43612c1a0379d5bf419 | [
"MIT"
] | null | null | null | core/analyser.py | hryu/cpu_usage_analyser | bc870c4fd3be873033a7f43612c1a0379d5bf419 | [
"MIT"
] | null | null | null | class Analyser:
def __init__(self, callbacks, notifiers, state):
self.cbs = callbacks
self.state = state
self.notifiers = notifiers
def on_begin_analyse(self, timestamp):
pass
def on_end_analyse(self, timestamp):
pass
def analyse(self, event):
event_name = event.name
# for 'perf' tool
split_event_name = event.name.split(':')
if len(split_event_name) > 1:
event_name = split_event_name[1].strip()
if event_name in self.cbs:
self.cbs[event_name](event)
elif (event_name.startswith('sys_enter') or \
event_name.startswith('syscall_entry_')) and \
'syscall_entry' in self.cbs:
self.cbs['syscall_entry'](event)
elif (event_name.startswith('sys_exit') or \
event_name.startswith('syscall_exit_')) and \
'syscall_exit' in self.cbs:
self.cbs['syscall_exit'](event)
def notify(self, notification_id, **kwargs):
if notification_id in self.notifiers:
self.notifiers[notification_id](**kwargs)
| 32.285714 | 60 | 0.604425 | 135 | 1,130 | 4.8 | 0.288889 | 0.180556 | 0.117284 | 0.060185 | 0.361111 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0.002488 | 0.288496 | 1,130 | 34 | 61 | 33.235294 | 0.803483 | 0.013274 | 0 | 0.074074 | 0 | 0 | 0.085355 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.185185 | false | 0.074074 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
db2d0faef6bb46b40a8c415250b0a2a6b57926d0 | 3,841 | py | Python | sugarpidisplay/sugarpiconfig/views.py | szpaku80/SugarPiDisplay | 793c288afaad1b1b6921b0d29ee0e6a537e42384 | [
"MIT"
] | 1 | 2022-02-12T20:39:20.000Z | 2022-02-12T20:39:20.000Z | sugarpidisplay/sugarpiconfig/views.py | szpaku80/SugarPiDisplay | 793c288afaad1b1b6921b0d29ee0e6a537e42384 | [
"MIT"
] | null | null | null | sugarpidisplay/sugarpiconfig/views.py | szpaku80/SugarPiDisplay | 793c288afaad1b1b6921b0d29ee0e6a537e42384 | [
"MIT"
] | null | null | null | """
Routes and views for the flask application.
"""
import os
import json
from flask import Flask, redirect, request, render_template, flash
from pathlib import Path
from flask_wtf import FlaskForm
from wtforms import StringField,SelectField,PasswordField,BooleanField
from wtforms.validators import InputRequired,ValidationError
from . import app
source_dexcom = 'dexcom'
source_nightscout = 'nightscout'
LOG_FILENAME="sugarpidisplay.log"
folder_name = '.sugarpidisplay'
config_file = 'config.json'
pi_sugar_path = os.path.join(str(Path.home()), folder_name)
Path(pi_sugar_path).mkdir(exist_ok=True)
def dexcom_field_check(form, field):
if (form.data_source.data == source_dexcom):
if (not field.data):
raise ValidationError('Field cannot be empty')
def nightscout_field_check(form, field):
if (form.data_source.data == source_nightscout):
if (not field.data):
raise ValidationError('Field cannot be empty')
class MyForm(FlaskForm):
class Meta:
csrf = False
data_source = SelectField(
'Data Source',
choices=[(source_dexcom, 'Dexcom'), (source_nightscout, 'Nightscout')]
)
use_animation = BooleanField('Use Animation')
dexcom_user = StringField('Dexcom UserName', validators=[dexcom_field_check])
dexcom_pass = PasswordField('Dexcom Password', validators=[dexcom_field_check])
ns_url = StringField('Nightscout URL', validators=[nightscout_field_check])
ns_token = StringField('Nightscout Access Token', validators=[nightscout_field_check])
@app.route('/hello')
def hello_world():
return 'Hello, World!'
@app.route('/success')
def success():
return 'Your device is configured. Now cycle the power and it will use the new settings'
@app.route('/', methods=('GET', 'POST'))
def setup():
form = MyForm()
if request.method == 'POST':
if form.validate() == False:
flash('Fields are missing.')
return render_template('setup.html', form=form)
else:
handle_submit(form)
return redirect('/success')
#if form.is_submitted():
loadData(form)
return render_template('setup.html', form=form)
def handle_submit(form):
config = { 'data_source': form.data_source.data }
config['use_animation'] = form.use_animation.data
if (form.data_source.data == source_dexcom):
config['dexcom_username'] = form.dexcom_user.data
config['dexcom_password'] = form.dexcom_pass.data
else:
config['nightscout_url'] = form.ns_url.data
config['nightscout_access_token'] = form.ns_token.data
#__location__ = os.path.realpath(os.path.join(os.getcwd(), os.path.dirname(__file__)))
f = open(os.path.join(pi_sugar_path, config_file), "w")
json.dump(config, f, indent = 4)
f.close()
def loadData(form):
config_full_path = os.path.join(pi_sugar_path, config_file)
if (not Path(config_full_path).exists()):
return
try:
f = open(config_full_path, "r")
config = json.load(f)
f.close()
if ('data_source' in config):
form.data_source.data = config['data_source']
if (config['data_source'] == source_dexcom):
if ('dexcom_username' in config):
form.dexcom_user.data = config['dexcom_username']
if ('dexcom_password' in config):
form.dexcom_pass.data = config['dexcom_password']
if (config['data_source'] == source_nightscout):
if ('nightscout_url' in config):
form.ns_url.data = config['nightscout_url']
if ('nightscout_access_token' in config):
form.ns_token.data = config['nightscout_access_token']
form.use_animation.data = config['use_animation']
except:
pass
| 35.564815 | 93 | 0.664931 | 471 | 3,841 | 5.208068 | 0.26327 | 0.06115 | 0.028536 | 0.03669 | 0.293518 | 0.254382 | 0.15002 | 0.104362 | 0.079087 | 0.079087 | 0 | 0.000332 | 0.215309 | 3,841 | 107 | 94 | 35.897196 | 0.813537 | 0.039573 | 0 | 0.136364 | 0 | 0 | 0.179125 | 0.018755 | 0 | 0 | 0 | 0 | 0 | 1 | 0.079545 | false | 0.068182 | 0.090909 | 0.022727 | 0.329545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
db2d5607d06728d0c91675bdab230c329ed3e400 | 2,001 | py | Python | progressao_aritmeticav3.py | eduardobaltazarmarfim/PythonC | 8e44b4f191582c73cca6df98120ab142145c4ba1 | [
"MIT"
] | null | null | null | progressao_aritmeticav3.py | eduardobaltazarmarfim/PythonC | 8e44b4f191582c73cca6df98120ab142145c4ba1 | [
"MIT"
] | null | null | null | progressao_aritmeticav3.py | eduardobaltazarmarfim/PythonC | 8e44b4f191582c73cca6df98120ab142145c4ba1 | [
"MIT"
] | null | null | null | def retorno():
resp=input('Deseja executar o programa novamente?[s/n] ')
if(resp=='S' or resp=='s'):
verificar()
else:
print('Processo finalizado com sucesso!')
pass
def cabecalho(titulo):
print('-'*30)
print(' '*9+titulo+' '*15)
print('-'*30)
pass
def mensagem_erro():
print('Dados inseridos são invalidos!')
pass
def verificar():
try:
cabecalho('Progressão PA')
num=int(input('Digite o primeiro termo: '))
numPA=int(input('Digite sua razão PA: '))
except:
mensagem_erro()
retorno()
else:
cont=1
loop=1
rept=1
contagem=0
while loop!=0:
if(rept==1):
while cont<=10:
if(cont>=10):
print('{} -> PAUSA\n'.format(num),end='')
else:
print('{} -> '.format(num),end='')
cont+=1
num+=numPA
contagem+=1
rept+=1
loop=int(input('Quantos termos deseja mostrar a mais? '))
if(loop<=0):
print('Progressão finalizada com {} termos mostrados'.format(contagem))
break
else:
cont=1
while cont<=loop:
if(cont>=loop):
print('{} -> PAUSA\n'.format(num),end='')
else:
print('{} -> '.format(num),end='')
cont+=1
num+=numPA
contagem+=1
rept+=1
loop=int(input('Quantos termos deseja mostrar a mais? '))
if(loop<=0):
print('Progressão finalizada com {} termos mostrados'.format(contagem))
break
retorno()
pass
verificar() | 17.4 | 91 | 0.410795 | 183 | 2,001 | 4.480874 | 0.344262 | 0.039024 | 0.058537 | 0.041463 | 0.446341 | 0.446341 | 0.446341 | 0.446341 | 0.446341 | 0.446341 | 0 | 0.024209 | 0.463268 | 2,001 | 115 | 92 | 17.4 | 0.739292 | 0 | 0 | 0.606557 | 0 | 0 | 0.186813 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.065574 | false | 0.065574 | 0 | 0 | 0.065574 | 0.180328 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
db308acc7784941bed9244b19f0ab77519bcb972 | 512 | py | Python | unfollow_parfum.py | AntonPukhonin/InstaPy | 0c480474ec39e174fa4256b48bc25bc4ecf7b6aa | [
"MIT"
] | null | null | null | unfollow_parfum.py | AntonPukhonin/InstaPy | 0c480474ec39e174fa4256b48bc25bc4ecf7b6aa | [
"MIT"
] | null | null | null | unfollow_parfum.py | AntonPukhonin/InstaPy | 0c480474ec39e174fa4256b48bc25bc4ecf7b6aa | [
"MIT"
] | null | null | null | from instapy import InstaPy
#insta_username = 'antonpuhonin'
#insta_password = 'Bulbazavr36'
insta_username = 'tonparfums'
insta_password = 'ov9AN6NlnV'
try:
session = InstaPy(username=insta_username,
password=insta_password,
headless_browser=True,
multi_logs=True)
session.login()
session.unfollow_users(amount=200, onlyInstapyFollowed = True, onlyInstapyMethod = 'FIFO', unfollow_after=6*24*60*60 )
finally:
session.end()
| 24.380952 | 122 | 0.667969 | 51 | 512 | 6.509804 | 0.607843 | 0.11747 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03599 | 0.240234 | 512 | 20 | 123 | 25.6 | 0.817481 | 0.119141 | 0 | 0 | 0 | 0 | 0.053571 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.166667 | 0.083333 | 0 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
db3b169862361f20c4e85e1f3babf59d22b794c5 | 10,622 | py | Python | src/lib/GL/glutbindings/glutbind.py | kokizzu/v8cgi | eafd3bd7a5dd1d60e2f1483701a52e7ac0ae0eba | [
"BSD-3-Clause"
] | 4 | 2016-01-31T08:49:35.000Z | 2021-07-12T17:31:42.000Z | src/lib/GL/glutbindings/glutbind.py | kokizzu/v8cgi | eafd3bd7a5dd1d60e2f1483701a52e7ac0ae0eba | [
"BSD-3-Clause"
] | null | null | null | src/lib/GL/glutbindings/glutbind.py | kokizzu/v8cgi | eafd3bd7a5dd1d60e2f1483701a52e7ac0ae0eba | [
"BSD-3-Clause"
] | 1 | 2021-06-03T22:51:17.000Z | 2021-06-03T22:51:17.000Z | import sys
import re
PATH_GLUT = 'glut.h'
FILE_GLUT = 'glutbind.cpp'
TEMPLATES = ['glutInit', 'glutTimerFunc']
def main():
"""
Still some things have to be hand-made, like
changing argv pargc values in the glutInit method definition
Also change the TimerFunc method with some magic.
"""
make_glut()
def make_glut():
constants = []
functions = []
void_stars = []
constant = re.compile(".+define[\s]+GLUT_([^\s]+).*")
function = re.compile("[\s]*extern[\s]+([^\s]+)[\s]+APIENTRY[\s]+glut([A-Za-z0-9]+)\((.*)\);")
text_out = []
fin = open(PATH_GLUT, 'r')
for l in fin:
mat = re.match(constant, l)
if mat and not mat.group(1) in constants:
name = mat.group(1)
constants.append(name)
text_out.append(make_constant("GLUT", name))
if name.find("STROKE") != -1 or name.find("BITMAP") != -1:
void_stars.append(name)
#print "GLUT_" + mat.group(1) + "\n"
else:
mat = re.match(function, l)
if mat:
prefix = "glut"
return_val = mat.group(1)
name = mat.group(2)
params = mat.group(3)
functions.append(name)
#if has template then take the template code
if (prefix + name) in TEMPLATES:
t = open(prefix + name + '.template', 'r')
text_out.append(t.read())
t.close()
else:
has_lambda, count, params_list = get_param_list(params)
if has_lambda is True and count == 1:
text_out.append(make_function_with_callback(prefix, name, params_list, return_val))
else:
text_out.append(make_function(prefix, name, params_list, count, return_val))
#print return_val + " " + name + " " + params
fin.close()
fout = open(FILE_GLUT, 'w')
fout.write("""
#include "glutbind.h"
int* pargc_;
char** argv_;
map<const char*, void*> font_;
Persistent<Context> GlutFactory::glut_persistent_context;
""" + '\n'.join(text_out) + make_main_glut_function(constants, functions, void_stars))
fout.close()
def make_main_glut_function(constants, functions, void_stars):
text_out_begin = """
Handle<ObjectTemplate> GlutFactory::createGlut(int* pargc, char** argv) {
pargc_ = pargc;
argv_ = argv;
HandleScope handle_scope;
Handle<ObjectTemplate> Glut = ObjectTemplate::New();
Glut->SetInternalFieldCount(1);
"""
text_out_end = """
// Again, return the result through the current handle scope.
return handle_scope.Close(Glut);
}
"""
fnt = [bind_font(name) for name in void_stars]
cts = [bind_accessor("Glut", name) for name in constants]
fts = [bind_function("Glut", name) for name in functions]
return text_out_begin + '\n'.join(fnt) + '\n'.join(cts) + '\n'.join(fts) + text_out_end
def make_constant(prefix, name):
if name.find("BITMAP") != -1 or name.find("STROKE") != -1:
return_val = "return String::New(\""+ name +"\");\n"
else:
return_val = "return Uint32::New(GLUT_"+ name +");"
text_out = """
Handle<Value> GetGLUT_%%(Local<String> property,
const AccessorInfo &info) {
##
}
"""
return multiple_replace({
'%%': name,
'##': return_val
}, text_out)
def make_function(prefix, name, params_list, count, return_val):
text_out = """
Handle<Value> GLUT<name>Callback(const Arguments& args) {
//if less that nbr of formal parameters then do nothing
if (args.Length() < <len_params>) return v8::Undefined();
//define handle scope
HandleScope scope;
//get arguments
<args>
//make call
<call>
return v8::Undefined();
}
"""
return multiple_replace({
'<name>': name,
'<len_params>': str(count),
'<args>': make_args(params_list, count),
'<call>': make_call(prefix + name, params_list, count)
}, text_out)
def make_function_with_callback(prefix, name, params_list, return_val):
text_out = """
Persistent<Function> persistent<name>;
<prototype> {
//define handle scope
HandleScope scope;
Handle<Value> valueArr[<nformalparams>];
<formalparamassignment>
TryCatch try_catch;
Handle<Value> result = persistent<name>->Call(GlutFactory::glut_persistent_context->Global(), <nformalparams>, valueArr);
if (result.IsEmpty()) {
String::Utf8Value error(try_catch.Exception());
fprintf(stderr, "Exception in <name>: %s\\n", *error);
}
}
Handle<Value> GLUT<name>Callback(const Arguments& args) {
//if less that nbr of formal parameters then do nothing
if (args.Length() < 1 || !args[0]->IsFunction()) return v8::Undefined();
//get arguments
//delete previous assigned function
persistent<name>.Dispose();
Handle<Function> value0 = Handle<Function>::Cast(args[0]);
persistent<name> = Persistent<Function>::New(value0);
//make call
glut<name>((<signature>) func<name>);
return v8::Undefined();
}
"""
nformalparams, prototype = make_prototype(name, params_list[0])
signature = params_list[0].replace('func', '')
formalparamassignment = formal_param_assignment(signature)
return multiple_replace({
'<name>': name,
'<nformalparams>': str(nformalparams),
'<prototype>': prototype,
'<formalparamassignment>': formalparamassignment,
'<signature>': signature
}, text_out)
def make_prototype(name, signature):
print 'prev ' + signature
signature = signature.replace('(*func)', 'func' + name)
ht = signature.split('(')
hd, tail = ht[0], ht[1].replace(')', '')
ans = [get_type(''.join(val), False) + ' arg' + str(i) for i, val in enumerate(tail.split(',')) if val.find('void') == -1]
#.strip().split(' ')[:-1]
print 'end ' + hd + ' ( ' + ','.join(ans) + ')'
return len(ans), hd + ' ( ' + ','.join(ans) + ')'
def formal_param_assignment(signature):
print "signature"
print signature
pat = re.compile('[\s]*[a-zA-Z0-9\*]+[\s]*\(\*[\s]*\)\((.*)\)')
pars = re.match(pat, signature)
if pars:
pars = pars.group(1).split(',')
ans = []
for i, val in enumerate(pars):
if val.find('int') != -1 or val.find('unsigned char') != -1:
ans.append(" valueArr[" + str(i) + "] = Integer::New(arg" + str(i) + ");")
elif val.find('float') != -1 or val.find('double') != -1:
ans.append(" valueArr[" + str(i) + "] = Number::New(arg" + str(i) + ");")
elif val.find('char*') != -1:
ans.append(" valueArr[" + str(i) + "] = String::New(arg" + str(i) + ");")
return '\n'.join(ans)
else:
return ''
def get_param_list(params):
params_list = []
params_aux = params.split(',')
passed = False
for par in params_aux:
if passed and params_list[-1].count('(') != params_list[-1].count(')'):
params_list[-1] += ',' + par
else:
params_list.append(par)
passed = True
aux = len(params_list)
if aux == 1 and params_list[0].find('func') == -1 and len(params_list[0].strip().split(' ')) == 1:
nb = 0
else:
nb = aux
return ' '.join(params_list).find('func') != -1, nb, params_list
def make_args(params_list, count):
ans = []
for i in range(count):
el = params_list[i]
type = get_type(el)
#is function
if type.find('(*') != -1:
ans.append(" Handle<Function> value" + str(i) + " = Handle<Function>::Cast(args[" + str(i) + "]);\n void* arg" + str(i) + " = *value" + str(i) + ";\n")
#print "function " + type
#is string
elif type.find('char*') != -1:
ans.append(" String::Utf8Value value"+ str(i) +"(args["+ str(i) +"]);\n char* arg" + str(i) + " = *value"+ str(i) +";\n")
#print "string " + type
#is void*
elif type.find('void*') != -1:
ans.append(" String::Utf8Value value"+ str(i) +"(args["+ str(i) +"]);\n char* key" + str(i) + " = *value"+ str(i) +";\n void* arg" + str(i) + " = font_[key"+ str(i) +"];\n")
#print "void " + type
#is array
elif type.find('*') != -1:
ans.append(" Handle<Array> arg" + str(i) + " = Array::Cast(args[" + str(i) + "]);\n")
#print "array " + type
#is unsigned integer
elif type.find('unsigned int') != -1:
ans.append(" unsigned int arg" + str(i) + " = args["+ str(i) +"]->Uint32Value();\n")
#print "unsigned int " + type
#is integer
elif type.find('int') != -1 or type.find('enum') != -1:
ans.append(" int arg" + str(i) + " = args["+ str(i) +"]->IntegerValue();\n")
#print "integer " + type
#is double, float
elif type.find('double') != -1 or type.find('float') != -1:
ans.append(" double arg" + str(i) + " = args["+ str(i) +"]->NumberValue();\n")
#print "double " + type
else:
print "don't know what this is "
print type
return ''.join(ans)
def make_call(name, params_list, nb):
return name + "(" + ", ".join([get_type(params_list[i]) + "arg" + str(i) for i in range(nb)]) + ");"
def bind_accessor(prefix, name):
return " " + prefix + "->SetAccessor(String::NewSymbol(\"" + name + "\"), GetGLUT_" + name + ");\n"
def bind_function(prefix, name):
return " " + prefix + "->Set(String::NewSymbol(\"" + name + "\"), FunctionTemplate::New(GLUT" + name + "Callback));\n"
def bind_font(name):
return " font_[\""+ name +"\"] = GLUT_" + name + ";\n"
def get_type(t, parens=True):
if t.find('(*') != -1 or t.find('func') != -1:
ans = t.replace('func', '')
else:
ans = ' '.join(t.strip().split(' ')[:-1]) + '*' * (t.strip().split(' ')[-1].count('*'))
return '(' + ans + ')' if parens else ans
def multiple_replace(dict, text):
""" Replace in 'text' all occurences of any key in the given
dictionary by its corresponding value. Returns the new tring."""
# Create a regular expression from the dictionary keys
regex = re.compile("(%s)" % "|".join(map(re.escape, dict.keys())))
# For each match, look-up corresponding value in dictionary
return regex.sub(lambda mo: dict[mo.string[mo.start():mo.end()]], text)
main()
| 33.828025 | 188 | 0.553568 | 1,287 | 10,622 | 4.464646 | 0.189588 | 0.020884 | 0.014619 | 0.017403 | 0.223112 | 0.160808 | 0.144448 | 0.10999 | 0.085973 | 0.069962 | 0 | 0.008604 | 0.266899 | 10,622 | 313 | 189 | 33.936102 | 0.729292 | 0.046978 | 0 | 0.182222 | 0 | 0.013333 | 0.320265 | 0.08392 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.013333 | 0.008889 | null | null | 0.031111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e1d379ffe45c72193de30757e4bad02874d4385a | 2,687 | py | Python | iMessSpam.py | fabiopigi/iMessageSpam | 4d1984f5286f5cf0229d414470a4dc60e5ba12d2 | [
"MIT"
] | null | null | null | iMessSpam.py | fabiopigi/iMessageSpam | 4d1984f5286f5cf0229d414470a4dc60e5ba12d2 | [
"MIT"
] | null | null | null | iMessSpam.py | fabiopigi/iMessageSpam | 4d1984f5286f5cf0229d414470a4dc60e5ba12d2 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#import some dope
import sys
import os
import re
import time
from random import randrange
from itertools import repeat
numbers = {
'adam' :"+41111111111",
'bob' :"+41222222222",
'chris' :"+41333333333",
'dave' :"+41444444444",
}
print "Gespeicherte Empfänger: "
for name in numbers:
print "%10s - %s"%(name,numbers[name])
number = ""
while number == "":
numberID = raw_input("\nEmpfänger eingeben: ")
if numberID in numbers:
number = numbers[numberID]
pause = int(raw_input("\nIntervall in Sekunden: "))
print """
Verfügbare Optionen:
[1] Zeitansagen im Format 'Es ist 17:34:22'
[2] Zufällige 'Chuck Norris' Jokes
[3] Satz für Satz aus einem Buch (Twilight)
[4] Fifty Shades of HEX
[5] Fröhliches Flaggen raten
"""
option = int(raw_input("Option auswählen: "))
if option == 1:
anzahl = int(raw_input("\nAnzahl Nachrichten: "))
start = 0
elif option == 2:
anzahl = int(raw_input("\nAnzahl Nachrichten: "))
start = 0
replaceName = raw_input("\n'Chuck Norris' durch Namen ersetzen: ")
if replaceName == "":
replaceName = "Chuck Norris"
elif option == 3:
p = open('content/twilight.txt')
book = p.read()
pat = re.compile(r'([A-Z][^\.!?]*[\.!?])', re.M)
sentences = pat.findall(book)
anzahl = int(raw_input("\nAnzahl Nachrichten: "))
start = int(raw_input("\nBei n. Satz anfangen: "))-1
anzahl = anzahl + (start)
elif option == 4:
anzahl = 50
start = 0
elif option == 5:
anzahl = 50
start = 0
import Countries
else:
anzahl = 0
start = 0
print "\n\nSenden beginnt...\n\n"
#tunay bei 207
for i in range(start,anzahl,1):
if option == 1:
cmdCode = "date +'%H:%M:%S'"
message = "Es ist jetzt " + os.popen(cmdCode).read()
elif option == 2:
curlCode = "curl 'http://api.icndb.com/jokes/random' -s | sed -e 's/.*joke\\\": \\\"//' -e 's/\\\", \\\".*//' -e 's/Chuck Norris/" + replaceName + "/g' -e 's/"/\"/g'"
message = os.popen(curlCode).read()
elif option == 3:
message = sentences[i]
elif option == 4:
message = "#%s" % "".join(list(repeat(hex(randrange(16, 255))[2:],3))).upper()
elif option == 5:
flags = os.listdir("content/flags")
country = Countries.iso[flags[randrange(1,len(flags))][:2]]
message = "Dies ist die Flagge von '%s'."%(country["Name"])
filePath = os.path.abspath("content/flags/%s.png"%country["ISO"])
osaCode = "osascript sendImage.scpt \"%s\" \"%s\""%(number,filePath)
osaReturn = os.popen(osaCode).read()
print message
message = message.replace('"', r'\"')
osaCode = "osascript sendText.scpt \"%s\" \"%s\""%(number,message)
print "%3d > %s"%((i+1),message)
osaReturn = os.popen(osaCode).read()
time.sleep(pause)
| 23.163793 | 175 | 0.628582 | 367 | 2,687 | 4.580381 | 0.433243 | 0.038073 | 0.039262 | 0.030339 | 0.1047 | 0.072576 | 0.072576 | 0.04878 | 0 | 0 | 0 | 0.042438 | 0.175661 | 2,687 | 115 | 176 | 23.365217 | 0.716479 | 0.018608 | 0 | 0.268293 | 0 | 0.02439 | 0.329658 | 0.007985 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.085366 | null | null | 0.073171 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e1dd3f3740e16e48cf7fbe8dce94d776bef908fd | 1,139 | py | Python | tests/encoding-utils/test_big_endian_integer.py | carver/ethereum-utils | 7ec2495b25107776cb4e0e4a79af8a8c64f622c4 | [
"MIT"
] | null | null | null | tests/encoding-utils/test_big_endian_integer.py | carver/ethereum-utils | 7ec2495b25107776cb4e0e4a79af8a8c64f622c4 | [
"MIT"
] | null | null | null | tests/encoding-utils/test_big_endian_integer.py | carver/ethereum-utils | 7ec2495b25107776cb4e0e4a79af8a8c64f622c4 | [
"MIT"
] | null | null | null | from __future__ import unicode_literals
import pytest
from hypothesis import (
strategies as st,
given,
)
from eth_utils.encoding import (
int_to_big_endian,
big_endian_to_int,
)
@pytest.mark.parametrize(
'as_int,as_big_endian',
(
(0, b'\x00'),
(1, b'\x01'),
(7, b'\x07'),
(8, b'\x08'),
(9, b'\x09'),
(256, b'\x01\x00'),
(2**256 - 1, b'\xff' * 32),
),
)
def test_big_endian_conversions(as_int, as_big_endian):
as_int_result = big_endian_to_int(as_big_endian)
assert as_int_result == as_int
as_big_endian_result = int_to_big_endian(as_int)
assert as_big_endian_result == as_big_endian
@given(value=st.integers(min_value=0, max_value=2**256 - 1))
def test_big_endian_round_trip_from_int(value):
result = big_endian_to_int(int_to_big_endian(value))
assert result == value
@given(
value=st.binary(min_size=1, max_size=32).map(
lambda v: v.lstrip(b'\x00') or b'\x00'
)
)
def test_big_endian_round_trip_from_big_endian(value):
result = int_to_big_endian(big_endian_to_int(value))
assert result == value
| 22.78 | 60 | 0.665496 | 181 | 1,139 | 3.790055 | 0.292818 | 0.236152 | 0.09621 | 0.081633 | 0.332362 | 0.166181 | 0.166181 | 0.081633 | 0 | 0 | 0 | 0.046512 | 0.207199 | 1,139 | 49 | 61 | 23.244898 | 0.713178 | 0 | 0 | 0.051282 | 0 | 0 | 0.052678 | 0 | 0 | 0 | 0 | 0 | 0.102564 | 1 | 0.076923 | false | 0 | 0.102564 | 0 | 0.179487 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e1dfa37abe08ed294d2a701673731176a4e461e5 | 3,500 | py | Python | jamf/setconfig.py | pythoninthegrass/python-jamf | f71a44f4565fc2824ce6daf536359d563ab75ea3 | [
"MIT"
] | 25 | 2020-11-02T18:16:22.000Z | 2022-03-07T04:36:14.000Z | jamf/setconfig.py | pythoninthegrass/python-jamf | f71a44f4565fc2824ce6daf536359d563ab75ea3 | [
"MIT"
] | 17 | 2020-12-22T19:24:05.000Z | 2022-03-02T22:39:04.000Z | jamf/setconfig.py | pythoninthegrass/python-jamf | f71a44f4565fc2824ce6daf536359d563ab75ea3 | [
"MIT"
] | 12 | 2020-10-28T19:03:29.000Z | 2022-03-01T08:29:52.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Jamf Config
"""
__author__ = "Sam Forester"
__email__ = "sam.forester@utah.edu"
__copyright__ = "Copyright (c) 2020 University of Utah, Marriott Library"
__license__ = "MIT"
__version__ = "1.0.4"
import argparse
import getpass
import jamf
import logging
import platform
import pprint
import sys
from os import path
class Parser:
def __init__(self):
myplatform = platform.system()
if myplatform == "Darwin":
default_pref = jamf.config.MACOS_PREFS_TILDA
elif myplatform == "Linux":
default_pref = jamf.config.LINUX_PREFS_TILDA
self.parser = argparse.ArgumentParser()
self.parser.add_argument(
"-H", "--hostname", help="specify hostname (default: prompt)"
)
self.parser.add_argument(
"-u", "--user", help="specify username (default: prompt)"
)
self.parser.add_argument(
"-p", "--passwd", help="specify password (default: prompt)"
)
self.parser.add_argument(
"-C",
"--config",
dest="path",
metavar="PATH",
default=default_pref,
help=f"specify config file (default {default_pref})",
)
self.parser.add_argument(
"-P",
"--print",
action="store_true",
help="print existing config profile (except password!)",
)
self.parser.add_argument(
"-t",
"--test",
action="store_true",
help="Connect to the Jamf server using the config file",
)
def parse(self, argv):
"""
:param argv: list of arguments to parse
:returns: argparse.NameSpace object
"""
return self.parser.parse_args(argv)
def setconfig(argv):
logger = logging.getLogger(__name__)
args = Parser().parse(argv)
logger.debug(f"args: {args!r}")
if args.path:
config_path = args.path
else:
myplatform = platform.system()
if myplatform == "Darwin":
default_pref = jamf.config.MACOS_PREFS_TILDA
elif myplatform == "Linux":
default_pref = jamf.config.LINUX_PREFS_TILDA
config_path = default_pref
if config_path[0] == "~":
config_path = path.expanduser(config_path)
if args.test:
api = jamf.API(config_path=config_path)
pprint.pprint(api.get("accounts"))
elif args.print:
conf = jamf.config.Config(prompt=False, explain=True, config_path=config_path)
print(conf.hostname)
print(conf.username)
if conf.password:
print("Password is set")
else:
print("Password is not set")
else:
if args.hostname:
hostname = args.hostname
else:
hostname = jamf.config.prompt_hostname()
if args.user:
user = args.user
else:
user = input("username: ")
if args.passwd:
passwd = args.passwd
else:
passwd = getpass.getpass()
conf = jamf.config.Config(
hostname=hostname, username=user, password=passwd, prompt=False
)
conf.save(config_path=config_path)
def main():
fmt = "%(asctime)s: %(levelname)8s: %(name)s - %(funcName)s(): %(message)s"
logging.basicConfig(level=logging.INFO, format=fmt)
setconfig(sys.argv[1:])
if __name__ == "__main__":
main()
| 28.225806 | 86 | 0.575429 | 382 | 3,500 | 5.08377 | 0.332461 | 0.056643 | 0.040165 | 0.064882 | 0.201339 | 0.189495 | 0.136972 | 0.136972 | 0.136972 | 0.136972 | 0 | 0.004926 | 0.304 | 3,500 | 123 | 87 | 28.455285 | 0.792282 | 0.04 | 0 | 0.237624 | 0 | 0.009901 | 0.176807 | 0.006325 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039604 | false | 0.09901 | 0.079208 | 0 | 0.138614 | 0.089109 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
e1e00ce354ffc24242ad31b4a0c1c5120baf617a | 979 | py | Python | src/menuResponse/migrations/0001_initial.py | miguelaav/dev | 5ade9d0b393f48c9cc3b160b6ede4a03c29addea | [
"bzip2-1.0.6"
] | null | null | null | src/menuResponse/migrations/0001_initial.py | miguelaav/dev | 5ade9d0b393f48c9cc3b160b6ede4a03c29addea | [
"bzip2-1.0.6"
] | 6 | 2020-06-05T20:02:33.000Z | 2022-03-11T23:43:11.000Z | src/menuResponse/migrations/0001_initial.py | miguelaav/dev | 5ade9d0b393f48c9cc3b160b6ede4a03c29addea | [
"bzip2-1.0.6"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11.20 on 2019-03-12 17:41
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
('menuCreate', '0001_initial'),
('menu', '0002_remove_menu_slug'),
]
operations = [
migrations.CreateModel(
name='MenuResponseModel',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('comments', models.CharField(max_length=200)),
('date', models.DateField(auto_now_add=True)),
('MenuID', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='menuCreate.MenuCreateModel')),
('option', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='menu.Menu')),
],
),
]
| 32.633333 | 124 | 0.622063 | 105 | 979 | 5.638095 | 0.619048 | 0.054054 | 0.070946 | 0.111486 | 0.185811 | 0.185811 | 0.185811 | 0.185811 | 0.185811 | 0.185811 | 0 | 0.038978 | 0.240041 | 979 | 29 | 125 | 33.758621 | 0.75672 | 0.07048 | 0 | 0 | 1 | 0 | 0.140022 | 0.051819 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e1e2d0a67c83cc0cf6dbbc60b3dc2efff897636e | 10,889 | py | Python | datacube/drivers/s3/storage/s3aio/s3aio.py | Zac-HD/datacube-core | ebc2025b6fb9d22fb406cdf5f79eba6d144c57e3 | [
"Apache-2.0"
] | 2 | 2018-12-02T11:33:50.000Z | 2021-04-24T11:42:42.000Z | datacube/drivers/s3/storage/s3aio/s3aio.py | Zac-HD/datacube-core | ebc2025b6fb9d22fb406cdf5f79eba6d144c57e3 | [
"Apache-2.0"
] | 103 | 2018-03-21T15:00:05.000Z | 2020-06-04T05:40:25.000Z | datacube/drivers/s3/storage/s3aio/s3aio.py | roarmstrong/datacube-core | 5e38638dabd9e5112e92b503fae6a83c8dcc4902 | [
"Apache-2.0"
] | null | null | null | """
S3AIO Class
Array access to a single S3 object
"""
from __future__ import absolute_import
import SharedArray as sa
import zstd
from itertools import repeat, product
import numpy as np
from pathos.multiprocessing import ProcessingPool
from six.moves import zip
try:
from StringIO import StringIO
except ImportError:
from io import StringIO
from .s3io import S3IO, generate_array_name
class S3AIO(object):
def __init__(self, enable_compression=True, enable_s3=True, file_path=None, num_workers=30):
"""Initialise the S3 array IO interface.
:param bool enable_s3: Flag to store objects in s3 or disk.
True: store in S3
False: store on disk (for testing purposes)
:param str file_path: The root directory for the emulated s3 buckets when enable_se is set to False.
:param int num_workers: The number of workers for parallel IO.
"""
self.s3io = S3IO(enable_s3, file_path, num_workers)
self.pool = ProcessingPool(num_workers)
self.enable_compression = enable_compression
def to_1d(self, index, shape):
"""Converts nD index to 1D index.
:param tuple index: N-D Index to be converted.
:param tuple shape: Shape to be used for conversion.
:return: Returns the 1D index.
"""
return np.ravel_multi_index(index, shape)
def to_nd(self, index, shape):
"""Converts 1D index to nD index.
:param tuple index: 1D Index to be converted.
:param tuple shape: Shape to be used for conversion.
:return: Returns the ND index.
"""
return np.unravel_index(index, shape)
def get_point(self, index_point, shape, dtype, s3_bucket, s3_key):
"""Gets a point in the nd array stored in S3.
Only works if compression is off.
:param tuple index_point: Index of the point to be retrieved.
:param tuple shape: Shape of the stored data.
:param numpy.dtype: dtype of the stored data.
:param str s3_bucket: S3 bucket name
:param str s3_key: S3 key name
:return: Returns the point data.
"""
item_size = np.dtype(dtype).itemsize
idx = self.to_1d(index_point, shape) * item_size
if self.enable_compression:
b = self.s3io.get_bytes(s3_bucket, s3_key)
cctx = zstd.ZstdDecompressor()
b = cctx.decompress(b)[idx:idx + item_size]
else:
b = self.s3io.get_byte_range(s3_bucket, s3_key, idx, idx + item_size)
a = np.frombuffer(b, dtype=dtype, count=-1, offset=0)
return a
def cdims(self, slices, shape):
return [sl.start == 0 and sl.stop == sh and (sl.step is None or sl.step == 1)
for sl, sh in zip(slices, shape)]
def get_slice(self, array_slice, shape, dtype, s3_bucket, s3_key): # pylint: disable=too-many-locals
"""Gets a slice of the nd array stored in S3.
Only works if compression is off.
:param tuple array_slice: tuple of slices to retrieve.
:param tuple shape: Shape of the stored data.
:param numpy.dtype: dtype of the stored data.
:param str s3_bucket: S3 bucket name
:param str s3_key: S3 key name
:return: Returns the data slice.
"""
# convert array_slice into into sub-slices of maximum contiguous blocks
# Todo:
# - parallelise reads and writes
# - option 1. get memory rows in parallel and merge
# - option 2. smarter byte range subsets depending on:
# - data size
# - data contiguity
if self.enable_compression:
return self.get_slice_by_bbox(array_slice, shape, dtype, s3_bucket, s3_key)
# truncate array_slice to shape
# array_slice = [slice(max(0, s.start) - min(sh, s.stop)) for s, sh in zip(array_sliced, shape)]
array_slice = [slice(max(0, s.start), min(sh, s.stop)) for s, sh in zip(array_slice, shape)]
cdim = self.cdims(array_slice, shape)
try:
end = cdim[::-1].index(False) + 1
except ValueError:
end = len(shape)
start = len(shape) - end
outer = array_slice[:-end]
outer_ranges = [range(s.start, s.stop) for s in outer]
outer_cells = list(product(*outer_ranges))
blocks = list(zip(outer_cells, repeat(array_slice[start:])))
item_size = np.dtype(dtype).itemsize
results = []
for cell, sub_range in blocks:
# print(cell, sub_range)
s3_start = (np.ravel_multi_index(cell + tuple([s.start for s in sub_range]), shape)) * item_size
s3_end = (np.ravel_multi_index(cell + tuple([s.stop - 1 for s in sub_range]), shape) + 1) * item_size
# print(s3_start, s3_end)
data = self.s3io.get_byte_range(s3_bucket, s3_key, s3_start, s3_end)
results.append((cell, sub_range, data))
result = np.empty([s.stop - s.start for s in array_slice], dtype=dtype)
offset = [s.start for s in array_slice]
for cell, sub_range, data in results:
t = [slice(x.start - o, x.stop - o) if isinstance(x, slice) else x - o for x, o in
zip(cell + tuple(sub_range), offset)]
if data.dtype != dtype:
data = np.frombuffer(data, dtype=dtype, count=-1, offset=0)
result[t] = data.reshape([s.stop - s.start for s in sub_range])
return result
def get_slice_mp(self, array_slice, shape, dtype, s3_bucket, s3_key): # pylint: disable=too-many-locals
"""Gets a slice of the nd array stored in S3 in parallel.
Only works if compression is off.
:param tuple array_slice: tuple of slices to retrieve.
:param tuple shape: Shape of the stored data.
:param numpy.dtype: dtype of the stored data.
:param str s3_bucket: S3 bucket name
:param str s3_key: S3 key name
:return: Returns the data slice.
"""
# pylint: disable=too-many-locals
def work_get_slice(block, array_name, offset, s3_bucket, s3_key, shape, dtype):
result = sa.attach(array_name)
cell, sub_range = block
item_size = np.dtype(dtype).itemsize
s3_start = (np.ravel_multi_index(cell + tuple([s.start for s in sub_range]), shape)) * item_size
s3_end = (np.ravel_multi_index(cell + tuple([s.stop - 1 for s in sub_range]), shape) + 1) * item_size
data = self.s3io.get_byte_range(s3_bucket, s3_key, s3_start, s3_end)
t = [slice(x.start - o, x.stop - o) if isinstance(x, slice) else x - o for x, o in
zip(cell + tuple(sub_range), offset)]
if data.dtype != dtype:
data = np.frombuffer(data, dtype=dtype, count=-1, offset=0)
# data = data.reshape([s.stop - s.start for s in sub_range])
result[t] = data.reshape([s.stop - s.start for s in sub_range])
if self.enable_compression:
return self.get_slice_by_bbox(array_slice, shape, dtype, s3_bucket, s3_key)
cdim = self.cdims(array_slice, shape)
try:
end = cdim[::-1].index(False) + 1
except ValueError:
end = len(shape)
start = len(shape) - end
outer = array_slice[:-end]
outer_ranges = [range(s.start, s.stop) for s in outer]
outer_cells = list(product(*outer_ranges))
blocks = list(zip(outer_cells, repeat(array_slice[start:])))
offset = [s.start for s in array_slice]
array_name = generate_array_name('S3AIO')
sa.create(array_name, shape=[s.stop - s.start for s in array_slice], dtype=dtype)
shared_array = sa.attach(array_name)
self.pool.map(work_get_slice, blocks, repeat(array_name), repeat(offset), repeat(s3_bucket),
repeat(s3_key), repeat(shape), repeat(dtype))
sa.delete(array_name)
return shared_array
def get_slice_by_bbox(self, array_slice, shape, dtype, s3_bucket, s3_key): # pylint: disable=too-many-locals
"""Gets a slice of the nd array stored in S3 by bounding box.
:param tuple array_slice: tuple of slices to retrieve.
:param tuple shape: Shape of the stored data.
:param numpy.dtype: dtype of the stored data.
:param str s3_bucket: S3 bucket name
:param str s3_key: S3 key name
:return: Returns the data slice.
"""
# Todo:
# - parallelise reads and writes
# - option 1. use get_byte_range_mp
# - option 2. smarter byte range subsets depending on:
# - data size
# - data contiguity
item_size = np.dtype(dtype).itemsize
s3_begin = (np.ravel_multi_index(tuple([s.start for s in array_slice]), shape)) * item_size
s3_end = (np.ravel_multi_index(tuple([s.stop - 1 for s in array_slice]), shape) + 1) * item_size
# if s3_end-s3_begin <= 5*1024*1024:
# d = self.s3io.get_byte_range(s3_bucket, s3_key, s3_begin, s3_end)
# else:
# d = self.s3io.get_byte_range_mp(s3_bucket, s3_key, s3_begin, s3_end, 5*1024*1024)
d = self.s3io.get_bytes(s3_bucket, s3_key)
if self.enable_compression:
cctx = zstd.ZstdDecompressor()
d = cctx.decompress(d)
d = np.frombuffer(d, dtype=np.uint8, count=-1, offset=0)
d = d[s3_begin:s3_end]
cdim = self.cdims(array_slice, shape)
try:
end = cdim[::-1].index(False) + 1
except ValueError:
end = len(shape)
start = len(shape) - end
outer = array_slice[:-end]
outer_ranges = [range(s.start, s.stop) for s in outer]
outer_cells = list(product(*outer_ranges))
blocks = list(zip(outer_cells, repeat(array_slice[start:])))
item_size = np.dtype(dtype).itemsize
results = []
for cell, sub_range in blocks:
s3_start = (np.ravel_multi_index(cell + tuple([s.start for s in sub_range]), shape)) * item_size
s3_end = (np.ravel_multi_index(cell + tuple([s.stop - 1 for s in sub_range]), shape) + 1) * item_size
data = d[s3_start - s3_begin:s3_end - s3_begin]
results.append((cell, sub_range, data))
result = np.empty([s.stop - s.start for s in array_slice], dtype=dtype)
offset = [s.start for s in array_slice]
for cell, sub_range, data in results:
t = [slice(x.start - o, x.stop - o) if isinstance(x, slice) else x - o for x, o in
zip(cell + tuple(sub_range), offset)]
if data.dtype != dtype:
data = np.frombuffer(data, dtype=dtype, count=-1, offset=0)
result[t] = data.reshape([s.stop - s.start for s in sub_range])
return result
| 39.452899 | 113 | 0.612545 | 1,581 | 10,889 | 4.065781 | 0.126502 | 0.046671 | 0.019602 | 0.028314 | 0.684816 | 0.680772 | 0.660236 | 0.630834 | 0.61201 | 0.601431 | 0 | 0.020224 | 0.287079 | 10,889 | 275 | 114 | 39.596364 | 0.807806 | 0.276242 | 0 | 0.61194 | 0 | 0 | 0.000672 | 0 | 0 | 0 | 0 | 0.007273 | 0 | 1 | 0.067164 | false | 0 | 0.08209 | 0.007463 | 0.223881 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e1e4a6c549324fabd37261ecd95b7fc5b7e7bd39 | 5,447 | py | Python | make_snapshot.py | trquinn/ICgen | 0d7f05187a955be7fa3dee2f638cfcb074ebadcd | [
"MIT"
] | 1 | 2021-09-14T12:03:03.000Z | 2021-09-14T12:03:03.000Z | make_snapshot.py | trquinn/ICgen | 0d7f05187a955be7fa3dee2f638cfcb074ebadcd | [
"MIT"
] | null | null | null | make_snapshot.py | trquinn/ICgen | 0d7f05187a955be7fa3dee2f638cfcb074ebadcd | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Fri Mar 21 15:11:31 2014
@author: ibackus
"""
__version__ = "$Revision: 1 $"
# $Source$
import pynbody
SimArray = pynbody.array.SimArray
import numpy as np
import gc
import os
import isaac
import calc_velocity
import ICgen_utils
import ICglobal_settings
global_settings = ICglobal_settings.global_settings
def snapshot_gen(ICobj):
"""
Generates a tipsy snapshot from the initial conditions object ICobj.
Returns snapshot, param
snapshot: tipsy snapshot
param: dictionary containing info for a .param file
"""
print 'Generating snapshot...'
# Constants
G = SimArray(1.0,'G')
# ------------------------------------
# Load in things from ICobj
# ------------------------------------
print 'Accessing data from ICs'
settings = ICobj.settings
# filenames
snapshotName = settings.filenames.snapshotName
paramName = settings.filenames.paramName
# particle positions
r = ICobj.pos.r
xyz = ICobj.pos.xyz
# Number of particles
nParticles = ICobj.pos.nParticles
# molecular mass
m = settings.physical.m
# star mass
m_star = settings.physical.M.copy()
# disk mass
m_disk = ICobj.sigma.m_disk.copy()
m_disk = isaac.match_units(m_disk, m_star)[0]
# mass of the gas particles
m_particles = m_disk / float(nParticles)
# re-scale the particles (allows making of lo-mass disk)
m_particles *= settings.snapshot.mScale
# -------------------------------------------------
# Assign output
# -------------------------------------------------
print 'Assigning data to snapshot'
# Get units all set up
m_unit = m_star.units
pos_unit = r.units
if xyz.units != r.units:
xyz.convert_units(pos_unit)
# time units are sqrt(L^3/GM)
t_unit = np.sqrt((pos_unit**3)*np.power((G*m_unit), -1)).units
# velocity units are L/t
v_unit = (pos_unit/t_unit).ratio('km s**-1')
# Make it a unit
v_unit = pynbody.units.Unit('{0} km s**-1'.format(v_unit))
# Other settings
metals = settings.snapshot.metals
star_metals = metals
# -------------------------------------------------
# Initialize snapshot
# -------------------------------------------------
# Note that empty pos, vel, and mass arrays are created in the snapshot
snapshot = pynbody.new(star=1,gas=nParticles)
snapshot['vel'].units = v_unit
snapshot['eps'] = 0.01*SimArray(np.ones(nParticles+1, dtype=np.float32), pos_unit)
snapshot['metals'] = SimArray(np.zeros(nParticles+1, dtype=np.float32))
snapshot['rho'] = SimArray(np.zeros(nParticles+1, dtype=np.float32))
snapshot.gas['pos'] = xyz
snapshot.gas['temp'] = ICobj.T(r)
snapshot.gas['mass'] = m_particles
snapshot.gas['metals'] = metals
snapshot.star['pos'] = SimArray([[ 0., 0., 0.]],pos_unit)
snapshot.star['vel'] = SimArray([[ 0., 0., 0.]], v_unit)
snapshot.star['mass'] = m_star
snapshot.star['metals'] = SimArray(star_metals)
# Estimate the star's softening length as the closest particle distance
snapshot.star['eps'] = r.min()
# Make param file
param = isaac.make_param(snapshot, snapshotName)
param['dMeanMolWeight'] = m
gc.collect()
# -------------------------------------------------
# CALCULATE VELOCITY USING calc_velocity.py. This also estimates the
# gravitational softening length eps
# -------------------------------------------------
print 'Calculating circular velocity'
preset = settings.changa_run.preset
max_particles = global_settings['misc']['max_particles']
calc_velocity.v_xy(snapshot, param, changa_preset=preset, max_particles=max_particles)
gc.collect()
# -------------------------------------------------
# Estimate time step for changa to use
# -------------------------------------------------
# Save param file
isaac.configsave(param, paramName, 'param')
# Save snapshot
snapshot.write(filename=snapshotName, fmt=pynbody.tipsy.TipsySnap)
# est dDelta
dDelta = ICgen_utils.est_time_step(paramName, preset)
param['dDelta'] = dDelta
# -------------------------------------------------
# Create director file
# -------------------------------------------------
# largest radius to plot
r_director = float(0.9 * r.max())
# Maximum surface density
sigma_min = float(ICobj.sigma(r_director))
# surface density at largest radius
sigma_max = float(ICobj.sigma.input_dict['sigma'].max())
# Create director dict
director = isaac.make_director(sigma_min, sigma_max, r_director, filename=param['achOutName'])
## Save .director file
#isaac.configsave(director, directorName, 'director')
# -------------------------------------------------
# Wrap up
# -------------------------------------------------
print 'Wrapping up'
# Now set the star particle's tform to a negative number. This allows
# UW ChaNGa treat it as a sink particle.
snapshot.star['tform'] = -1.0
# Update params
r_sink = isaac.strip_units(r.min())
param['dSinkBoundOrbitRadius'] = r_sink
param['dSinkRadius'] = r_sink
param['dSinkMassMin'] = 0.9 * isaac.strip_units(m_star)
param['bDoSinks'] = 1
return snapshot, param, director | 33.213415 | 98 | 0.572242 | 624 | 5,447 | 4.88141 | 0.323718 | 0.009849 | 0.015758 | 0.017728 | 0.039724 | 0.031517 | 0.031517 | 0.031517 | 0.031517 | 0 | 0 | 0.011358 | 0.208004 | 5,447 | 164 | 99 | 33.213415 | 0.694715 | 0.312833 | 0 | 0.026667 | 0 | 0 | 0.091122 | 0.006153 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.106667 | null | null | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e1e993089a256f12c7dadf856a619e12a83973e8 | 918 | py | Python | backend/apps/api/system/v1/serializers/groups.py | offurface/smsta | b8a1f2e6efe6c71703c8d57e8aae255ad213863c | [
"MIT"
] | null | null | null | backend/apps/api/system/v1/serializers/groups.py | offurface/smsta | b8a1f2e6efe6c71703c8d57e8aae255ad213863c | [
"MIT"
] | null | null | null | backend/apps/api/system/v1/serializers/groups.py | offurface/smsta | b8a1f2e6efe6c71703c8d57e8aae255ad213863c | [
"MIT"
] | null | null | null | from rest_framework import serializers
from ... import models
class DepartmentSerializers(serializers.ModelSerializer):
"""
Сериализатор кафедр
"""
class Meta:
model = models.Department
fields = ["short_name", "full_name"]
class StudentSerializers(serializers.ModelSerializer):
"""
Сериализатор студентов
"""
class Meta:
model = models.Student
fields = ["pk", "name", "surname", "patronymic", "gender"]
class AcademicGroupsDetailSerializers(serializers.ModelSerializer):
"""
Сериализатор Академических Групп
"""
department = DepartmentSerializers()
students = StudentSerializers(many=True, read_only=True)
class Meta:
model = models.AcademicGroup
fields = [
"pk",
"start_date",
"department",
"name",
"students",
"course",
]
| 21.348837 | 67 | 0.606754 | 70 | 918 | 7.885714 | 0.528571 | 0.141304 | 0.206522 | 0.108696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.287582 | 918 | 42 | 68 | 21.857143 | 0.844037 | 0.081699 | 0 | 0.130435 | 0 | 0 | 0.110414 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.086957 | 0 | 0.434783 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e1f180db019536ccc2e9f00c32c47da031376111 | 4,266 | py | Python | run.py | kbeyer/RPi-LED-SpectrumAnalyzer | f5a5f1210f02188599eb308f5737392ce8c93218 | [
"MIT"
] | 14 | 2015-01-09T12:26:06.000Z | 2021-03-22T22:16:53.000Z | run.py | kbeyer/RPi-LED-SpectrumAnalyzer | f5a5f1210f02188599eb308f5737392ce8c93218 | [
"MIT"
] | 4 | 2015-07-19T07:20:51.000Z | 2017-02-01T16:11:22.000Z | run.py | kbeyer/RPi-LED-SpectrumAnalyzer | f5a5f1210f02188599eb308f5737392ce8c93218 | [
"MIT"
] | 4 | 2016-03-07T12:12:08.000Z | 2018-03-04T21:57:13.000Z | """ Main entry point for running the demo. """
# Standard library
import time
import sys
# Third party library
import alsaaudio as aa
# Local library
from char import show_text
from hs_logo import draw_logo
from leds import ColumnedLEDStrip
from music import calculate_levels, read_musicfile_in_chunks, calculate_column_frequency
from shairplay import initialize_shairplay, shutdown_shairplay, RaopCallbacks
COLUMNS = 12
GAP_LEDS = 0
TOTAL_LEDS = 100
SKIP_LEDS = 4
SAMPLE_RATE = 44100
NUM_CHANNELS = 2
FORMAT = aa.PCM_FORMAT_S16_LE
PERIOD_SIZE = 2048
frequency_limits = calculate_column_frequency(200, 10000, COLUMNS)
def analyze_airplay_input(led_strip):
from os.path import join
lib_path = join(sys.prefix, 'lib')
initialize_shairplay(lib_path, get_shairplay_callback_class(led_strip))
while True:
try:
pass
except KeyboardInterrupt:
shutdown_shairplay()
break
def analyze_audio_file(led_strip, path):
for chunk, sample_rate in read_musicfile_in_chunks(path, play_audio=True):
data = calculate_levels(chunk, sample_rate, frequency_limits)
led_strip.display_data(data)
def analyze_line_in(led_strip, hacker_school=True):
start_time = time.time()
while True:
if hacker_school and time.time() - start_time > 60 * 2:
hacker_school_display()
start_time = time.time()
size, chunk = input.read()
if size > 0:
L = (len(chunk)/2 * 2)
chunk = chunk[:L]
data = calculate_levels(chunk, SAMPLE_RATE, frequency_limits)
led_strip.display_data(data[::-1])
def get_audio_input():
input = aa.PCM(aa.PCM_CAPTURE, aa.PCM_NONBLOCK)
input.setchannels(NUM_CHANNELS)
input.setformat(aa.PCM_FORMAT_S16_BE)
input.setrate(SAMPLE_RATE)
input.setperiodsize(PERIOD_SIZE)
return input
def get_led_strip():
led = ColumnedLEDStrip(
leds=TOTAL_LEDS, columns=COLUMNS, gap_leds=GAP_LEDS, skip_leds=SKIP_LEDS
)
led.all_off()
return led
def get_shairplay_callback_class(led_strip):
class SampleCallbacks(RaopCallbacks):
def audio_init(self, bits, channels, samplerate):
print "Initializing", bits, channels, samplerate
self.bits = bits
self.channels = channels
self.samplerate = samplerate
min_frequency = 500
max_frequency = samplerate / 30 * 10 # Abusing integer division
self.frequency_limits = calculate_column_frequency(
min_frequency, max_frequency, COLUMNS
)
self.buffer = ''
def audio_process(self, session, buffer):
data = calculate_levels(buffer, self.samplerate, self.frequency_limits, self.channels, self.bits)
led_strip.display_data(data[::-1])
def audio_destroy(self, session):
print "Destroying"
def audio_set_volume(self, session, volume):
print "Set volume to", volume
def audio_set_metadata(self, session, metadata):
print "Got", len(metadata), "bytes of metadata"
def audio_set_coverart(self, session, coverart):
print "Got", len(coverart), "bytes of coverart"
return SampleCallbacks
def hacker_school_display(led_strip):
draw_logo(led_strip)
time.sleep(1)
show_text(led_strip, 'NEVER GRADUATE!', x_offset=3, y_offset=1, sleep=0.5)
if __name__ == '__main__':
from textwrap import dedent
input_types = ('local', 'linein', 'airplay')
usage = dedent("""\
Usage: %s <input-type> [additional arguments]
input-type: should be one of %s
To play a local file, you can pass the path to the file as an additional
argument.
""") % (sys.argv[0], input_types)
if len(sys.argv) == 1:
print usage
sys.exit(1)
input_type = sys.argv[1]
led_strip = get_led_strip()
if input_type == 'local':
path = sys.argv[2] if len(sys.argv) > 2 else 'sample.mp3'
analyze_audio_file(led_strip, path)
elif input_type == 'airplay':
analyze_airplay_input(led_strip)
elif input_type == 'linein':
analyze_line_in(led_strip)
else:
print usage
sys.exit(1)
| 28.44 | 109 | 0.665495 | 551 | 4,266 | 4.900181 | 0.306715 | 0.05037 | 0.026667 | 0.021111 | 0.187778 | 0.11 | 0.064815 | 0.053333 | 0.053333 | 0.053333 | 0 | 0.017702 | 0.245195 | 4,266 | 149 | 110 | 28.630872 | 0.820807 | 0.017581 | 0 | 0.091743 | 0 | 0 | 0.080676 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.018349 | 0.091743 | null | null | 0.06422 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e1f5bc34418af89095c0d30d7b41fe28a2137a99 | 695 | py | Python | tests/profiling/test_scheduler.py | uniq10/dd-trace-py | ca9ce1fe552cf03c2828bcd160e537336aa275d5 | [
"Apache-2.0",
"BSD-3-Clause"
] | 1 | 2020-10-17T14:55:46.000Z | 2020-10-17T14:55:46.000Z | tests/profiling/test_scheduler.py | uniq10/dd-trace-py | ca9ce1fe552cf03c2828bcd160e537336aa275d5 | [
"Apache-2.0",
"BSD-3-Clause"
] | 1 | 2020-12-22T16:56:55.000Z | 2020-12-22T16:56:55.000Z | tests/profiling/test_scheduler.py | uniq10/dd-trace-py | ca9ce1fe552cf03c2828bcd160e537336aa275d5 | [
"Apache-2.0",
"BSD-3-Clause"
] | 1 | 2020-12-22T16:54:02.000Z | 2020-12-22T16:54:02.000Z | # -*- encoding: utf-8 -*-
from ddtrace.profiling import event
from ddtrace.profiling import exporter
from ddtrace.profiling import recorder
from ddtrace.profiling import scheduler
class _FailExporter(exporter.Exporter):
@staticmethod
def export(events):
raise Exception("BOO!")
def test_exporter_failure():
r = recorder.Recorder()
exp = _FailExporter()
s = scheduler.Scheduler(r, [exp])
r.push_events([event.Event()] * 10)
s.flush()
def test_thread_name():
r = recorder.Recorder()
exp = exporter.NullExporter()
s = scheduler.Scheduler(r, [exp])
s.start()
assert s._worker.name == "ddtrace.profiling.scheduler:Scheduler"
s.stop()
| 23.965517 | 68 | 0.689209 | 83 | 695 | 5.674699 | 0.421687 | 0.169851 | 0.169851 | 0.220807 | 0.097665 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005291 | 0.184173 | 695 | 28 | 69 | 24.821429 | 0.825397 | 0.033094 | 0 | 0.190476 | 0 | 0 | 0.061194 | 0.055224 | 0 | 0 | 0 | 0 | 0.047619 | 1 | 0.142857 | false | 0 | 0.190476 | 0 | 0.380952 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e1f95d627c633bc21a45b92e2b2fbf936f530ed6 | 1,916 | py | Python | logistic-regression/code.py | kalpeshsnaik09/ga-learner-dsmp-repo | b0b8b0b1e8f91d6462d1ea129f86595b5200a4c4 | [
"MIT"
] | null | null | null | logistic-regression/code.py | kalpeshsnaik09/ga-learner-dsmp-repo | b0b8b0b1e8f91d6462d1ea129f86595b5200a4c4 | [
"MIT"
] | null | null | null | logistic-regression/code.py | kalpeshsnaik09/ga-learner-dsmp-repo | b0b8b0b1e8f91d6462d1ea129f86595b5200a4c4 | [
"MIT"
] | null | null | null | # --------------
# import the libraries
import numpy as np
import pandas as pd
import seaborn as sns
from sklearn.model_selection import train_test_split
import warnings
warnings.filterwarnings('ignore')
# Code starts here
df=pd.read_csv(path)
print(df.head())
X=df.drop(columns='insuranceclaim')
y=df['insuranceclaim']
X_train,X_test,y_train,y_test=train_test_split(X,y,test_size=0.2,random_state=6)
# Code ends here
# --------------
import matplotlib.pyplot as plt
# Code starts here
plt.boxplot(X_train['bmi'])
plt.show()
q_value=X_train['bmi'].quantile(0.95)
print(y_train.value_counts())
# Code ends here
# --------------
import seaborn as sns
# Code starts here
relation=X_train.corr()
print(relation)
sns.pairplot(X_train)
plt.show()
# Code ends here
# --------------
import seaborn as sns
import matplotlib.pyplot as plt
# Code starts here
cols=['children','sex','region','smoker']
fig,axes=plt.subplots(2,2)
for i in range(2):
for j in range(2):
col=cols[i*2+j]
sns.countplot(X_train[col],hue=y_train,ax=axes[i,j])
# Code ends here
# --------------
from sklearn.model_selection import GridSearchCV, RandomizedSearchCV
from sklearn.linear_model import LogisticRegression
from sklearn.metrics import accuracy_score
# parameters for grid search
parameters = {'C':[0.1,0.5,1,5]}
# Code starts here
lr=LogisticRegression(random_state=9)
grid=GridSearchCV(estimator=lr,param_grid=parameters)
grid.fit(X_train,y_train)
y_pred=grid.predict(X_test)
accuracy=accuracy_score(y_test,y_pred)
print(accuracy)
# Code ends here
# --------------
from sklearn.metrics import roc_auc_score
from sklearn import metrics
# Code starts here
score=roc_auc_score(y_test,y_pred)
y_pred_proba=grid.predict_proba(X_test)[:,1]
fpr,tpr,_=metrics.roc_curve(y_test,y_pred)
roc_auc=roc_auc_score(y_test,y_pred_proba)
plt.plot(fpr,tpr,label="Logistic model, auc="+str(roc_auc))
# Code ends here
| 20.602151 | 80 | 0.731733 | 310 | 1,916 | 4.348387 | 0.329032 | 0.031157 | 0.062315 | 0.029674 | 0.219585 | 0.136499 | 0.136499 | 0.060831 | 0 | 0 | 0 | 0.01115 | 0.110647 | 1,916 | 92 | 81 | 20.826087 | 0.77993 | 0.171712 | 0 | 0.152174 | 0 | 0 | 0.053708 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.304348 | 0 | 0.304348 | 0.086957 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
c008713b35128a47b245f9ad063e4cc7dcc2e046 | 5,090 | py | Python | shuttl/tests/test_views/test_organization.py | shuttl-io/shuttl-cms | 50c85db0de42e901c371561270be6425cc65eccc | [
"MIT"
] | 2 | 2017-06-26T18:06:58.000Z | 2017-10-11T21:45:29.000Z | shuttl/tests/test_views/test_organization.py | shuttl-io/shuttl-cms | 50c85db0de42e901c371561270be6425cc65eccc | [
"MIT"
] | null | null | null | shuttl/tests/test_views/test_organization.py | shuttl-io/shuttl-cms | 50c85db0de42e901c371561270be6425cc65eccc | [
"MIT"
] | null | null | null | import json
from shuttl import app
from shuttl.tests import testbase
from shuttl.Models.Reseller import Reseller
from shuttl.Models.organization import Organization, OrganizationDoesNotExistException
class OrganizationViewTest(testbase.BaseTest):
def _setUp(self):
pass
def test_index(self):
# rv = self.app.get('/')
# assert 'Shuttl' in rv.data.decode('utf-8')
pass
def test_login(self):
rv = self.login('test')
pass
def login(self, organization):
return self.app.post('/login', data=dict(
organization=organization
), follow_redirects=True)
def test_creation(self):
results = self.app.post("/organization/", data = dict(name="testOrg"))
self.assertEqual(results.status_code, 201)
results2 = self.app.post("/organization/", data = dict(name="testOrg"))
self.assertEqual(results2.status_code, 409)
results = json.loads(results.data.decode())
expected = {
'reseller': {
'directory': '',
'name': 'shuttl',
'_url': 'shuttl.com',
'subdir': '',
'id': 1,
'admins': [],
'organizations': [],
'_price': 10.0
},
'id': 1,
'name': 'testOrg',
'websites': [],
'users': []
}
self.assertEqual(len(Organization.query.all()), 1)
self.assertEqual(len(list(self.reseller.organizations.all())), 1)
self.assertEqual(results, expected)
pass
def test_getAll(self):
self.app.post("/organization/", data = dict(name="testOrg"))
self.app.post("/organization/", data = dict(name="testOrg2"))
self.app.post("/organization/", data = dict(name="testOrg3"))
expected = [
Organization.query.filter(Organization.name == "testOrg").first().serialize(),
Organization.query.filter(Organization.name == "testOrg2").first().serialize(),
Organization.query.filter(Organization.name == "testOrg3").first().serialize()
]
results = self.app.get("/organization/")
results = json.loads(results.data.decode())
self.assertEqual(len(results), 3)
self.assertEqual(expected, results)
pass
def test_get(self):
results = self.app.post("/organization/", data = dict(name="testOrg"))
results_dict = json.loads(results.data.decode())
id = results_dict["id"]
results = self.app.get("/organization/{0}".format(id))
self.assertEqual(results.status_code, 200)
results = json.loads(results.data.decode())
self.assertEqual(results_dict, results)
results = self.app.get("/organization/1234")
self.assertEqual(results.status_code, 404)
pass
def test_patch(self):
results = self.app.post("/organization/", data = dict(name="testOrg"))
reseller = Reseller.Create(name="test3", _url="shuttl2.com")
results_dict = json.loads(results.data.decode())
results = self.app.patch("/organization/{0}".format(results_dict["id"]), data=dict(name="testOrg4"))
self.assertEqual(results.status_code, 200)
self.assertRaises(OrganizationDoesNotExistException, Organization.Get, name="testOrg", vendor=self.reseller)
results = json.loads(results.data.decode())
self.assertEqual(results["name"], "testOrg4")
org = Organization.Get(name="testOrg4", vendor=self.reseller)
self.assertEqual(org.serialize(), results)
results = self.app.patch("/organization/{0}".format(results_dict["id"]), data=dict(vendor=reseller.id))
self.assertRaises(OrganizationDoesNotExistException, Organization.Get, name="testOrg4", vendor=self.reseller)
results = json.loads(results.data.decode())
org = Organization.Get(name="testOrg4", vendor=reseller)
self.assertEqual(org.serialize(), results)
self.assertEqual(len(list(self.reseller.organizations.all())), 0)
self.assertEqual(len(list(reseller.organizations.all())), 1)
results = self.app.patch("/organization/1234", data=dict(vendor=reseller.id))
self.assertEqual(results.status_code, 404)
results = self.app.patch("/organization/", data=dict(vendor=reseller.id))
self.assertEqual(results.status_code, 405)
pass
def test_delete(self):
results = self.app.post("/organization/", data = dict(name="testOrg"))
results2 = self.app.post("/organization/", data = dict(name="testOrg2"))
results = json.loads(results.data.decode())
res3 = self.app.delete("/organization/{0}".format(results["id"]))
self.assertEqual(res3.status_code, 200)
self.assertEqual(len(list(self.reseller.organizations.all())), 1)
res3 = self.app.delete("/organization/")
self.assertEqual(res3.status_code, 405)
res3 = self.app.delete("/organization/1234")
self.assertEqual(res3.status_code, 404)
pass
| 40.72 | 117 | 0.621022 | 545 | 5,090 | 5.748624 | 0.157798 | 0.10533 | 0.049154 | 0.066071 | 0.678902 | 0.557932 | 0.422598 | 0.330993 | 0.268433 | 0.16406 | 0 | 0.020112 | 0.228291 | 5,090 | 124 | 118 | 41.048387 | 0.777495 | 0.01277 | 0 | 0.291262 | 0 | 0 | 0.11244 | 0 | 0 | 0 | 0 | 0 | 0.23301 | 1 | 0.087379 | false | 0.07767 | 0.048544 | 0.009709 | 0.15534 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c00b0d921904cae0f3219c2a2df1410ec3c0ae18 | 3,477 | py | Python | emissary/controllers/load.py | LukeB42/Emissary | 31629a8baedc91a9b60c551a01b2b45372b9a8c7 | [
"MIT"
] | 193 | 2015-06-20T23:46:05.000Z | 2021-02-16T14:04:29.000Z | emissary/controllers/load.py | LukeB42/Emissary | 31629a8baedc91a9b60c551a01b2b45372b9a8c7 | [
"MIT"
] | 4 | 2015-08-23T15:25:55.000Z | 2016-01-06T11:29:20.000Z | emissary/controllers/load.py | LukeB42/Emissary | 31629a8baedc91a9b60c551a01b2b45372b9a8c7 | [
"MIT"
] | 21 | 2015-07-05T12:20:06.000Z | 2019-07-12T08:07:46.000Z | # This file contains functions designed for
# loading cron tables and storing new feeds.
from emissary import db
from sqlalchemy import and_
from emissary.controllers.utils import spaceparse
from emissary.controllers.cron import parse_timings
from emissary.models import APIKey, Feed, FeedGroup
def create_feed(log, db, key, group, feed):
"""
Takes a key object, a group name and a dictionary
describing a feed ({name:,url:,schedule:,active:})
and reliably attaches a newly created feed to the key
and group.
"""
if not type(feed) == dict:
log('Unexpected type when creating feed for API key "%s"' % key.name)
return
for i in ['name', 'schedule', 'active', 'url']:
if not i in feed.keys():
log('%s: Error creating feed. Missing "%s" field from feed definition.' % (key.name, i))
return
f = Feed.query.filter(and_(Feed.key == key, Feed.name == feed['name'])).first()
fg = FeedGroup.query.filter(and_(FeedGroup.key == key, FeedGroup.name == group)).first()
if f:
if f.group:
log('%s: Error creating feed "%s" in group "%s", feed already exists in group "%s".' % \
(key.name, feed['name'], group, f.group.name))
return
elif fg:
log('%s: %s: Adding feed "%s"' % (key.name, fg.name, f.name))
fg.append(f)
db.session.add(fg)
db.session.add(f)
db.session.commit()
return
if not fg:
log('%s: Creating feed group %s.' % (key.name, group))
fg = FeedGroup(name=group)
key.feedgroups.append(fg)
try:
parse_timings(feed['schedule'])
except Exception, e:
log('%s: %s: Error creating "%s": %s' % \
(key.name, fg.name, feed['name'], e.message))
log('%s: %s: Creating feed "%s"' % (key.name, fg.name, feed['name']))
f = Feed(
name=feed['name'],
url=feed['url'],
active=feed['active'],
schedule=feed['schedule']
)
fg.feeds.append(f)
key.feeds.append(f)
db.session.add(key)
db.session.add(fg)
db.session.add(f)
db.session.commit()
def parse_crontab(filename):
"""
Get a file descriptor on filename and
create feeds and groups for API keys therein.
"""
def log(message):
print message
# read filename into a string named crontab
try:
fd = open(filename, "r")
except OSError:
print "Error opening %s" % filename
raise SystemExit
crontab = fd.read()
fd.close()
# keep a resident api key on hand
key = None
for i, line in enumerate(crontab.split('\n')):
# Set the APIKey we're working with when we find a line starting
# with apikey:
if line.startswith("apikey:"):
if ' ' in line:
key_str = line.split()[1]
key = APIKey.query.filter(APIKey.key == key_str).first()
if not key:
print 'Malformed or unknown API key at line %i in %s: %s' % (i+1, filename, line)
raise SystemExit
else:
print 'Using API key "%s".' % key.name
if line.startswith("http"):
feed = {'active': True}
# Grab the URL and set the string to the remainder
feed['url'] = line.split().pop(0)
line = ' '.join(line.split()[1:])
# Grab names and groups
names = spaceparse(line)
if not names:
print "Error parsing feed or group name at line %i in %s: %s" % (i+1, filename, line)
continue
feed['name'], group = names[:2]
# The schedule should be the last five items
schedule = line.split()[-5:]
try:
parse_timings(schedule)
except Exception, e:
print "Error parsing schedule at line %i in %s: %s" % (i+1, filename, e.message)
continue
feed['schedule'] = ' '.join(schedule)
create_feed(log, db, key, group, feed)
| 27.816 | 91 | 0.6543 | 538 | 3,477 | 4.208178 | 0.267658 | 0.031802 | 0.024735 | 0.013251 | 0.163869 | 0.121466 | 0.111749 | 0.068463 | 0.068463 | 0.059187 | 0 | 0.002858 | 0.194996 | 3,477 | 124 | 92 | 28.040323 | 0.806002 | 0.100086 | 0 | 0.218391 | 0 | 0.011494 | 0.205831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.057471 | null | null | 0.068966 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c012b837e3e30a6eafa5b481e3b7199fb7fec744 | 369 | py | Python | src/utils/tools.py | Xuenew/2c | 2e6ada011bcc8bbe19d2e745fcc9eff1fc31a520 | [
"Apache-2.0"
] | null | null | null | src/utils/tools.py | Xuenew/2c | 2e6ada011bcc8bbe19d2e745fcc9eff1fc31a520 | [
"Apache-2.0"
] | null | null | null | src/utils/tools.py | Xuenew/2c | 2e6ada011bcc8bbe19d2e745fcc9eff1fc31a520 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
"""
Created by howie.hu at 2021/4/7.
Description:通用函数
Changelog: all notable changes to this file will be documented
"""
import hashlib
def md5_encryption(string: str) -> str:
"""
对字符串进行md5加密
:param string: 加密目标字符串
:return:
"""
m = hashlib.md5()
m.update(string.encode("utf-8"))
return m.hexdigest()
| 18.45 | 66 | 0.631436 | 49 | 369 | 4.734694 | 0.836735 | 0.060345 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035461 | 0.235772 | 369 | 19 | 67 | 19.421053 | 0.787234 | 0.479675 | 0 | 0 | 0 | 0 | 0.033557 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c028333a0436a3c88c477f3244b6bd0fca21d64d | 1,600 | py | Python | rest_api/views.py | vikash98k/django-rest-api | 51c83a5d9c65f03b4b790ac965cd2222c6326752 | [
"MIT"
] | 1 | 2021-11-15T03:29:24.000Z | 2021-11-15T03:29:24.000Z | rest_api/views.py | vikash98k/django-rest-api | 51c83a5d9c65f03b4b790ac965cd2222c6326752 | [
"MIT"
] | null | null | null | rest_api/views.py | vikash98k/django-rest-api | 51c83a5d9c65f03b4b790ac965cd2222c6326752 | [
"MIT"
] | null | null | null | from rest_framework import generics
from .permissions import IsOwner
from .serializers import BucketlistSerializer, UserSerializer
from .models import Bucketlist
from django.contrib.auth.models import User
from rest_framework.permissions import IsAuthenticated
from rest_framework.authentication import SessionAuthentication
class CreateView(generics.ListCreateAPIView):
"""This class handles the GET and POSt requests of our rest api."""
queryset = Bucketlist.objects.all()
serializer_class = BucketlistSerializer
permission_classes = [IsAuthenticated]
authentication_classes = [SessionAuthentication]
def perform_create(self, serializer):
"""Save the post data when creating a new bucketlist."""
serializer.save(owner=self.request.user)
class DetailsView(generics.RetrieveUpdateDestroyAPIView):
"""This class handles GET, PUT, PATCH and DELETE requests."""
queryset = Bucketlist.objects.all()
serializer_class = BucketlistSerializer
permission_classes = [IsAuthenticated]
authentication_classes = [SessionAuthentication]
class UserView(generics.ListAPIView):
"""View to list the user queryset."""
queryset = User.objects.all()
serializer_class = UserSerializer
permission_classes = [IsAuthenticated]
authentication_classes = [SessionAuthentication]
class UserDetailsView(generics.RetrieveAPIView):
"""View to retrieve a user instance."""
queryset = User.objects.all()
serializer_class = UserSerializer
permission_classes = [IsAuthenticated]
authentication_classes = [SessionAuthentication]
| 39.02439 | 71 | 0.775625 | 158 | 1,600 | 7.753165 | 0.405063 | 0.032653 | 0.065306 | 0.081633 | 0.435918 | 0.435918 | 0.435918 | 0.427755 | 0.427755 | 0.427755 | 0 | 0 | 0.149375 | 1,600 | 40 | 72 | 40 | 0.900073 | 0.14625 | 0 | 0.551724 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034483 | false | 0 | 0.241379 | 0 | 0.965517 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c028ccbb3fd75e9d2a67753c98bec97a1ad49fb6 | 7,113 | py | Python | darling_ansible/python_venv/lib/python3.7/site-packages/oci/core/models/create_ip_sec_tunnel_bgp_session_details.py | revnav/sandbox | f9c8422233d093b76821686b6c249417502cf61d | [
"Apache-2.0"
] | null | null | null | darling_ansible/python_venv/lib/python3.7/site-packages/oci/core/models/create_ip_sec_tunnel_bgp_session_details.py | revnav/sandbox | f9c8422233d093b76821686b6c249417502cf61d | [
"Apache-2.0"
] | null | null | null | darling_ansible/python_venv/lib/python3.7/site-packages/oci/core/models/create_ip_sec_tunnel_bgp_session_details.py | revnav/sandbox | f9c8422233d093b76821686b6c249417502cf61d | [
"Apache-2.0"
] | 1 | 2020-06-25T03:12:58.000Z | 2020-06-25T03:12:58.000Z | # coding: utf-8
# Copyright (c) 2016, 2020, Oracle and/or its affiliates. All rights reserved.
# This software is dual-licensed to you under the Universal Permissive License (UPL) 1.0 as shown at https://oss.oracle.com/licenses/upl or Apache License 2.0 as shown at http://www.apache.org/licenses/LICENSE-2.0. You may choose either license.
from oci.util import formatted_flat_dict, NONE_SENTINEL, value_allowed_none_or_none_sentinel # noqa: F401
from oci.decorators import init_model_state_from_kwargs
@init_model_state_from_kwargs
class CreateIPSecTunnelBgpSessionDetails(object):
"""
CreateIPSecTunnelBgpSessionDetails model.
"""
def __init__(self, **kwargs):
"""
Initializes a new CreateIPSecTunnelBgpSessionDetails object with values from keyword arguments.
The following keyword arguments are supported (corresponding to the getters/setters of this class):
:param oracle_interface_ip:
The value to assign to the oracle_interface_ip property of this CreateIPSecTunnelBgpSessionDetails.
:type oracle_interface_ip: str
:param customer_interface_ip:
The value to assign to the customer_interface_ip property of this CreateIPSecTunnelBgpSessionDetails.
:type customer_interface_ip: str
:param customer_bgp_asn:
The value to assign to the customer_bgp_asn property of this CreateIPSecTunnelBgpSessionDetails.
:type customer_bgp_asn: str
"""
self.swagger_types = {
'oracle_interface_ip': 'str',
'customer_interface_ip': 'str',
'customer_bgp_asn': 'str'
}
self.attribute_map = {
'oracle_interface_ip': 'oracleInterfaceIp',
'customer_interface_ip': 'customerInterfaceIp',
'customer_bgp_asn': 'customerBgpAsn'
}
self._oracle_interface_ip = None
self._customer_interface_ip = None
self._customer_bgp_asn = None
@property
def oracle_interface_ip(self):
"""
Gets the oracle_interface_ip of this CreateIPSecTunnelBgpSessionDetails.
The IP address for the Oracle end of the inside tunnel interface.
If the tunnel's `routing` attribute is set to `BGP`
(see :class:`IPSecConnectionTunnel`), this IP address
is required and used for the tunnel's BGP session.
If `routing` is instead set to `STATIC`, this IP address is optional. You can set this IP
address to troubleshoot or monitor the tunnel.
The value must be a /30 or /31.
Example: `10.0.0.4/31`
:return: The oracle_interface_ip of this CreateIPSecTunnelBgpSessionDetails.
:rtype: str
"""
return self._oracle_interface_ip
@oracle_interface_ip.setter
def oracle_interface_ip(self, oracle_interface_ip):
"""
Sets the oracle_interface_ip of this CreateIPSecTunnelBgpSessionDetails.
The IP address for the Oracle end of the inside tunnel interface.
If the tunnel's `routing` attribute is set to `BGP`
(see :class:`IPSecConnectionTunnel`), this IP address
is required and used for the tunnel's BGP session.
If `routing` is instead set to `STATIC`, this IP address is optional. You can set this IP
address to troubleshoot or monitor the tunnel.
The value must be a /30 or /31.
Example: `10.0.0.4/31`
:param oracle_interface_ip: The oracle_interface_ip of this CreateIPSecTunnelBgpSessionDetails.
:type: str
"""
self._oracle_interface_ip = oracle_interface_ip
@property
def customer_interface_ip(self):
"""
Gets the customer_interface_ip of this CreateIPSecTunnelBgpSessionDetails.
The IP address for the CPE end of the inside tunnel interface.
If the tunnel's `routing` attribute is set to `BGP`
(see :class:`IPSecConnectionTunnel`), this IP address
is required and used for the tunnel's BGP session.
If `routing` is instead set to `STATIC`, this IP address is optional. You can set this IP
address to troubleshoot or monitor the tunnel.
The value must be a /30 or /31.
Example: `10.0.0.5/31`
:return: The customer_interface_ip of this CreateIPSecTunnelBgpSessionDetails.
:rtype: str
"""
return self._customer_interface_ip
@customer_interface_ip.setter
def customer_interface_ip(self, customer_interface_ip):
"""
Sets the customer_interface_ip of this CreateIPSecTunnelBgpSessionDetails.
The IP address for the CPE end of the inside tunnel interface.
If the tunnel's `routing` attribute is set to `BGP`
(see :class:`IPSecConnectionTunnel`), this IP address
is required and used for the tunnel's BGP session.
If `routing` is instead set to `STATIC`, this IP address is optional. You can set this IP
address to troubleshoot or monitor the tunnel.
The value must be a /30 or /31.
Example: `10.0.0.5/31`
:param customer_interface_ip: The customer_interface_ip of this CreateIPSecTunnelBgpSessionDetails.
:type: str
"""
self._customer_interface_ip = customer_interface_ip
@property
def customer_bgp_asn(self):
"""
Gets the customer_bgp_asn of this CreateIPSecTunnelBgpSessionDetails.
If the tunnel's `routing` attribute is set to `BGP`
(see :class:`IPSecConnectionTunnel`), this ASN
is required and used for the tunnel's BGP session. This is the ASN of the network on the
CPE end of the BGP session. Can be a 2-byte or 4-byte ASN. Uses \"asplain\" format.
If the tunnel's `routing` attribute is set to `STATIC`, the `customerBgpAsn` must be null.
Example: `12345` (2-byte) or `1587232876` (4-byte)
:return: The customer_bgp_asn of this CreateIPSecTunnelBgpSessionDetails.
:rtype: str
"""
return self._customer_bgp_asn
@customer_bgp_asn.setter
def customer_bgp_asn(self, customer_bgp_asn):
"""
Sets the customer_bgp_asn of this CreateIPSecTunnelBgpSessionDetails.
If the tunnel's `routing` attribute is set to `BGP`
(see :class:`IPSecConnectionTunnel`), this ASN
is required and used for the tunnel's BGP session. This is the ASN of the network on the
CPE end of the BGP session. Can be a 2-byte or 4-byte ASN. Uses \"asplain\" format.
If the tunnel's `routing` attribute is set to `STATIC`, the `customerBgpAsn` must be null.
Example: `12345` (2-byte) or `1587232876` (4-byte)
:param customer_bgp_asn: The customer_bgp_asn of this CreateIPSecTunnelBgpSessionDetails.
:type: str
"""
self._customer_bgp_asn = customer_bgp_asn
def __repr__(self):
return formatted_flat_dict(self)
def __eq__(self, other):
if other is None:
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not self == other
| 37.240838 | 245 | 0.679882 | 927 | 7,113 | 5.04315 | 0.159655 | 0.084706 | 0.065455 | 0.029091 | 0.76877 | 0.67508 | 0.660535 | 0.559572 | 0.503529 | 0.474011 | 0 | 0.018822 | 0.253058 | 7,113 | 190 | 246 | 37.436842 | 0.861095 | 0.647406 | 0 | 0.068182 | 0 | 0 | 0.095265 | 0.023398 | 0 | 0 | 0 | 0 | 0 | 1 | 0.227273 | false | 0 | 0.045455 | 0.045455 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c029732da347682368446ad63ee588078d4cc569 | 752 | py | Python | server_django/prikmeter/views.py | ttencate/smartmetertap | c768a5818766f897cb5dcd223286b173b31a3a65 | [
"BSD-3-Clause"
] | 1 | 2017-10-26T05:28:08.000Z | 2017-10-26T05:28:08.000Z | server_django/prikmeter/views.py | ttencate/smartmetertap | c768a5818766f897cb5dcd223286b173b31a3a65 | [
"BSD-3-Clause"
] | 9 | 2017-10-16T07:15:51.000Z | 2021-09-11T07:39:40.000Z | server_django/prikmeter/views.py | ttencate/smartmetertap | c768a5818766f897cb5dcd223286b173b31a3a65 | [
"BSD-3-Clause"
] | null | null | null | from django.contrib import auth, messages
from django.shortcuts import redirect, render
from django.views.decorators.http import require_POST, require_safe
@require_safe
def index(request):
context = {}
return render(request, 'prikmeter/index.html', context)
@require_POST
def login(request):
email = request.POST['email']
password = request.POST['password']
user = auth.authenticate(request, email=email, password=password)
if user:
auth.login(request, user)
else:
messages.error(request, 'Invalid username or password.')
return redirect(request.POST['next'] or 'prikmeter:index')
@require_POST
def logout(request):
auth.logout()
return redirect(request.POST['next'] or 'prikmeter:index')
| 26.857143 | 69 | 0.720745 | 93 | 752 | 5.774194 | 0.376344 | 0.081937 | 0.052142 | 0.09311 | 0.167598 | 0.167598 | 0.167598 | 0.167598 | 0 | 0 | 0 | 0 | 0.163564 | 752 | 27 | 70 | 27.851852 | 0.853736 | 0 | 0 | 0.190476 | 0 | 0 | 0.132979 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0.142857 | 0.142857 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c02b02d3a9106d7127a9a094f2f01f8ba90e6fb6 | 22,963 | py | Python | app/src/main/Python/Translate.py | tangcan1600/XuMiJie | 2e47d519c1c62ec3eabb576d80f783dd62052f44 | [
"MIT"
] | null | null | null | app/src/main/Python/Translate.py | tangcan1600/XuMiJie | 2e47d519c1c62ec3eabb576d80f783dd62052f44 | [
"MIT"
] | null | null | null | app/src/main/Python/Translate.py | tangcan1600/XuMiJie | 2e47d519c1c62ec3eabb576d80f783dd62052f44 | [
"MIT"
] | null | null | null | import time, sys, os, hashlib, json, re
import requests, random, js2py
import urllib.request
import urllib.parse
# 检验是否含有中文字符
def language(strs):
for char in strs:
if (u'\u4e00' <= char and char <= u'\u9fff'):
return 'zh', 'en' # 含有中文
return 'en', 'zh' # 不含有中文
class Baidu():
def __init__(self):
self.url = 'https://fanyi.baidu.com/v2transapi?from=zh&to=en'
self.header = {
'content-type': 'application/x-www-form-urlencoded;charset=UTF-8',
'origin': 'https://fanyi.baidu.com',
'referer': 'https://fanyi.baidu.com/?aldtype=16047',
'user-agent': "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
'x-requested-with': 'XMLHttpRequest',
'cookie': 'BIDUPSID=D3290C65C03AEF0E98D97B8641DFFB15; PSTM=1570785944; REALTIME_TRANS_SWITCH=1; FANYI_WORD_SWITCH=1; HISTORY_SWITCH=1; SOUND_SPD_SWITCH=1; SOUND_PREFER_SWITCH=1; BAIDUID=0CC6F13854E81A68D3C564D36E7C8A03:FG=1; APPGUIDE_8_2_2=1; BDORZ=B490B5EBF6F3CD402E515D22BCDA1598; BDSFRCVID=wt_OJeC626EDLgju-c_JbHce7gSxbKcTH6aoxbIy4_AgXmAxrp74EG0PJf8g0Ku-dWitogKKBmOTHg-F_2uxOjjg8UtVJeC6EG0Ptf8g0M5; H_BDCLCKID_SF=JJkO_D_atKvjDbTnMITHh-F-5fIX5-RLf5TuLPOF5lOTJh0RbtOkjnQD-UL82bT2fRcQ0tJLb4DaStJbLjbke6cbDa_fJ5Fs-I5O0R4854QqqR5R5bOq-PvHhxoJqbbJX2OZ0l8KtDQpshRTMR_V-p4p-472K6bML5baabOmWIQHDPnPyJuMBU_sWMcChnjjJbn4KKJxWJLWeIJo5Dcf3PF3hUJiBMjLBan7056IXKohJh7FM4tW3J0ZyxomtfQxtNRJ0DnjtpChbRO4-TF-D5jXeMK; delPer=0; PSINO=2; H_PS_PSSID=1435_21104_18560_26350; Hm_lvt_64ecd82404c51e03dc91cb9e8c025574=1580216234,1580216243,1580458514,1580458537; Hm_lpvt_64ecd82404c51e03dc91cb9e8c025574=1580458539; __yjsv5_shitong=1.0_7_ed303110bee0e644d4985049ba8a5cd1f28d_300_1580458537306_120.10.109.208_66a3b40c; yjs_js_security_passport=630340c0505f771135167fa6df3e5215699dcf0b_1580458538_js; to_lang_often=%5B%7B%22value%22%3A%22zh%22%2C%22text%22%3A%22%u4E2D%u6587%22%7D%2C%7B%22value%22%3A%22en%22%2C%22text%22%3A%22%u82F1%u8BED%22%7D%5D; from_lang_often=%5B%7B%22value%22%3A%22vie%22%2C%22text%22%3A%22%u8D8A%u5357%u8BED%22%7D%2C%7B%22value%22%3A%22en%22%2C%22text%22%3A%22%u82F1%u8BED%22%7D%2C%7B%22value%22%3A%22zh%22%2C%22text%22%3A%22%u4E2D%u6587%22%7D%5D'
}
self.data = None
def get_sign_ctx(self):
ctx = execjs.compile(
r"""
function n(r, o) {
for (var t = 0; t < o.length - 2; t += 3) {
var a = o.charAt(t + 2);
a = a >= "a" ? a.charCodeAt(0) - 87 : Number(a),
a = "+" === o.charAt(t + 1) ? r >>> a : r << a,
r = "+" === o.charAt(t) ? r + a & 4294967295 : r ^ a
}
return r
}
function e(r) {
var o = r.match(/[\uD800-\uDBFF][\uDC00-\uDFFF]/g);
if (null === o) {
var t = r.length;
t > 30 && (r = "" + r.substr(0, 10) + r.substr(Math.floor(t / 2) - 5, 10) + r.substr(-10, 10))
} else {
for (var e = r.split(/[\uD800-\uDBFF][\uDC00-\uDFFF]/), C = 0, h = e.length, f = []; h > C; C++)
"" !== e[C] && f.push.apply(f, a(e[C].split(""))),
C !== h - 1 && f.push(o[C]);
var g = f.length;
g > 30 && (r = f.slice(0, 10).join("") + f.slice(Math.floor(g / 2) - 5, Math.floor(g / 2) + 5).join("") + f.slice(-10).join(""))
}
var u = void 0
, l = "" + String.fromCharCode(103) + String.fromCharCode(116) + String.fromCharCode(107);
u =' """ + str(self.get_gtk()) + r""" ';
for (var d = u.split("."), m = Number(d[0]) || 0, s = Number(d[1]) || 0, S = [], c = 0, v = 0; v < r.length; v++) {
var A = r.charCodeAt(v);
128 > A ? S[c++] = A : (2048 > A ? S[c++] = A >> 6 | 192 : (55296 === (64512 & A) && v + 1 < r.length && 56320 === (64512 & r.charCodeAt(v + 1)) ? (A = 65536 + ((1023 & A) << 10) + (1023 & r.charCodeAt(++v)),
S[c++] = A >> 18 | 240,
S[c++] = A >> 12 & 63 | 128) : S[c++] = A >> 12 | 224,
S[c++] = A >> 6 & 63 | 128),
S[c++] = 63 & A | 128)
}
for (var p = m, F = "" + String.fromCharCode(43) + String.fromCharCode(45) + String.fromCharCode(97) + ("" + String.fromCharCode(94) + String.fromCharCode(43) + String.fromCharCode(54)), D = "" + String.fromCharCode(43) + String.fromCharCode(45) + String.fromCharCode(51) + ("" + String.fromCharCode(94) + String.fromCharCode(43) + String.fromCharCode(98)) + ("" + String.fromCharCode(43) + String.fromCharCode(45) + String.fromCharCode(102)), b = 0; b < S.length; b++)
p += S[b],
p = n(p, F);
return p = n(p, D),
p ^= s,
0 > p && (p = (2147483647 & p) + 2147483648),
p %= 1e6,
p.toString() + "." + (p ^ m)
}
"""
)
return ctx
def get_sign(self, text):
ctx = self.get_sign_ctx()
sign = ctx.call("e", text)
# print(sign)
return sign
def get_token(self):
s = requests.session()
url = 'https://fanyi.baidu.com/'
html = requests.get(url, headers=self.header)
html = html.text
# print(html)
raw_tk_str = str(re.search('token:.*,', html))
token = raw_tk_str.split('\'')[1]
# print(token)
return token
def get_cookie(self):
import urllib.request
import http.cookiejar
cookie = http.cookiejar.CookieJar()
handler = urllib.request.HTTPCookieProcessor(cookie)
opener = urllib.request.build_opener(handler)
response = opener.open('https://fanyi.baidu.com/?aldtype=16047#zh/en/aa%E9%80%9F%E5%BA%A6')
# print(response)
for item in cookie:
print('%s = %s' % (item.name, item.value))
def get_gtk(self):
url = 'https://fanyi.baidu.com/'
html = requests.get(url)
html = html.text
raw_gtk_str = str(re.search('window.gtk = .*;', html))
gtk = raw_gtk_str.split('\'')[1]
# print('gtk '+gtk)
return gtk
def get_data(self, text, from_lan, to_lan):
data = {}
data['from'] = from_lan
data['to'] = to_lan
data['query'] = text
data['simple_means_flag'] = 3
data['transtype'] = 'realtime'
data['sign'] = self.get_sign(text)
data['token'] = self.get_token()
return data
def translate(self, text, from_lan, to_lan):
try:
self.data = self.get_data(text, from_lan, to_lan)
response = requests.post(self.url, headers=self.header, data=self.data)
# print('百度翻译结果为:',response.json()['trans_result']['data'][0]['dst'])
return response.json()['trans_result']['data'][0]['dst']
except:
return '程序出现了一点小问题,无法翻译'
class Bing():
def __init__(self):
self.url = "http://api.microsofttranslator.com/v2/ajax.svc/TranslateArray2?"
def translate(self, content, from_lan, to_lan):
try:
data = {}
data['from'] = '"' + from_lan + '"'
data['to'] = '"' + to_lan + '"'
data['texts'] = '["'
data['texts'] += content
data['texts'] += '"]'
data['options'] = "{}"
data['oncomplete'] = 'onComplete_3'
data['onerror'] = 'onError_3'
data['_'] = '1430745999189'
data = urllib.parse.urlencode(data).encode('utf-8')
strUrl = self.url + data.decode() + "&appId=%223DAEE5B978BA031557E739EE1E2A68CB1FAD5909%22"
response = urllib.request.urlopen(strUrl)
str_data = response.read().decode('utf-8')
# print(str_data)
tmp, str_data = str_data.split('"TranslatedText":')
translate_data = str_data[1:str_data.find('"', 1)]
# print('必应翻译结果为:',translate_data)
return translate_data
except:
return '程序出现了一点小问题,无法翻译'
class Ciba():
def __init__(self, word, lan, tolan):
self.word = word
self.url = 'http://fy.iciba.com/ajax.php?a=fy'
self.headers = {
'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; Win64; x64) '
'AppleWebKit/537.36 (KHTML, like Gecko) Chrome/76.0.3809.132 Safari/537.36'
}
# 构造post请求的参数
self.post_data = {
'f': lan,
't': tolan,
'w': self.word
}
# 发送请求
def request_post(self):
res = requests.post(url=self.url, headers=self.headers, data=self.post_data)
# print(res.content.decode())
return res.content.decode()
# 解析数据
@staticmethod
def parse_data(data):
dict_data = json.loads(data)
if 'out' in dict_data['content']:
return dict_data['content']['out']
elif 'word_mean' in dict_data['content']:
return dict_data['content']['word_mean']
def translate(self):
data = self.request_post()
try:
# print('词霸翻译结果为:',self.parse_data(data))
return self.parse_data(data)
except:
return '程序出现了一点小问题,无法翻译'
class Youdao():
def translate(self, content, lan, tolan):
try:
# 解决反爬机制
u = 'fanyideskweb'
d = content
url = 'http://fanyi.youdao.com/translate?smartresult=dict&smartresult=rule'
f = str(int(time.time() * 1000) + random.randint(1, 10)) # 时间戳
c = 'rY0D^0\'nM0}g5Mm1z%1G4'
sign = hashlib.md5((u + d + f + c).encode('utf-8')).hexdigest() # md5加密,生成一个随机数
head = {}
head[
'User-Agent'] = 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/65.0.3325.181 Safari/537.36'
data = {}
# data 有道翻译数据表单
data['i'] = content
data['from'] = lan # 'AUTO'
data['to'] = tolan # 'AUTO'
data['smartresult'] = 'dict'
data['client'] = 'fanyideskweb'
data['salt'] = f # salt 与 sign 反爬机制 ,每次都会变化 salt就是时间戳
data['sign'] = sign # 使用的是u + d + f + c的md5的值。
data['ts'] = '1551506287219'
data['bv'] = '97ba7c7fb78632ae9b11dcf6be726aee'
data['doctype'] = 'json'
data['version'] = '2.1'
data['keyfrom'] = 'fanyi.web'
data['action'] = 'FY_BY_REALTIME'
data['typoResult'] = 'False'
data = urllib.parse.urlencode(data).encode('utf-8')
request = urllib.request.Request(url=url, data=data, headers=head, method='POST')
response = urllib.request.urlopen(request)
line = json.load(response) # 将得到的字符串转换成json格式
text = ''
for x in line['translateResult']:
text += x[0]['tgt']
# print('有道翻译结果为:',text)
return text
except:
return '程序出现了一点小问题,无法翻译'
class Youdao1():
def get_data(self, e, lan, tolan):
'''
构建data数据函数
:param e: 输入要翻译的内容
:return: 字典类型的data数据
'''
sjc = time.time()
ts = str(int(sjc * 1000))
salt = ts + str(int(random.random() * 10))
con = "fanyideskweb" + e + salt + "97_3(jkMYg@T[KZQmqjTK"
sign = hashlib.md5(con.encode(encoding='UTF-8')).hexdigest()
# 'from': 'AUTO',
# 'to': 'AUTO',
data = {
'i': e,
'from': lan,
'to': tolan,
'smartresult': 'dict',
'client': 'fanyideskweb',
'salt': salt,
'sign': sign,
'ts': ts,
'bv': '97ba7c7fb78632ae9b11dcf6be726aee',
'doctype': 'json',
'version': '2.1',
'keyfrom': 'fanyi.web',
'action': 'FY_BY_REALTlME',
'typoResult': 'False'
}
return data
def get_para(self, e, lan, tolan):
'''
获取需要的参数
:param e: 输入字符串
:return:
'''
header = {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/\
537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36',
'Cookie': 'OUTFOX_SEARCH_USER_ID=-1154806696@10.168.8.76; \
OUTFOX_SEARCH_USER_ID_NCOO=1227534676.2988937; \
JSESSIONID=aaa7LDLdy4Wbh9ECJb_Vw; ___rl__test__cookies=1563334957868',
'Referer': 'http://fanyi.youdao.com/'
}
return self.get_data(e, lan, tolan), header
def search(self, res):
'''
用于匹配响应的结果
:param res:
:return:
'''
import re
model = '"tgt":"(.*?)"'
rep = re.findall(model, res, re.S)
rep = rep[0]
return rep
def translate(self, content, lan, tolan):
try:
url = 'http://fanyi.youdao.com/translate_o?smartresult=dict&smartresult=rule'
data = self.get_para(content, lan, tolan)[0]
header = self.get_para(content, lan, tolan)[1]
response = requests.post(url, data=data, headers=header).text
result = self.search(response)
return result
except:
return '程序出现了一点小问题,无法翻译'
class Google():
def translate(self, word, from_lan, to_lan):
headers = {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36',
}
url = 'https://translate.google.cn/translate_a/single?client=t&sl=auto&tl={}&hl=zh-CN&dt=at&dt=bd&dt=ex&dt=ld&dt=md&dt=qca&dt=rw&dt=rm&dt=ss&dt=t&tk={}&q={}'
if len(word) > 4891:
raise RuntimeError('The length of word should be less than 4891...')
target_language = to_lan
res = requests.get(url.format(target_language, self.getTk(word), word), headers=headers)
# print(res.json()[0][0][0])
return res.json()[0][0][0]
def getTk(self, word):
evaljs = js2py.EvalJs()
js_code = self.gg_js_code
evaljs.execute(js_code)
tk = evaljs.TL(word)
return tk
def isChinese(self, word):
for w in word:
if '\u4e00' <= w <= '\u9fa5':
return True
return False
gg_js_code = '''
function TL(a) {
var k = "";
var b = 406644;
var b1 = 3293161072;
var jd = ".";
var $b = "+-a^+6";
var Zb = "+-3^+b+-f";
for (var e = [], f = 0, g = 0; g < a.length; g++) {
var m = a.charCodeAt(g);
128 > m ? e[f++] = m : (2048 > m ? e[f++] = m >> 6 | 192 : (55296 == (m & 64512) && g + 1 < a.length && 56320 == (a.charCodeAt(g + 1) & 64512) ? (m = 65536 + ((m & 1023) << 10) + (a.charCodeAt(++g) & 1023),
e[f++] = m >> 18 | 240,
e[f++] = m >> 12 & 63 | 128) : e[f++] = m >> 12 | 224,
e[f++] = m >> 6 & 63 | 128),
e[f++] = m & 63 | 128)
}
a = b;
for (f = 0; f < e.length; f++) a += e[f],
a = RL(a, $b);
a = RL(a, Zb);
a ^= b1 || 0;
0 > a && (a = (a & 2147483647) + 2147483648);
a %= 1E6;
return a.toString() + jd + (a ^ b)
};
function RL(a, b) {
var t = "a";
var Yb = "+";
for (var c = 0; c < b.length - 2; c += 3) {
var d = b.charAt(c + 2),
d = d >= t ? d.charCodeAt(0) - 87 : Number(d),
d = b.charAt(c + 1) == Yb ? a >>> d: a << d;
a = b.charAt(c) == Yb ? a + d & 4294967295 : a ^ d
}
return a
}
'''
class Tencent():
def __init__(self):
self.api_url = 'https://fanyi.qq.com/api/translate'
self.headers = {
'Cookie': 'fy_guid=605ead81-f210-47eb-bd80-ac6ae5e7a2d8; '
'qtv=ed286a053ae88763; '
'qtk=wfMmjh3k/7Sr2xVNg/LtITgPRlnvGWBzP9a4FN0dn9PE7L5jDYiYJnW03MJLRUGHEFNCRhTfrp/V+wUj0dun1KkKNUUmS86A/wGVf6ydzhwboelTOs0hfHuF0ndtSoX+N3486tUMlm62VU4i856mqw==; ',
'Host': 'fanyi.qq.com',
'Origin': 'https://fanyi.qq.com',
'Referer': 'https://fanyi.qq.com/',
'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, '
'like Gecko) Chrome/73.0.3683.86 Safari/537.36', }
self.fromlang = 'auto'
self.text = ''
self.tolang = 'en' # 设置默认为英语
self.sessionUuid = str(int(time.time() * 1000))
self.fy_guid, self.qtv, self.qtk = self.get_qtv_qtk()
self.headers['Cookie'] = self.headers['Cookie'].replace(
'605ead81-f210-47eb-bd80-ac6ae5e7a2d8', self.fy_guid)
self.headers['Cookie'] = self.headers['Cookie'].replace(
'ed286a053ae88763', self.qtv)
self.headers['Cookie'] = self.headers['Cookie'].replace(
'wfMmjh3k/7Sr2xVNg/LtITgPRlnvGWBzP9a4FN0dn9PE7L5jDYiYJnW03MJLRUGHEFNCRhTfrp/V+wUj0dun1KkKNUUmS86A/wGVf6ydzhwboelTOs0hfHuF0ndtSoX+N3486tUMlm62VU4i856mqw==',
self.qtk)
def get_filter(self, text):
if isinstance(text, list):
text = ''.join(text)
text = str(text)
text = text.strip()
filter_list = [
'\r', '\n', '\t', '\u3000', '\xa0', '\u2002',
'<br>', '<br/>', ' ', ' ', ' ', '>>', '"',
'展开全部', ' '
]
for fl in filter_list:
text = text.replace(fl, '')
return text
def get_qtv_qtk(self):
api_url = 'https://fanyi.qq.com/'
headers = {
'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, '
'like Gecko) Chrome/73.0.3683.86 Safari/537.36', }
res = requests.get(api_url, headers=headers)
data = res.text
fy_guid = res.cookies.get('fy_guid')
reg = re.compile(r'var qtv = "(.*?)"')
qtv = reg.search(data).group(1)
reg = re.compile(r'var qtk = "(.*?)"')
qtk = reg.search(data).group(1)
return fy_guid, qtv, qtk
def getHtml(self, url, headers, data):
try:
html = requests.post(url=url, data=data, headers=headers)
# print(html.text)
datas = html.json()['translate']['records']
if html != None and datas != None:
# 以文本的形式打印结果
trans_result = ''.join([data['targetText'] for data in datas])
return trans_result
except Exception:
return None
def translate(self, text):
data = {
'source': self.fromlang,
'target': self.tolang,
'sourceText': text,
'qtv': self.qtv,
'qtk': self.qtk,
'sessionUuid': self.sessionUuid, }
try:
result = self.getHtml(self.api_url, self.headers, data)
# print('腾讯翻译结果为:',result)
return result
except:
return '程序出现了一点小问题,无法翻译'
class SanLiuLing():
def translate(self, content, lan, tolan):
try:
eng = "0";
if lan == 'en' and tolan == 'zh':
eng = "0"
elif lan == 'zh' and tolan == 'en':
eng = "1"
else:
return;
url = 'https://fanyi.so.com/index/search'
query_string = {"eng": eng, "validate": "", "ignore_trans": "0", "query": content}
headers = {
"user-agent": "Mozilla/5.0 (iPhone; CPU iPhone OS 11_0 like Mac OS X) AppleWebKit/604.1.38 (KHTML, like Gecko) Version/11.0 Mobile/15A372 Safari/604.1"}
response = requests.post(url=url, data=query_string, headers=headers)
response.encoding = 'utf-8'
dict_ret = json.loads(response.text)
print(dict_ret['data']['fanyi'])
return dict_ret['data']['fanyi']
except:
return '程序出现了一点小问题,无法翻译'
languageMapCode = {
'检测语言': 'auto',
'阿尔巴尼亚语': 'sq',
'阿拉伯语': 'ar',
'阿姆哈拉语': 'am',
'阿塞拜疆语': 'az',
'爱尔兰语': 'ga',
'爱沙尼亚语': 'et',
'巴斯克语': 'eu',
'白俄罗斯语': 'be',
'保加利亚语': 'bg',
'冰岛语': 'is',
'波兰语': 'pl',
'波斯尼亚语': 'bs',
'波斯语': 'fa',
'布尔语(南非荷兰语)': 'af',
'丹麦语': 'da',
'德语': 'de',
'俄语': 'ru',
'法语': 'fr',
'菲律宾语': 'tl',
'芬兰语': 'fi',
'弗里西语': 'fy',
'高棉语': 'km',
'格鲁吉亚语': 'ka',
'古吉拉特语': 'gu',
'哈萨克语': 'kk',
'海地克里奥尔语': 'ht',
'韩语': 'ko',
'豪萨语': 'ha',
'荷兰语': 'nl',
'吉尔吉斯语': 'ky',
'加利西亚语': 'gl',
'加泰罗尼亚语': 'ca',
'捷克语': 'cs',
'卡纳达语': 'kn',
'科西嘉语': 'co',
'克罗地亚语': 'hr',
'库尔德语': 'ku',
'拉丁语': 'la',
'拉脱维亚语': 'lv',
'老挝语': 'lo',
'立陶宛语': 'lt',
'卢森堡语': 'lb',
'罗马尼亚语': 'ro',
'马尔加什语': 'mg',
'马耳他语': 'mt',
'马拉地语': 'mr',
'马拉雅拉姆语': 'ml',
'马来语': 'ms',
'马其顿语': 'mk',
'毛利语': 'mi',
'蒙古语': 'mn',
'孟加拉语': 'bn',
'缅甸语': 'my',
'苗语': 'hmn',
'南非科萨语': 'xh',
'南非祖鲁语': 'zu',
'尼泊尔语': 'ne',
'挪威语': 'no',
'旁遮普语': 'pa',
'葡萄牙语': 'pt',
'普什图语': 'ps',
'齐切瓦语': 'ny',
'日语': 'ja',
'瑞典语': 'sv',
'萨摩亚语': 'sm',
'塞尔维亚语': 'sr',
'塞索托语': 'st',
'僧伽罗语': 'si',
'世界语': 'eo',
'斯洛伐克语': 'sk',
'斯洛文尼亚语': 'sl',
'斯瓦希里语': 'sw',
'苏格兰盖尔语': 'gd',
'宿务语': 'ceb',
'索马里语': 'so',
'塔吉克语': 'tg',
'泰卢固语': 'te',
'泰米尔语': 'ta',
'泰语': 'th',
'土耳其语': 'tr',
'威尔士语': 'cy',
'乌尔都语': 'ur',
'乌克兰语': 'uk',
'乌兹别克语': 'uz',
'西班牙语': 'es',
'希伯来语': 'iw',
'希腊语': 'el',
'夏威夷语': 'haw',
'信德语': 'sd',
'匈牙利语': 'hu',
'修纳语': 'sn',
'亚美尼亚语': 'hy',
'伊博语': 'ig',
'意大利语': 'it',
'意第绪语': 'yi',
'印地语': 'hi',
'印尼巽他语': 'su',
'印尼语': 'id',
'印尼爪哇语': 'jw',
'英语': 'en',
'约鲁巴语': 'yo',
'越南语': 'vi',
'中文': 'zh-CN',
'中文(繁体)': 'zh-TW'
}
""" 获取语言代码 """
def get_language_code(language):
if language in languageMapCode:
return languageMapCode[language]
return ''
def translate(api, content):
lan, tolan = language(content)
if not content:
return
results = "kong"
if api == 'baidu':
results = Baidu().translate(content, lan, tolan)
elif api == 'youdao':
results = Youdao1().translate(content, lan, tolan)
elif api == 'google':
results = Google().translate(content, lan, tolan)
elif api == 'Ciba':
ciba = Ciba(content, lan, tolan)
results = ciba.translate()
elif api == 'bing':
results = Bing().translate(content, lan, tolan)
elif api == 'tencent':
results = Tencent().translate(content)
elif api == '360':
results = SanLiuLing().translate(content, lan, tolan)
return results
| 36.276461 | 1,459 | 0.512041 | 2,713 | 22,963 | 4.261703 | 0.276815 | 0.028023 | 0.015568 | 0.011763 | 0.263017 | 0.222885 | 0.181716 | 0.117194 | 0.077755 | 0.067203 | 0 | 0.089489 | 0.314332 | 22,963 | 632 | 1,460 | 36.333861 | 0.64484 | 0.029874 | 0 | 0.11501 | 0 | 0.021443 | 0.396284 | 0.108025 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060429 | false | 0.001949 | 0.013645 | 0 | 0.173489 | 0.003899 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c02c5be917c2e7c70614350c8ed104d79b0759b4 | 1,266 | py | Python | jina/executors/evaluators/rank/recall.py | yk/jina | ab66e233e74b956390f266881ff5dc4e0110d3ff | [
"Apache-2.0"
] | null | null | null | jina/executors/evaluators/rank/recall.py | yk/jina | ab66e233e74b956390f266881ff5dc4e0110d3ff | [
"Apache-2.0"
] | null | null | null | jina/executors/evaluators/rank/recall.py | yk/jina | ab66e233e74b956390f266881ff5dc4e0110d3ff | [
"Apache-2.0"
] | null | null | null | from typing import Sequence, Any, Optional
from . import BaseRankingEvaluator
class RecallEvaluator(BaseRankingEvaluator):
"""A :class:`RecallEvaluator` evaluates the Precision of the search.
It computes how many of the first given `eval_at` groundtruth are found in the matches
"""
metric = 'Recall@N'
def __init__(self,
eval_at: Optional[int] = None,
*args, **kwargs):
""""
:param eval_at: the point at which evaluation is computed, if None give, will consider all the input to evaluate
"""
super().__init__(*args, **kwargs)
self.eval_at = eval_at
def evaluate(self, actual: Sequence[Any], desired: Sequence[Any], *args, **kwargs) -> float:
""""
:param actual: the matched document identifiers from the request as matched by jina indexers and rankers
:param desired: the expected documents matches ids sorted as they are expected
:return the evaluation metric value for the request document
"""
if self.eval_at == 0:
return 0.0
actual_at_k = actual[:self.eval_at] if self.eval_at else actual
ret = len(set(actual_at_k).intersection(set(desired)))
return ret / len(desired)
| 38.363636 | 120 | 0.648499 | 162 | 1,266 | 4.944444 | 0.5 | 0.059925 | 0.062422 | 0.029963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003215 | 0.263033 | 1,266 | 32 | 121 | 39.5625 | 0.855305 | 0.406003 | 0 | 0 | 0 | 0 | 0.011869 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.133333 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c02ddc618f6444651370434e959eed89c5b43ed2 | 2,881 | py | Python | plugins/commands.py | Kalpesh0/Project01 | 42383a3aa4a3f17ab69dd01357bfbb0740ba965b | [
"MIT"
] | null | null | null | plugins/commands.py | Kalpesh0/Project01 | 42383a3aa4a3f17ab69dd01357bfbb0740ba965b | [
"MIT"
] | null | null | null | plugins/commands.py | Kalpesh0/Project01 | 42383a3aa4a3f17ab69dd01357bfbb0740ba965b | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
# @REQUEST_M0viz
from pyrogram import Client, filters
from pyrogram.types import InlineKeyboardMarkup, InlineKeyboardButton
from script import script
@Client.on_message(filters.command(["start"]) & filters.private)
async def start(client, message):
try:
await message.reply_text(
text=script.START_MSG.format(message.from_user.mention),
disable_web_page_preview=True,
reply_markup=InlineKeyboardMarkup(
[
[
InlineKeyboardButton("HELP", callback_data="help_data"),
InlineKeyboardButton("ABOUT", callback_data="about_data"),
],
[
InlineKeyboardButton(
"🖤JOIN SUPPORT GROUP🖤", url="https://t.me/REQUEST_M0viz")
]
]
),
reply_to_message_id=message.message_id
)
except:
pass
@Client.on_message(filters.command(["help"]) & filters.private)
async def help(client, message):
try:
await message.reply_text(
text=script.HELP_MSG,
disable_web_page_preview=True,
reply_markup=InlineKeyboardMarkup(
[
[
InlineKeyboardButton("BACK", callback_data="start_data"),
InlineKeyboardButton("ABOUT", callback_data="about_data"),
],
[
InlineKeyboardButton(
"💕DONATE US 💕", url="https://t.me/Harshsoni_08")
]
]
),
reply_to_message_id=message.message_id
)
except:
pass
@Client.on_message(filters.command(["about"]) & filters.private)
async def about(client, message):
try:
await message.reply_text(
text=script.ABOUT_MSG,
disable_web_page_preview=True,
reply_markup=InlineKeyboardMarkup(
[
[
InlineKeyboardButton("BACK", callback_data="help_data"),
InlineKeyboardButton("START", callback_data="start_data"),
],
[
InlineKeyboardButton(
"SHARE OUR GROUP 🖤🤙", url="http://t.me/share/url?url=Hey%20There%E2%9D%A4%EF%B8%8F%2C%0A%20%0A%20I%20Found%20A%20Really%20Awesome%20Group%20%20For%20Searching%20Movies%20Hope%20You%20will%20Join%20This%20Group%20Too😁😁👍%E2%9D%A4%EF%B8%8F%E2%9D%A4%EF%B8%8F%E2%9D%A4%EF%B8%8F%0A%20%0A%20Group%20Sharing%20Username%20Link%20%3A-%20%40REQUEST_M0viz")
]
]
),
reply_to_message_id=message.message_id
)
except:
pass
| 36.468354 | 373 | 0.532107 | 277 | 2,881 | 5.386282 | 0.350181 | 0.048257 | 0.016086 | 0.021448 | 0.608579 | 0.52882 | 0.520777 | 0.520777 | 0.426944 | 0.281501 | 0 | 0.047283 | 0.361333 | 2,881 | 78 | 374 | 36.935897 | 0.758696 | 0.020132 | 0 | 0.463768 | 0 | 0.014493 | 0.182979 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.043478 | 0.043478 | 0 | 0.043478 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c02e20b826436ffe36a314fe9b02b8f0b79df9d5 | 3,583 | py | Python | gpjax/utils.py | thomaspinder/GPJax | 929fcb88d13d15bb10e1175491dbc3e79622325a | [
"Apache-2.0"
] | 44 | 2020-12-03T14:07:39.000Z | 2022-03-14T17:45:34.000Z | gpjax/utils.py | thomaspinder/GPJax | 929fcb88d13d15bb10e1175491dbc3e79622325a | [
"Apache-2.0"
] | 28 | 2020-12-05T08:54:45.000Z | 2022-03-01T09:56:50.000Z | gpjax/utils.py | thomaspinder/GPJax | 929fcb88d13d15bb10e1175491dbc3e79622325a | [
"Apache-2.0"
] | 7 | 2021-02-05T12:37:57.000Z | 2022-03-13T13:00:20.000Z | from copy import deepcopy
from typing import Tuple
import jax.numpy as jnp
from jax.scipy.linalg import cho_factor, cho_solve
from multipledispatch import dispatch
from .types import Array
def I(n: int) -> Array:
"""
Compute an n x n identity matrix.
:param n: The size of of the matrix.
:return: An n x n identity matrix.
"""
return jnp.eye(n)
def concat_dictionaries(a: dict, b: dict) -> dict:
"""
Append one dictionary below another. If duplicate keys exist, then the key-value pair of the second supplied
dictionary will be used.
"""
return {**a, **b}
def merge_dictionaries(base_dict: dict, in_dict: dict) -> dict:
"""
This will return a complete dictionary based on the keys of the first matrix. If the same key should exist in the
second matrix, then the key-value pair from the first dictionary will be overwritten. The purpose of this is that
the base_dict will be a complete dictionary of values such that an incomplete second dictionary can be used to
update specific key-value pairs.
:param base_dict: Complete dictionary of key-value pairs.
:param in_dict: Subset of key-values pairs such that values from this dictionary will take precedent.
:return: A merged single dictionary.
"""
for k, v in base_dict.items():
if k in in_dict.keys():
base_dict[k] = in_dict[k]
return base_dict
def sort_dictionary(base_dict: dict) -> dict:
"""
Sort a dictionary based on the dictionary's key values.
:param base_dict: The unsorted dictionary.
:return: A dictionary sorted alphabetically on the dictionary's keys.
"""
return dict(sorted(base_dict.items()))
@dispatch(jnp.DeviceArray)
def standardise(x: jnp.DeviceArray) -> Tuple[jnp.DeviceArray, jnp.DeviceArray, jnp.DeviceArray]:
"""
Standardise a given matrix such that values are distributed according to a unit normal random variable. This is
primarily designed for standardising a training dataset.
:param x: A matrix of unstandardised values
:return: A matrix of standardised values
"""
xmean = jnp.mean(x, axis=0)
xstd = jnp.std(x, axis=0)
return (x - xmean) / xstd, xmean, xstd
@dispatch(jnp.DeviceArray, jnp.DeviceArray, jnp.DeviceArray)
def standardise(
x: jnp.DeviceArray, xmean: jnp.DeviceArray, xstd: jnp.DeviceArray
) -> jnp.DeviceArray:
"""
Standardise a given matrix with respect to a given mean and standard deviation. This is primarily designed for
standardising a test set of data with respect to the training data.
:param x: A matrix of unstandardised values
:param xmean: A precomputed mean vector
:param xstd: A precomputed standard deviation vector
:return: A matrix of standardised values
"""
return (x - xmean) / xstd
def unstandardise(
x: jnp.DeviceArray, xmean: jnp.DeviceArray, xstd: jnp.DeviceArray
) -> jnp.DeviceArray:
"""
Unstandardise a given matrix with respect to a previously computed mean and standard deviation. This is designed
for remapping a matrix back onto its original scale.
:param x: A standardised matrix.
:param xmean: A mean vector.
:param xstd: A standard deviation vector.
:return: A matrix of unstandardised values.
"""
return (x * xstd) + xmean
def as_constant(parameter_set: dict, params: list) -> Tuple[dict, dict]:
base_params = deepcopy(parameter_set)
sparams = {}
for param in params:
sparams[param] = base_params[param]
del base_params[param]
return base_params, sparams
| 33.485981 | 117 | 0.706391 | 513 | 3,583 | 4.88499 | 0.274854 | 0.089385 | 0.040702 | 0.067039 | 0.332003 | 0.300878 | 0.21668 | 0.082203 | 0.052674 | 0.052674 | 0 | 0.000712 | 0.215741 | 3,583 | 106 | 118 | 33.801887 | 0.891103 | 0.532236 | 0 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0.157895 | 0 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c032d43f5d12902206b5df36fccb87158ca21d3e | 466 | py | Python | setup.py | Kamuish/StarSearch | 63e5f6ee544ab1d48ae5b0d8e9067cedccc40d1e | [
"MIT"
] | null | null | null | setup.py | Kamuish/StarSearch | 63e5f6ee544ab1d48ae5b0d8e9067cedccc40d1e | [
"MIT"
] | null | null | null | setup.py | Kamuish/StarSearch | 63e5f6ee544ab1d48ae5b0d8e9067cedccc40d1e | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
from setuptools import setup
setup(name='starsearch',
version='0.3',
description='Package to dig into the ESO archives',
author='João Camacho',
author_email='joao.camacho@astro.up.pt',
license='MIT',
url='https://github.com/jdavidrcamacho/starsearch',
packages=['starsearch'],
install_requires=[
'numpy',
'astroquery',
"astropy",
],
)
| 24.526316 | 57 | 0.592275 | 50 | 466 | 5.48 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011429 | 0.248927 | 466 | 18 | 58 | 25.888889 | 0.771429 | 0.092275 | 0 | 0 | 0 | 0 | 0.389549 | 0.057007 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.066667 | 0 | 0.066667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c03e4bfd7eee3d8023944a7e3e5535ae1233ba11 | 1,341 | py | Python | build.py | jmetzz/coffee-chatbot | da7e76d9532c8e5e38a47a19ffed1f1e27601766 | [
"MIT"
] | null | null | null | build.py | jmetzz/coffee-chatbot | da7e76d9532c8e5e38a47a19ffed1f1e27601766 | [
"MIT"
] | null | null | null | build.py | jmetzz/coffee-chatbot | da7e76d9532c8e5e38a47a19ffed1f1e27601766 | [
"MIT"
] | null | null | null | from pybuilder.core import use_plugin, init
use_plugin("python.core")
use_plugin("python.unittest")
use_plugin("python.install_dependencies")
use_plugin("python.flake8")
use_plugin("python.coverage")
name = "ActionServerPybuilder"
default_task = ['install_dependencies', 'analyze', 'publish']
@init
def set_properties(project):
project.build_depends_on('tblib')
project.build_depends_on('mockito')
project.build_depends_on('parameterized')
project.build_depends_on('responses')
@init
def initialize_flake8_plugin(project):
project.build_depends_on("flake8")
project.set_property('unittest_module_glob', 'test_*')
project.set_property("flake8_verbose_output", True)
project.set_property("flake8_break_build", True)
project.set_property("flake8_max_line_length", 120)
project.set_property("flake8_exclude_patterns", None)
project.set_property("flake8_include_test_sources", False)
project.set_property("flake8_include_scripts", False)
@init
def initialize_coverage_plugin(project):
project.set_property('coverage_break_build', False)
project.set_property('coverage_threshold_warn', 80)
# for now, code coverage does not break the build
# as we do Python, a scripted language, you have to aim for 100% coverage!
project.set_property("coverage_exceptions", ['endpoint'])
| 35.289474 | 78 | 0.774049 | 172 | 1,341 | 5.709302 | 0.418605 | 0.101833 | 0.183299 | 0.14664 | 0.177189 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014286 | 0.112603 | 1,341 | 37 | 79 | 36.243243 | 0.810924 | 0.089485 | 0 | 0.103448 | 0 | 0 | 0.332512 | 0.152709 | 0 | 0 | 0 | 0 | 0 | 1 | 0.103448 | false | 0 | 0.034483 | 0 | 0.137931 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c04331c5a5c72cc4fd22977bf1a531a2facdca4e | 445 | py | Python | Cleaning.py | TharindraParanagama/MovieClassification | 2cdee9a2aaf1f55d0a59b20181e69c524c4d5895 | [
"MIT"
] | null | null | null | Cleaning.py | TharindraParanagama/MovieClassification | 2cdee9a2aaf1f55d0a59b20181e69c524c4d5895 | [
"MIT"
] | null | null | null | Cleaning.py | TharindraParanagama/MovieClassification | 2cdee9a2aaf1f55d0a59b20181e69c524c4d5895 | [
"MIT"
] | null | null | null | import csv
input = open('MovieI.csv', 'rb')
output = open('MovieO.csv', 'wb')
writer = csv.writer(output)
for row in csv.reader(input):
for i in range(len(row)):
if(row[0]==''):
break
elif(row[1]==''):
break
elif(row[2]==''):
break
elif(row[3]==''):
break
elif(row[4]==''):
break
else :writer.writerow(row)
input.close()
output.close() | 21.190476 | 33 | 0.483146 | 56 | 445 | 3.839286 | 0.5 | 0.167442 | 0.223256 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016611 | 0.323596 | 445 | 21 | 34 | 21.190476 | 0.697674 | 0 | 0 | 0.263158 | 0 | 0 | 0.053812 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.052632 | 0 | 0.052632 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c046c72c4e753549e8ec891d9f48179094bc06ed | 775 | py | Python | manage.py | BeyondLam/Flask_Blog_Python3 | 274c932e9ea28bb6c83335e408a2cd9f1cf4fcb6 | [
"Apache-2.0"
] | 2 | 2019-10-25T16:35:41.000Z | 2019-10-26T10:54:00.000Z | manage.py | BeyondLam/Flask_Blog_Python3 | 274c932e9ea28bb6c83335e408a2cd9f1cf4fcb6 | [
"Apache-2.0"
] | null | null | null | manage.py | BeyondLam/Flask_Blog_Python3 | 274c932e9ea28bb6c83335e408a2cd9f1cf4fcb6 | [
"Apache-2.0"
] | null | null | null | from app import create_app, db
from flask_script import Manager
from flask_migrate import Migrate, MigrateCommand
app = create_app("develop")
manager = Manager(app)
Migrate(app, db)
manager.add_command("db", MigrateCommand)
# 初始化管理员账号数据,添加manager命令
@manager.command
def create_admin():
from app.models import Admin
from config_message.constant import ADMIN_USERNAME, ADMIN_PASSWORD, ADMIN_AVATAR_URL, ADMIN_POWER
try:
admin_new = Admin(username=ADMIN_USERNAME, password=ADMIN_PASSWORD, avatar=ADMIN_AVATAR_URL,
power=ADMIN_POWER)
db.session.add(admin_new)
db.session.commit()
print("初始化成功")
except:
print("初始化失败")
db.session.rollback()
if __name__ == '__main__':
manager.run() | 27.678571 | 101 | 0.707097 | 96 | 775 | 5.427083 | 0.40625 | 0.074856 | 0.069098 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.20129 | 775 | 28 | 102 | 27.678571 | 0.84168 | 0.028387 | 0 | 0 | 0 | 0 | 0.035904 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0.090909 | 0.227273 | 0 | 0.272727 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c047ab7812a83340a4a3ccb035cf5db37d2b6b67 | 2,954 | py | Python | qiling/qiling/cc/intel.py | mrTavas/owasp-fstm-auto | 6e9ff36e46d885701c7419db3eca15f12063a7f3 | [
"CC0-1.0"
] | 2 | 2021-05-05T12:03:01.000Z | 2021-06-04T14:27:15.000Z | qiling/qiling/cc/intel.py | mrTavas/owasp-fstm-auto | 6e9ff36e46d885701c7419db3eca15f12063a7f3 | [
"CC0-1.0"
] | null | null | null | qiling/qiling/cc/intel.py | mrTavas/owasp-fstm-auto | 6e9ff36e46d885701c7419db3eca15f12063a7f3 | [
"CC0-1.0"
] | 2 | 2021-05-05T12:03:09.000Z | 2021-06-04T14:27:21.000Z | #!/usr/bin/env python3
#
# Cross Platform and Multi Architecture Advanced Binary Emulation Framework
from unicorn.x86_const import (
UC_X86_REG_AX, UC_X86_REG_EAX, UC_X86_REG_RAX, UC_X86_REG_RCX,
UC_X86_REG_RDI, UC_X86_REG_RDX, UC_X86_REG_RSI, UC_X86_REG_R8,
UC_X86_REG_R9, UC_X86_REG_R10
)
from qiling import Qiling
from . import QlCommonBaseCC
class QlIntelBaseCC(QlCommonBaseCC):
"""Calling convention base class for Intel-based systems.
Supports arguments passing over registers and stack.
"""
def __init__(self, ql: Qiling):
retreg = {
16: UC_X86_REG_AX,
32: UC_X86_REG_EAX,
64: UC_X86_REG_RAX
}[ql.archbit]
super().__init__(ql, retreg)
def unwind(self, nslots: int) -> int:
# no cleanup; just pop out the return address
return self.ql.arch.stack_pop()
class QlIntel64(QlIntelBaseCC):
"""Calling convention base class for Intel-based 64-bit systems.
"""
@staticmethod
def getNumSlots(argbits: int) -> int:
return max(argbits, 64) // 64
class QlIntel32(QlIntelBaseCC):
"""Calling convention base class for Intel-based 32-bit systems.
"""
@staticmethod
def getNumSlots(argbits: int) -> int:
return max(argbits, 32) // 32
def getRawParam(self, slot: int, nbits: int = None) -> int:
__super_getparam = super().getRawParam
if nbits == 64:
lo = __super_getparam(slot)
hi = __super_getparam(slot + 1)
val = (hi << 32) | lo
else:
val = __super_getparam(slot, nbits)
return val
class amd64(QlIntel64):
"""Default calling convention for POSIX (x86-64).
First 6 arguments are passed in regs, the rest are passed on the stack.
"""
_argregs = (UC_X86_REG_RDI, UC_X86_REG_RSI, UC_X86_REG_RDX, UC_X86_REG_R10, UC_X86_REG_R8, UC_X86_REG_R9) + (None, ) * 10
class ms64(QlIntel64):
"""Default calling convention for Windows and UEFI (x86-64).
First 4 arguments are passed in regs, the rest are passed on the stack.
Each stack frame starts with a shadow space in size of 4 items, corresponding
to the first arguments passed in regs.
"""
_argregs = (UC_X86_REG_RCX, UC_X86_REG_RDX, UC_X86_REG_R8, UC_X86_REG_R9) + (None, ) * 12
_shadow = 4
class macosx64(QlIntel64):
"""Default calling convention for Mac OS (x86-64).
First 6 arguments are passed in regs, the rest are passed on the stack.
"""
_argregs = (UC_X86_REG_RDI, UC_X86_REG_RSI, UC_X86_REG_RDX, UC_X86_REG_RCX, UC_X86_REG_R8, UC_X86_REG_R9) + (None, ) * 10
class cdecl(QlIntel32):
"""Calling convention used by all operating systems (x86).
All arguments are passed on the stack.
The caller is resopnsible to unwind the stack.
"""
_argregs = (None, ) * 16
class stdcall(QlIntel32):
"""Calling convention used by all operating systems (x86).
All arguments are passed on the stack.
The callee is resopnsible to unwind the stack.
"""
_argregs = (None, ) * 16
def unwind(self, nslots: int) -> int:
retaddr = super().unwind(nslots)
self.ql.reg.arch_sp += (nslots * self._asize)
return retaddr
| 26.854545 | 122 | 0.728842 | 464 | 2,954 | 4.387931 | 0.286638 | 0.071218 | 0.113949 | 0.034381 | 0.556974 | 0.503929 | 0.479371 | 0.426326 | 0.365422 | 0.312377 | 0 | 0.060025 | 0.170955 | 2,954 | 109 | 123 | 27.100917 | 0.771335 | 0.393703 | 0 | 0.163265 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.122449 | false | 0 | 0.061224 | 0.061224 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c04d90069f191974d0ed369a9c73406bd54fa0cc | 2,114 | py | Python | xblock/test/test_json_conversion.py | edly-io/XBlock | 60d01a32e5bfe1b543f598cbc56ba3f4d736129d | [
"Apache-2.0"
] | null | null | null | xblock/test/test_json_conversion.py | edly-io/XBlock | 60d01a32e5bfe1b543f598cbc56ba3f4d736129d | [
"Apache-2.0"
] | null | null | null | xblock/test/test_json_conversion.py | edly-io/XBlock | 60d01a32e5bfe1b543f598cbc56ba3f4d736129d | [
"Apache-2.0"
] | null | null | null | """
Tests asserting that ModelTypes convert to and from json when working
with ModelDatas
"""
# Allow inspection of private class members
# pylint: disable=protected-access
from mock import Mock
from xblock.core import XBlock
from xblock.fields import Field, Scope, ScopeIds
from xblock.field_data import DictFieldData
from xblock.test.tools import TestRuntime
class TestJSONConversionField(Field):
"""Field for testing json conversion"""
__test__ = False
def from_json(self, value):
assert value['$type'] == 'set'
return set(value['$vals'])
def to_json(self, value):
return {
'$type': 'set',
'$vals': sorted(value)
}
class TestBlock(XBlock):
"""XBlock for testing json conversion"""
__test__ = False
field_a = TestJSONConversionField(scope=Scope.content)
field_b = TestJSONConversionField(scope=Scope.content)
class TestModel(DictFieldData):
"""ModelData for testing json conversion"""
__test__ = False
def default(self, block, name):
return {'$type': 'set', '$vals': [0, 1]}
class TestJsonConversion:
"""
Verify that all ModelType operations correctly convert
the json that comes out of the ModelData to python objects
"""
def setup_method(self):
"""
Setup for each test method in this class.
"""
field_data = TestModel({
'field_a': {'$type': 'set', '$vals': [1, 2, 3]}
})
runtime = TestRuntime(services={'field-data': field_data})
self.block = TestBlock(runtime, scope_ids=Mock(spec=ScopeIds)) # pylint: disable=attribute-defined-outside-init
def test_get(self):
# Test field with a value
assert isinstance(self.block.field_a, set)
# Test ModelData default
assert isinstance(self.block.field_b, set)
def test_set(self):
self.block.field_b = set([5, 6, 5])
self.block.save()
assert isinstance(self.block.field_b, set)
assert {'$type': 'set', '$vals': [5, 6]} == \
self.block._field_data.get(self.block, 'field_b')
| 28.186667 | 120 | 0.64333 | 256 | 2,114 | 5.191406 | 0.359375 | 0.060948 | 0.063205 | 0.045147 | 0.16629 | 0.130173 | 0.105342 | 0 | 0 | 0 | 0 | 0.006215 | 0.238884 | 2,114 | 74 | 121 | 28.567568 | 0.819764 | 0.245033 | 0 | 0.128205 | 0 | 0 | 0.058284 | 0 | 0 | 0 | 0 | 0 | 0.128205 | 1 | 0.153846 | false | 0 | 0.128205 | 0.051282 | 0.589744 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c04f13b9a712c28cf890f8bd241f887d6602c688 | 42,844 | py | Python | modisco/coordproducers.py | Bluedragon137/tfmodisco | d7c56b21e1bb58b07695ef3035f173b7d1a039e6 | [
"MIT"
] | null | null | null | modisco/coordproducers.py | Bluedragon137/tfmodisco | d7c56b21e1bb58b07695ef3035f173b7d1a039e6 | [
"MIT"
] | null | null | null | modisco/coordproducers.py | Bluedragon137/tfmodisco | d7c56b21e1bb58b07695ef3035f173b7d1a039e6 | [
"MIT"
] | null | null | null | from __future__ import division, print_function, absolute_import
from .core import SeqletCoordinates
from modisco import util
import numpy as np
from collections import defaultdict, Counter, OrderedDict
import itertools
import sys
import time
from .value_provider import (
AbstractValTransformer, AbsPercentileValTransformer,
SignedPercentileValTransformer, PrecisionValTransformer)
import scipy
from sklearn.isotonic import IsotonicRegression
SUBSAMPLE_CAP = 1000000
#The only parts of TransformAndThresholdResults that are used in
# TfModiscoWorkflow are the transformed_pos/neg_thresholds and the
# val_transformer (used in metaclustering with multiple tasks)
#TransformAndThresholdResults are also used to be
# able to replicate the same procedure used for identifying coordinates as
# when TfMoDisco was first run; the information needed in that case would
# be specific to the type of Coordproducer used
class AbstractTransformAndThresholdResults(object):
def __init__(self, transformed_neg_threshold, transformed_pos_threshold,
val_transformer):
self.transformed_neg_threshold = transformed_neg_threshold
self.transformed_pos_threshold = transformed_pos_threshold
self.val_transformer = val_transformer
@classmethod
def from_hdf5(cls, grp):
if "class" not in grp.attrs:
the_class = FWACTransformAndThresholdResults
else:
the_class = eval(grp.attrs["class"])
if (the_class.__name__ != cls.__name__):
return the_class.from_hdf5(grp)
class BasicTransformAndThresholdResults(AbstractTransformAndThresholdResults):
def save_hdf5(self, grp):
grp.attrs["class"] = type(self).__name__
grp.attrs["transformed_neg_threshold"] = self.transformed_neg_threshold
grp.attrs["transformed_pos_threshold"] = self.transformed_pos_threshold
self.val_transformer.save_hdf5(grp.create_group("val_transformer"))
@classmethod
def load_basic_attrs_from_hdf5(cls, grp):
transformed_neg_threshold = grp.attrs['transformed_neg_threshold']
transformed_pos_threshold = grp.attrs['transformed_pos_threshold']
val_transformer = AbstractValTransformer.from_hdf5(
grp["val_transformer"])
return (transformed_neg_threshold, transformed_pos_threshold,
val_transformer)
@classmethod
def from_hdf5(cls, grp):
the_class = eval(grp.attrs["class"])
(transformed_neg_threshold,
transformed_pos_threshold,
val_transformer) = cls.load_basic_attrs_from_hdf5(grp)
return cls(transformed_neg_threshold=transformed_neg_threshold,
transformed_pos_threshold=transformed_pos_threshold,
val_transformer=val_transformer)
#FWAC = FixedWindowAroundChunks; this TransformAndThresholdResults object
# is specific to the type of info needed in that case.
class FWACTransformAndThresholdResults(
BasicTransformAndThresholdResults):
def __init__(self, neg_threshold,
transformed_neg_threshold,
pos_threshold,
transformed_pos_threshold,
val_transformer):
#both 'transformed_neg_threshold' and 'transformed_pos_threshold'
# should be positive, i.e. they should be relative to the
# transformed distribution used to set the threshold, e.g. a
# cdf value
self.neg_threshold = neg_threshold
self.pos_threshold = pos_threshold
super(FWACTransformAndThresholdResults, self).__init__(
transformed_neg_threshold=transformed_neg_threshold,
transformed_pos_threshold=transformed_pos_threshold,
val_transformer=val_transformer)
def save_hdf5(self, grp):
super(FWACTransformAndThresholdResults, self).save_hdf5(grp)
grp.attrs["neg_threshold"] = self.neg_threshold
grp.attrs["pos_threshold"] = self.pos_threshold
@classmethod
def from_hdf5(cls, grp):
(transformed_neg_threshold, transformed_pos_threshold,
val_transformer) = cls.load_basic_attrs_from_hdf5(grp)
neg_threshold = grp.attrs['neg_threshold']
pos_threshold = grp.attrs['pos_threshold']
return cls(neg_threshold=neg_threshold,
transformed_neg_threshold=transformed_neg_threshold,
pos_threshold=pos_threshold,
transformed_pos_threshold=transformed_pos_threshold,
val_transformer=val_transformer)
class AbstractCoordProducer(object):
def __call__(self):
raise NotImplementedError()
@classmethod
def from_hdf5(cls, grp):
the_class = eval(grp.attrs["class"])
return the_class.from_hdf5(grp)
class SeqletCoordsFWAP(SeqletCoordinates):
"""
Coordinates for the FixedWindowAroundChunks CoordProducer
"""
def __init__(self, example_idx, start, end, score, other_info={}):
self.score = score
self.other_info = other_info
super(SeqletCoordsFWAP, self).__init__(
example_idx=example_idx,
start=start, end=end,
is_revcomp=False)
class CoordProducerResults(object):
def __init__(self, coords, tnt_results):
self.coords = coords
self.tnt_results = tnt_results
@classmethod
def from_hdf5(cls, grp):
coord_strings = util.load_string_list(dset_name="coords",
grp=grp)
coords = [SeqletCoordinates.from_string(x) for x in coord_strings]
tnt_results = AbstractTransformAndThresholdResults.from_hdf5(
grp["tnt_results"])
return CoordProducerResults(coords=coords,
tnt_results=tnt_results)
def save_hdf5(self, grp):
util.save_string_list(
string_list=[str(x) for x in self.coords],
dset_name="coords",
grp=grp)
self.tnt_results.save_hdf5(
grp=grp.create_group("tnt_results"))
def get_simple_window_sum_function(window_size):
def window_sum_function(arrs):
to_return = []
for arr in arrs:
cumsum = np.cumsum(arr)
cumsum = np.array([0]+list(cumsum))
to_return.append(cumsum[window_size:]-cumsum[:-window_size])
return to_return
return window_sum_function
class GenerateNullDist(object):
def __call__(self, score_track):
raise NotImplementedError()
class TakeSign(GenerateNullDist):
@classmethod
def from_hdf5(cls, grp):
raise NotImplementedError()
def save_hdf(cls, grp):
raise NotImplementedError()
def __call__(self, score_track):
null_tracks = [np.sign(x) for x in score_track]
return null_tracks
class TakeAbs(GenerateNullDist):
@classmethod
def from_hdf5(cls, grp):
raise NotImplementedError()
def save_hdf(cls, grp):
raise NotImplementedError()
def __call__(self, score_track):
null_tracks = [np.abs(x) for x in score_track]
return null_tracks
class LaplaceNullDist(GenerateNullDist):
def __init__(self, num_to_samp, verbose=True,
percentiles_to_use=[5*(x+1) for x in range(19)],
random_seed=1234):
self.num_to_samp = num_to_samp
self.verbose = verbose
self.percentiles_to_use = np.array(percentiles_to_use)
self.random_seed = random_seed
self.rng = np.random.RandomState()
@classmethod
def from_hdf5(cls, grp):
num_to_samp = grp.attrs["num_to_samp"]
verbose = grp.attrs["verbose"]
percentiles_to_use = np.array(grp["percentiles_to_use"][:])
return cls(num_to_samp=num_to_samp, verbose=verbose)
def save_hdf5(self, grp):
grp.attrs["class"] = type(self).__name__
grp.attrs["num_to_samp"] = self.num_to_samp
grp.attrs["verbose"] = self.verbose
grp.create_dataset('percentiles_to_use',
data=self.percentiles_to_use)
def __call__(self, score_track, window_size, original_summed_score_track):
#original_summed_score_track is supplied to avoid recomputing it
if (original_summed_score_track is None):
window_sum_function = get_simple_window_sum_function(window_size)
original_summed_score_track = window_sum_function(arrs=score_track)
values = np.concatenate(original_summed_score_track, axis=0)
# first estimate mu, using two level histogram to get to 1e-6
hist1, bin_edges1 = np.histogram(values, bins=1000)
peak1 = np.argmax(hist1)
l_edge = bin_edges1[peak1]
r_edge = bin_edges1[peak1+1]
top_values = values[ (l_edge < values) & (values < r_edge) ]
hist2, bin_edges2 = np.histogram(top_values, bins=1000)
peak2 = np.argmax(hist2)
l_edge = bin_edges2[peak2]
r_edge = bin_edges2[peak2+1]
mu = (l_edge + r_edge) / 2
if (self.verbose):
print("peak(mu)=", mu)
pos_values = [x for x in values if x >= mu]
neg_values = [x for x in values if x <= mu]
#for an exponential distribution:
# cdf = 1 - exp(-lambda*x)
# exp(-lambda*x) = 1-cdf
# -lambda*x = log(1-cdf)
# lambda = -log(1-cdf)/x
# x = -log(1-cdf)/lambda
#Take the most aggressive lambda over all percentiles
pos_laplace_lambda = np.max(
-np.log(1-(self.percentiles_to_use/100.0))/
(np.percentile(a=pos_values, q=self.percentiles_to_use)-mu))
neg_laplace_lambda = np.max(
-np.log(1-(self.percentiles_to_use/100.0))/
(np.abs(np.percentile(a=neg_values,
q=100-self.percentiles_to_use)-mu)))
self.rng.seed(self.random_seed)
prob_pos = float(len(pos_values))/(len(pos_values)+len(neg_values))
sampled_vals = []
for i in range(self.num_to_samp):
sign = 1 if (self.rng.uniform() < prob_pos) else -1
if (sign == 1):
sampled_cdf = self.rng.uniform()
val = -np.log(1-sampled_cdf)/pos_laplace_lambda + mu
else:
sampled_cdf = self.rng.uniform()
val = mu + np.log(1-sampled_cdf)/neg_laplace_lambda
sampled_vals.append(val)
return np.array(sampled_vals)
class FlipSignNullDist(GenerateNullDist):
def __init__(self, num_seq_to_samp, shuffle_pos=False,
seed=1234, num_breaks=100,
lower_null_percentile=20,
upper_null_percentile=80):
self.num_seq_to_samp = num_seq_to_samp
self.shuffle_pos = shuffle_pos
self.seed = seed
self.rng = np.random.RandomState()
self.num_breaks = num_breaks
self.lower_null_percentile = lower_null_percentile
self.upper_null_percentile = upper_null_percentile
@classmethod
def from_hdf5(cls, grp):
raise NotImplementedError()
def save_hdf(cls, grp):
raise NotImplementedError()
def __call__(self, score_track, windowsize, original_summed_score_track):
#summed_score_track is supplied to avoid recomputing it
window_sum_function = get_simple_window_sum_function(windowsize)
if (original_summed_score_track is not None):
original_summed_score_track = window_sum_function(arrs=score_track)
all_orig_summed_scores = np.concatenate(
original_summed_score_track, axis=0)
pos_threshold = np.percentile(a=all_orig_summed_scores,
q=self.upper_null_percentile)
neg_threshold = np.percentile(a=all_orig_summed_scores,
q=self.lower_null_percentile)
#retain only the portions of the tracks that are under the
# thresholds
retained_track_portions = []
num_pos_vals = 0
num_neg_vals = 0
for (single_score_track, single_summed_score_track)\
in zip(score_track, original_summed_score_track):
window_passing_track = [
(1.0 if (x > neg_threshold and x < pos_threshold) else 0)
for x in single_summed_score_track]
padded_window_passing_track = [0.0]*int(windowsize-1)
padded_window_passing_track.extend(window_passing_track)
padded_window_passing_track.extend([0.0]*int(windowsize-1))
pos_in_passing_window = window_sum_function(
[padded_window_passing_track])[0]
assert len(single_score_track)==len(pos_in_passing_window)
single_retained_track = []
for (val, pos_passing) in zip(single_score_track,
pos_in_passing_window):
if (pos_passing > 0):
single_retained_track.append(val)
num_pos_vals += (1 if val > 0 else 0)
num_neg_vals += (1 if val < 0 else 0)
retained_track_portions.append(single_retained_track)
print("Fraction of positions retained:",
sum(len(x) for x in retained_track_portions)/
sum(len(x) for x in score_track))
prob_pos = num_pos_vals/float(num_pos_vals + num_neg_vals)
self.rng.seed(self.seed)
null_tracks = []
for i in range(self.num_seq_to_samp):
random_track = retained_track_portions[
int(self.rng.randint(0,len(retained_track_portions)))]
track_with_sign_flips = np.array([
abs(x)*(1 if self.rng.uniform() < prob_pos else -1)
for x in random_track])
if (self.shuffle_pos):
self.rng.shuffle(track_with_sign_flips)
null_tracks.append(track_with_sign_flips)
return np.concatenate(window_sum_function(null_tracks), axis=0)
def get_null_vals(null_track, score_track, window_size,
original_summed_score_track):
if (hasattr(null_track, '__call__')):
null_vals = null_track(
score_track=score_track,
window_size=window_size,
original_summed_score_track=original_summed_score_track)
else:
window_sum_function = get_simple_window_sum_function(window_size)
null_summed_score_track = window_sum_function(arrs=null_track)
null_vals = list(np.concatenate(null_summed_score_track, axis=0))
return null_vals
def subsample_if_large(arr):
if (len(arr) > SUBSAMPLE_CAP):
print("Subsampling!")
sys.stdout.flush()
arr = np.random.RandomState(1234).choice(a=arr, size=SUBSAMPLE_CAP,
replace=False)
return arr
def irval_to_probpos(irval, frac_neg):
#n(x):= pdf of null dist (negatives)
#p(x):= pdf of positive distribution
#f_p:= fraction of positives
#f_n:= fraction of negatives = 1-f_p
#o(x):= pdf of observed distribution = n(x)f_n + p(x)f_p
#The isotonic regression produces a(x) = o(x)/[o(x) + n(x)]
# o(x)/[o(x) + n(x)] = [n(x)f_n + o(x)f_p]/[n(x)(1+f_n) + p(x)]
# a(x)[n(x)(1+f_n) + p(x)f_p] = n(x)f_n + p(x)f_p
# a(x)n(x)(1+f_n) - n(x)f_n = p(x)f_p - a(x)p(x)f_p
# n(x)[a(x)(1+f_n) - f_n] = p(x)f_p[1 - a(x)]
# [a(x)/f_n + (a(x)-1)]/[1-a(x)] = (p(x)f_p)/(n(x)f_n) = r(x)
#p_pos = 1 / (1 + 1/r(x))
# = [a(x)/f_n + (a(x)-1)]/[a(x)/f_n + (a(x)-1) + (1-a(x))]
# = [a(x)/f_n + a(x)-1]/[a(x)/f_n]
# = [a(x) + f_n(a(x)-1)]/a(x)
# = 1 + f_n(a(x)-1)/a(x)
# = 1 + f_n(1 - 1/a(x))
#If solving for p_pos=0, we have -1/(1 - 1/a(x)) = f_n
#As f_n --> 100%, p_pos --> 2 - 1/a(x); this assumes max(a(x)) = 0.5
return np.minimum(np.maximum(1 + frac_neg*(
1 - (1/np.maximum(irval,1e-7))), 0.0), 1.0)
class SavableIsotonicRegression(object):
def __init__(self, origvals, nullvals, increasing, min_frac_neg=0.95):
self.origvals = origvals
self.nullvals = nullvals
self.increasing = increasing
self.min_frac_neg = min_frac_neg
self.ir = IsotonicRegression(out_of_bounds='clip',
increasing=increasing).fit(
X=np.concatenate([self.origvals, self.nullvals], axis=0),
y=([1.0 for x in self.origvals] + [0.0 for x in self.nullvals]),
sample_weight=([1.0 for x in self.origvals]
+[float(len(self.origvals))/len(self.nullvals)
for x in self.nullvals]))
#Infer frac_pos based on the minimum value of the ir probs
#See derivation in irval_to_probpos function
min_prec_x = self.ir.X_min_ if self.increasing else self.ir.X_max_
min_precision = self.ir.transform([min_prec_x])[0]
implied_frac_neg = -1/(1-(1/max(min_precision,1e-7)))
print("For increasing =",increasing,", the minimum IR precision was",
min_precision,"occurring at",min_prec_x,
"implying a frac_neg",
"of",implied_frac_neg)
if (implied_frac_neg > 1.0 or implied_frac_neg < self.min_frac_neg):
implied_frac_neg = max(min(1.0,implied_frac_neg),
self.min_frac_neg)
print("To be conservative, adjusted frac neg is",implied_frac_neg)
self.implied_frac_neg = implied_frac_neg
def transform(self, vals):
return irval_to_probpos(self.ir.transform(vals),
frac_neg=self.implied_frac_neg)
def save_hdf5(self, grp):
grp.attrs['increasing'] = self.increasing
grp.attrs['min_frac_neg'] = self.min_frac_neg
grp.create_dataset('origvals', data=self.origvals)
grp.create_dataset('nullvals', data=self.nullvals)
@classmethod
def from_hdf5(cls, grp):
increasing = grp.attrs['increasing']
min_frac_neg = grp.attrs['min_frac_neg']
origvals = np.array(grp['origvals'])
nullvals = np.array(grp['nullvals'])
return cls(origvals=origvals, nullvals=nullvals,
increasing=increasing, min_frac_neg=min_frac_neg)
def get_isotonic_regression_classifier(orig_vals, null_vals):
orig_vals = subsample_if_large(orig_vals)
null_vals = subsample_if_large(null_vals)
pos_orig_vals = (
np.array(sorted([x for x in orig_vals if x >= 0])))
neg_orig_vals = (
np.array(sorted([x for x in orig_vals if x < 0],
key=lambda x: abs(x))))
pos_null_vals = [x for x in null_vals if x >= 0]
neg_null_vals = [x for x in null_vals if x < 0]
pos_ir = SavableIsotonicRegression(origvals=pos_orig_vals,
nullvals=pos_null_vals, increasing=True)
if (len(neg_orig_vals) > 0):
neg_ir = SavableIsotonicRegression(origvals=neg_orig_vals,
nullvals=neg_null_vals, increasing=False)
else:
neg_ir = None
return pos_ir, neg_ir, orig_vals, null_vals
#sliding in this case would be a list of values
class VariableWindowAroundChunks(AbstractCoordProducer):
count = 0
def __init__(self, sliding, flank, suppress, target_fdr,
min_passing_windows_frac, max_passing_windows_frac,
separate_pos_neg_thresholds,
max_seqlets_total,
progress_update=5000,
verbose=True, plot_save_dir="figures"):
self.sliding = sliding
self.flank = flank
self.suppress = suppress
self.target_fdr = target_fdr
assert max_passing_windows_frac >= min_passing_windows_frac
self.min_passing_windows_frac = min_passing_windows_frac
self.max_passing_windows_frac = max_passing_windows_frac
self.separate_pos_neg_thresholds = separate_pos_neg_thresholds
self.max_seqlets_total = None
self.progress_update = progress_update
self.verbose = verbose
self.plot_save_dir = plot_save_dir
@classmethod
def from_hdf5(cls, grp):
sliding = np.array(grp["sliding"]).astype("int")
flank = grp.attrs["flank"]
suppress = grp.attrs["suppress"]
target_fdr = grp.attrs["target_fdr"]
min_passing_windows_frac = grp.attrs["min_passing_windows_frac"]
max_passing_windows_frac = grp.attrs["max_passing_windows_frac"]
separate_pos_neg_thresholds = grp.attrs["separate_pos_neg_thresholds"]
if ("max_seqlets_total" in grp.attrs):
max_seqlets_total = grp.attrs["max_seqlets_total"]
else:
max_seqlets_total = None
progress_update = grp.attrs["progress_update"]
verbose = grp.attrs["verbose"]
return cls(sliding=sliding, flank=flank, suppress=suppress,
target_fdr=target_fdr,
min_passing_windows_frac=min_passing_windows_frac,
max_passing_windows_frac=max_passing_windows_frac,
separate_pos_neg_thresholds=separate_pos_neg_thresholds,
max_seqlets_total=max_seqlets_total,
progress_update=progress_update, verbose=verbose)
def save_hdf5(self, grp):
grp.attrs["class"] = type(self).__name__
grp.create_dataset("sliding", data=np.array(self.sliding))
grp.attrs["flank"] = self.flank
grp.attrs["suppress"] = self.suppress
grp.attrs["target_fdr"] = self.target_fdr
grp.attrs["min_passing_windows_frac"] = self.min_passing_windows_frac
grp.attrs["max_passing_windows_frac"] = self.max_passing_windows_frac
grp.attrs["separate_pos_neg_thresholds"] =\
self.separate_pos_neg_thresholds
if (self.max_seqlets_total is not None):
grp.attrs["max_seqlets_total"] = self.max_seqlets_total
grp.attrs["progress_update"] = self.progress_update
grp.attrs["verbose"] = self.verbose
def fit_pos_and_neg_irs(self, score_track, null_track):
pos_irs = []
neg_irs = []
for sliding_window_size in self.sliding:
window_sum_function = get_simple_window_sum_function(
sliding_window_size)
print("Fitting - on window size",sliding_window_size)
if (hasattr(null_track, '__call__')):
null_vals = null_track(
score_track=score_track,
window_size=sliding_window_size,
original_summed_score_track=None)
else:
null_summed_score_track = window_sum_function(arrs=null_track)
null_vals = np.concatenate(null_summed_score_track,
axis=0)
print("Computing window sums")
sys.stdout.flush()
window_sums_rows = window_sum_function(arrs=score_track)
print("Done computing window sums")
sys.stdout.flush()
orig_vals = np.concatenate(window_sums_rows, axis=0)
pos_ir, neg_ir, subsampled_orig_vals, subsampled_null_vals =\
get_isotonic_regression_classifier(
orig_vals=np.concatenate(window_sums_rows, axis=0),
null_vals=null_vals)
make_nulldist_figure(orig_vals=subsampled_orig_vals,
null_vals=subsampled_null_vals,
pos_ir=pos_ir, neg_ir=neg_ir,
pos_threshold=None,
neg_threshold=None)
util.show_or_savefig(plot_save_dir=self.plot_save_dir,
filename="scoredist_window"
+str(sliding_window_size)+"_"
+str(VariableWindowAroundChunks.count)+".png")
pos_irs.append(pos_ir)
neg_irs.append(neg_ir)
return pos_irs, neg_irs
def __call__(self, score_track, null_track, tnt_results=None):
if (tnt_results is None):
pos_irs, neg_irs = self.fit_pos_and_neg_irs(
score_track=score_track,
null_track=null_track)
precision_transformer = PrecisionValTransformer(
sliding_window_sizes=self.sliding,
pos_irs=pos_irs,
neg_irs=neg_irs)
(precisiontransformed_score_track,
precisiontransformed_bestwindowsizeidxs) =\
precision_transformer.transform_score_track(
score_track=score_track)
subsampled_prec_vals = subsample_if_large(
np.concatenate(precisiontransformed_score_track, axis=0))
from matplotlib import pyplot as plt
plt.plot(sorted(subsampled_prec_vals),
(np.arange(len(subsampled_prec_vals))/
len(subsampled_prec_vals)))
plt.xlabel("Tranformed IR precision value")
plt.ylabel("CDF")
util.show_or_savefig(plot_save_dir=self.plot_save_dir,
filename="final_prec_vals_cdf_dist"
+str(VariableWindowAroundChunks.count)+".png")
#Pick a threshold according the the precisiontransformed score track
pos_threshold = (1-self.target_fdr)
neg_threshold = -(1-self.target_fdr)
pos_threshold, neg_threshold =\
refine_thresholds_based_on_frac_passing(
vals=subsampled_prec_vals,
pos_threshold=pos_threshold,
neg_threshold=neg_threshold,
min_passing_windows_frac=self.min_passing_windows_frac,
max_passing_windows_frac=self.max_passing_windows_frac,
separate_pos_neg_thresholds=self.separate_pos_neg_thresholds,
verbose=self.verbose)
tnt_results = BasicTransformAndThresholdResults(
transformed_neg_threshold=neg_threshold,
transformed_pos_threshold=pos_threshold,
val_transformer=precision_transformer)
else:
precision_transformer = tnt_results.val_transformer
(precisiontransformed_score_track,
precisiontransformed_bestwindowsizeidxs) =\
precision_transformer.transform_score_track(
score_track=score_track)
#Need to remove padding because identify_coords is assumed to
# operate on a scoretrack that has already been processed with
# a sliding window of window_size (and assumes that partial windows
# were not included)
left_padding_to_remove = int((max(self.sliding)-1)/2)
right_padding_to_remove = (max(self.sliding)-1)-left_padding_to_remove
coords = identify_coords(
score_track=[x[left_padding_to_remove:-right_padding_to_remove]
for x in precisiontransformed_score_track],
pos_threshold=tnt_results.transformed_pos_threshold,
neg_threshold=tnt_results.transformed_neg_threshold,
window_size=max(self.sliding),
flank=self.flank,
suppress=self.suppress,
max_seqlets_total=self.max_seqlets_total,
verbose=self.verbose,
other_info_tracks={'best_window_idx':
[x[left_padding_to_remove:-right_padding_to_remove] for x in
precisiontransformed_bestwindowsizeidxs]})
VariableWindowAroundChunks.count += 1
return CoordProducerResults(
coords=coords,
tnt_results=tnt_results)
#identify_coords is expecting something that has already been processed
# with sliding windows of size window_size
def identify_coords(score_track, pos_threshold, neg_threshold,
window_size, flank, suppress,
max_seqlets_total, verbose, other_info_tracks={}):
for other_info_track in other_info_tracks.values():
assert all([x.shape==y.shape for x,y
in zip(other_info_track,score_track)])
#cp_score_track = 'copy' of the score track, which can be modified as
# coordinates are identified
cp_score_track = [np.array(x) for x in score_track]
#if a position is less than the threshold, set it to -np.inf
#Note that the threshold comparisons need to be >= and not just > for
# cases where there are lots of ties at the high end (e.g. with an IR
# tranformation that gives a lot of values that have a precision of 1.0)
cp_score_track = [
np.array([np.abs(y) if (y >= pos_threshold
or y <= neg_threshold)
else -np.inf for y in x])
for x in cp_score_track]
coords = []
for example_idx,single_score_track in enumerate(cp_score_track):
#set the stuff near the flanks to -np.inf so that we
# don't pick it up during argmax
single_score_track[0:flank] = -np.inf
single_score_track[len(single_score_track)-(flank):
len(single_score_track)] = -np.inf
while True:
argmax = np.argmax(single_score_track,axis=0)
max_val = single_score_track[argmax]
#bail if exhausted everything that passed the threshold
#and was not suppressed
if (max_val == -np.inf):
break
#need to be able to expand without going off the edge
if ((argmax >= flank) and
(argmax < (len(single_score_track)-flank))):
coord = SeqletCoordsFWAP(
example_idx=example_idx,
start=argmax-flank,
end=argmax+window_size+flank,
score=score_track[example_idx][argmax],
other_info = dict([
(track_name, track[example_idx][argmax])
for (track_name, track) in other_info_tracks.items()]))
assert (coord.score >= pos_threshold
or coord.score <= neg_threshold)
coords.append(coord)
else:
assert False,\
("This shouldn't happen because I set stuff near the"
"border to -np.inf early on")
#suppress the chunks within +- suppress
left_supp_idx = int(max(np.floor(argmax+0.5-suppress),0))
right_supp_idx = int(min(np.ceil(argmax+0.5+suppress),
len(single_score_track)))
single_score_track[left_supp_idx:right_supp_idx] = -np.inf
if (verbose):
print("Got "+str(len(coords))+" coords")
sys.stdout.flush()
if ((max_seqlets_total is not None) and
len(coords) > max_seqlets_total):
if (verbose):
print("Limiting to top "+str(max_seqlets_total))
sys.stdout.flush()
coords = sorted(coords, key=lambda x: -np.abs(x.score))\
[:max_seqlets_total]
return coords
def refine_thresholds_based_on_frac_passing(
vals, pos_threshold, neg_threshold,
min_passing_windows_frac, max_passing_windows_frac,
separate_pos_neg_thresholds, verbose):
frac_passing_windows =(
sum(vals >= pos_threshold)
+ sum(vals <= neg_threshold))/float(len(vals))
if (verbose):
print("Thresholds from null dist were",
neg_threshold," and ",pos_threshold,
"with frac passing", frac_passing_windows)
pos_vals = [x for x in vals if x >= 0]
neg_vals = [x for x in vals if x < 0]
#deal with edge case of len < 0
pos_vals = [0] if len(pos_vals)==0 else pos_vals
neg_vals = [0] if len(neg_vals)==0 else neg_vals
#adjust the thresholds if the fall outside the min/max
# windows frac
if (frac_passing_windows < min_passing_windows_frac):
if (verbose):
print("Passing windows frac was",
frac_passing_windows,", which is below ",
min_passing_windows_frac,"; adjusting")
if (separate_pos_neg_thresholds):
pos_threshold = np.percentile(
a=pos_vals,
q=100*(1-min_passing_windows_frac))
neg_threshold = np.percentile(
a=neg_vals,
q=100*(min_passing_windows_frac))
else:
pos_threshold = np.percentile(
a=np.abs(vals),
q=100*(1-min_passing_windows_frac))
neg_threshold = -pos_threshold
if (frac_passing_windows > max_passing_windows_frac):
if (verbose):
print("Passing windows frac was",
frac_passing_windows,", which is above ",
max_passing_windows_frac,"; adjusting")
if (separate_pos_neg_thresholds):
pos_threshold = np.percentile(
a=pos_vals,
q=100*(1-max_passing_windows_frac))
neg_threshold = np.percentile(
a=neg_vals,
q=100*(max_passing_windows_frac))
else:
pos_threshold = np.percentile(
a=np.abs(vals),
q=100*(1-max_passing_windows_frac))
neg_threshold = -pos_threshold
if (verbose):
print("New thresholds are",pos_threshold,"and",neg_threshold)
return pos_threshold, neg_threshold
def make_nulldist_figure(orig_vals, null_vals, pos_ir, neg_ir,
pos_threshold, neg_threshold):
from matplotlib import pyplot as plt
fig,ax1 = plt.subplots()
orig_vals = np.array(sorted(orig_vals))
ax1.hist(orig_vals, bins=100, density=True, alpha=0.5)
ax1.hist(null_vals, bins=100, density=True, alpha=0.5)
ax1.set_ylabel("Probability density\n(blue=foreground, orange=null)")
ax1.set_xlabel("Total importance in window")
precisions = pos_ir.transform(orig_vals)
if (neg_ir is not None):
precisions = np.maximum(precisions, neg_ir.transform(orig_vals))
ax2 = ax1.twinx()
ax2.plot(orig_vals, precisions)
if (pos_threshold is not None):
ax2.plot([pos_threshold, pos_threshold], [0.0, 1.0], color="red")
if (neg_threshold is not None):
ax2.plot([neg_threshold, neg_threshold], [0.0, 1.0], color="red")
ax2.set_ylabel("Estimated foreground precision")
ax2.set_ylim(0.0, 1.02)
class FixedWindowAroundChunks(AbstractCoordProducer):
count = 0
def __init__(self, sliding,
flank,
suppress, #flanks to suppress
target_fdr,
min_passing_windows_frac,
max_passing_windows_frac,
separate_pos_neg_thresholds=False,
max_seqlets_total=None,
progress_update=5000,
verbose=True,
plot_save_dir="figures"):
self.sliding = sliding
self.flank = flank
self.suppress = suppress
self.target_fdr = target_fdr
assert max_passing_windows_frac >= min_passing_windows_frac
self.min_passing_windows_frac = min_passing_windows_frac
self.max_passing_windows_frac = max_passing_windows_frac
self.separate_pos_neg_thresholds = separate_pos_neg_thresholds
self.max_seqlets_total = None
self.progress_update = progress_update
self.verbose = verbose
self.plot_save_dir = plot_save_dir
@classmethod
def from_hdf5(cls, grp):
sliding = grp.attrs["sliding"]
flank = grp.attrs["flank"]
suppress = grp.attrs["suppress"]
target_fdr = grp.attrs["target_fdr"]
min_passing_windows_frac = grp.attrs["min_passing_windows_frac"]
max_passing_windows_frac = grp.attrs["max_passing_windows_frac"]
separate_pos_neg_thresholds = grp.attrs["separate_pos_neg_thresholds"]
if ("max_seqlets_total" in grp.attrs):
max_seqlets_total = grp.attrs["max_seqlets_total"]
else:
max_seqlets_total = None
progress_update = grp.attrs["progress_update"]
verbose = grp.attrs["verbose"]
return cls(sliding=sliding, flank=flank, suppress=suppress,
target_fdr=target_fdr,
min_passing_windows_frac=min_passing_windows_frac,
max_passing_windows_frac=max_passing_windows_frac,
separate_pos_neg_thresholds=separate_pos_neg_thresholds,
max_seqlets_total=max_seqlets_total,
progress_update=progress_update, verbose=verbose)
def save_hdf5(self, grp):
grp.attrs["class"] = type(self).__name__
grp.attrs["sliding"] = self.sliding
grp.attrs["flank"] = self.flank
grp.attrs["suppress"] = self.suppress
grp.attrs["target_fdr"] = self.target_fdr
grp.attrs["min_passing_windows_frac"] = self.min_passing_windows_frac
grp.attrs["max_passing_windows_frac"] = self.max_passing_windows_frac
grp.attrs["separate_pos_neg_thresholds"] =\
self.separate_pos_neg_thresholds
if (self.max_seqlets_total is not None):
grp.attrs["max_seqlets_total"] = self.max_seqlets_total
grp.attrs["progress_update"] = self.progress_update
grp.attrs["verbose"] = self.verbose
def __call__(self, score_track, null_track, tnt_results=None):
# score_track now can be a list of arrays,
assert all([len(x.shape)==1 for x in score_track])
window_sum_function = get_simple_window_sum_function(self.sliding)
if (self.verbose):
print("Computing windowed sums on original")
sys.stdout.flush()
original_summed_score_track = window_sum_function(arrs=score_track)
#Determine the window thresholds
if (tnt_results is None):
if (self.verbose):
print("Generating null dist")
sys.stdout.flush()
null_vals = get_null_vals(
null_track=null_track,
score_track=score_track,
window_size=self.sliding,
original_summed_score_track=original_summed_score_track)
if (self.verbose):
print("Computing threshold")
sys.stdout.flush()
orig_vals = list(
np.concatenate(original_summed_score_track, axis=0))
#Note that orig_vals may have been subsampled at this point
pos_ir, neg_ir, subsampled_orig_vals, subsampled_null_vals =\
get_isotonic_regression_classifier(
orig_vals=orig_vals,
null_vals=null_vals)
subsampled_pos_orig_vals = (
np.array(sorted([x for x in subsampled_orig_vals if x >= 0])))
subsampled_neg_orig_vals = (
np.array(sorted([x for x in subsampled_orig_vals if x < 0],
key=lambda x: abs(x))))
subsampled_pos_val_precisions =\
pos_ir.transform(subsampled_pos_orig_vals)
if (len(subsampled_neg_orig_vals) > 0):
subsampled_neg_val_precisions =\
neg_ir.transform(subsampled_neg_orig_vals)
pos_threshold = ([x[1] for x in
zip(subsampled_pos_val_precisions,
subsampled_pos_orig_vals) if x[0]
>= (1-self.target_fdr)]+[subsampled_pos_orig_vals[-1]])[0]
if (len(subsampled_neg_orig_vals) > 0):
neg_threshold = ([x[1] for x in
zip(subsampled_neg_val_precisions,
subsampled_neg_orig_vals) if x[0]
>= (1-self.target_fdr)]+[subsampled_neg_orig_vals[-1]])[0]
else:
neg_threshold = -np.inf
pos_threshold, neg_threshold =\
refine_thresholds_based_on_frac_passing(
vals=subsampled_orig_vals,
pos_threshold=pos_threshold,
neg_threshold=neg_threshold,
min_passing_windows_frac=self.min_passing_windows_frac,
max_passing_windows_frac=self.max_passing_windows_frac,
separate_pos_neg_thresholds=self.separate_pos_neg_thresholds,
verbose=self.verbose)
if (self.separate_pos_neg_thresholds):
val_transformer = SignedPercentileValTransformer(
distribution=orig_vals)
else:
val_transformer = AbsPercentileValTransformer(
distribution=orig_vals)
if (self.verbose):
print("Final raw thresholds are",
neg_threshold," and ",pos_threshold)
print("Final transformed thresholds are",
val_transformer(neg_threshold)," and ",
val_transformer(pos_threshold))
make_nulldist_figure(orig_vals=subsampled_orig_vals,
null_vals=subsampled_null_vals,
pos_ir=pos_ir, neg_ir=neg_ir,
pos_threshold=pos_threshold,
neg_threshold=neg_threshold)
util.show_or_savefig(plot_save_dir=self.plot_save_dir,
filename="scoredist_"
+str(FixedWindowAroundChunks.count)+".png")
FixedWindowAroundChunks.count += 1
tnt_results = FWACTransformAndThresholdResults(
neg_threshold=neg_threshold,
transformed_neg_threshold=val_transformer(neg_threshold),
pos_threshold=pos_threshold,
transformed_pos_threshold=val_transformer(pos_threshold),
val_transformer=val_transformer)
coords = identify_coords(
score_track=original_summed_score_track,
pos_threshold=tnt_results.pos_threshold,
neg_threshold=tnt_results.neg_threshold,
window_size=self.sliding,
flank=self.flank,
suppress=self.suppress,
max_seqlets_total=self.max_seqlets_total,
verbose=self.verbose)
return CoordProducerResults(
coords=coords,
tnt_results=tnt_results)
| 42.294176 | 80 | 0.618336 | 5,161 | 42,844 | 4.80062 | 0.093393 | 0.035922 | 0.045044 | 0.025428 | 0.545891 | 0.478326 | 0.419438 | 0.393849 | 0.352276 | 0.330077 | 0 | 0.010429 | 0.297264 | 42,844 | 1,012 | 81 | 42.335968 | 0.812475 | 0.081062 | 0 | 0.404309 | 0 | 0 | 0.049288 | 0.011457 | 0 | 0 | 0 | 0 | 0.008872 | 1 | 0.064639 | false | 0.069708 | 0.017744 | 0.002535 | 0.135615 | 0.025349 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c0507144735d0e0532afa021b9f51f1bb1e7c543 | 3,908 | py | Python | lib/tests/test_integration.py | OneIdentity/safeguard-sessions-plugin-cyberark-vault | 34f8c7a826b6b89c3c9a649b5395798263b4077f | [
"MIT"
] | null | null | null | lib/tests/test_integration.py | OneIdentity/safeguard-sessions-plugin-cyberark-vault | 34f8c7a826b6b89c3c9a649b5395798263b4077f | [
"MIT"
] | 3 | 2020-08-07T10:41:44.000Z | 2021-01-27T08:56:57.000Z | lib/tests/test_integration.py | OneIdentity/safeguard-sessions-plugin-cyberark-vault | 34f8c7a826b6b89c3c9a649b5395798263b4077f | [
"MIT"
] | null | null | null | #
# Copyright (c) 2019 One Identity
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to
# deal in the Software without restriction, including without limitation the
# rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
# sell copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
# IN THE SOFTWARE.
#
import pytest
from textwrap import dedent
from ..plugin import Plugin
from safeguard.sessions.plugin_impl.test_utils.plugin import assert_plugin_hook_result
def test_cyberark_integration_getting_password(cy_config, cy_account, cy_asset, cy_account_password, connection_parameters):
plugin = Plugin(cy_config)
result = plugin.get_password_list(
**connection_parameters(server_uname=cy_account, server_ip=cy_asset)
)
assert_plugin_hook_result(result, {"passwords": [cy_account_password]})
def test_cyberark_integration_getting_password_for_wrong_user(cy_config, cy_wrong_account, cy_asset, connection_parameters):
plugin = Plugin(cy_config)
result = plugin.get_password_list(
**connection_parameters(server_uname=cy_wrong_account, server_ip=cy_asset)
)
assert_plugin_hook_result(result, {"passwords": []})
def test_cyberark_integration_getting_private_key(cy_config, cy_account_with_key, cy_asset, cy_account_private_key, connection_parameters):
plugin = Plugin(cy_config)
result = plugin.get_private_key_list(
**connection_parameters(server_uname=cy_account_with_key, server_ip=cy_asset)
)
assert_plugin_hook_result(result, {"private_keys": [("ssh-rsa", cy_account_private_key)]})
def test_cyberark_integration_getting_private_key_for_wrong_account(cy_config, cy_wrong_account, cy_asset, connection_parameters):
plugin = Plugin(cy_config)
result = plugin.get_private_key_list(
**connection_parameters(server_uname=cy_wrong_account, server_ip=cy_asset)
)
assert_plugin_hook_result(result, {"private_keys": []})
def test_v10_user_logon(cy_config, cy_account, cy_asset, cy_account_password, connection_parameters):
config = cy_config + "\nauthentication_method=cyberark"
plugin = Plugin(config)
result = plugin.get_password_list(
**connection_parameters(server_uname=cy_account, server_ip=cy_asset)
)
assert_plugin_hook_result(result, {"passwords": [cy_account_password]})
@pytest.mark.skip(reason="I don't know how this was tested before, cannot see settings on our CArk")
def test_v10_ldap_logon(
cy_address,
cy_ldap_username,
cy_ldap_password,
cy_account,
cy_asset,
cy_account_password,
connection_parameters
):
config = dedent(
"""
[cyberark]
address={}
use_credential=explicit
username={}
password={}
authentication_method=ldap
""".format(
cy_address, cy_ldap_username, cy_ldap_password
)
)
plugin = Plugin(config)
result = plugin.get_password_list(
**connection_parameters(server_uname=cy_account, server_ip=cy_asset)
)
assert_plugin_hook_result(result, {"passwords": [cy_account_password]})
| 36.185185 | 139 | 0.753327 | 521 | 3,908 | 5.332054 | 0.305182 | 0.051836 | 0.040317 | 0.055436 | 0.520518 | 0.520518 | 0.491001 | 0.457523 | 0.430886 | 0.430886 | 0 | 0.002468 | 0.170676 | 3,908 | 107 | 140 | 36.523364 | 0.854674 | 0.269447 | 0 | 0.363636 | 0 | 0 | 0.063925 | 0.011963 | 0 | 0 | 0 | 0 | 0.127273 | 1 | 0.109091 | false | 0.254545 | 0.072727 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c054e5df6525caa1e5b886eedf24790796e7312b | 862 | py | Python | WEEKS/CD_Sata-Structures/_RESOURCES/CODESIGNAL/first_duplicate.py | webdevhub42/Lambda | b04b84fb5b82fe7c8b12680149e25ae0d27a0960 | [
"MIT"
] | null | null | null | WEEKS/CD_Sata-Structures/_RESOURCES/CODESIGNAL/first_duplicate.py | webdevhub42/Lambda | b04b84fb5b82fe7c8b12680149e25ae0d27a0960 | [
"MIT"
] | null | null | null | WEEKS/CD_Sata-Structures/_RESOURCES/CODESIGNAL/first_duplicate.py | webdevhub42/Lambda | b04b84fb5b82fe7c8b12680149e25ae0d27a0960 | [
"MIT"
] | null | null | null | def firstDuplicate(a):
number_frequencies, number_indices, duplicate_index = {}, {}, {}
# Iterate through list and increment frequency count
# if number not in dict. Also, note the index asscoiated
# with the value
for i in range(len(a)):
if a[i] not in number_frequencies:
number_frequencies[a[i]] = 1
number_indices[a[i]] = i
elif a[i] in number_frequencies:
if number_frequencies[a[i]] < 2:
number_frequencies[a[i]] += 1
number_indices[a[i]] = i
for number in number_frequencies:
if number_frequencies[number] == 2:
duplicate_index[number] = number_indices[number]
if not duplicate_index:
return -1
else:
minimal_index_key = min(duplicate_index, key=duplicate_index.get)
return minimal_index_key
| 34.48 | 73 | 0.62413 | 111 | 862 | 4.657658 | 0.342342 | 0.263056 | 0.133462 | 0.110251 | 0.286267 | 0.286267 | 0.139265 | 0.139265 | 0.139265 | 0.139265 | 0 | 0.00813 | 0.286543 | 862 | 24 | 74 | 35.916667 | 0.83252 | 0.139211 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c05de0c488b3f0907732a9cffd73ea481b5c0be6 | 10,458 | py | Python | dotfiles/config/feltnerm/bin/dots.py | feltnerm/dotfiles | 0984ade31ecfcd003e1cce4f165fcd717e9b6317 | [
"WTFPL"
] | 4 | 2016-06-19T20:02:12.000Z | 2017-02-27T19:55:49.000Z | dotfiles/config/feltnerm/bin/dots.py | feltnerm/dotfiles | 0984ade31ecfcd003e1cce4f165fcd717e9b6317 | [
"WTFPL"
] | 6 | 2016-01-20T20:24:42.000Z | 2016-08-17T02:31:43.000Z | dotfiles/config/feltnerm/bin/dots.py | feltnerm/dotfiles | 0984ade31ecfcd003e1cce4f165fcd717e9b6317 | [
"WTFPL"
] | null | null | null | #!/usr/bin/env python
# .py
# @TODO:
# - fix the diffing
# - use rsync across hosts or something fancy
import argparse, difflib, functools, re, shutil, subprocess, sys, time, os
from pprint import pprint
__description__ = "Manage your dotfiles."
ls = lambda path: os.listdir(path)
ls_abs = lambda path: [os.path.join(path, x) for x in os.listdir(path)]
ln = lambda src, dst: os.symlink(src, dst)
unlink = lambda src: os.unlink(src)
def rm(path):
try:
if os.path.isdir(path):
os.rmdir(path)
else:
os.remove(path)
except OSError, e:
print(e)
def diff(fromfile, tofile):
if os.path.exists(tofile):
fromdate = time.ctime(os.stat(fromfile).st_mtime)
todate = time.ctime(os.stat(tofile).st_mtime)
fromlines = open(fromfile, 'U').readlines()
tolines = open(tofile, 'U').readlines()
diff = difflib.unified_diff(fromlines, tolines,
fromfile, tofile, fromdate, todate)
return diff
def parse_args(argv):
DEFAULTS = {
'source_dir': os.path.join(os.getenv("HOME"), "dotfiles"),
'dest_dir': os.path.join(os.getenv("HOME"))
}
ap = argparse.ArgumentParser(prog='dots.py',
description=__description__)
ap.add_argument("-d", "--dest-dir", default=DEFAULTS['dest_dir'],
help="Directory to process source files to.")
ap.add_argument("-e", "--exclude", help="Regex of files to exclude")
ap.add_argument("-f", "--force", help="Force the operation to continue.")
ap.add_argument("-i", "--interactive", default=False, action='store_true',
help="Run interactively.")
ap.add_argument("-l", "--list-commands", default=False, action='store_true',
help="List the possible commands.")
ap.add_argument("-n", "--dry-run", default=False, action='store_true',
help="Dry run.")
ap.add_argument("--no-dot",
help="Comma-separated list of files to not append a '.' to")
ap.add_argument("-s", "--source-dir", default=DEFAULTS['source_dir'],
help="Directory to process source files from.")
ap.add_argument("-v", "--version", default=False, action='store_true',
help="Print the version.")
ap.add_argument("-V", "--verbose", default=False, action='store_true',
help="Verbose mode.")
ap.add_argument("commands", nargs="*", help="Command to run.")
args = ap.parse_args(argv)
return args
def ask(msg):
inp = raw_input(msg + " [Y]/n?").lower()
while inp not in ('y','n'):
inp = raw_input(msg + " [Y]/n?").lower()
if inp == 'y':
return True
else:
return False
#
# dotfiles command API
#
class Dotfiles(object):
def __init__(self, opts):
self.options = {
"exclude": [
r'^\.'
],
"exclude_list": [
'README.md',
'LICENSE',
'dots.py',
],
"nodot_list": ['bin']
}
for key in opts.keys():
if opts.get(key, None):
self.options[key] = (opts.get(key))
# Look for the source and destination directories
self.src = self.options.get('source_dir', None)
self.dst = self.options.get('dest_dir', None)
if not (os.path.isdir(self.src) and os.path.isdir(self.dst)):
raise Exception("BAD PATH: <Source: %s> <Dest: %s>" %
(self.src, self.dst))
# Process files which to not add a '.' to
_nodot = self.options.get('nodot', None)
if _nodot:
if "," in _nodot:
self.options['nodot_list'].extend(_nodot.split(','))
else:
self.options['nodot_list'].append(_nodot)
# Process regex for excluding files
_re_excludes = self.options.get('exclude', None)
if _re_excludes:
for _re_exclude in _re_excludes:
_re_exclude = _re_exclude.decode('string_escape')
re_exclude = re.compile(_re_exclude)
self.options['exclude_list'].extend(filter(lambda x: re_exclude.match(x), ls(self.src)))
#_re_exclude = self.options.get('exclude', None)
#if _re_exclude:
# _re_exclude = _re_exclude.decode('string_escape')
# re_exclude = re.compile(_re_exclude)
# self.options['exclude_list'].extend(filter(lambda x: re_exclude.match(x), ls(self.src)))
# Pre-process (cache) these files for quick access
#self.source_files = self._nodots(self._exclude(ls(self.src))()
self.source_files = self._exclude(ls(self.src))()
self.dest_files = ls(self.dst)
self.verbose = self.options.get('verbose')
self.interactive = self.options.get('interactive')
self.dry_run = self.options.get('dry_run')
if self.options.get('verbose', False):
pprint(self.options)
@property
def commands(self):
return filter(lambda method: method.startswith('cmd_'), dir(self))
def _nodots(self, l):
def map_func(x):
return x.lstrip('.') if x in self.options['nodot_list'] else x
return functools.partial(map, map_func, l)
def _exclude(self, l):
def filter_func(x):
return x not in [os.path.basename(a) for a in self.options['exclude_list']]
return functools.partial(filter, filter_func, l)
def _execute(self, cmd, func):
if self.dry_run:
self.verbose = True
self.func = None
if self.verbose:
print("# Execute: %s" % cmd)
if func:
func()
def run(self, command):
cmd = 'cmd_' + command
if hasattr(self, cmd):
func = getattr(self, cmd)
if callable(func):
try:
self._execute(cmd, func)
except Exception, e:
print(e)
#
# Commands API
#
def cmd_init(self):
""" Task to initialize dotfiles in your $HOME for the first time. """
print(">> Initing ...")
commands = ['update', 'diff', 'link']
for cmd in commands:
self.run(cmd)
def cmd_diff(self):
""" Show the differences between $DOTFILES and $HOME. """
print(">> Diffing ...")
for from_file in self.source_files:
fromfile = os.path.join(self.options.get('source_dir'), from_file)
if not os.path.isdir(fromfile):
#to_file = os.path.join(self.options.get('dest_dir', from_file))
tofile = os.path.join(self.options.get('dest_dir'), "." + from_file)
sys.stdout.writelines(diff(fromfile, tofile))
def cmd_link(self):
""" Link files in $DOTFILES to corresponding files in $HOME. """
print(">> Linking ...")
for from_file in self.source_files:
fromfile = os.path.join(self.options.get('source_dir'), from_file)
tofile = os.path.join(self.options.get('dest_dir'), "." + from_file)
nodeExists = os.path.lexists(tofile)
if nodeExists:
print("\nFile %s exists already!" % tofile)
if self.options.get('force', False):
if self.interactive:
if ask("Link %s->%s" % (fromfile, tofile)):
if self.verbose:
print("rm(%s)" % (tofile))
print("ln(%s, %s)" % (fromfile, tofile))
if os.path.islink(tofile):
unlink(tofile)
elif os.path.isdir(tofile):
shutil.rmtree(tofile)
ln(fromfile, tofile)
else:
rm(tofile)
ln(fromfile, tofile)
else:
if self.verbose:
print("rm(%s)" % (tofile))
print("ln(%s, %s)" % (fromfile, tofile))
rm(tofile)
ln(fromfile, tofile)
else:
if self.interactive:
if ask("Link %s->%s" % (fromfile, tofile)):
if self.verbose:
print("ln(%s, %s)" % (fromfile, tofile))
ln(fromfile, tofile)
else:
if self.verbose:
print("ln(%s, %s)" % (fromfile, tofile))
ln(fromfile, tofile)
def cmd_clean(self):
""" Clean the dotfiles in $HOME. """
print(">> Cleaning ...")
for from_file in self.source_files:
fromfile = os.path.join(self.options.get('source_dir'), from_file)
tofile = os.path.join(self.options.get('dest_dir'), "." + from_file)
if os.path.lexists(tofile):
if self.verbose:
print("rm(%s)" % tofile)
rm(tofile)
def cmd_update(self):
""" Update dotfiles and dependencies in $HOME with latest
in the repo(s). """
print(">> Updating ...")
cmd = "cd %s; git pull" % self.options.get('source_dir')
output = subprocess.check_output(
cmd,
stderr=subprocess.STDOUT,
shell=True)
print(output)
def cmd_status(self):
""" Status of $DOTFILES. """
print(">> Status: ")
cmd = "cd %s; git status" % self.options.get('source_dir')
output = subprocess.check_output(
cmd,
stderr=subprocess.STDOUT,
shell=True)
print(output)
def run(commands, opts):
df = Dotfiles(opts)
if not opts['commands'] or opts['list_commands']:
for cmd in df.commands:
print(cmd.split("_")[1]+":")
docstring = getattr(getattr(df, cmd), '__doc__')
print(docstring)
else:
for command in commands:
df.run(command)
#
# main
#
def main(argv=None):
args = parse_args(argv)
run(args.commands, vars(args))
return 0 # success
if __name__ == '__main__':
status = main()
sys.exit(status)
| 34.288525 | 104 | 0.52467 | 1,199 | 10,458 | 4.451209 | 0.199333 | 0.05771 | 0.049841 | 0.018362 | 0.342702 | 0.311598 | 0.273562 | 0.231966 | 0.220161 | 0.21454 | 0 | 0.00029 | 0.339549 | 10,458 | 304 | 105 | 34.401316 | 0.772405 | 0.065309 | 0 | 0.268182 | 0 | 0 | 0.123167 | 0 | 0 | 0 | 0 | 0.003289 | 0 | 0 | null | null | 0 | 0.009091 | null | null | 0.104545 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c05e6da89d714cfca87531c2eed521c2ad804f17 | 246 | py | Python | plot_log_population.py | catskillsresearch/openasr20 | b9821c4ee6a51501e81103c1d6d4db0ea8aaa31e | [
"Apache-2.0"
] | null | null | null | plot_log_population.py | catskillsresearch/openasr20 | b9821c4ee6a51501e81103c1d6d4db0ea8aaa31e | [
"Apache-2.0"
] | null | null | null | plot_log_population.py | catskillsresearch/openasr20 | b9821c4ee6a51501e81103c1d6d4db0ea8aaa31e | [
"Apache-2.0"
] | 1 | 2021-07-28T02:13:21.000Z | 2021-07-28T02:13:21.000Z | import matplotlib.pylab as plt
def plot_log_population(population, _title, _xlabel, _ylabel, _bins):
plt.hist(population,bins=_bins)
plt.xlabel(_xlabel)
plt.ylabel(_ylabel)
plt.title(_title)
plt.yscale('log');
plt.show()
| 24.6 | 69 | 0.707317 | 33 | 246 | 4.969697 | 0.484848 | 0.085366 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 246 | 9 | 70 | 27.333333 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.012195 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c05e9891a35e2b972d23578bd72644f77e52bb11 | 12,711 | py | Python | src/stargazer/stargazer.py | magazino/stargazer | d350959b830b084d31656682721f68b22683ceba | [
"MIT"
] | 1 | 2020-02-16T13:18:39.000Z | 2020-02-16T13:18:39.000Z | src/stargazer/stargazer.py | magazino/stargazer | d350959b830b084d31656682721f68b22683ceba | [
"MIT"
] | 3 | 2017-11-10T14:06:05.000Z | 2020-04-10T08:27:00.000Z | src/stargazer/stargazer.py | magazino/stargazer | d350959b830b084d31656682721f68b22683ceba | [
"MIT"
] | null | null | null | """
Driver class for Hagisonic Stargazer, with no ROS dependencies.
"""
from serial import Serial
from collections import deque
import re
import yaml
import time
import logging
import rospy
import numpy as np
from threading import Thread, Event
from tf import transformations
# STX: char that represents the start of a properly formed message
STX = '~'
# ETX: char that represents the end of a properly formed message
ETX = '`'
# DELIM: char that splits data
DELIM = '|'
# CMD: char that indicates command
CMD = '#'
# CMD: char that indicates command
RESPONSE = '!'
# RESULT: char that indicates that the message contains result data
RESULT = '^'
# NOTIFY: char that indicates a notification message of some kind
NOTIFY = '*'
class StarGazer(object):
def __init__(self, device, marker_map, callback_global=None, callback_local=None,callback_raw=None, callback_raw_reponse=None):
"""
Connect to a Hagisonic StarGazer device and receive poses.
device: The device location for the serial connection.
marker_map: dictionary of marker transforms, formatted:
{marker_id: (4,4) matrix}
callback_global: will be called whenever a new pose is received from the
Stargazer, will be called with (n,4,4) matrix of poses
of the location of the Stargazer in the global frame.
These are computed from marker_map.
callback_local: will be called whenever a new poses is received from the
Stargazer, with a dict: {marker_id: [xyz, angle]}
"""
self.device = device
self.marker_map = marker_map
self.connection = None
# chunk_size: how many characters to read from the serial bus in
# between checking the buffer for the STX/ETX characters
self._chunk_size = 80
self._callback_global = callback_global
self._callback_local = callback_local
self._callback_raw = callback_raw
self._callback_raw_reponse = callback_raw_reponse
self._stopped = Event()
self._thread = None
def __enter__(self):
self.connect()
return self
def __exit__(self, type, value, traceback):
if self.is_connected:
self.disconnect()
@property
def is_connected(self):
"""
Returns whether the driver is currently connected to a serial port.
"""
return self.connection is not None
def connect(self):
"""
Connect to the StarGazer over the specified RS-232 port.
"""
if self.is_connected:
self.disconnect()
self.connection = Serial(port=self.device, baudrate=115200, timeout=1.0)
if self.connection is None:
return False
return True
def disconnect(self):
"""
Disconnects from the StarGazer and closes the RS-232 port.
"""
if self.is_connected:
self.connection.close()
self.connection = None
if self.connection is None:
return True
return False
@property
def is_streaming(self):
"""
Returns whether the driver is currently streaming pose data.
"""
return self._thread is not None
def start_streaming(self):
"""
Begin streaming pose data from the StarGazer.
"""
assert self.is_connected and not self.is_streaming
success = self._send_command('CalcStart')
if success:
self._thread = Thread(target=self._read, args=()).start()
return success
def stop_streaming(self):
"""
Stop streaming pose data from the StarGazer.
"""
assert self.is_connected
if self.is_streaming:
self._stopped.set()
self._thread.join()
success = self._send_command('CalcStop')
return success
def reset_parameters(self):
"""
Stop streaming pose data from the StarGazer.
"""
assert self.is_connected and not self.is_streaming
success = self._send_command('Reset')
return success
def set_parameter(self, name, value):
"""
Set a StarGazer configuration parameter.
This function can only be called while the StarGazer is
connected, but not streaming.
Arguments
---------
name: string name of the parameter to set
value: string value of the parameter to set
Example
-------
set_parameter('MarkType', 'HLD1L')
"""
assert self.is_connected and not self.is_streaming
success = self._send_command(name, value)
return success
def get_parameter(self, name):
pass
def _send_command(self, *args):
"""
Send a command to the StarGazer.
Arguments
---------
command: string, or list. If string of single command, send just that.
if list, reformat to add delimiter character
Example
-------
_send_command('CalcStop')
_send_command('MarkType', 'HLD1L')
"""
success = True
delimited = DELIM.join(str(i) for i in args)
if 'SetEnd' in delimited:
delimited = 'SetEnd'
command_str = STX + CMD + delimited + ETX
rospy.loginfo('Sending command to StarGazer: %s', command_str)
# The StarGazer requires a 50 ms delay between each byte.
for ch in command_str:
self.connection.write(ch)
time.sleep(0.05)
response_expected = STX + RESPONSE + delimited + ETX
success = self._read_response(response_expected)
if success and ('SetEnd' in response_expected):
response_expected = STX + RESPONSE + 'ParameterUpdate' + ETX
time.sleep(1.0)
success = self._read_response(response_expected)
if(success):
rospy.loginfo('Parameters update successful')
return success
def _read_response(self, response_expected):
success = True
try:
response_actual = self.connection.read(len(response_expected))
except Exception as e:
rospy.logwarn(str(e))
sucess = False
return success
# Scan for more incoming characters until we get a read timeout.
# (This is useful if there is still some incoming data from previous
# commands in intermediate serial buffers.)
while response_actual[-len(response_expected):] != response_expected:
c = None
try:
c = self.connection.read()
except Exception as e:
rospy.logwarn(str(e))
return success
if c:
# Add new characters to the response string.
response_actual += c
else:
rospy.logwarn('Received invalid response {%s} expected "{%s}'% \
(response_actual, response_expected))
success = False
break
'''
# If we run out of characters and still don't match, report
# the invalid response as an exception.
raise Exception(
'Command "{:s}" received invalid response "{:s}"; '
'expected "{:s}".'
.format(command_str, response_actual, response_expected)
)
'''
print response_actual
if self._callback_raw_reponse:
self._callback_raw_reponse(response_actual)
return success
def _read(self):
"""
Read from the serial connection to the StarGazer, process buffer,
then execute callbacks.
"""
# Compute a regular expression that returns the last valid
# message in a StarGazer stream.
msg_pattern = ('.*' + STX + '(?P<type>.)(?P<payload>.+)' + ETX +
'(?P<remainder>.*)$')
msg_matcher = re.compile(msg_pattern)
# Compute a regular expression that converts a StarGazer message
# into a list of tuples containing parsed groups.
delimiter = '\\' + DELIM
number = '[\d\+\-\.]'
tag_pattern = (r'(?P<id>\d+)' + delimiter +
r'(?P<yaw>' + number + '+)' + delimiter +
r'(?P<x>' + number + '+)' + delimiter +
r'(?P<y>' + number + '+)' + delimiter +
r'(?P<z>' + number + '+)')
tag_matcher = re.compile(tag_pattern)
def process_buffer(message_buffer):
"""
Looks at current message_buffer string for STX and ETX chars.
Proper behavior is to process string found between STX/ETX for poses
and remove everything in the buffer up the last observed ETX.
Valid readings:
~^148|-175.91|+98.74|+7.10|182.39`
~^248|-176.67|+98.38|+8.39|181.91|370|-178.41|-37.05|+8.97|179.51`
No valid readings:
~*DeadZone`
"""
# Look for a matching message, return the buffer if none are found.
message = msg_matcher.match(message_buffer)
if not message:
return message_buffer
if message.group('type') == RESULT:
markers = tag_matcher.finditer(message.group('payload'))
local_poses = {}
raw_poses = []
for marker in markers:
# Parse pose information for this marker.
_id = marker.group('id')
yaw = -np.radians(float(marker.group('yaw')))
x = 0.01 * float(marker.group('x'))
y = 0.01 * float(marker.group('y'))
# Note: this axis is negated.
z = 0.0#-0.01 * float(marker.group('z'))
raw_pose = [_id,x,y,0,-yaw]
raw_poses.append(raw_pose)
# Convert the pose to a transform and store it by ID.
marker_to_stargazer = fourdof_to_matrix((x, y, z), yaw)
local_poses[_id] = np.linalg.inv(marker_to_stargazer)
if self._callback_raw:
self._callback_raw(raw_poses)
if self._callback_local:
self._callback_local(local_poses)
if self._callback_global:
global_poses, unknown_ids = local_to_global(self.marker_map,
local_poses)
self._callback_global(global_poses, unknown_ids)
elif message.group('type') == NOTIFY:
# TODO: Report deadzone messages in here!
pass
else:
pass
# Return the rest of the message buffer.
return message.group('remainder')
rospy.loginfo('Entering read loop.')
message_buffer = ''
while not self._stopped.is_set() and self.connection:
try:
message_buffer += self.connection.read(self._chunk_size)
message_buffer = process_buffer(message_buffer)
except Exception as e:
rospy.logwarn('Error processing current buffer: %s (content: "%s")',
str(e), message_buffer
)
message_buffer = ''
break # For debugging purposes.
rospy.loginfo('Exited read loop.')
def close(self):
self._stopped.set()
self._send_command('CalcStop')
self.connection.close()
def local_to_global(marker_map, local_poses):
"""
Transform local marker coordinates to map coordinates.
"""
global_poses = dict()
unknown_ids = set()
for _id, pose in local_poses.iteritems():
if _id in marker_map:
marker_to_map = marker_map[_id]
local_to_marker = np.linalg.inv(pose)
local_to_map = np.dot(marker_to_map, local_to_marker)
global_poses[_id] = local_to_map
else:
unknown_ids.add(_id)
return global_poses, unknown_ids
def fourdof_to_matrix(translation, yaw):
"""
Convert from a Cartesian translation and yaw to a homogeneous transform.
"""
T = transformations.rotation_matrix(yaw, [0,0,1])
T[0:3,3] = translation
return T
def _callback_dummy(data):
return
def _callback_print(data):
print(data)
| 33.274869 | 131 | 0.567855 | 1,433 | 12,711 | 4.884159 | 0.23866 | 0.026004 | 0.015002 | 0.012002 | 0.186598 | 0.135305 | 0.100872 | 0.078868 | 0.046864 | 0.046864 | 0 | 0.012881 | 0.346472 | 12,711 | 381 | 132 | 33.362205 | 0.829662 | 0.099205 | 0 | 0.256158 | 0 | 0 | 0.04834 | 0.003174 | 0 | 0 | 0 | 0.002625 | 0.019704 | 0 | null | null | 0.014778 | 0.049261 | null | null | 0.014778 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
fbed4a160c462e80695d00929515e53d559a44ef | 455 | py | Python | amaranth/vendor/xilinx_spartan_3_6.py | psumesh/nmigen | 7d611b8fc1d9e58853ff268ec38ff8f4131a9774 | [
"BSD-2-Clause"
] | 528 | 2020-01-28T18:21:00.000Z | 2021-12-09T06:27:51.000Z | amaranth/vendor/xilinx_spartan_3_6.py | psumesh/nmigen | 7d611b8fc1d9e58853ff268ec38ff8f4131a9774 | [
"BSD-2-Clause"
] | 360 | 2020-01-28T18:34:30.000Z | 2021-12-10T08:03:32.000Z | amaranth/vendor/xilinx_spartan_3_6.py | psumesh/nmigen | 7d611b8fc1d9e58853ff268ec38ff8f4131a9774 | [
"BSD-2-Clause"
] | 100 | 2020-02-06T21:55:46.000Z | 2021-11-25T19:20:44.000Z | import warnings
from .xilinx import XilinxPlatform
__all__ = ["XilinxSpartan3APlatform", "XilinxSpartan6Platform"]
XilinxSpartan3APlatform = XilinxPlatform
XilinxSpartan6Platform = XilinxPlatform
# TODO(amaranth-0.4): remove
warnings.warn("instead of amaranth.vendor.xilinx_spartan_3_6.XilinxSpartan3APlatform and "
".XilinxSpartan6Platform, use amaranth.vendor.xilinx.XilinxPlatform",
DeprecationWarning, stacklevel=2)
| 26.764706 | 90 | 0.782418 | 39 | 455 | 8.948718 | 0.641026 | 0.080229 | 0.114613 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028133 | 0.140659 | 455 | 16 | 91 | 28.4375 | 0.86445 | 0.057143 | 0 | 0 | 0 | 0 | 0.433255 | 0.384075 | 0 | 0 | 0 | 0.0625 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
fbf49444e0f4679af981bbaa8faf8266920ca318 | 1,216 | py | Python | setup.py | mark-dawn/stytra | be1d5be0a44aeb685d475240d056ef7adf60ed06 | [
"MIT"
] | null | null | null | setup.py | mark-dawn/stytra | be1d5be0a44aeb685d475240d056ef7adf60ed06 | [
"MIT"
] | null | null | null | setup.py | mark-dawn/stytra | be1d5be0a44aeb685d475240d056ef7adf60ed06 | [
"MIT"
] | null | null | null | from distutils.core import setup
from setuptools import find_packages
setup(
name="stytra",
version="0.1",
author="Vilim Stih, Luigi Petrucco @portugueslab",
author_email="vilim@neuro.mpg.de",
license="MIT",
packages=find_packages(),
install_requires=[
"pyqtgraph>=0.10.0",
"numpy",
"numba",
"matplotlib",
"pandas",
"qdarkstyle",
"qimage2ndarray",
"deepdish",
"param",
"pims",
"GitPython",
"pymongo",
"colorspacious",
"arrayqueues",
],
classifiers=[
"Development Status :: 3 - Alpha",
"Intended Audience :: Science/Research",
# Pick your license as you wish (should match "license" above)
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6",
],
keywords="tracking processing",
description="A modular package to control stimulation and track behaviour in zebrafish experiments.",
project_urls={
"Source": "https://github.com/portugueslab/stytra",
"Tracker": "https://github.com/portugueslab/stytra/issues",
},
)
| 28.27907 | 105 | 0.591283 | 117 | 1,216 | 6.102564 | 0.769231 | 0.033613 | 0.070028 | 0.072829 | 0.089636 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013575 | 0.273026 | 1,216 | 42 | 106 | 28.952381 | 0.794118 | 0.049342 | 0 | 0.051282 | 0 | 0 | 0.498267 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.051282 | 0 | 0.051282 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
fbf6f8315c8b89ca91d3b286cb9fd7bfaffd9e47 | 83,653 | py | Python | MainUi.py | james646-hs/Fgo_teamup | f1e5c6f514818b68e9abb9eab3c6103fd000819a | [
"MIT"
] | 18 | 2020-05-30T01:41:24.000Z | 2021-03-04T08:07:35.000Z | MainUi.py | james646-hs/Fgo_teamup | f1e5c6f514818b68e9abb9eab3c6103fd000819a | [
"MIT"
] | 1 | 2020-08-13T02:19:42.000Z | 2020-08-13T02:19:42.000Z | MainUi.py | james646-hs/Fgo_teamup | f1e5c6f514818b68e9abb9eab3c6103fd000819a | [
"MIT"
] | 2 | 2020-06-13T18:23:07.000Z | 2020-08-13T02:08:54.000Z | # -*- coding: utf-8 -*-
# Form implementation generated from reading ui file 'MainUi.ui'
#
# Created by: PyQt5 UI code generator 5.13.0
#
# WARNING! All changes made in this file will be lost!
from PyQt5 import QtCore, QtGui, QtWidgets
class Ui_MainWindow(object):
def setupUi(self, MainWindow):
MainWindow.setObjectName("MainWindow")
MainWindow.setEnabled(True)
MainWindow.resize(1070, 837)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Minimum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(MainWindow.sizePolicy().hasHeightForWidth())
MainWindow.setSizePolicy(sizePolicy)
MainWindow.setMinimumSize(QtCore.QSize(0, 0))
MainWindow.setMaximumSize(QtCore.QSize(16777215, 16777215))
self.centralwidget = QtWidgets.QWidget(MainWindow)
self.centralwidget.setObjectName("centralwidget")
self.verticalLayout_2 = QtWidgets.QVBoxLayout(self.centralwidget)
self.verticalLayout_2.setContentsMargins(5, 10, 5, 5)
self.verticalLayout_2.setSpacing(5)
self.verticalLayout_2.setObjectName("verticalLayout_2")
self.groupBox = QtWidgets.QGroupBox(self.centralwidget)
self.groupBox.setTitle("")
self.groupBox.setObjectName("groupBox")
self.verticalLayout_3 = QtWidgets.QVBoxLayout(self.groupBox)
self.verticalLayout_3.setContentsMargins(5, 5, 5, 5)
self.verticalLayout_3.setSpacing(5)
self.verticalLayout_3.setObjectName("verticalLayout_3")
self.gridLayout_3 = QtWidgets.QGridLayout()
self.gridLayout_3.setSizeConstraint(QtWidgets.QLayout.SetDefaultConstraint)
self.gridLayout_3.setObjectName("gridLayout_3")
self.label_costume_state_4 = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_costume_state_4.sizePolicy().hasHeightForWidth())
self.label_costume_state_4.setSizePolicy(sizePolicy)
self.label_costume_state_4.setMaximumSize(QtCore.QSize(16777215, 28))
self.label_costume_state_4.setObjectName("label_costume_state_4")
self.gridLayout_3.addWidget(self.label_costume_state_4, 4, 4, 1, 1)
self.label_servant_state_2 = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_servant_state_2.sizePolicy().hasHeightForWidth())
self.label_servant_state_2.setSizePolicy(sizePolicy)
self.label_servant_state_2.setMinimumSize(QtCore.QSize(110, 65))
self.label_servant_state_2.setMaximumSize(QtCore.QSize(16777215, 65))
self.label_servant_state_2.setObjectName("label_servant_state_2")
self.gridLayout_3.addWidget(self.label_servant_state_2, 2, 1, 1, 1)
self.line_7 = QtWidgets.QFrame(self.groupBox)
self.line_7.setFrameShape(QtWidgets.QFrame.VLine)
self.line_7.setFrameShadow(QtWidgets.QFrame.Sunken)
self.line_7.setObjectName("line_7")
self.gridLayout_3.addWidget(self.line_7, 0, 7, 1, 1)
self.label_costume_state_1 = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_costume_state_1.sizePolicy().hasHeightForWidth())
self.label_costume_state_1.setSizePolicy(sizePolicy)
self.label_costume_state_1.setMaximumSize(QtCore.QSize(16777212, 28))
self.label_costume_state_1.setObjectName("label_costume_state_1")
self.gridLayout_3.addWidget(self.label_costume_state_1, 4, 0, 1, 1)
self.box_skill_confirm = QtWidgets.QCheckBox(self.groupBox)
self.box_skill_confirm.setObjectName("box_skill_confirm")
self.gridLayout_3.addWidget(self.box_skill_confirm, 4, 8, 1, 1)
self.horizontalLayout_29 = QtWidgets.QHBoxLayout()
self.horizontalLayout_29.setObjectName("horizontalLayout_29")
self.btn_select_servant_5 = QtWidgets.QPushButton(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.btn_select_servant_5.sizePolicy().hasHeightForWidth())
self.btn_select_servant_5.setSizePolicy(sizePolicy)
self.btn_select_servant_5.setMinimumSize(QtCore.QSize(92, 100))
self.btn_select_servant_5.setMaximumSize(QtCore.QSize(92, 100))
self.btn_select_servant_5.setText("")
self.btn_select_servant_5.setIconSize(QtCore.QSize(92, 100))
self.btn_select_servant_5.setObjectName("btn_select_servant_5")
self.horizontalLayout_29.addWidget(self.btn_select_servant_5)
self.gridLayout_3.addLayout(self.horizontalLayout_29, 0, 5, 1, 1)
self.horizontalLayout_3 = QtWidgets.QHBoxLayout()
self.horizontalLayout_3.setObjectName("horizontalLayout_3")
self.label = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label.sizePolicy().hasHeightForWidth())
self.label.setSizePolicy(sizePolicy)
self.label.setObjectName("label")
self.horizontalLayout_3.addWidget(self.label)
self.spinbox_required_prob = QtWidgets.QSpinBox(self.groupBox)
self.spinbox_required_prob.setMaximum(100)
self.spinbox_required_prob.setProperty("value", 100)
self.spinbox_required_prob.setObjectName("spinbox_required_prob")
self.horizontalLayout_3.addWidget(self.spinbox_required_prob)
self.gridLayout_3.addLayout(self.horizontalLayout_3, 3, 8, 1, 1)
self.horizontalLayout_24 = QtWidgets.QHBoxLayout()
self.horizontalLayout_24.setObjectName("horizontalLayout_24")
self.btn_select_servant_1 = QtWidgets.QPushButton(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.btn_select_servant_1.sizePolicy().hasHeightForWidth())
self.btn_select_servant_1.setSizePolicy(sizePolicy)
self.btn_select_servant_1.setMinimumSize(QtCore.QSize(92, 100))
self.btn_select_servant_1.setMaximumSize(QtCore.QSize(92, 100))
self.btn_select_servant_1.setText("")
self.btn_select_servant_1.setIconSize(QtCore.QSize(92, 100))
self.btn_select_servant_1.setObjectName("btn_select_servant_1")
self.horizontalLayout_24.addWidget(self.btn_select_servant_1)
self.gridLayout_3.addLayout(self.horizontalLayout_24, 0, 0, 1, 1)
self.label_costume_state_5 = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_costume_state_5.sizePolicy().hasHeightForWidth())
self.label_costume_state_5.setSizePolicy(sizePolicy)
self.label_costume_state_5.setMaximumSize(QtCore.QSize(16777215, 28))
self.label_costume_state_5.setObjectName("label_costume_state_5")
self.gridLayout_3.addWidget(self.label_costume_state_5, 4, 5, 1, 1)
self.label_costume_state_6 = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_costume_state_6.sizePolicy().hasHeightForWidth())
self.label_costume_state_6.setSizePolicy(sizePolicy)
self.label_costume_state_6.setMaximumSize(QtCore.QSize(16777215, 28))
self.label_costume_state_6.setObjectName("label_costume_state_6")
self.gridLayout_3.addWidget(self.label_costume_state_6, 4, 6, 1, 1)
self.horizontalLayout_26 = QtWidgets.QHBoxLayout()
self.horizontalLayout_26.setObjectName("horizontalLayout_26")
self.btn_select_servant_2 = QtWidgets.QPushButton(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.btn_select_servant_2.sizePolicy().hasHeightForWidth())
self.btn_select_servant_2.setSizePolicy(sizePolicy)
self.btn_select_servant_2.setMinimumSize(QtCore.QSize(92, 100))
self.btn_select_servant_2.setMaximumSize(QtCore.QSize(92, 100))
self.btn_select_servant_2.setText("")
self.btn_select_servant_2.setIconSize(QtCore.QSize(92, 100))
self.btn_select_servant_2.setObjectName("btn_select_servant_2")
self.horizontalLayout_26.addWidget(self.btn_select_servant_2)
self.gridLayout_3.addLayout(self.horizontalLayout_26, 0, 1, 1, 1)
self.horizontalLayout_23 = QtWidgets.QHBoxLayout()
self.horizontalLayout_23.setObjectName("horizontalLayout_23")
self.btn_select_master = QtWidgets.QPushButton(self.groupBox)
self.btn_select_master.setMinimumSize(QtCore.QSize(92, 100))
self.btn_select_master.setMaximumSize(QtCore.QSize(92, 100))
self.btn_select_master.setText("")
self.btn_select_master.setIconSize(QtCore.QSize(100, 100))
self.btn_select_master.setObjectName("btn_select_master")
self.horizontalLayout_23.addWidget(self.btn_select_master)
self.gridLayout_3.addLayout(self.horizontalLayout_23, 0, 8, 1, 1)
self.label_servant_state_1 = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_servant_state_1.sizePolicy().hasHeightForWidth())
self.label_servant_state_1.setSizePolicy(sizePolicy)
self.label_servant_state_1.setMinimumSize(QtCore.QSize(110, 65))
self.label_servant_state_1.setMaximumSize(QtCore.QSize(16777215, 65))
self.label_servant_state_1.setObjectName("label_servant_state_1")
self.gridLayout_3.addWidget(self.label_servant_state_1, 2, 0, 1, 1)
self.label_master_state = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_master_state.sizePolicy().hasHeightForWidth())
self.label_master_state.setSizePolicy(sizePolicy)
self.label_master_state.setMinimumSize(QtCore.QSize(0, 0))
self.label_master_state.setMaximumSize(QtCore.QSize(16777215, 28))
self.label_master_state.setObjectName("label_master_state")
self.gridLayout_3.addWidget(self.label_master_state, 2, 8, 1, 1)
self.horizontalLayout_38 = QtWidgets.QHBoxLayout()
self.horizontalLayout_38.setObjectName("horizontalLayout_38")
self.btn_select_costume_3 = QtWidgets.QPushButton(self.groupBox)
self.btn_select_costume_3.setMinimumSize(QtCore.QSize(100, 45))
self.btn_select_costume_3.setMaximumSize(QtCore.QSize(100, 45))
self.btn_select_costume_3.setText("")
self.btn_select_costume_3.setIconSize(QtCore.QSize(100, 150))
self.btn_select_costume_3.setObjectName("btn_select_costume_3")
self.horizontalLayout_38.addWidget(self.btn_select_costume_3)
self.gridLayout_3.addLayout(self.horizontalLayout_38, 3, 2, 1, 1)
self.label_costume_state_2 = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_costume_state_2.sizePolicy().hasHeightForWidth())
self.label_costume_state_2.setSizePolicy(sizePolicy)
self.label_costume_state_2.setMaximumSize(QtCore.QSize(16777215, 28))
self.label_costume_state_2.setObjectName("label_costume_state_2")
self.gridLayout_3.addWidget(self.label_costume_state_2, 4, 1, 1, 1)
self.horizontalLayout_28 = QtWidgets.QHBoxLayout()
self.horizontalLayout_28.setObjectName("horizontalLayout_28")
self.btn_select_servant_4 = QtWidgets.QPushButton(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.btn_select_servant_4.sizePolicy().hasHeightForWidth())
self.btn_select_servant_4.setSizePolicy(sizePolicy)
self.btn_select_servant_4.setMinimumSize(QtCore.QSize(92, 100))
self.btn_select_servant_4.setMaximumSize(QtCore.QSize(92, 100))
self.btn_select_servant_4.setText("")
self.btn_select_servant_4.setIconSize(QtCore.QSize(92, 100))
self.btn_select_servant_4.setObjectName("btn_select_servant_4")
self.horizontalLayout_28.addWidget(self.btn_select_servant_4)
self.gridLayout_3.addLayout(self.horizontalLayout_28, 0, 4, 1, 1)
self.horizontalLayout_36 = QtWidgets.QHBoxLayout()
self.horizontalLayout_36.setObjectName("horizontalLayout_36")
self.btn_select_costume_2 = QtWidgets.QPushButton(self.groupBox)
self.btn_select_costume_2.setMinimumSize(QtCore.QSize(100, 45))
self.btn_select_costume_2.setMaximumSize(QtCore.QSize(100, 45))
self.btn_select_costume_2.setText("")
self.btn_select_costume_2.setIconSize(QtCore.QSize(100, 150))
self.btn_select_costume_2.setObjectName("btn_select_costume_2")
self.horizontalLayout_36.addWidget(self.btn_select_costume_2)
self.gridLayout_3.addLayout(self.horizontalLayout_36, 3, 1, 1, 1)
self.horizontalLayout_46 = QtWidgets.QHBoxLayout()
self.horizontalLayout_46.setObjectName("horizontalLayout_46")
self.btn_select_costume_1 = QtWidgets.QPushButton(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.btn_select_costume_1.sizePolicy().hasHeightForWidth())
self.btn_select_costume_1.setSizePolicy(sizePolicy)
self.btn_select_costume_1.setMinimumSize(QtCore.QSize(100, 45))
self.btn_select_costume_1.setMaximumSize(QtCore.QSize(100, 45))
self.btn_select_costume_1.setText("")
self.btn_select_costume_1.setIconSize(QtCore.QSize(100, 150))
self.btn_select_costume_1.setObjectName("btn_select_costume_1")
self.horizontalLayout_46.addWidget(self.btn_select_costume_1)
self.gridLayout_3.addLayout(self.horizontalLayout_46, 3, 0, 1, 1)
self.label_servant_state_3 = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_servant_state_3.sizePolicy().hasHeightForWidth())
self.label_servant_state_3.setSizePolicy(sizePolicy)
self.label_servant_state_3.setMinimumSize(QtCore.QSize(110, 65))
self.label_servant_state_3.setMaximumSize(QtCore.QSize(16777215, 65))
self.label_servant_state_3.setObjectName("label_servant_state_3")
self.gridLayout_3.addWidget(self.label_servant_state_3, 2, 2, 1, 1)
self.label_servant_state_5 = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_servant_state_5.sizePolicy().hasHeightForWidth())
self.label_servant_state_5.setSizePolicy(sizePolicy)
self.label_servant_state_5.setMinimumSize(QtCore.QSize(110, 65))
self.label_servant_state_5.setMaximumSize(QtCore.QSize(16777215, 65))
self.label_servant_state_5.setObjectName("label_servant_state_5")
self.gridLayout_3.addWidget(self.label_servant_state_5, 2, 5, 1, 1)
self.horizontalLayout_44 = QtWidgets.QHBoxLayout()
self.horizontalLayout_44.setObjectName("horizontalLayout_44")
self.btn_select_costume_6 = QtWidgets.QPushButton(self.groupBox)
self.btn_select_costume_6.setMinimumSize(QtCore.QSize(100, 45))
self.btn_select_costume_6.setMaximumSize(QtCore.QSize(100, 45))
self.btn_select_costume_6.setText("")
self.btn_select_costume_6.setIconSize(QtCore.QSize(100, 150))
self.btn_select_costume_6.setObjectName("btn_select_costume_6")
self.horizontalLayout_44.addWidget(self.btn_select_costume_6)
self.gridLayout_3.addLayout(self.horizontalLayout_44, 3, 6, 1, 1)
self.horizontalLayout_27 = QtWidgets.QHBoxLayout()
self.horizontalLayout_27.setObjectName("horizontalLayout_27")
self.btn_select_servant_3 = QtWidgets.QPushButton(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.btn_select_servant_3.sizePolicy().hasHeightForWidth())
self.btn_select_servant_3.setSizePolicy(sizePolicy)
self.btn_select_servant_3.setMinimumSize(QtCore.QSize(92, 100))
self.btn_select_servant_3.setMaximumSize(QtCore.QSize(92, 100))
self.btn_select_servant_3.setText("")
self.btn_select_servant_3.setIconSize(QtCore.QSize(92, 100))
self.btn_select_servant_3.setObjectName("btn_select_servant_3")
self.horizontalLayout_27.addWidget(self.btn_select_servant_3)
self.gridLayout_3.addLayout(self.horizontalLayout_27, 0, 2, 1, 1)
self.label_costume_state_3 = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_costume_state_3.sizePolicy().hasHeightForWidth())
self.label_costume_state_3.setSizePolicy(sizePolicy)
self.label_costume_state_3.setMaximumSize(QtCore.QSize(16777215, 28))
self.label_costume_state_3.setObjectName("label_costume_state_3")
self.gridLayout_3.addWidget(self.label_costume_state_3, 4, 2, 1, 1)
self.label_servant_state_4 = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_servant_state_4.sizePolicy().hasHeightForWidth())
self.label_servant_state_4.setSizePolicy(sizePolicy)
self.label_servant_state_4.setMinimumSize(QtCore.QSize(110, 65))
self.label_servant_state_4.setMaximumSize(QtCore.QSize(16777215, 65))
self.label_servant_state_4.setObjectName("label_servant_state_4")
self.gridLayout_3.addWidget(self.label_servant_state_4, 2, 4, 1, 1)
self.line_8 = QtWidgets.QFrame(self.groupBox)
self.line_8.setFrameShape(QtWidgets.QFrame.VLine)
self.line_8.setFrameShadow(QtWidgets.QFrame.Sunken)
self.line_8.setObjectName("line_8")
self.gridLayout_3.addWidget(self.line_8, 3, 7, 1, 1)
self.horizontalLayout_40 = QtWidgets.QHBoxLayout()
self.horizontalLayout_40.setObjectName("horizontalLayout_40")
self.btn_select_costume_4 = QtWidgets.QPushButton(self.groupBox)
self.btn_select_costume_4.setMinimumSize(QtCore.QSize(100, 45))
self.btn_select_costume_4.setMaximumSize(QtCore.QSize(100, 45))
self.btn_select_costume_4.setText("")
self.btn_select_costume_4.setIconSize(QtCore.QSize(100, 150))
self.btn_select_costume_4.setObjectName("btn_select_costume_4")
self.horizontalLayout_40.addWidget(self.btn_select_costume_4)
self.gridLayout_3.addLayout(self.horizontalLayout_40, 3, 4, 1, 1)
self.label_servant_state_6 = QtWidgets.QLabel(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_servant_state_6.sizePolicy().hasHeightForWidth())
self.label_servant_state_6.setSizePolicy(sizePolicy)
self.label_servant_state_6.setMinimumSize(QtCore.QSize(110, 65))
self.label_servant_state_6.setMaximumSize(QtCore.QSize(16777215, 65))
self.label_servant_state_6.setObjectName("label_servant_state_6")
self.gridLayout_3.addWidget(self.label_servant_state_6, 2, 6, 1, 1)
self.line_3 = QtWidgets.QFrame(self.groupBox)
self.line_3.setFrameShape(QtWidgets.QFrame.VLine)
self.line_3.setFrameShadow(QtWidgets.QFrame.Sunken)
self.line_3.setObjectName("line_3")
self.gridLayout_3.addWidget(self.line_3, 0, 3, 1, 1)
self.horizontalLayout_42 = QtWidgets.QHBoxLayout()
self.horizontalLayout_42.setObjectName("horizontalLayout_42")
self.btn_select_costume_5 = QtWidgets.QPushButton(self.groupBox)
self.btn_select_costume_5.setMinimumSize(QtCore.QSize(100, 45))
self.btn_select_costume_5.setMaximumSize(QtCore.QSize(100, 45))
self.btn_select_costume_5.setText("")
self.btn_select_costume_5.setIconSize(QtCore.QSize(100, 150))
self.btn_select_costume_5.setObjectName("btn_select_costume_5")
self.horizontalLayout_42.addWidget(self.btn_select_costume_5)
self.gridLayout_3.addLayout(self.horizontalLayout_42, 3, 5, 1, 1)
self.horizontalLayout_30 = QtWidgets.QHBoxLayout()
self.horizontalLayout_30.setObjectName("horizontalLayout_30")
self.btn_select_servant_6 = QtWidgets.QPushButton(self.groupBox)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.btn_select_servant_6.sizePolicy().hasHeightForWidth())
self.btn_select_servant_6.setSizePolicy(sizePolicy)
self.btn_select_servant_6.setMinimumSize(QtCore.QSize(92, 100))
self.btn_select_servant_6.setMaximumSize(QtCore.QSize(92, 100))
self.btn_select_servant_6.setText("")
self.btn_select_servant_6.setIconSize(QtCore.QSize(92, 100))
self.btn_select_servant_6.setObjectName("btn_select_servant_6")
self.horizontalLayout_30.addWidget(self.btn_select_servant_6)
self.gridLayout_3.addLayout(self.horizontalLayout_30, 0, 6, 1, 1)
self.line_4 = QtWidgets.QFrame(self.groupBox)
self.line_4.setFrameShape(QtWidgets.QFrame.VLine)
self.line_4.setFrameShadow(QtWidgets.QFrame.Sunken)
self.line_4.setObjectName("line_4")
self.gridLayout_3.addWidget(self.line_4, 3, 3, 1, 1)
self.verticalLayout_3.addLayout(self.gridLayout_3)
self.horizontalLayout_4 = QtWidgets.QHBoxLayout()
self.horizontalLayout_4.setObjectName("horizontalLayout_4")
self.btn_set_progress = QtWidgets.QPushButton(self.groupBox)
self.btn_set_progress.setObjectName("btn_set_progress")
self.horizontalLayout_4.addWidget(self.btn_set_progress)
self.btn_choose_level = QtWidgets.QPushButton(self.groupBox)
self.btn_choose_level.setObjectName("btn_choose_level")
self.horizontalLayout_4.addWidget(self.btn_choose_level)
self.btn_confirm_team = QtWidgets.QPushButton(self.groupBox)
self.btn_confirm_team.setObjectName("btn_confirm_team")
self.horizontalLayout_4.addWidget(self.btn_confirm_team)
self.btn_change_team = QtWidgets.QPushButton(self.groupBox)
self.btn_change_team.setEnabled(False)
self.btn_change_team.setObjectName("btn_change_team")
self.horizontalLayout_4.addWidget(self.btn_change_team)
self.btn_round_reset = QtWidgets.QPushButton(self.groupBox)
self.btn_round_reset.setEnabled(False)
self.btn_round_reset.setObjectName("btn_round_reset")
self.horizontalLayout_4.addWidget(self.btn_round_reset)
self.verticalLayout_3.addLayout(self.horizontalLayout_4)
self.verticalLayout_2.addWidget(self.groupBox)
self.horizontalLayout_15 = QtWidgets.QHBoxLayout()
self.horizontalLayout_15.setObjectName("horizontalLayout_15")
self.groupBox_2 = QtWidgets.QGroupBox(self.centralwidget)
self.groupBox_2.setTitle("")
self.groupBox_2.setObjectName("groupBox_2")
self.verticalLayout_6 = QtWidgets.QVBoxLayout(self.groupBox_2)
self.verticalLayout_6.setContentsMargins(5, 5, 5, 5)
self.verticalLayout_6.setSpacing(5)
self.verticalLayout_6.setObjectName("verticalLayout_6")
self.verticalLayout_4 = QtWidgets.QVBoxLayout()
self.verticalLayout_4.setObjectName("verticalLayout_4")
self.gridLayout_7 = QtWidgets.QGridLayout()
self.gridLayout_7.setObjectName("gridLayout_7")
self.round1_enemy3_class = QtWidgets.QLabel(self.groupBox_2)
self.round1_enemy3_class.setMinimumSize(QtCore.QSize(150, 0))
self.round1_enemy3_class.setMaximumSize(QtCore.QSize(150, 16777215))
self.round1_enemy3_class.setText("")
self.round1_enemy3_class.setObjectName("round1_enemy3_class")
self.gridLayout_7.addWidget(self.round1_enemy3_class, 1, 0, 1, 1)
self.round3_enemy1_class = QtWidgets.QLabel(self.groupBox_2)
self.round3_enemy1_class.setMinimumSize(QtCore.QSize(150, 0))
self.round3_enemy1_class.setMaximumSize(QtCore.QSize(150, 16777215))
self.round3_enemy1_class.setText("")
self.round3_enemy1_class.setObjectName("round3_enemy1_class")
self.gridLayout_7.addWidget(self.round3_enemy1_class, 7, 2, 1, 1)
self.horizontalLayout_10 = QtWidgets.QHBoxLayout()
self.horizontalLayout_10.setObjectName("horizontalLayout_10")
self.round2_enemy1_pic = QtWidgets.QPushButton(self.groupBox_2)
self.round2_enemy1_pic.setEnabled(False)
self.round2_enemy1_pic.setMinimumSize(QtCore.QSize(64, 64))
self.round2_enemy1_pic.setMaximumSize(QtCore.QSize(64, 64))
self.round2_enemy1_pic.setText("")
self.round2_enemy1_pic.setIconSize(QtCore.QSize(64, 64))
self.round2_enemy1_pic.setObjectName("round2_enemy1_pic")
self.horizontalLayout_10.addWidget(self.round2_enemy1_pic)
self.gridLayout_7.addLayout(self.horizontalLayout_10, 3, 2, 1, 1)
self.round3_enemy1_health = QtWidgets.QLabel(self.groupBox_2)
self.round3_enemy1_health.setMinimumSize(QtCore.QSize(150, 0))
self.round3_enemy1_health.setMaximumSize(QtCore.QSize(150, 16777215))
self.round3_enemy1_health.setText("")
self.round3_enemy1_health.setObjectName("round3_enemy1_health")
self.gridLayout_7.addWidget(self.round3_enemy1_health, 8, 2, 1, 1)
self.horizontalLayout_20 = QtWidgets.QHBoxLayout()
self.horizontalLayout_20.setObjectName("horizontalLayout_20")
self.round3_enemy2_pic = QtWidgets.QPushButton(self.groupBox_2)
self.round3_enemy2_pic.setEnabled(False)
self.round3_enemy2_pic.setMinimumSize(QtCore.QSize(64, 64))
self.round3_enemy2_pic.setMaximumSize(QtCore.QSize(64, 64))
self.round3_enemy2_pic.setText("")
self.round3_enemy2_pic.setIconSize(QtCore.QSize(64, 64))
self.round3_enemy2_pic.setObjectName("round3_enemy2_pic")
self.horizontalLayout_20.addWidget(self.round3_enemy2_pic)
self.gridLayout_7.addLayout(self.horizontalLayout_20, 6, 1, 1, 1)
self.round2_enemy3_class = QtWidgets.QLabel(self.groupBox_2)
self.round2_enemy3_class.setMinimumSize(QtCore.QSize(150, 0))
self.round2_enemy3_class.setMaximumSize(QtCore.QSize(150, 16777215))
self.round2_enemy3_class.setText("")
self.round2_enemy3_class.setObjectName("round2_enemy3_class")
self.gridLayout_7.addWidget(self.round2_enemy3_class, 4, 0, 1, 1)
self.round2_enemy2_class = QtWidgets.QLabel(self.groupBox_2)
self.round2_enemy2_class.setMinimumSize(QtCore.QSize(150, 0))
self.round2_enemy2_class.setMaximumSize(QtCore.QSize(150, 16777215))
self.round2_enemy2_class.setText("")
self.round2_enemy2_class.setObjectName("round2_enemy2_class")
self.gridLayout_7.addWidget(self.round2_enemy2_class, 4, 1, 1, 1)
self.horizontalLayout_6 = QtWidgets.QHBoxLayout()
self.horizontalLayout_6.setObjectName("horizontalLayout_6")
self.round2_enemy3_pic = QtWidgets.QPushButton(self.groupBox_2)
self.round2_enemy3_pic.setEnabled(False)
self.round2_enemy3_pic.setMinimumSize(QtCore.QSize(64, 64))
self.round2_enemy3_pic.setMaximumSize(QtCore.QSize(64, 64))
self.round2_enemy3_pic.setText("")
self.round2_enemy3_pic.setIconSize(QtCore.QSize(64, 64))
self.round2_enemy3_pic.setObjectName("round2_enemy3_pic")
self.horizontalLayout_6.addWidget(self.round2_enemy3_pic)
self.gridLayout_7.addLayout(self.horizontalLayout_6, 3, 0, 1, 1)
self.horizontalLayout_7 = QtWidgets.QHBoxLayout()
self.horizontalLayout_7.setObjectName("horizontalLayout_7")
self.round2_enemy2_pic = QtWidgets.QPushButton(self.groupBox_2)
self.round2_enemy2_pic.setEnabled(False)
self.round2_enemy2_pic.setMinimumSize(QtCore.QSize(64, 64))
self.round2_enemy2_pic.setMaximumSize(QtCore.QSize(64, 64))
self.round2_enemy2_pic.setText("")
self.round2_enemy2_pic.setIconSize(QtCore.QSize(64, 64))
self.round2_enemy2_pic.setObjectName("round2_enemy2_pic")
self.horizontalLayout_7.addWidget(self.round2_enemy2_pic)
self.gridLayout_7.addLayout(self.horizontalLayout_7, 3, 1, 1, 1)
self.horizontalLayout_21 = QtWidgets.QHBoxLayout()
self.horizontalLayout_21.setObjectName("horizontalLayout_21")
self.round3_enemy3_pic = QtWidgets.QPushButton(self.groupBox_2)
self.round3_enemy3_pic.setEnabled(False)
self.round3_enemy3_pic.setMinimumSize(QtCore.QSize(64, 64))
self.round3_enemy3_pic.setMaximumSize(QtCore.QSize(64, 64))
self.round3_enemy3_pic.setText("")
self.round3_enemy3_pic.setIconSize(QtCore.QSize(64, 64))
self.round3_enemy3_pic.setObjectName("round3_enemy3_pic")
self.horizontalLayout_21.addWidget(self.round3_enemy3_pic)
self.gridLayout_7.addLayout(self.horizontalLayout_21, 6, 0, 1, 1)
self.horizontalLayout_2 = QtWidgets.QHBoxLayout()
self.horizontalLayout_2.setObjectName("horizontalLayout_2")
self.round1_enemy2_pic = QtWidgets.QPushButton(self.groupBox_2)
self.round1_enemy2_pic.setEnabled(False)
self.round1_enemy2_pic.setMinimumSize(QtCore.QSize(64, 64))
self.round1_enemy2_pic.setMaximumSize(QtCore.QSize(64, 64))
self.round1_enemy2_pic.setText("")
self.round1_enemy2_pic.setIconSize(QtCore.QSize(64, 64))
self.round1_enemy2_pic.setObjectName("round1_enemy2_pic")
self.horizontalLayout_2.addWidget(self.round1_enemy2_pic)
self.gridLayout_7.addLayout(self.horizontalLayout_2, 0, 1, 1, 1)
self.round3_enemy3_class = QtWidgets.QLabel(self.groupBox_2)
self.round3_enemy3_class.setMinimumSize(QtCore.QSize(150, 0))
self.round3_enemy3_class.setMaximumSize(QtCore.QSize(150, 28))
self.round3_enemy3_class.setText("")
self.round3_enemy3_class.setObjectName("round3_enemy3_class")
self.gridLayout_7.addWidget(self.round3_enemy3_class, 7, 0, 1, 1)
self.round1_enemy2_class = QtWidgets.QLabel(self.groupBox_2)
self.round1_enemy2_class.setMinimumSize(QtCore.QSize(150, 0))
self.round1_enemy2_class.setMaximumSize(QtCore.QSize(150, 28))
self.round1_enemy2_class.setText("")
self.round1_enemy2_class.setObjectName("round1_enemy2_class")
self.gridLayout_7.addWidget(self.round1_enemy2_class, 1, 1, 1, 1)
self.round3_enemy3_health = QtWidgets.QLabel(self.groupBox_2)
self.round3_enemy3_health.setMinimumSize(QtCore.QSize(150, 0))
self.round3_enemy3_health.setMaximumSize(QtCore.QSize(150, 16777215))
self.round3_enemy3_health.setText("")
self.round3_enemy3_health.setObjectName("round3_enemy3_health")
self.gridLayout_7.addWidget(self.round3_enemy3_health, 8, 0, 1, 1)
self.round1_enemy3_health = QtWidgets.QLabel(self.groupBox_2)
self.round1_enemy3_health.setMinimumSize(QtCore.QSize(150, 0))
self.round1_enemy3_health.setMaximumSize(QtCore.QSize(150, 16777215))
self.round1_enemy3_health.setText("")
self.round1_enemy3_health.setObjectName("round1_enemy3_health")
self.gridLayout_7.addWidget(self.round1_enemy3_health, 2, 0, 1, 1)
self.horizontalLayout = QtWidgets.QHBoxLayout()
self.horizontalLayout.setObjectName("horizontalLayout")
self.round1_enemy1_pic = QtWidgets.QPushButton(self.groupBox_2)
self.round1_enemy1_pic.setEnabled(False)
self.round1_enemy1_pic.setMinimumSize(QtCore.QSize(64, 64))
self.round1_enemy1_pic.setMaximumSize(QtCore.QSize(64, 64))
self.round1_enemy1_pic.setText("")
self.round1_enemy1_pic.setIconSize(QtCore.QSize(64, 64))
self.round1_enemy1_pic.setObjectName("round1_enemy1_pic")
self.horizontalLayout.addWidget(self.round1_enemy1_pic)
self.gridLayout_7.addLayout(self.horizontalLayout, 0, 2, 1, 1)
self.round2_enemy3_health = QtWidgets.QLabel(self.groupBox_2)
self.round2_enemy3_health.setMinimumSize(QtCore.QSize(150, 0))
self.round2_enemy3_health.setMaximumSize(QtCore.QSize(150, 16777215))
self.round2_enemy3_health.setText("")
self.round2_enemy3_health.setObjectName("round2_enemy3_health")
self.gridLayout_7.addWidget(self.round2_enemy3_health, 5, 0, 1, 1)
self.round2_enemy2_health = QtWidgets.QLabel(self.groupBox_2)
self.round2_enemy2_health.setMinimumSize(QtCore.QSize(150, 0))
self.round2_enemy2_health.setMaximumSize(QtCore.QSize(150, 16777215))
self.round2_enemy2_health.setText("")
self.round2_enemy2_health.setObjectName("round2_enemy2_health")
self.gridLayout_7.addWidget(self.round2_enemy2_health, 5, 1, 1, 1)
self.round3_enemy2_health = QtWidgets.QLabel(self.groupBox_2)
self.round3_enemy2_health.setMinimumSize(QtCore.QSize(150, 0))
self.round3_enemy2_health.setMaximumSize(QtCore.QSize(150, 16777215))
self.round3_enemy2_health.setText("")
self.round3_enemy2_health.setObjectName("round3_enemy2_health")
self.gridLayout_7.addWidget(self.round3_enemy2_health, 8, 1, 1, 1)
self.round1_enemy2_health = QtWidgets.QLabel(self.groupBox_2)
self.round1_enemy2_health.setMinimumSize(QtCore.QSize(150, 0))
self.round1_enemy2_health.setMaximumSize(QtCore.QSize(150, 16777215))
self.round1_enemy2_health.setText("")
self.round1_enemy2_health.setObjectName("round1_enemy2_health")
self.gridLayout_7.addWidget(self.round1_enemy2_health, 2, 1, 1, 1)
self.round2_enemy1_class = QtWidgets.QLabel(self.groupBox_2)
self.round2_enemy1_class.setMinimumSize(QtCore.QSize(150, 0))
self.round2_enemy1_class.setMaximumSize(QtCore.QSize(150, 16777215))
self.round2_enemy1_class.setText("")
self.round2_enemy1_class.setObjectName("round2_enemy1_class")
self.gridLayout_7.addWidget(self.round2_enemy1_class, 4, 2, 1, 1)
self.round1_enemy1_class = QtWidgets.QLabel(self.groupBox_2)
self.round1_enemy1_class.setMinimumSize(QtCore.QSize(150, 0))
self.round1_enemy1_class.setMaximumSize(QtCore.QSize(150, 28))
self.round1_enemy1_class.setText("")
self.round1_enemy1_class.setObjectName("round1_enemy1_class")
self.gridLayout_7.addWidget(self.round1_enemy1_class, 1, 2, 1, 1)
self.round1_enemy1_health = QtWidgets.QLabel(self.groupBox_2)
self.round1_enemy1_health.setMinimumSize(QtCore.QSize(150, 0))
self.round1_enemy1_health.setMaximumSize(QtCore.QSize(150, 16777215))
self.round1_enemy1_health.setText("")
self.round1_enemy1_health.setObjectName("round1_enemy1_health")
self.gridLayout_7.addWidget(self.round1_enemy1_health, 2, 2, 1, 1)
self.round2_enemy1_health = QtWidgets.QLabel(self.groupBox_2)
self.round2_enemy1_health.setMinimumSize(QtCore.QSize(150, 0))
self.round2_enemy1_health.setMaximumSize(QtCore.QSize(150, 16777215))
self.round2_enemy1_health.setText("")
self.round2_enemy1_health.setObjectName("round2_enemy1_health")
self.gridLayout_7.addWidget(self.round2_enemy1_health, 5, 2, 1, 1)
self.horizontalLayout_12 = QtWidgets.QHBoxLayout()
self.horizontalLayout_12.setObjectName("horizontalLayout_12")
self.round3_enemy1_pic = QtWidgets.QPushButton(self.groupBox_2)
self.round3_enemy1_pic.setEnabled(False)
self.round3_enemy1_pic.setMinimumSize(QtCore.QSize(64, 64))
self.round3_enemy1_pic.setMaximumSize(QtCore.QSize(64, 64))
self.round3_enemy1_pic.setText("")
self.round3_enemy1_pic.setIconSize(QtCore.QSize(64, 64))
self.round3_enemy1_pic.setObjectName("round3_enemy1_pic")
self.horizontalLayout_12.addWidget(self.round3_enemy1_pic)
self.gridLayout_7.addLayout(self.horizontalLayout_12, 6, 2, 1, 1)
self.horizontalLayout_5 = QtWidgets.QHBoxLayout()
self.horizontalLayout_5.setObjectName("horizontalLayout_5")
self.round1_enemy3_pic = QtWidgets.QPushButton(self.groupBox_2)
self.round1_enemy3_pic.setEnabled(False)
self.round1_enemy3_pic.setMinimumSize(QtCore.QSize(64, 64))
self.round1_enemy3_pic.setMaximumSize(QtCore.QSize(64, 64))
self.round1_enemy3_pic.setText("")
self.round1_enemy3_pic.setIconSize(QtCore.QSize(64, 64))
self.round1_enemy3_pic.setObjectName("round1_enemy3_pic")
self.horizontalLayout_5.addWidget(self.round1_enemy3_pic)
self.gridLayout_7.addLayout(self.horizontalLayout_5, 0, 0, 1, 1)
self.round3_enemy2_class = QtWidgets.QLabel(self.groupBox_2)
self.round3_enemy2_class.setMinimumSize(QtCore.QSize(150, 0))
self.round3_enemy2_class.setMaximumSize(QtCore.QSize(150, 28))
self.round3_enemy2_class.setText("")
self.round3_enemy2_class.setObjectName("round3_enemy2_class")
self.gridLayout_7.addWidget(self.round3_enemy2_class, 7, 1, 1, 1)
self.verticalLayout_4.addLayout(self.gridLayout_7)
self.verticalLayout_6.addLayout(self.verticalLayout_4)
self.horizontalLayout_15.addWidget(self.groupBox_2)
self.groupBox_3 = QtWidgets.QGroupBox(self.centralwidget)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Minimum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.groupBox_3.sizePolicy().hasHeightForWidth())
self.groupBox_3.setSizePolicy(sizePolicy)
self.groupBox_3.setTitle("")
self.groupBox_3.setObjectName("groupBox_3")
self.verticalLayout_7 = QtWidgets.QVBoxLayout(self.groupBox_3)
self.verticalLayout_7.setContentsMargins(5, 5, 5, 5)
self.verticalLayout_7.setSpacing(5)
self.verticalLayout_7.setObjectName("verticalLayout_7")
self.gridLayout_2 = QtWidgets.QGridLayout()
self.gridLayout_2.setObjectName("gridLayout_2")
self.verticalLayout_5 = QtWidgets.QVBoxLayout()
self.verticalLayout_5.setObjectName("verticalLayout_5")
self.round1_label_random = QtWidgets.QLabel(self.groupBox_3)
self.round1_label_random.setEnabled(False)
self.round1_label_random.setMaximumSize(QtCore.QSize(100, 16777215))
self.round1_label_random.setObjectName("round1_label_random")
self.verticalLayout_5.addWidget(self.round1_label_random)
self.round1_bar_random = QtWidgets.QSlider(self.groupBox_3)
self.round1_bar_random.setEnabled(False)
self.round1_bar_random.setMaximumSize(QtCore.QSize(100, 16777215))
self.round1_bar_random.setMinimum(90)
self.round1_bar_random.setMaximum(110)
self.round1_bar_random.setProperty("value", 90)
self.round1_bar_random.setOrientation(QtCore.Qt.Horizontal)
self.round1_bar_random.setObjectName("round1_bar_random")
self.verticalLayout_5.addWidget(self.round1_bar_random)
self.gridLayout_2.addLayout(self.verticalLayout_5, 1, 8, 1, 1)
self.round1_servant2_np = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant2_np.setEnabled(False)
self.round1_servant2_np.setObjectName("round1_servant2_np")
self.gridLayout_2.addWidget(self.round1_servant2_np, 4, 5, 1, 1)
self.round1_servant3_np = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant3_np.setEnabled(False)
self.round1_servant3_np.setObjectName("round1_servant3_np")
self.gridLayout_2.addWidget(self.round1_servant3_np, 4, 6, 1, 1)
self.horizontalLayout_16 = QtWidgets.QHBoxLayout()
self.horizontalLayout_16.setObjectName("horizontalLayout_16")
self.round1_servant2_pic = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant2_pic.setEnabled(False)
self.round1_servant2_pic.setMinimumSize(QtCore.QSize(64, 70))
self.round1_servant2_pic.setMaximumSize(QtCore.QSize(64, 70))
self.round1_servant2_pic.setText("")
self.round1_servant2_pic.setIconSize(QtCore.QSize(64, 70))
self.round1_servant2_pic.setObjectName("round1_servant2_pic")
self.horizontalLayout_16.addWidget(self.round1_servant2_pic)
self.gridLayout_2.addLayout(self.horizontalLayout_16, 1, 5, 1, 1)
self.horizontalLayout_9 = QtWidgets.QHBoxLayout()
self.horizontalLayout_9.setObjectName("horizontalLayout_9")
self.round1_servant1_pic = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant1_pic.setEnabled(False)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum, QtWidgets.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.round1_servant1_pic.sizePolicy().hasHeightForWidth())
self.round1_servant1_pic.setSizePolicy(sizePolicy)
self.round1_servant1_pic.setMinimumSize(QtCore.QSize(64, 70))
self.round1_servant1_pic.setMaximumSize(QtCore.QSize(64, 70))
self.round1_servant1_pic.setText("")
self.round1_servant1_pic.setIconSize(QtCore.QSize(64, 70))
self.round1_servant1_pic.setObjectName("round1_servant1_pic")
self.horizontalLayout_9.addWidget(self.round1_servant1_pic)
self.gridLayout_2.addLayout(self.horizontalLayout_9, 1, 4, 1, 1)
self.horizontalLayout_19 = QtWidgets.QHBoxLayout()
self.horizontalLayout_19.setObjectName("horizontalLayout_19")
self.round1_master_pic = QtWidgets.QPushButton(self.groupBox_3)
self.round1_master_pic.setEnabled(False)
self.round1_master_pic.setMinimumSize(QtCore.QSize(64, 70))
self.round1_master_pic.setMaximumSize(QtCore.QSize(64, 70))
self.round1_master_pic.setText("")
self.round1_master_pic.setIconSize(QtCore.QSize(64, 64))
self.round1_master_pic.setObjectName("round1_master_pic")
self.horizontalLayout_19.addWidget(self.round1_master_pic)
self.gridLayout_2.addLayout(self.horizontalLayout_19, 1, 7, 1, 1)
self.horizontalLayout_11 = QtWidgets.QHBoxLayout()
self.horizontalLayout_11.setObjectName("horizontalLayout_11")
self.round1_servant1_skill1 = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant1_skill1.setEnabled(False)
self.round1_servant1_skill1.setMinimumSize(QtCore.QSize(30, 30))
self.round1_servant1_skill1.setMaximumSize(QtCore.QSize(30, 30))
self.round1_servant1_skill1.setText("")
self.round1_servant1_skill1.setIconSize(QtCore.QSize(30, 30))
self.round1_servant1_skill1.setObjectName("round1_servant1_skill1")
self.horizontalLayout_11.addWidget(self.round1_servant1_skill1)
self.round1_servant1_skill2 = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant1_skill2.setEnabled(False)
self.round1_servant1_skill2.setMinimumSize(QtCore.QSize(30, 30))
self.round1_servant1_skill2.setMaximumSize(QtCore.QSize(30, 30))
self.round1_servant1_skill2.setText("")
self.round1_servant1_skill2.setIconSize(QtCore.QSize(30, 30))
self.round1_servant1_skill2.setObjectName("round1_servant1_skill2")
self.horizontalLayout_11.addWidget(self.round1_servant1_skill2)
self.round1_servant1_skill3 = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant1_skill3.setEnabled(False)
self.round1_servant1_skill3.setMinimumSize(QtCore.QSize(30, 30))
self.round1_servant1_skill3.setMaximumSize(QtCore.QSize(30, 30))
self.round1_servant1_skill3.setText("")
self.round1_servant1_skill3.setIconSize(QtCore.QSize(30, 30))
self.round1_servant1_skill3.setObjectName("round1_servant1_skill3")
self.horizontalLayout_11.addWidget(self.round1_servant1_skill3)
self.gridLayout_2.addLayout(self.horizontalLayout_11, 3, 4, 1, 1)
self.horizontalLayout_17 = QtWidgets.QHBoxLayout()
self.horizontalLayout_17.setObjectName("horizontalLayout_17")
self.round1_servant3_pic = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant3_pic.setEnabled(False)
self.round1_servant3_pic.setMinimumSize(QtCore.QSize(64, 70))
self.round1_servant3_pic.setMaximumSize(QtCore.QSize(64, 70))
self.round1_servant3_pic.setText("")
self.round1_servant3_pic.setIconSize(QtCore.QSize(64, 70))
self.round1_servant3_pic.setObjectName("round1_servant3_pic")
self.horizontalLayout_17.addWidget(self.round1_servant3_pic)
self.gridLayout_2.addLayout(self.horizontalLayout_17, 1, 6, 1, 1)
self.btn_round1_next = QtWidgets.QPushButton(self.groupBox_3)
self.btn_round1_next.setEnabled(False)
self.btn_round1_next.setMinimumSize(QtCore.QSize(0, 30))
self.btn_round1_next.setMaximumSize(QtCore.QSize(16777215, 30))
self.btn_round1_next.setObjectName("btn_round1_next")
self.gridLayout_2.addWidget(self.btn_round1_next, 3, 8, 1, 1)
self.round1_servant1_np = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant1_np.setEnabled(False)
self.round1_servant1_np.setObjectName("round1_servant1_np")
self.gridLayout_2.addWidget(self.round1_servant1_np, 4, 4, 1, 1)
self.horizontalLayout_14 = QtWidgets.QHBoxLayout()
self.horizontalLayout_14.setObjectName("horizontalLayout_14")
self.round1_servant3_skill1 = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant3_skill1.setEnabled(False)
self.round1_servant3_skill1.setMinimumSize(QtCore.QSize(30, 30))
self.round1_servant3_skill1.setMaximumSize(QtCore.QSize(30, 30))
self.round1_servant3_skill1.setText("")
self.round1_servant3_skill1.setIconSize(QtCore.QSize(30, 30))
self.round1_servant3_skill1.setObjectName("round1_servant3_skill1")
self.horizontalLayout_14.addWidget(self.round1_servant3_skill1)
self.round1_servant3_skill2 = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant3_skill2.setEnabled(False)
self.round1_servant3_skill2.setMinimumSize(QtCore.QSize(30, 30))
self.round1_servant3_skill2.setMaximumSize(QtCore.QSize(30, 30))
self.round1_servant3_skill2.setText("")
self.round1_servant3_skill2.setIconSize(QtCore.QSize(30, 30))
self.round1_servant3_skill2.setObjectName("round1_servant3_skill2")
self.horizontalLayout_14.addWidget(self.round1_servant3_skill2)
self.round1_servant3_skill3 = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant3_skill3.setEnabled(False)
self.round1_servant3_skill3.setMinimumSize(QtCore.QSize(30, 30))
self.round1_servant3_skill3.setMaximumSize(QtCore.QSize(30, 30))
self.round1_servant3_skill3.setText("")
self.round1_servant3_skill3.setIconSize(QtCore.QSize(30, 30))
self.round1_servant3_skill3.setObjectName("round1_servant3_skill3")
self.horizontalLayout_14.addWidget(self.round1_servant3_skill3)
self.gridLayout_2.addLayout(self.horizontalLayout_14, 3, 6, 1, 1)
self.horizontalLayout_18 = QtWidgets.QHBoxLayout()
self.horizontalLayout_18.setObjectName("horizontalLayout_18")
self.round1_master_skill1 = QtWidgets.QPushButton(self.groupBox_3)
self.round1_master_skill1.setEnabled(False)
self.round1_master_skill1.setMinimumSize(QtCore.QSize(30, 30))
self.round1_master_skill1.setMaximumSize(QtCore.QSize(30, 30))
self.round1_master_skill1.setText("")
self.round1_master_skill1.setIconSize(QtCore.QSize(30, 30))
self.round1_master_skill1.setObjectName("round1_master_skill1")
self.horizontalLayout_18.addWidget(self.round1_master_skill1)
self.round1_master_skill2 = QtWidgets.QPushButton(self.groupBox_3)
self.round1_master_skill2.setEnabled(False)
self.round1_master_skill2.setMinimumSize(QtCore.QSize(30, 30))
self.round1_master_skill2.setMaximumSize(QtCore.QSize(30, 30))
self.round1_master_skill2.setText("")
self.round1_master_skill2.setIconSize(QtCore.QSize(30, 30))
self.round1_master_skill2.setObjectName("round1_master_skill2")
self.horizontalLayout_18.addWidget(self.round1_master_skill2)
self.round1_master_skill3 = QtWidgets.QPushButton(self.groupBox_3)
self.round1_master_skill3.setEnabled(False)
self.round1_master_skill3.setMinimumSize(QtCore.QSize(30, 30))
self.round1_master_skill3.setMaximumSize(QtCore.QSize(30, 30))
self.round1_master_skill3.setText("")
self.round1_master_skill3.setIconSize(QtCore.QSize(30, 30))
self.round1_master_skill3.setObjectName("round1_master_skill3")
self.horizontalLayout_18.addWidget(self.round1_master_skill3)
self.gridLayout_2.addLayout(self.horizontalLayout_18, 3, 7, 1, 1)
self.horizontalLayout_13 = QtWidgets.QHBoxLayout()
self.horizontalLayout_13.setObjectName("horizontalLayout_13")
self.round1_servant2_skill1 = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant2_skill1.setEnabled(False)
self.round1_servant2_skill1.setMinimumSize(QtCore.QSize(30, 30))
self.round1_servant2_skill1.setMaximumSize(QtCore.QSize(30, 30))
self.round1_servant2_skill1.setText("")
self.round1_servant2_skill1.setIconSize(QtCore.QSize(30, 30))
self.round1_servant2_skill1.setObjectName("round1_servant2_skill1")
self.horizontalLayout_13.addWidget(self.round1_servant2_skill1)
self.round1_servant2_skill2 = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant2_skill2.setEnabled(False)
self.round1_servant2_skill2.setMinimumSize(QtCore.QSize(30, 30))
self.round1_servant2_skill2.setMaximumSize(QtCore.QSize(30, 30))
self.round1_servant2_skill2.setText("")
self.round1_servant2_skill2.setIconSize(QtCore.QSize(30, 30))
self.round1_servant2_skill2.setObjectName("round1_servant2_skill2")
self.horizontalLayout_13.addWidget(self.round1_servant2_skill2)
self.round1_servant2_skill3 = QtWidgets.QPushButton(self.groupBox_3)
self.round1_servant2_skill3.setEnabled(False)
self.round1_servant2_skill3.setMinimumSize(QtCore.QSize(30, 30))
self.round1_servant2_skill3.setMaximumSize(QtCore.QSize(30, 30))
self.round1_servant2_skill3.setText("")
self.round1_servant2_skill3.setIconSize(QtCore.QSize(30, 30))
self.round1_servant2_skill3.setObjectName("round1_servant2_skill3")
self.horizontalLayout_13.addWidget(self.round1_servant2_skill3)
self.gridLayout_2.addLayout(self.horizontalLayout_13, 3, 5, 1, 1)
self.verticalLayout_7.addLayout(self.gridLayout_2)
self.gridLayout_4 = QtWidgets.QGridLayout()
self.gridLayout_4.setObjectName("gridLayout_4")
self.round2_servant3_np = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant3_np.setEnabled(False)
self.round2_servant3_np.setObjectName("round2_servant3_np")
self.gridLayout_4.addWidget(self.round2_servant3_np, 3, 5, 1, 1)
self.horizontalLayout_181 = QtWidgets.QHBoxLayout()
self.horizontalLayout_181.setObjectName("horizontalLayout_181")
self.round2_master_skill1 = QtWidgets.QPushButton(self.groupBox_3)
self.round2_master_skill1.setEnabled(False)
self.round2_master_skill1.setMinimumSize(QtCore.QSize(30, 30))
self.round2_master_skill1.setMaximumSize(QtCore.QSize(30, 30))
self.round2_master_skill1.setText("")
self.round2_master_skill1.setIconSize(QtCore.QSize(30, 30))
self.round2_master_skill1.setObjectName("round2_master_skill1")
self.horizontalLayout_181.addWidget(self.round2_master_skill1)
self.round2_master_skill2 = QtWidgets.QPushButton(self.groupBox_3)
self.round2_master_skill2.setEnabled(False)
self.round2_master_skill2.setMinimumSize(QtCore.QSize(30, 30))
self.round2_master_skill2.setMaximumSize(QtCore.QSize(30, 30))
self.round2_master_skill2.setText("")
self.round2_master_skill2.setIconSize(QtCore.QSize(30, 30))
self.round2_master_skill2.setObjectName("round2_master_skill2")
self.horizontalLayout_181.addWidget(self.round2_master_skill2)
self.round2_master_skill3 = QtWidgets.QPushButton(self.groupBox_3)
self.round2_master_skill3.setEnabled(False)
self.round2_master_skill3.setMinimumSize(QtCore.QSize(30, 30))
self.round2_master_skill3.setMaximumSize(QtCore.QSize(30, 30))
self.round2_master_skill3.setText("")
self.round2_master_skill3.setIconSize(QtCore.QSize(30, 30))
self.round2_master_skill3.setObjectName("round2_master_skill3")
self.horizontalLayout_181.addWidget(self.round2_master_skill3)
self.gridLayout_4.addLayout(self.horizontalLayout_181, 1, 6, 1, 1)
self.round2_servant1_np = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant1_np.setEnabled(False)
self.round2_servant1_np.setObjectName("round2_servant1_np")
self.gridLayout_4.addWidget(self.round2_servant1_np, 3, 3, 1, 1)
self.horizontalLayout_171 = QtWidgets.QHBoxLayout()
self.horizontalLayout_171.setObjectName("horizontalLayout_171")
self.round2_servant3_pic = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant3_pic.setEnabled(False)
self.round2_servant3_pic.setMinimumSize(QtCore.QSize(64, 70))
self.round2_servant3_pic.setMaximumSize(QtCore.QSize(64, 70))
self.round2_servant3_pic.setText("")
self.round2_servant3_pic.setIconSize(QtCore.QSize(64, 70))
self.round2_servant3_pic.setObjectName("round2_servant3_pic")
self.horizontalLayout_171.addWidget(self.round2_servant3_pic)
self.gridLayout_4.addLayout(self.horizontalLayout_171, 0, 5, 1, 1)
self.horizontalLayout_161 = QtWidgets.QHBoxLayout()
self.horizontalLayout_161.setObjectName("horizontalLayout_161")
self.round2_servant2_pic = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant2_pic.setEnabled(False)
self.round2_servant2_pic.setMinimumSize(QtCore.QSize(64, 70))
self.round2_servant2_pic.setMaximumSize(QtCore.QSize(64, 70))
self.round2_servant2_pic.setText("")
self.round2_servant2_pic.setIconSize(QtCore.QSize(64, 70))
self.round2_servant2_pic.setObjectName("round2_servant2_pic")
self.horizontalLayout_161.addWidget(self.round2_servant2_pic)
self.gridLayout_4.addLayout(self.horizontalLayout_161, 0, 4, 1, 1)
self.horizontalLayout_131 = QtWidgets.QHBoxLayout()
self.horizontalLayout_131.setObjectName("horizontalLayout_131")
self.round2_servant2_skill1 = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant2_skill1.setEnabled(False)
self.round2_servant2_skill1.setMinimumSize(QtCore.QSize(30, 30))
self.round2_servant2_skill1.setMaximumSize(QtCore.QSize(30, 30))
self.round2_servant2_skill1.setText("")
self.round2_servant2_skill1.setIconSize(QtCore.QSize(30, 30))
self.round2_servant2_skill1.setObjectName("round2_servant2_skill1")
self.horizontalLayout_131.addWidget(self.round2_servant2_skill1)
self.round2_servant2_skill2 = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant2_skill2.setEnabled(False)
self.round2_servant2_skill2.setMinimumSize(QtCore.QSize(30, 30))
self.round2_servant2_skill2.setMaximumSize(QtCore.QSize(30, 30))
self.round2_servant2_skill2.setText("")
self.round2_servant2_skill2.setIconSize(QtCore.QSize(30, 30))
self.round2_servant2_skill2.setObjectName("round2_servant2_skill2")
self.horizontalLayout_131.addWidget(self.round2_servant2_skill2)
self.round2_servant2_skill3 = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant2_skill3.setEnabled(False)
self.round2_servant2_skill3.setMinimumSize(QtCore.QSize(30, 30))
self.round2_servant2_skill3.setMaximumSize(QtCore.QSize(30, 30))
self.round2_servant2_skill3.setText("")
self.round2_servant2_skill3.setIconSize(QtCore.QSize(30, 30))
self.round2_servant2_skill3.setObjectName("round2_servant2_skill3")
self.horizontalLayout_131.addWidget(self.round2_servant2_skill3)
self.gridLayout_4.addLayout(self.horizontalLayout_131, 1, 4, 1, 1)
self.horizontalLayout_141 = QtWidgets.QHBoxLayout()
self.horizontalLayout_141.setObjectName("horizontalLayout_141")
self.round2_servant3_skill1 = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant3_skill1.setEnabled(False)
self.round2_servant3_skill1.setMinimumSize(QtCore.QSize(30, 30))
self.round2_servant3_skill1.setMaximumSize(QtCore.QSize(30, 30))
self.round2_servant3_skill1.setText("")
self.round2_servant3_skill1.setIconSize(QtCore.QSize(30, 30))
self.round2_servant3_skill1.setObjectName("round2_servant3_skill1")
self.horizontalLayout_141.addWidget(self.round2_servant3_skill1)
self.round2_servant3_skill2 = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant3_skill2.setEnabled(False)
self.round2_servant3_skill2.setMinimumSize(QtCore.QSize(30, 30))
self.round2_servant3_skill2.setMaximumSize(QtCore.QSize(30, 30))
self.round2_servant3_skill2.setText("")
self.round2_servant3_skill2.setIconSize(QtCore.QSize(30, 30))
self.round2_servant3_skill2.setObjectName("round2_servant3_skill2")
self.horizontalLayout_141.addWidget(self.round2_servant3_skill2)
self.round2_servant3_skill3 = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant3_skill3.setEnabled(False)
self.round2_servant3_skill3.setMinimumSize(QtCore.QSize(30, 30))
self.round2_servant3_skill3.setMaximumSize(QtCore.QSize(30, 30))
self.round2_servant3_skill3.setText("")
self.round2_servant3_skill3.setIconSize(QtCore.QSize(30, 30))
self.round2_servant3_skill3.setObjectName("round2_servant3_skill3")
self.horizontalLayout_141.addWidget(self.round2_servant3_skill3)
self.gridLayout_4.addLayout(self.horizontalLayout_141, 1, 5, 1, 1)
self.btn_round2_next = QtWidgets.QPushButton(self.groupBox_3)
self.btn_round2_next.setEnabled(False)
self.btn_round2_next.setMinimumSize(QtCore.QSize(0, 30))
self.btn_round2_next.setMaximumSize(QtCore.QSize(16777215, 16777215))
self.btn_round2_next.setObjectName("btn_round2_next")
self.gridLayout_4.addWidget(self.btn_round2_next, 1, 7, 1, 1)
self.horizontalLayout_191 = QtWidgets.QHBoxLayout()
self.horizontalLayout_191.setObjectName("horizontalLayout_191")
self.round2_master_pic = QtWidgets.QPushButton(self.groupBox_3)
self.round2_master_pic.setEnabled(False)
self.round2_master_pic.setMinimumSize(QtCore.QSize(64, 70))
self.round2_master_pic.setMaximumSize(QtCore.QSize(64, 70))
self.round2_master_pic.setText("")
self.round2_master_pic.setIconSize(QtCore.QSize(64, 64))
self.round2_master_pic.setObjectName("round2_master_pic")
self.horizontalLayout_191.addWidget(self.round2_master_pic)
self.gridLayout_4.addLayout(self.horizontalLayout_191, 0, 6, 1, 1)
self.round2_servant2_np = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant2_np.setEnabled(False)
self.round2_servant2_np.setObjectName("round2_servant2_np")
self.gridLayout_4.addWidget(self.round2_servant2_np, 3, 4, 1, 1)
self.horizontalLayout_91 = QtWidgets.QHBoxLayout()
self.horizontalLayout_91.setObjectName("horizontalLayout_91")
self.round2_servant1_pic = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant1_pic.setEnabled(False)
self.round2_servant1_pic.setMinimumSize(QtCore.QSize(64, 70))
self.round2_servant1_pic.setMaximumSize(QtCore.QSize(64, 70))
self.round2_servant1_pic.setText("")
self.round2_servant1_pic.setIconSize(QtCore.QSize(64, 70))
self.round2_servant1_pic.setObjectName("round2_servant1_pic")
self.horizontalLayout_91.addWidget(self.round2_servant1_pic)
self.gridLayout_4.addLayout(self.horizontalLayout_91, 0, 3, 1, 1)
self.verticalLayout_12 = QtWidgets.QVBoxLayout()
self.verticalLayout_12.setObjectName("verticalLayout_12")
self.round2_label_random = QtWidgets.QLabel(self.groupBox_3)
self.round2_label_random.setEnabled(False)
self.round2_label_random.setObjectName("round2_label_random")
self.verticalLayout_12.addWidget(self.round2_label_random)
self.round2_bar_random = QtWidgets.QSlider(self.groupBox_3)
self.round2_bar_random.setEnabled(False)
self.round2_bar_random.setMaximumSize(QtCore.QSize(100, 16777215))
self.round2_bar_random.setMinimum(90)
self.round2_bar_random.setMaximum(110)
self.round2_bar_random.setProperty("value", 90)
self.round2_bar_random.setOrientation(QtCore.Qt.Horizontal)
self.round2_bar_random.setObjectName("round2_bar_random")
self.verticalLayout_12.addWidget(self.round2_bar_random)
self.gridLayout_4.addLayout(self.verticalLayout_12, 0, 7, 1, 1)
self.horizontalLayout_111 = QtWidgets.QHBoxLayout()
self.horizontalLayout_111.setObjectName("horizontalLayout_111")
self.round2_servant1_skill1 = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant1_skill1.setEnabled(False)
self.round2_servant1_skill1.setMinimumSize(QtCore.QSize(30, 30))
self.round2_servant1_skill1.setMaximumSize(QtCore.QSize(30, 30))
self.round2_servant1_skill1.setText("")
self.round2_servant1_skill1.setIconSize(QtCore.QSize(30, 30))
self.round2_servant1_skill1.setObjectName("round2_servant1_skill1")
self.horizontalLayout_111.addWidget(self.round2_servant1_skill1)
self.round2_servant1_skill2 = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant1_skill2.setEnabled(False)
self.round2_servant1_skill2.setMinimumSize(QtCore.QSize(30, 30))
self.round2_servant1_skill2.setMaximumSize(QtCore.QSize(30, 30))
self.round2_servant1_skill2.setText("")
self.round2_servant1_skill2.setIconSize(QtCore.QSize(30, 30))
self.round2_servant1_skill2.setObjectName("round2_servant1_skill2")
self.horizontalLayout_111.addWidget(self.round2_servant1_skill2)
self.round2_servant1_skill3 = QtWidgets.QPushButton(self.groupBox_3)
self.round2_servant1_skill3.setEnabled(False)
self.round2_servant1_skill3.setMinimumSize(QtCore.QSize(30, 30))
self.round2_servant1_skill3.setMaximumSize(QtCore.QSize(30, 30))
self.round2_servant1_skill3.setText("")
self.round2_servant1_skill3.setIconSize(QtCore.QSize(30, 30))
self.round2_servant1_skill3.setObjectName("round2_servant1_skill3")
self.horizontalLayout_111.addWidget(self.round2_servant1_skill3)
self.gridLayout_4.addLayout(self.horizontalLayout_111, 1, 3, 1, 1)
self.verticalLayout_7.addLayout(self.gridLayout_4)
self.gridLayout_5 = QtWidgets.QGridLayout()
self.gridLayout_5.setObjectName("gridLayout_5")
self.round3_servant3_np = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant3_np.setEnabled(False)
self.round3_servant3_np.setObjectName("round3_servant3_np")
self.gridLayout_5.addWidget(self.round3_servant3_np, 3, 6, 1, 1)
self.horizontalLayout_192 = QtWidgets.QHBoxLayout()
self.horizontalLayout_192.setObjectName("horizontalLayout_192")
self.round3_master_pic = QtWidgets.QPushButton(self.groupBox_3)
self.round3_master_pic.setEnabled(False)
self.round3_master_pic.setMinimumSize(QtCore.QSize(64, 70))
self.round3_master_pic.setMaximumSize(QtCore.QSize(64, 70))
self.round3_master_pic.setText("")
self.round3_master_pic.setIconSize(QtCore.QSize(64, 64))
self.round3_master_pic.setObjectName("round3_master_pic")
self.horizontalLayout_192.addWidget(self.round3_master_pic)
self.gridLayout_5.addLayout(self.horizontalLayout_192, 0, 7, 1, 1)
self.horizontalLayout_92 = QtWidgets.QHBoxLayout()
self.horizontalLayout_92.setObjectName("horizontalLayout_92")
self.round3_servant1_pic = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant1_pic.setEnabled(False)
self.round3_servant1_pic.setMinimumSize(QtCore.QSize(64, 70))
self.round3_servant1_pic.setMaximumSize(QtCore.QSize(64, 70))
self.round3_servant1_pic.setText("")
self.round3_servant1_pic.setIconSize(QtCore.QSize(64, 70))
self.round3_servant1_pic.setObjectName("round3_servant1_pic")
self.horizontalLayout_92.addWidget(self.round3_servant1_pic)
self.gridLayout_5.addLayout(self.horizontalLayout_92, 0, 3, 1, 1)
self.round3_servant1_np = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant1_np.setEnabled(False)
self.round3_servant1_np.setObjectName("round3_servant1_np")
self.gridLayout_5.addWidget(self.round3_servant1_np, 3, 3, 1, 1)
self.round3_servant2_np = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant2_np.setEnabled(False)
self.round3_servant2_np.setObjectName("round3_servant2_np")
self.gridLayout_5.addWidget(self.round3_servant2_np, 3, 5, 1, 1)
self.horizontalLayout_172 = QtWidgets.QHBoxLayout()
self.horizontalLayout_172.setObjectName("horizontalLayout_172")
self.round3_servant3_pic = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant3_pic.setEnabled(False)
self.round3_servant3_pic.setMinimumSize(QtCore.QSize(64, 70))
self.round3_servant3_pic.setMaximumSize(QtCore.QSize(64, 70))
self.round3_servant3_pic.setText("")
self.round3_servant3_pic.setIconSize(QtCore.QSize(64, 70))
self.round3_servant3_pic.setObjectName("round3_servant3_pic")
self.horizontalLayout_172.addWidget(self.round3_servant3_pic)
self.gridLayout_5.addLayout(self.horizontalLayout_172, 0, 6, 1, 1)
self.horizontalLayout_162 = QtWidgets.QHBoxLayout()
self.horizontalLayout_162.setObjectName("horizontalLayout_162")
self.round3_servant2_pic = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant2_pic.setEnabled(False)
self.round3_servant2_pic.setMinimumSize(QtCore.QSize(64, 70))
self.round3_servant2_pic.setMaximumSize(QtCore.QSize(64, 70))
self.round3_servant2_pic.setText("")
self.round3_servant2_pic.setIconSize(QtCore.QSize(64, 70))
self.round3_servant2_pic.setObjectName("round3_servant2_pic")
self.horizontalLayout_162.addWidget(self.round3_servant2_pic)
self.gridLayout_5.addLayout(self.horizontalLayout_162, 0, 5, 1, 1)
self.btn_output_strategy = QtWidgets.QPushButton(self.groupBox_3)
self.btn_output_strategy.setEnabled(False)
self.btn_output_strategy.setObjectName("btn_output_strategy")
self.gridLayout_5.addWidget(self.btn_output_strategy, 2, 9, 1, 1)
self.horizontalLayout_132 = QtWidgets.QHBoxLayout()
self.horizontalLayout_132.setObjectName("horizontalLayout_132")
self.round3_servant2_skill1 = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant2_skill1.setEnabled(False)
self.round3_servant2_skill1.setMinimumSize(QtCore.QSize(30, 30))
self.round3_servant2_skill1.setMaximumSize(QtCore.QSize(30, 30))
self.round3_servant2_skill1.setText("")
self.round3_servant2_skill1.setIconSize(QtCore.QSize(30, 30))
self.round3_servant2_skill1.setObjectName("round3_servant2_skill1")
self.horizontalLayout_132.addWidget(self.round3_servant2_skill1)
self.round3_servant2_skill2 = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant2_skill2.setEnabled(False)
self.round3_servant2_skill2.setMinimumSize(QtCore.QSize(30, 30))
self.round3_servant2_skill2.setMaximumSize(QtCore.QSize(30, 30))
self.round3_servant2_skill2.setText("")
self.round3_servant2_skill2.setIconSize(QtCore.QSize(30, 30))
self.round3_servant2_skill2.setObjectName("round3_servant2_skill2")
self.horizontalLayout_132.addWidget(self.round3_servant2_skill2)
self.round3_servant2_skill3 = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant2_skill3.setEnabled(False)
self.round3_servant2_skill3.setMinimumSize(QtCore.QSize(30, 30))
self.round3_servant2_skill3.setMaximumSize(QtCore.QSize(30, 30))
self.round3_servant2_skill3.setText("")
self.round3_servant2_skill3.setIconSize(QtCore.QSize(30, 30))
self.round3_servant2_skill3.setObjectName("round3_servant2_skill3")
self.horizontalLayout_132.addWidget(self.round3_servant2_skill3)
self.gridLayout_5.addLayout(self.horizontalLayout_132, 2, 5, 1, 1)
self.verticalLayout_13 = QtWidgets.QVBoxLayout()
self.verticalLayout_13.setObjectName("verticalLayout_13")
self.round3_label_random = QtWidgets.QLabel(self.groupBox_3)
self.round3_label_random.setEnabled(False)
self.round3_label_random.setObjectName("round3_label_random")
self.verticalLayout_13.addWidget(self.round3_label_random)
self.round3_bar_random = QtWidgets.QSlider(self.groupBox_3)
self.round3_bar_random.setEnabled(False)
self.round3_bar_random.setMaximumSize(QtCore.QSize(100, 16777215))
self.round3_bar_random.setMinimum(90)
self.round3_bar_random.setMaximum(110)
self.round3_bar_random.setProperty("value", 90)
self.round3_bar_random.setOrientation(QtCore.Qt.Horizontal)
self.round3_bar_random.setObjectName("round3_bar_random")
self.verticalLayout_13.addWidget(self.round3_bar_random)
self.gridLayout_5.addLayout(self.verticalLayout_13, 0, 9, 1, 1)
self.horizontalLayout_182 = QtWidgets.QHBoxLayout()
self.horizontalLayout_182.setObjectName("horizontalLayout_182")
self.round3_master_skill1 = QtWidgets.QPushButton(self.groupBox_3)
self.round3_master_skill1.setEnabled(False)
self.round3_master_skill1.setMinimumSize(QtCore.QSize(30, 30))
self.round3_master_skill1.setMaximumSize(QtCore.QSize(30, 30))
self.round3_master_skill1.setText("")
self.round3_master_skill1.setIconSize(QtCore.QSize(30, 30))
self.round3_master_skill1.setObjectName("round3_master_skill1")
self.horizontalLayout_182.addWidget(self.round3_master_skill1)
self.round3_master_skill2 = QtWidgets.QPushButton(self.groupBox_3)
self.round3_master_skill2.setEnabled(False)
self.round3_master_skill2.setMinimumSize(QtCore.QSize(30, 30))
self.round3_master_skill2.setMaximumSize(QtCore.QSize(30, 30))
self.round3_master_skill2.setText("")
self.round3_master_skill2.setIconSize(QtCore.QSize(30, 30))
self.round3_master_skill2.setObjectName("round3_master_skill2")
self.horizontalLayout_182.addWidget(self.round3_master_skill2)
self.round3_master_skill3 = QtWidgets.QPushButton(self.groupBox_3)
self.round3_master_skill3.setEnabled(False)
self.round3_master_skill3.setMinimumSize(QtCore.QSize(30, 30))
self.round3_master_skill3.setMaximumSize(QtCore.QSize(30, 30))
self.round3_master_skill3.setText("")
self.round3_master_skill3.setIconSize(QtCore.QSize(30, 30))
self.round3_master_skill3.setObjectName("round3_master_skill3")
self.horizontalLayout_182.addWidget(self.round3_master_skill3)
self.gridLayout_5.addLayout(self.horizontalLayout_182, 2, 7, 1, 1)
self.horizontalLayout_142 = QtWidgets.QHBoxLayout()
self.horizontalLayout_142.setObjectName("horizontalLayout_142")
self.round3_servant3_skill1 = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant3_skill1.setEnabled(False)
self.round3_servant3_skill1.setMinimumSize(QtCore.QSize(30, 30))
self.round3_servant3_skill1.setMaximumSize(QtCore.QSize(30, 30))
self.round3_servant3_skill1.setText("")
self.round3_servant3_skill1.setIconSize(QtCore.QSize(30, 30))
self.round3_servant3_skill1.setObjectName("round3_servant3_skill1")
self.horizontalLayout_142.addWidget(self.round3_servant3_skill1)
self.round3_servant3_skill2 = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant3_skill2.setEnabled(False)
self.round3_servant3_skill2.setMinimumSize(QtCore.QSize(30, 30))
self.round3_servant3_skill2.setMaximumSize(QtCore.QSize(30, 30))
self.round3_servant3_skill2.setText("")
self.round3_servant3_skill2.setIconSize(QtCore.QSize(30, 30))
self.round3_servant3_skill2.setObjectName("round3_servant3_skill2")
self.horizontalLayout_142.addWidget(self.round3_servant3_skill2)
self.round3_servant3_skill3 = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant3_skill3.setEnabled(False)
self.round3_servant3_skill3.setMinimumSize(QtCore.QSize(30, 30))
self.round3_servant3_skill3.setMaximumSize(QtCore.QSize(30, 30))
self.round3_servant3_skill3.setText("")
self.round3_servant3_skill3.setIconSize(QtCore.QSize(30, 30))
self.round3_servant3_skill3.setObjectName("round3_servant3_skill3")
self.horizontalLayout_142.addWidget(self.round3_servant3_skill3)
self.gridLayout_5.addLayout(self.horizontalLayout_142, 2, 6, 1, 1)
self.horizontalLayout_112 = QtWidgets.QHBoxLayout()
self.horizontalLayout_112.setObjectName("horizontalLayout_112")
self.round3_servant1_skill1 = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant1_skill1.setEnabled(False)
self.round3_servant1_skill1.setMinimumSize(QtCore.QSize(30, 30))
self.round3_servant1_skill1.setMaximumSize(QtCore.QSize(30, 30))
self.round3_servant1_skill1.setText("")
self.round3_servant1_skill1.setIconSize(QtCore.QSize(30, 30))
self.round3_servant1_skill1.setObjectName("round3_servant1_skill1")
self.horizontalLayout_112.addWidget(self.round3_servant1_skill1)
self.round3_servant1_skill2 = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant1_skill2.setEnabled(False)
self.round3_servant1_skill2.setMinimumSize(QtCore.QSize(30, 30))
self.round3_servant1_skill2.setMaximumSize(QtCore.QSize(30, 30))
self.round3_servant1_skill2.setText("")
self.round3_servant1_skill2.setIconSize(QtCore.QSize(30, 30))
self.round3_servant1_skill2.setObjectName("round3_servant1_skill2")
self.horizontalLayout_112.addWidget(self.round3_servant1_skill2)
self.round3_servant1_skill3 = QtWidgets.QPushButton(self.groupBox_3)
self.round3_servant1_skill3.setEnabled(False)
self.round3_servant1_skill3.setMinimumSize(QtCore.QSize(30, 30))
self.round3_servant1_skill3.setMaximumSize(QtCore.QSize(30, 30))
self.round3_servant1_skill3.setText("")
self.round3_servant1_skill3.setIconSize(QtCore.QSize(30, 30))
self.round3_servant1_skill3.setObjectName("round3_servant1_skill3")
self.horizontalLayout_112.addWidget(self.round3_servant1_skill3)
self.gridLayout_5.addLayout(self.horizontalLayout_112, 2, 3, 1, 1)
self.verticalLayout_7.addLayout(self.gridLayout_5)
self.horizontalLayout_15.addWidget(self.groupBox_3)
self.verticalLayout_2.addLayout(self.horizontalLayout_15)
MainWindow.setCentralWidget(self.centralwidget)
self.menubar = QtWidgets.QMenuBar(MainWindow)
self.menubar.setGeometry(QtCore.QRect(0, 0, 1070, 26))
self.menubar.setObjectName("menubar")
self.menu = QtWidgets.QMenu(self.menubar)
self.menu.setObjectName("menu")
MainWindow.setMenuBar(self.menubar)
self.statusbar = QtWidgets.QStatusBar(MainWindow)
self.statusbar.setObjectName("statusbar")
MainWindow.setStatusBar(self.statusbar)
self.action_update = QtWidgets.QAction(MainWindow)
self.action_update.setObjectName("action_update")
self.action_mooncell = QtWidgets.QAction(MainWindow)
self.action_mooncell.setObjectName("action_mooncell")
self.action_support = QtWidgets.QAction(MainWindow)
self.action_support.setObjectName("action_support")
self.action_kazemai = QtWidgets.QAction(MainWindow)
self.action_kazemai.setObjectName("action_kazemai")
self.action_about = QtWidgets.QAction(MainWindow)
self.action_about.setObjectName("action_about")
self.menu.addAction(self.action_update)
self.menu.addAction(self.action_support)
self.menu.addAction(self.action_about)
self.menu.addSeparator()
self.menu.addAction(self.action_mooncell)
self.menu.addAction(self.action_kazemai)
self.menubar.addAction(self.menu.menuAction())
self.retranslateUi(MainWindow)
QtCore.QMetaObject.connectSlotsByName(MainWindow)
def retranslateUi(self, MainWindow):
_translate = QtCore.QCoreApplication.translate
MainWindow.setWindowTitle(_translate("MainWindow", "FGO周回组队器"))
self.label_costume_state_4.setText(_translate("MainWindow", "等级: "))
self.label_servant_state_2.setText(_translate("MainWindow", "技能: \n"
"宝具: \n"
"等级: \n"
"芙芙:"))
self.label_costume_state_1.setText(_translate("MainWindow", "等级: "))
self.box_skill_confirm.setText(_translate("MainWindow", "技能提示"))
self.label.setText(_translate("MainWindow", "概率阈值:"))
self.label_costume_state_5.setText(_translate("MainWindow", "等级: "))
self.label_costume_state_6.setText(_translate("MainWindow", "等级: "))
self.label_servant_state_1.setText(_translate("MainWindow", "技能: \n"
"宝具: \n"
"等级: \n"
"芙芙:"))
self.label_master_state.setText(_translate("MainWindow", "等级:"))
self.label_costume_state_2.setText(_translate("MainWindow", "等级: "))
self.label_servant_state_3.setText(_translate("MainWindow", "技能: \n"
"宝具: \n"
"等级: \n"
"芙芙:"))
self.label_servant_state_5.setText(_translate("MainWindow", "技能: \n"
"宝具: \n"
"等级: \n"
"芙芙:"))
self.label_costume_state_3.setText(_translate("MainWindow", "等级: "))
self.label_servant_state_4.setText(_translate("MainWindow", "技能: \n"
"宝具: \n"
"等级: \n"
"芙芙:"))
self.label_servant_state_6.setText(_translate("MainWindow", "技能: \n"
"宝具: \n"
"等级: \n"
"芙芙:"))
self.btn_set_progress.setText(_translate("MainWindow", "选择进度"))
self.btn_choose_level.setText(_translate("MainWindow", "设置副本"))
self.btn_confirm_team.setText(_translate("MainWindow", "确 认"))
self.btn_change_team.setText(_translate("MainWindow", "修 改"))
self.btn_round_reset.setText(_translate("MainWindow", "撤 销"))
self.round1_label_random.setText(_translate("MainWindow", "随机数: 0.9"))
self.round1_servant2_np.setText(_translate("MainWindow", "宝具: 0%"))
self.round1_servant3_np.setText(_translate("MainWindow", "宝具: 0%"))
self.btn_round1_next.setText(_translate("MainWindow", "下一回合"))
self.round1_servant1_np.setText(_translate("MainWindow", "宝具: 0%"))
self.round2_servant3_np.setText(_translate("MainWindow", "宝具: 0%"))
self.round2_servant1_np.setText(_translate("MainWindow", "宝具: 0%"))
self.btn_round2_next.setText(_translate("MainWindow", "下一回合"))
self.round2_servant2_np.setText(_translate("MainWindow", "宝具: 0%"))
self.round2_label_random.setText(_translate("MainWindow", "随机数: 0.9"))
self.round3_servant3_np.setText(_translate("MainWindow", "宝具: 0%"))
self.round3_servant1_np.setText(_translate("MainWindow", "宝具: 0%"))
self.round3_servant2_np.setText(_translate("MainWindow", "宝具: 0%"))
self.btn_output_strategy.setText(_translate("MainWindow", "输出操作"))
self.round3_label_random.setText(_translate("MainWindow", "随机数: 0.9"))
self.menu.setTitle(_translate("MainWindow", "选 项"))
self.action_update.setText(_translate("MainWindow", "数据库更新"))
self.action_mooncell.setText(_translate("MainWindow", "Mooncell"))
self.action_support.setText(_translate("MainWindow", "软件更新"))
self.action_kazemai.setText(_translate("MainWindow", "茹西教王的理想乡"))
self.action_about.setText(_translate("MainWindow", "关于软件"))
| 64.746904 | 106 | 0.748927 | 9,856 | 83,653 | 6.060471 | 0.024959 | 0.050827 | 0.045202 | 0.027121 | 0.730429 | 0.634049 | 0.567803 | 0.47159 | 0.141834 | 0.092379 | 0 | 0.066118 | 0.151148 | 83,653 | 1,291 | 107 | 64.797057 | 0.775071 | 0.002152 | 0 | 0.070423 | 1 | 0 | 0.05401 | 0.010387 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001565 | false | 0 | 0.000782 | 0 | 0.00313 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2200a38582a5987a8032f11e6758387289477471 | 2,240 | py | Python | models/FlagAttachment.py | jeffg2k/RootTheBox | 1bb971f98da96f66c868f5786c2405321b0be976 | [
"Apache-2.0"
] | 1 | 2020-02-28T16:23:12.000Z | 2020-02-28T16:23:12.000Z | models/FlagAttachment.py | Warlockk/RootTheBox | e24f3e0350aec1b65be81cdc71ff09a5e1b8e587 | [
"Apache-2.0"
] | null | null | null | models/FlagAttachment.py | Warlockk/RootTheBox | e24f3e0350aec1b65be81cdc71ff09a5e1b8e587 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Nov 24, 2014
@author: moloch
Copyright 2014 Root the Box
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import os
from uuid import uuid4
from sqlalchemy import Column, ForeignKey
from sqlalchemy.types import Unicode, String, Integer
from models.BaseModels import DatabaseObject
from libs.StringCoding import encode, decode
from builtins import str
from tornado.options import options
class FlagAttachment(DatabaseObject):
"""
These are files that the administrator wants to
distribute alongside a flag.
"""
uuid = Column(String(36), unique=True, nullable=False, default=lambda: str(uuid4()))
flag_id = Column(Integer, ForeignKey("flag.id"), nullable=False)
_file_name = Column(Unicode(64), nullable=False)
@property
def file_name(self):
return self._file_name
@file_name.setter
def file_name(self, value):
fname = value.replace("\n", "").replace("\r", "")
self._file_name = str(os.path.basename(fname))[:64]
@property
def data(self):
with open(options.flag_attachment_dir + "/" + self.uuid, "rb") as fp:
return decode(fp.read(), "base64")
@data.setter
def data(self, value):
if self.uuid is None:
self.uuid = str(uuid4())
self.byte_size = len(value)
with open(options.flag_attachment_dir + "/" + self.uuid, "wb") as fp:
fp.write(str(encode(value, "base64")).encode())
def delete_data(self):
""" Remove the file from the file system, if it exists """
fpath = options.flag_attachment_dir + "/" + self.uuid
if os.path.exists(fpath) and os.path.isfile(fpath):
os.unlink(fpath)
| 31.111111 | 88 | 0.672768 | 306 | 2,240 | 4.866013 | 0.490196 | 0.040296 | 0.04231 | 0.048355 | 0.075218 | 0.075218 | 0.053727 | 0.053727 | 0 | 0 | 0 | 0.016083 | 0.222768 | 2,240 | 71 | 89 | 31.549296 | 0.839173 | 0.350893 | 0 | 0.058824 | 0 | 0 | 0.021112 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.147059 | false | 0 | 0.235294 | 0.029412 | 0.558824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
2205e64387c6f4c5a706049e6175c53d8453ff11 | 442 | py | Python | Cryptography/Exp-1-Shamirs-Secret-Sharing/main.py | LuminolT/Cryptographic | 87fffae591eee9644641a4c511972df0c2a44df7 | [
"MIT"
] | null | null | null | Cryptography/Exp-1-Shamirs-Secret-Sharing/main.py | LuminolT/Cryptographic | 87fffae591eee9644641a4c511972df0c2a44df7 | [
"MIT"
] | null | null | null | Cryptography/Exp-1-Shamirs-Secret-Sharing/main.py | LuminolT/Cryptographic | 87fffae591eee9644641a4c511972df0c2a44df7 | [
"MIT"
] | 1 | 2022-03-07T13:56:55.000Z | 2022-03-07T13:56:55.000Z | import numpy as np
import matplotlib.pyplot as plt
from shamir import *
from binascii import hexlify
# img = plt.imread('cat.png')
# plt.imshow(img)
# plt.show()
s = 'TEST_STRING'.encode()
print("Original secret:", hexlify(s))
l = Shamir.split(3, 5, '12345'.encode())
for idx, item in l:
print("Share {}: {}".format(str(idx), hexlify(item)))
shares = l[1:4]
secret = Shamir.combine(shares)
print(f'Secret is : {secret.decode()}') | 18.416667 | 57 | 0.669683 | 68 | 442 | 4.338235 | 0.632353 | 0.040678 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023936 | 0.149321 | 442 | 24 | 58 | 18.416667 | 0.760638 | 0.122172 | 0 | 0 | 0 | 0 | 0.18961 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
220aabb343a26ce1e8cadcc4df4a8b3a8adedfdb | 2,939 | py | Python | towhee/engine/pipeline.py | jeffoverflow/towhee | c576d22a4cdfc3909a3323b0d1decab87e83d26c | [
"Apache-2.0"
] | null | null | null | towhee/engine/pipeline.py | jeffoverflow/towhee | c576d22a4cdfc3909a3323b0d1decab87e83d26c | [
"Apache-2.0"
] | null | null | null | towhee/engine/pipeline.py | jeffoverflow/towhee | c576d22a4cdfc3909a3323b0d1decab87e83d26c | [
"Apache-2.0"
] | null | null | null | # Copyright 2021 Zilliz. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from towhee.engine.graph_context import GraphContext
from towhee.dag.graph_repr import GraphRepr
from towhee.dataframe.dataframe import DFIterator
class Pipeline:
"""
The runtime pipeline context
"""
def __init__(self, engine: Engine, graph_repr: GraphRepr, parallelism: int = 1) -> None:
"""
Args:
engine: the local engine to drive the Pipeline
graph_repr: the graph representation
parallelism: how many rows of inputs to be processed concurrently
"""
self._engine = engine
self._graph_repr = graph_repr
self._parallelism = parallelism
def build(self):
"""
Create GraphContexts and set up input iterators.
"""
raise NotImplementedError
def run(self, inputs: list) -> DFIterator:
"""
The Pipeline's main loop
Agrs:
inputs: the input data, organized as a list of DataFrame, feeding
to the Pipeline.
"""
# while we still have pipeline inputs:
# input = inputs.next()
# for g in graph contexts:
# if g.is_idle:
# g.start_op.inputs = input
# break
# if all graphs contexts are busy:
# wait for notification from _notify_run_loop
raise NotImplementedError
def on_start(self, handler: function):
"""
Set a custom handler that called before the execution of the graph.
"""
self._on_start_handler = handler
raise NotImplementedError
def on_finish(self, handler: function):
"""
Set a custom handler that called after the execution of the graph.
"""
self._on_finish_handler = handler
raise NotImplementedError
def _organize_outputs(self, graph_ctx: GraphContext):
"""
on_finish handler passing to GraphContext. The handler will organize the
GraphContext's output into Pipeline's outputs.
"""
raise NotImplementedError
def _notify_run_loop(self, graph_ctx: GraphContext):
"""
on_finish handler passing to GraphContext. The handler will notify the run loop
that a GraphContext is in idle state.
"""
raise NotImplementedError
| 33.397727 | 92 | 0.638312 | 349 | 2,939 | 5.275072 | 0.432665 | 0.032591 | 0.07333 | 0.017382 | 0.205323 | 0.160782 | 0.160782 | 0.130364 | 0.130364 | 0.080391 | 0 | 0.004367 | 0.298741 | 2,939 | 87 | 93 | 33.781609 | 0.888889 | 0.535216 | 0 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.318182 | false | 0 | 0.136364 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
221ac1f2a8c5526fcda12d6ed18346f9e5d9d58a | 906 | py | Python | kasaya/core/backend/redisstore.py | AYAtechnologies/Kasaya-esb | 150fa96d4136641cd4632f3c9a09d4fc2610df07 | [
"BSD-2-Clause"
] | 1 | 2015-06-26T18:05:20.000Z | 2015-06-26T18:05:20.000Z | kasaya/core/backend/redisstore.py | AYAtechnologies/Kasaya-esb | 150fa96d4136641cd4632f3c9a09d4fc2610df07 | [
"BSD-2-Clause"
] | null | null | null | kasaya/core/backend/redisstore.py | AYAtechnologies/Kasaya-esb | 150fa96d4136641cd4632f3c9a09d4fc2610df07 | [
"BSD-2-Clause"
] | null | null | null | __author__ = 'wektor'
from generic import GenericBackend
import redis
class RedisBackend(GenericBackend):
def __init__(self):
pool = redis.ConnectionPool(host='localhost', port=6379, db=0)
self.store = redis.Redis(connection_pool=pool)
def get_typecode(self, value):
typecode = str(type(value)).split("'")[1]
return typecode
def set(self, key, value):
data = {}
data["type"] = self.get_typecode(value)
data["data"] = value
self.store.hmset(key, data)
# def update(self, key, value):
def get(self, key):
data = self.store.hgetall(key)
print data
try:
if data["type"] != "str":
return eval(data["data"])
else:
return data["data"]
except KeyError:
return {}
def delete(self, key):
self.store.delete(key) | 25.166667 | 71 | 0.562914 | 103 | 906 | 4.84466 | 0.427184 | 0.072144 | 0.048096 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009585 | 0.309051 | 906 | 36 | 72 | 25.166667 | 0.78754 | 0.032009 | 0 | 0 | 0 | 0 | 0.044521 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.074074 | null | null | 0.037037 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
221c5dbcccafcacd09ca66b22dfdff675d20b942 | 2,050 | py | Python | tests/test_kobo.py | Donearm/kobuddy | 9c55f2f94c3c949c4d8a5ba18704be92c055873c | [
"MIT"
] | 75 | 2019-08-24T14:21:53.000Z | 2022-02-21T17:20:20.000Z | tests/test_kobo.py | Donearm/kobuddy | 9c55f2f94c3c949c4d8a5ba18704be92c055873c | [
"MIT"
] | 9 | 2019-10-15T19:30:16.000Z | 2021-08-17T15:24:00.000Z | tests/test_kobo.py | Donearm/kobuddy | 9c55f2f94c3c949c4d8a5ba18704be92c055873c | [
"MIT"
] | 4 | 2020-02-05T13:53:59.000Z | 2021-08-17T14:50:39.000Z | from datetime import datetime
from pathlib import Path
import pytz
import kobuddy
def get_test_db():
# db = Path(__file__).absolute().parent.parent / 'KoboShelfes' / 'KoboReader.sqlite.0'
db = Path(__file__).absolute().parent / 'data' / 'kobo_notes' / 'input' / 'KoboReader.sqlite'
return db
# a bit meh, but ok for now
kobuddy.set_databases(get_test_db())
from kobuddy import _iter_events_aux, get_events, get_books_with_highlights, _iter_highlights
def test_events():
for e in _iter_events_aux():
print(e)
def test_hls():
for h in _iter_highlights():
print(h)
def test_get_all():
events = get_events()
assert len(events) > 50
for d in events:
print(d)
def test_books_with_highlights():
pages = get_books_with_highlights()
g = pages[0]
assert 'Essentialism' in g.book
hls = g.highlights
assert len(hls) == 273
[b] = [h for h in hls if h.eid == '520b7b13-dbef-4402-9a81-0f4e0c4978de']
# TODO wonder if there might be any useful info? StartContainerPath, EndContainerPath
assert b.kind == 'bookmark'
# TODO move to a more specific test?
# TODO assert sorted by date or smth?
assert hls[0].kind == 'highlight'
# TODO assert highlights got no annotation? not sure if it's even necessary to distinguish..
[ann] = [h for h in hls if h.annotation is not None and len(h.annotation) > 0]
assert ann.eid == 'eb264817-9a06-42fd-92ff-7bd38cd9ca79'
assert ann.kind == 'annotation'
assert ann.text == 'He does this by finding which machine has the biggest queue of materials waiting behind it and finds a way to increase its efficiency.'
assert ann.annotation == 'Bottleneck'
assert ann.dt == datetime(year=2017, month=8, day=12, hour=3, minute=49, second=13, microsecond=0, tzinfo=pytz.utc)
assert ann.book.author == 'Greg McKeown'
assert len(pages) == 7
def test_history():
kobuddy.print_progress()
def test_annotations():
kobuddy.print_annotations()
def test_books():
kobuddy.print_books()
| 28.472222 | 159 | 0.691707 | 301 | 2,050 | 4.561462 | 0.48505 | 0.035688 | 0.041515 | 0.02622 | 0.053897 | 0.018937 | 0.018937 | 0 | 0 | 0 | 0 | 0.037897 | 0.201951 | 2,050 | 71 | 160 | 28.873239 | 0.801345 | 0.173659 | 0 | 0 | 0 | 0.023256 | 0.179609 | 0.042679 | 0 | 0 | 0 | 0.014085 | 0.27907 | 1 | 0.186047 | false | 0 | 0.116279 | 0 | 0.325581 | 0.139535 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
221edc811e6d0e0ea5e013272ed5a112078a3713 | 1,062 | py | Python | tanks/views.py | BArdelean/djangostuff | b4b7b6bac5e1d8dbc73e2f5cb5a7e784a82c9519 | [
"bzip2-1.0.6"
] | null | null | null | tanks/views.py | BArdelean/djangostuff | b4b7b6bac5e1d8dbc73e2f5cb5a7e784a82c9519 | [
"bzip2-1.0.6"
] | null | null | null | tanks/views.py | BArdelean/djangostuff | b4b7b6bac5e1d8dbc73e2f5cb5a7e784a82c9519 | [
"bzip2-1.0.6"
] | null | null | null | from django.shortcuts import render
from .models import Tank
from django.db import models
from django.http import HttpResponse
from django.views import View
# Create your views here.
# The view for the created model Tank
def tank_view(request):
queryset = Tank.objects.all()
context = {
'object': queryset
}
return render(request, "tankbattle.html", context)
def tank_1(request, pk):
queryset = Tank.objects.get(pk=1)
context = {
'object': queryset
}
return render(request, 'tankbattle.html', context)
def tank_2(request, pk):
queryset = Tank.objects.get(pk=2)
context = {
'object': queryset
}
return render(request, 'tankbattle.html', context)
def tank_3(request, pk):
queryset = Tank.objects.get(pk=3)
context = {
'object': queryset
}
return render(request, 'tankbattle.html', context)
def tank_4(request, pk):
queryset = Tank.objects.get(pk=4)
context = {
'object': queryset
}
return render(request, 'tankbattle.html', context)
| 21.673469 | 54 | 0.65725 | 132 | 1,062 | 5.25 | 0.265152 | 0.050505 | 0.137085 | 0.194805 | 0.670996 | 0.670996 | 0.670996 | 0.480519 | 0.480519 | 0.392496 | 0 | 0.009768 | 0.228814 | 1,062 | 48 | 55 | 22.125 | 0.836386 | 0.055556 | 0 | 0.4 | 0 | 0 | 0.105 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2225c8037c0f830751bb9a95f4a1c6bd17ba29de | 824 | py | Python | 403-Frog-Jump/solution.py | Tanych/CodeTracking | 86f1cb98de801f58c39d9a48ce9de12df7303d20 | [
"MIT"
] | null | null | null | 403-Frog-Jump/solution.py | Tanych/CodeTracking | 86f1cb98de801f58c39d9a48ce9de12df7303d20 | [
"MIT"
] | null | null | null | 403-Frog-Jump/solution.py | Tanych/CodeTracking | 86f1cb98de801f58c39d9a48ce9de12df7303d20 | [
"MIT"
] | null | null | null | class Solution(object):
def dfs(self,stones,graph,curpos,lastjump):
if curpos==stones[-1]:
return True
# since the jump need based on lastjump
# only forward,get rid of the stay at the same pos
rstart=max(curpos+lastjump-1,curpos+1)
rend=min(curpos+lastjump+1,stones[-1])+1
for nextpos in xrange(rstart,rend):
if nextpos in graph and self.dfs(stones,graph,nextpos,nextpos-curpos):
return True
return False
def canCross(self, stones):
"""
:type stones: List[int]
:rtype: bool
"""
if not stones:
return True
if stones[1]!=1:
return False
graph={val:idx for idx,val in enumerate(stones)}
return self.dfs(stones,graph,1,1)
| 32.96 | 82 | 0.566748 | 108 | 824 | 4.324074 | 0.453704 | 0.070664 | 0.06424 | 0.077088 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018182 | 0.332524 | 824 | 25 | 83 | 32.96 | 0.830909 | 0.150485 | 0 | 0.294118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0 | 0 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
222dd53901bfb2ab9baf636ea45e6459defef6a1 | 9,975 | py | Python | runOtakuBot.py | Eagleheardt/otakuBot | 6f8576423bb1b0701d5a60095bed7552b2711bab | [
"Unlicense"
] | null | null | null | runOtakuBot.py | Eagleheardt/otakuBot | 6f8576423bb1b0701d5a60095bed7552b2711bab | [
"Unlicense"
] | null | null | null | runOtakuBot.py | Eagleheardt/otakuBot | 6f8576423bb1b0701d5a60095bed7552b2711bab | [
"Unlicense"
] | null | null | null | import sqlite3
from sqlite3 import Error
import os
import time
import datetime
import re
import random
import schedule
import cryptography
from apscheduler.schedulers.background import BackgroundScheduler
from slackclient import SlackClient
from cryptography.fernet import Fernet
conn = sqlite3.connect('/home/ubuntu/otakuBot/data/anime.db')
serverCursor = conn.cursor()
keyFile = open('/home/ubuntu/otakuBot/data/otakubot_token.key', 'rb')
key = keyFile.read()
keyFile.close()
f = Fernet(key)
encryptedTokenFile = open('/home/ubuntu/otakuBot/data/otakubot_token.encrypted', 'rb')
encryptedToken = encryptedTokenFile.read()
decryptedToken = f.decrypt(encryptedToken)
SLACK_BOT_TOKEN = decryptedToken.decode()
# instantiate Slack client
slack_client = SlackClient(SLACK_BOT_TOKEN)
# starterbot's user ID in Slack: value is assigned after the bot starts up
otakuBotID = None
# constants
RTM_READ_DELAY = 0.5 # 0.5 second delay in reading events
def stdOut(s):
curDate = datetime.datetime.today().strftime('%Y-%m-%d')
curTime = datetime.datetime.now().strftime('%H:%M:%S')
logFile = open((("/home/ubuntu/logs/{0}.log").format(curDate)),"a")
logFile.write(("{0}: {1}\n").format(curTime,s))
logFile.close()
return
def logIt():
curDate = datetime.datetime.today().strftime('%Y-%m-%d')
curTime = datetime.datetime.now().strftime('%H:%M:%S')
logFile = open((("/home/ubuntu/logs/{0}.log").format(curDate)),"a")
logFile.write(("{0}: Otaku 15 minute check in!\n").format(curTime))
logFile.close()
return
schedule.every(15).minutes.do(logIt)
def SQLReturn(aConn,sqlCmd):
reportCur = aConn.cursor()
reportCur.execute(sqlCmd)
SQLResults = reportCur.fetchall()
reportCur.close()
return SQLResults
def insertQuote (aUser,theQuote):
newCur = conn.cursor()
newCur.execute(("""
INSERT INTO
Quotes (User, Words)
VALUES
('{0}','{1}');
""").format(aUser,theQuote))
newCur.close()
conn.commit()
return
def insertAniMusic (aUser,theLink):
newCur = conn.cursor()
newCur.execute(("""
INSERT INTO
Music (Category, User, Link)
VALUES
('Anime','{0}','{1}');
""").format(aUser,theLink))
newCur.close()
conn.commit()
return
def insertEngMusic (aUser,theLink):
newCur = conn.cursor()
newCur.execute(("""
INSERT INTO
Music (Category, User, Link)
VALUES
('English','{0}','{1}');
""").format(aUser,theLink))
newCur.close()
conn.commit()
return
def insertIcon (aUser,theLink):
newCur = conn.cursor()
newCur.execute(("""
INSERT INTO
Music (Category, User, Link)
VALUES
('Iconic','{0}','{1}');
""").format(aUser,theLink))
newCur.close()
conn.commit()
return
def deleteQuote (quoteID):
newCur = conn.cursor()
newCur.execute(("""
DELETE
FROM
Quotes
WHERE
ID == {0};
""").format(quoteID))
newCur.close()
conn.commit()
return
def getQuote(aConn):
sqlCmd = "SELECT Words FROM Quotes;"
results = SQLReturn(aConn,sqlCmd)
allQuotes = []
for quote in results:
allQuotes.append(quote)
return (random.choice(allQuotes))
def getAniMusic(aConn):
sqlCmd = "SELECT Link FROM Music WHERE Category = 'Anime';"
results = SQLReturn(aConn,sqlCmd)
allQuotes = []
for quote in results:
allQuotes.append(quote)
return (random.choice(allQuotes))
def getEngMusic(aConn):
sqlCmd = "SELECT Link FROM Music WHERE Category = 'English';"
results = SQLReturn(aConn,sqlCmd)
allQuotes = []
for quote in results:
allQuotes.append(quote)
return (random.choice(allQuotes))
def getIconic(aConn):
sqlCmd = "SELECT Link FROM Music WHERE Category = 'Iconic';"
results = SQLReturn(aConn,sqlCmd)
allQuotes = []
for quote in results:
allQuotes.append(quote)
return (random.choice(allQuotes))
def getAllQuotes(aConn):
sqlCmd = "SELECT ID, Words FROM Quotes;"
results = SQLReturn(aConn,sqlCmd)
allQuotes = []
for quote in results:
allQuotes.append(quote)
newStr = "All the Quotes\n"
for item in allQuotes:
i = 1
for place in item:
if i == 1:
newStr += "ID: " + str(place) + "\n"
if i == 2:
newStr += "Words: " + str(place) + "\n\n"
i += 1
return newStr
def EODReportRange (date1, date2): # Gets a range summary of the VM number and status reported
cmd = (("""
SELECT
ServerNumber as [Server]
, ServerStatus as [Status]
, count(ServerStatus) as [Amount]
FROM
Status
WHERE
date(TimeStamp) BETWEEN '{0}' AND '{1}'
AND ServerNumber IN('1','2','3','4','17')
GROUP BY
ServerNumber
,ServerStatus
""").format(date1, date2))
results = SQLReturn(conn,cmd)
newStr = "Report for: " + date1 + " to " + date2 + "\n"
for row in results:
i = 1
for item in row:
if i == 1:
newStr += "VM" + str(item) + " - "
if i == 2:
newStr += "Status: " + str(item) + " - "
if i == 3:
if item != 1:
newStr += "Reported: " + str(item) + " times"
else:
newStr += "Reported: " + str(item) + " time"
i += 1
newStr += "\n"
return newStr
def parseSlackInput(aText):
if aText and len(aText) > 0:
item = aText[0]
if 'text' in item:
msg = item['text'].strip(' ')
chn = item['channel']
usr = item['user']
stp = item['ts']
return [str(msg),str(chn),str(usr),str(stp)]
else:
return [None,None,None,None]
def inChannelResponse(channel,response):
slack_client.api_call(
"chat.postMessage",
channel=channel,
text=response,
as_user=True
)
return
def threadedResponse(channel,response,stamp):
slack_client.api_call(
"chat.postMessage",
channel=channel,
text=response,
thread_ts=stamp,
as_user=True
)
return
def directResponse(someUser,text):
slack_client.api_call(
"chat.postMessage",
channel=someUser,
text=text,
as_user=True
)
return
def parseQuote(someMsg):
starter,theQuote = someMsg.split(' ', 1)
return theQuote
def handle_command(command, channel, aUser, tStamp):
"""
Executes bot command if the command is known
"""
#command = command.lower()
response = None
# This is where you start to implement more commands!
if command.lower().startswith("!help"):
response = """I'm Otaku Bot!
I don't do a lot yet. But watch out! I'm just getting started!
!addquote[SPACE][A quote of your choice!] - I will remember your quote!
!quote - I will reply with a random quote!
!addAniMusic[SPACE][Link to a Japanese anime song] - I will remember your music!
!addEngMusic[SPACE][Link to an English anime song] - I will remember your music!
!addIconic[SPACE][Link to an iconic anime moment] - I will remember your moment!
!animusic - I will reply with a Japanese anime song from memory!
!engmusic - I will reply with an English anime song from memory!
!iconic - I will show you an iconic anime moment!
"""
inChannelResponse(channel,response)
return
if command.lower().startswith("!addquote"):
newQuote = str(command[10:])
insertQuote(aUser,newQuote)
threadedResponse(channel,"I'll try to remember: " + newQuote ,tStamp)
stdOut("Quote Added: " + newQuote)
return
if command.lower().startswith("!quote"):
aQuote = getQuote(conn)
inChannelResponse(channel,aQuote)
return
if command.lower().startswith("!animusic"):
aQuote = getAniMusic(conn)
inChannelResponse(channel,aQuote)
return
if command.lower().startswith("!engmusic"):
aQuote = getEngMusic(conn)
inChannelResponse(channel,aQuote)
return
if command.lower().startswith("!iconic"):
aQuote = getIconic(conn)
inChannelResponse(channel,aQuote)
return
if command.lower().startswith("!onepunch"):
inChannelResponse(channel,"https://www.youtube.com/watch?v=_TUTJ0klnKk")
return
if command.lower().startswith("!addanimusic"):
newQuote = str(command[13:])
insertAniMusic(aUser,newQuote)
threadedResponse(channel,"I'll add this to the Anime music section: " + newQuote ,tStamp)
stdOut("Anime Music Added: " + newQuote)
return
if command.lower().startswith("!addengmusic"):
newQuote = str(command[13:])
insertEngMusic(aUser,newQuote)
threadedResponse(channel,"I'll add this to the English music section: " + newQuote ,tStamp)
stdOut("English Music Added: " + newQuote)
return
if command.lower().startswith("!addiconic"):
newQuote = str(command[11:])
insertIcon(aUser,newQuote)
threadedResponse(channel,"I'll add this to the Iconic moments section: " + newQuote ,tStamp)
stdOut("Iconic Moment Added: " + newQuote)
return
if command.lower().startswith("!delquote"):
if aUser == "UC176R92M":
num = command[10:]
deleteQuote(num)
inChannelResponse(channel,"You have removed a quote.")
else:
inChannelResponse(channel,"You don't have permission to do that!")
return
if command.lower().startswith("!getquotes"):
if aUser == "UC176R92M":
inChannelResponse(channel,getAllQuotes(conn))
else:
inChannelResponse(channel,"You don't have permission to do that!")
return
if command.startswith("!test"):
return
response = (("""Text:{0}
Channel:{1}
TS:{2}
User:{3}
""").format(command,channel,tStamp,aUser))
inChannelResponse(channel,response)
return
return
# Sends the response back to the channel
if __name__ == "__main__":
if slack_client.rtm_connect(with_team_state=False):
stdOut("Otaku Bot connected and running!")
# Read bot's user ID by calling Web API method `auth.test`
otakuBotID = slack_client.api_call("auth.test")["user_id"]
while True:
try:
command, channel,usr,stp = parseSlackInput(slack_client.rtm_read())
if command:
handle_command(command, channel,usr,stp)
except:
pass
schedule.run_pending()
time.sleep(RTM_READ_DELAY)
else:
stdOut("Connection failed. Exception traceback printed above.")
| 27.631579 | 95 | 0.665063 | 1,234 | 9,975 | 5.341977 | 0.241491 | 0.019114 | 0.025485 | 0.043689 | 0.412318 | 0.373635 | 0.358617 | 0.312348 | 0.278216 | 0.239381 | 0 | 0.009618 | 0.197393 | 9,975 | 360 | 96 | 27.708333 | 0.813765 | 0.037393 | 0 | 0.428571 | 0 | 0 | 0.282554 | 0.037485 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.003175 | 0.038095 | null | null | 0.003175 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
22369660633e2973cf659ea963259b0f27b54f98 | 952 | py | Python | posts/models.py | dnetochaves/blog | e04fda385490b671540b671631726584a533369c | [
"MIT"
] | null | null | null | posts/models.py | dnetochaves/blog | e04fda385490b671540b671631726584a533369c | [
"MIT"
] | null | null | null | posts/models.py | dnetochaves/blog | e04fda385490b671540b671631726584a533369c | [
"MIT"
] | null | null | null | from django.db import models
from categorias.models import Categoria
from django.contrib.auth.models import User
from django.utils import timezone
# Create your models here.
class Post(models.Model):
titulo_post = models.CharField(max_length=50, verbose_name='Titulo')
autor_post = models.ForeignKey(User, on_delete=models.DO_NOTHING, verbose_name='Autor')
data_post = models.DateTimeField(default=timezone.now, verbose_name='Data')
conteudo_post = models.TextField(verbose_name='Conteudo')
exerto_post = models.TextField(verbose_name='Exerto')
categoria_post = models.ForeignKey(
Categoria, on_delete=models.DO_NOTHING, blank=True, null=True, verbose_name='Categoria')
imagem_post = models.ImageField(upload_to='post_img', blank=True, null=True, verbose_name='Imagem')
publicacao_post = models.BooleanField(default=False, verbose_name='Publicado')
def __str__(self):
return self.titulo_post
| 43.272727 | 103 | 0.765756 | 125 | 952 | 5.608 | 0.432 | 0.128388 | 0.057061 | 0.045649 | 0.231098 | 0.079886 | 0 | 0 | 0 | 0 | 0 | 0.002415 | 0.130252 | 952 | 21 | 104 | 45.333333 | 0.844203 | 0.02521 | 0 | 0 | 0 | 0 | 0.065875 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.25 | 0.0625 | 0.9375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
22386711171a0a872717803582d333bc6bde0602 | 1,176 | py | Python | algs15_priority_queue/circular_queue.py | zhubaiyuan/learning-algorithms | ea9ee674878d535a9e9987c0d948c0357e0ed4da | [
"MIT"
] | null | null | null | algs15_priority_queue/circular_queue.py | zhubaiyuan/learning-algorithms | ea9ee674878d535a9e9987c0d948c0357e0ed4da | [
"MIT"
] | null | null | null | algs15_priority_queue/circular_queue.py | zhubaiyuan/learning-algorithms | ea9ee674878d535a9e9987c0d948c0357e0ed4da | [
"MIT"
] | null | null | null | """
A fixed-capacity queue implemented as circular queue.
Queue can become full.
* enqueue is O(1)
* dequeue is O(1)
"""
class Queue:
"""
Implementation of a Queue using a circular buffer.
"""
def __init__(self, size):
self.size = size
self.storage = [None] * size
self.first = 0
self.last = 0
self.N = 0
def is_empty(self):
"""
Determine if queue is empty.
"""
return self.N == 0
def is_full(self):
"""
Determine if queue is full.
"""
return self.N == self.size
def enqueue(self, item):
"""
Enqueue new item to end of queue.
"""
if self.is_full():
raise RuntimeError('Queue is full')
self.storage[self.last] = item
self.N += 1
self.last = (self.last + 1) % self.size
def dequeue(self):
"""
Remove and return first item from queue.
"""
if self.is_empty():
raise RuntimeError('Queue is empty')
val = self.storage[self.first]
self.N -= 1
self.first = (self.first + 1) % self.size
return val
| 22.188679 | 54 | 0.519558 | 148 | 1,176 | 4.074324 | 0.297297 | 0.066335 | 0.013267 | 0.029851 | 0.109453 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013387 | 0.364796 | 1,176 | 52 | 55 | 22.615385 | 0.793842 | 0.25085 | 0 | 0 | 0 | 0 | 0.035248 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.208333 | false | 0 | 0 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
223b618424ceff584aa410ff8121dcf69f5567f4 | 323 | py | Python | data_statistics/variable_statistics.py | go-jugo/ml_event_prediction_trainer | 0d644b737afdef078ad5b6fc2b7e2549b964b56f | [
"Apache-2.0"
] | null | null | null | data_statistics/variable_statistics.py | go-jugo/ml_event_prediction_trainer | 0d644b737afdef078ad5b6fc2b7e2549b964b56f | [
"Apache-2.0"
] | null | null | null | data_statistics/variable_statistics.py | go-jugo/ml_event_prediction_trainer | 0d644b737afdef078ad5b6fc2b7e2549b964b56f | [
"Apache-2.0"
] | null | null | null | import dask.dataframe as dd
def descibe_dataframe(df):
error_cols = [col for col in df.columns if 'errorCode' in col]
df = df.categorize(columns=error_cols)
statistics = df.describe(include='all').compute()
statistics = statistics.T
statistics.to_excel('../statistics/variables_statistics.xlsx') | 40.375 | 67 | 0.718266 | 43 | 323 | 5.27907 | 0.627907 | 0.079295 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.167183 | 323 | 8 | 68 | 40.375 | 0.843866 | 0 | 0 | 0 | 0 | 0 | 0.160883 | 0.123028 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
223b7dfcfebf8324a056e2a31a6551d0d0397ac2 | 393 | py | Python | code/com/caicongyang/python/study/base/pandas_sql.py | caicongyang/python-study | e5db4d1b033d183da7e9af6a8c930bcaba2962f7 | [
"Apache-2.0"
] | null | null | null | code/com/caicongyang/python/study/base/pandas_sql.py | caicongyang/python-study | e5db4d1b033d183da7e9af6a8c930bcaba2962f7 | [
"Apache-2.0"
] | null | null | null | code/com/caicongyang/python/study/base/pandas_sql.py | caicongyang/python-study | e5db4d1b033d183da7e9af6a8c930bcaba2962f7 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python
# -*- coding: UTF-8 -*-
import numpy as np
import pandas as pd
'''
如何用sql的方式打开pandas
'''
from pandas import DataFrame, Series
from pandasql import sqldf, load_meat, load_births
df1 = DataFrame({'name': ['jack', 'tony', 'pony'], 'data1': range(3)})
print(df1)
sql = "select * from df1 where name = 'jack'"
pysqldf = lambda sql: sqldf(sql, globals());
print(pysqldf(sql))
| 19.65 | 70 | 0.676845 | 55 | 393 | 4.8 | 0.654545 | 0.060606 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017964 | 0.150127 | 393 | 19 | 71 | 20.684211 | 0.772455 | 0.096692 | 0 | 0 | 0 | 0 | 0.176829 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.444444 | 0 | 0.444444 | 0.222222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
2247d6e5fabfa0bfb68bd37d1e8736537d83e496 | 4,375 | py | Python | flytekit/models/qubole.py | slai/flytekit | 9d73d096b748d263a638e6865d15db4880845305 | [
"Apache-2.0"
] | null | null | null | flytekit/models/qubole.py | slai/flytekit | 9d73d096b748d263a638e6865d15db4880845305 | [
"Apache-2.0"
] | 2 | 2021-06-26T04:32:43.000Z | 2021-07-14T04:47:52.000Z | flytekit/models/qubole.py | slai/flytekit | 9d73d096b748d263a638e6865d15db4880845305 | [
"Apache-2.0"
] | null | null | null | from flyteidl.plugins import qubole_pb2 as _qubole
from flytekit.models import common as _common
class HiveQuery(_common.FlyteIdlEntity):
def __init__(self, query, timeout_sec, retry_count):
"""
Initializes a new HiveQuery.
:param Text query: The query string.
:param int timeout_sec:
:param int retry_count:
"""
self._query = query
self._timeout_sec = timeout_sec
self._retry_count = retry_count
@property
def query(self):
"""
The query string.
:rtype: str
"""
return self._query
@property
def timeout_sec(self):
"""
:rtype: int
"""
return self._timeout_sec
@property
def retry_count(self):
"""
:rtype: int
"""
return self._retry_count
def to_flyte_idl(self):
"""
:rtype: _qubole.HiveQuery
"""
return _qubole.HiveQuery(query=self.query, timeout_sec=self.timeout_sec, retryCount=self.retry_count)
@classmethod
def from_flyte_idl(cls, pb2_object):
"""
:param _qubole.HiveQuery pb2_object:
:return: HiveQuery
"""
return cls(
query=pb2_object.query,
timeout_sec=pb2_object.timeout_sec,
retry_count=pb2_object.retryCount,
)
class HiveQueryCollection(_common.FlyteIdlEntity):
def __init__(self, queries):
"""
Initializes a new HiveQueryCollection.
:param list[HiveQuery] queries: Queries to execute.
"""
self._queries = queries
@property
def queries(self):
"""
:rtype: list[HiveQuery]
"""
return self._queries
def to_flyte_idl(self):
"""
:rtype: _qubole.HiveQueryCollection
"""
return _qubole.HiveQueryCollection(
queries=[query.to_flyte_idl() for query in self.queries] if self.queries else None
)
@classmethod
def from_flyte_idl(cls, pb2_object):
"""
:param _qubole.HiveQuery pb2_object:
:rtype: HiveQueryCollection
"""
return cls(queries=[HiveQuery.from_flyte_idl(query) for query in pb2_object.queries])
class QuboleHiveJob(_common.FlyteIdlEntity):
def __init__(self, query, cluster_label, tags, query_collection=None):
"""
Initializes a HiveJob.
:param HiveQuery query: Single query to execute
:param Text cluster_label: The qubole cluster label to execute the query on
:param list[Text] tags: User tags for the queries
:param HiveQueryCollection query_collection: Deprecated Queries to execute.
"""
self._query = query
self._cluster_label = cluster_label
self._tags = tags
self._query_collection = query_collection
@property
def query_collection(self):
"""
The queries to be executed
:rtype: HiveQueryCollection
"""
return self._query_collection
@property
def query(self):
"""
The query to be executed
:rtype: HiveQuery
"""
return self._query
@property
def cluster_label(self):
"""
The cluster label where the query should be executed
:rtype: Text
"""
return self._cluster_label
@property
def tags(self):
"""
User tags for the queries
:rtype: list[Text]
"""
return self._tags
def to_flyte_idl(self):
"""
:rtype: _qubole.QuboleHiveJob
"""
return _qubole.QuboleHiveJob(
query_collection=self._query_collection.to_flyte_idl() if self._query_collection else None,
query=self._query.to_flyte_idl() if self._query else None,
cluster_label=self._cluster_label,
tags=self._tags,
)
@classmethod
def from_flyte_idl(cls, p):
"""
:param _qubole.QuboleHiveJob p:
:rtype: QuboleHiveJob
"""
return cls(
query_collection=HiveQueryCollection.from_flyte_idl(p.query_collection)
if p.HasField("query_collection")
else None,
query=HiveQuery.from_flyte_idl(p.query) if p.HasField("query") else None,
cluster_label=p.cluster_label,
tags=p.tags,
)
| 26.355422 | 109 | 0.597257 | 465 | 4,375 | 5.35914 | 0.139785 | 0.04695 | 0.024077 | 0.032504 | 0.306581 | 0.167335 | 0.08748 | 0.053772 | 0.053772 | 0.053772 | 0 | 0.003007 | 0.315886 | 4,375 | 165 | 110 | 26.515152 | 0.829602 | 0.236343 | 0 | 0.328767 | 0 | 0 | 0.007447 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.232877 | false | 0 | 0.027397 | 0 | 0.493151 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
22529ca1da0ceee85ccd01a18946c0340e79ffbb | 330 | py | Python | cycleshare/migrations/0013_remove_cycle_toprofile.py | vasundhara7/College-EWallet | 0a4c32bc08218650635a04fb9a9e28446fd4f3e1 | [
"Apache-2.0"
] | 2 | 2019-07-28T00:34:09.000Z | 2020-06-18T11:58:03.000Z | cycleshare/migrations/0013_remove_cycle_toprofile.py | vasundhara7/College-EWallet | 0a4c32bc08218650635a04fb9a9e28446fd4f3e1 | [
"Apache-2.0"
] | null | null | null | cycleshare/migrations/0013_remove_cycle_toprofile.py | vasundhara7/College-EWallet | 0a4c32bc08218650635a04fb9a9e28446fd4f3e1 | [
"Apache-2.0"
] | null | null | null | # Generated by Django 2.0.9 on 2018-12-12 08:18
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('cycleshare', '0012_cycle_toprofile'),
]
operations = [
migrations.RemoveField(
model_name='cycle',
name='toprofile',
),
]
| 18.333333 | 47 | 0.593939 | 34 | 330 | 5.676471 | 0.764706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081545 | 0.293939 | 330 | 17 | 48 | 19.411765 | 0.746781 | 0.136364 | 0 | 0 | 1 | 0 | 0.155477 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2262f6ba6d8c2278a63ea0e571aa7725d2647bf8 | 11,843 | py | Python | plugin/taskmage2/project/projects.py | willjp/vim-taskmage | adcf809ccf1768753eca4dadaf6279b34e8d5699 | [
"BSD-2-Clause"
] | 1 | 2017-11-28T14:12:03.000Z | 2017-11-28T14:12:03.000Z | plugin/taskmage2/project/projects.py | willjp/vim-taskmage | adcf809ccf1768753eca4dadaf6279b34e8d5699 | [
"BSD-2-Clause"
] | 16 | 2017-08-13T18:01:26.000Z | 2020-11-17T04:55:43.000Z | plugin/taskmage2/project/projects.py | willjp/vim-taskmage | adcf809ccf1768753eca4dadaf6279b34e8d5699 | [
"BSD-2-Clause"
] | null | null | null | import os
import shutil
import tempfile
from taskmage2.utils import filesystem, functional
from taskmage2.asttree import asttree, renderers
from taskmage2.parser import iostream, parsers
from taskmage2.project import taskfiles
class Project(object):
def __init__(self, root='.'):
""" Constructor.
Args:
path (str, optional): ``(ex: None, '/src/project/subdir/file.mtask', '/src/project', '/src/project/.taskmage' )``
Path to your projectroot, or a file/directory within
your taskmage project root.
.. code-block:: python
'/src/project'
'/src/project/subdir/file.mtask'
'/src/project/.taskmage'
"""
self._root = None
if root:
self.load(root)
def __repr__(self):
"""
Returns:
str: ``<Project(path/to/project) at 0x7ff6b33106a0>``
"""
if self.root:
relpath = os.path.relpath(self.root)
else:
relpath = 'None'
repr_ = '<Project({}) at {}>'.format(relpath, hex(id(self)))
return repr_
def __hash__(self):
hashstr = '<taskmage2.project.projects.Project({})>'.format(str(self.root))
return hash(hashstr)
@classmethod
def from_path(cls, filepath):
""" Instantiates a new Project, loaded using `filepath`.
Args:
path (str): ``(ex: '/src/project/subdir/file.mtask', '/src/project', '/src/project/.taskmage' )``
Path to your projectroot, or a file/directory within
your taskmage project root.
.. code-block:: python
'/src/project'
'/src/project/subdir/file.mtask'
'/src/project/.taskmage'
"""
project = Project(root=None)
project.load(filepath)
return project
@property
def root(self):
""" The root directory of a project. Contains ``.taskmage`` directory.
Returns:
.. code-block:: python
'/src/project'
"""
return self._root
@classmethod
def create(cls, root):
""" Create a new taksmage project in directory `rootdir` .
Args:
rootdir (str):
Path to the root of your taskmage project.
Returns:
str: project root directory
"""
root = format_rootpath(root)
if os.path.exists(root):
if not os.path.isdir(root):
raise OSError(
'unable to create taskmage project, provided '
'path exists and is not a directory. "{}"'.format(root)
)
taskmage_dir = '{}/.taskmage'.format(root)
filesystem.make_directories(taskmage_dir)
return root
@staticmethod
def find(path):
"""
Returns:
str: absolute path to taskmage project root
"""
path = filesystem.format_path(os.path.abspath(path))
# is path root
if os.path.isdir('{}/.taskmage'.format(path)):
return path
# /src/project/.taskmage
if os.path.basename(path) == '.taskmage':
return os.path.dirname(path)
# /src/project
# /src/project/sub-path
for parent_dir in filesystem.walk_parents(path):
if os.path.isdir('{}/.taskmage'.format(parent_dir)):
return parent_dir
raise RuntimeError('unable to find taskmage project from path: {}'.format(path))
def load(self, path):
""" Loads a taskmage project from a path.
Args:
path (str): ``(ex: '/src/project/subdir/file.mtask', '/src/project', '/src/project/.taskmage' )``
Path to your projectroot, or a file/directory within
your taskmage project root.
.. code-block:: python
'/src/project'
'/src/project/subdir/file.mtask'
'/src/project/.taskmage'
"""
path = os.path.abspath(path)
projectroot = self.find(path)
self._root = projectroot
def archive_completed(self, filepath=None):
""" Archives all completed task-branches.
Example:
.. code-block:: ReStructuredText
## a,b, and c will be archived
## (entire task-branch completed)
x a
x b
x c
## nothing will be archived
## (task-branch is not entirely completed)
x a
x b
* c
Args:
filepath (str, optional): ``(ex: '/src/project/file.mtask' )``
Optionally, archive completed tasks in a single target file.
"""
if filepath is not None:
self._archive_completed(filepath)
else:
# for every mtask file in the entire project...
raise NotImplementedError('todo - archive completed tasks from all mtask files')
def is_project_path(self, filepath):
""" Test if a file is within this project.
"""
if filepath.startswith('{}/'.format(self.root)):
return True
return False
def is_archived_path(self, filepath):
""" Test if file is an archived mtask file.
"""
if filepath.startswith('{}/.taskmage/'.format(self.root)):
return True
return False
def is_active_path(self, filepath):
""" Test if file is an active (non-archived) mtask file.
"""
if self.is_project_path(filepath) and not self.is_archived_path(filepath):
return True
return False
def get_archived_path(self, filepath):
""" Returns filepath to corresponding archived mtask file's (from un-archived mtask file).
"""
if not self.is_project_path(filepath):
msg = ('filepath not within current taskmage project. \n'
'project "{}"\n'
'filepath "{}\n').format(self.root, filepath)
raise RuntimeError(msg)
if self.is_archived_path(filepath):
return filepath
filepath = filesystem.format_path(filepath)
relpath = filepath[len(self.root) + 1:]
archived_path = '{}/.taskmage/{}'.format(self.root, relpath)
return archived_path
def get_active_path(self, filepath):
""" Returns filepath to corresponding un-archived mtask file (from archived mtask file).
"""
if not self.is_project_path(filepath):
raise RuntimeError(
('filepath not within current taskmage project. \n'
'project "{}"\n'
'filepath "{}\n').format(self.root, filepath)
)
if not self.is_archived_path(filepath):
return filepath
filepath = filesystem.format_path(filepath)
taskdir = '{}/.taskmage'.format(self.root)
relpath = filepath[len(taskdir) + 1:]
active_path = '{}/{}'.format(self.root, relpath)
return active_path
def get_counterpart(self, filepath):
""" Returns active-path if archived-path, or inverse.
"""
if not self.is_project_path(filepath):
raise RuntimeError(
('filepath not within current taskmage project. \n'
'project "{}"\n'
'filepath "{}\n').format(self.root, filepath)
)
if self.is_archived_path(filepath):
return self.get_active_path(filepath)
else:
return self.get_archived_path(filepath)
def filter_taskfiles(self, filters):
""" Returns a list of all taskfiles in project, filtered by provided `filters` .
Args:
filters (list):
List of functions that accepts a :py:obj:`taskmage2.project.taskfiles.TaskFile`
as an argument, and returns True (keep) or False (remove)
Returns:
Iterable:
iterable of project taskfiles (after all filters applied to them).
.. code-block:: python
[
TaskFile('/path/to/todos/file1.mtask'),
TaskFile('/path/to/todos/file2.mtask'),
TaskFile('/path/to/todos/file3.mtask'),
...
]
"""
return functional.multifilter(filters, self.iter_taskfiles())
def iter_taskfiles(self):
""" Iterates over all `*.mtask` files in project (both completed and uncompleted).
Returns:
Iterable:
iterable of all project taskfiles
.. code-block:: python
[
TaskFile('/path/to/todos/file1.mtask'),
TaskFile('/path/to/todos/file2.mtask'),
TaskFile('/path/to/todos/file3.mtask'),
...
]
"""
for (root, dirnames, filenames) in os.walk(self.root):
for filename in filenames:
if not filename.endswith('.mtask'):
continue
filepath = '{}/{}'.format(root, filename)
yield taskfiles.TaskFile(filepath)
def _archive_completed(self, filepath):
"""
Args:
filepath (str):
absolute path to a .mtask file.
"""
(active_ast, archive_ast) = self._archive_completed_as_ast(filepath)
archive_path = self.get_archived_path(filepath)
tempdir = tempfile.mkdtemp()
try:
# create tempfile objects
active_taskfile = taskfiles.TaskFile('{}/active.mtask'.format(tempdir))
archive_taskfile = taskfiles.TaskFile('{}/archive.mtask'.format(tempdir))
# write tempfiles
active_taskfile.write(active_ast)
archive_taskfile.write(archive_ast)
# (if successful) overwrite real files
active_taskfile.copyfile(filepath)
archive_taskfile.copyfile(archive_path)
finally:
# delete tempdir
if os.path.isdir(tempdir):
shutil.rmtree(tempdir)
def _archive_completed_as_ast(self, filepath):
"""
Returns:
.. code-block:: python
(
asttree.AbstractSyntaxTree(), # new active AST
asttree.AbstractSyntaxTree(), # new archive AST
)
"""
# get active AST
active_ast = self._get_mtaskfile_ast(filepath)
# get archive AST
archive_path = self.get_archived_path(filepath)
archive_ast = self._get_mtaskfile_ast(archive_path)
# perform archive
archive_ast = active_ast.archive_completed(archive_ast)
return (active_ast, archive_ast)
def _get_mtaskfile_ast(self, filepath):
if not os.path.isfile(filepath):
return asttree.AbstractSyntaxTree()
with open(filepath, 'r') as fd_src:
fd = iostream.FileDescriptor(fd_src)
AST = parsers.parse(fd, 'mtask')
return AST
def format_rootpath(path):
""" Formats a project-directory path.
Ensures path ends with `.taskmage` dir, and uses forward slashes exclusively.
Returns:
str:
a new formatted path
"""
return functional.pipeline(
path,
[
_ensure_path_ends_with_dot_taskmage,
filesystem.format_path,
]
)
def _ensure_path_ends_with_dot_taskmage(path):
if os.path.basename(path):
return path
return '{}/.taskmage'.format(path)
| 31.248021 | 125 | 0.548679 | 1,216 | 11,843 | 5.23602 | 0.177632 | 0.036124 | 0.017591 | 0.021988 | 0.354327 | 0.288048 | 0.270457 | 0.232763 | 0.223339 | 0.210774 | 0 | 0.002975 | 0.347125 | 11,843 | 378 | 126 | 31.330688 | 0.820486 | 0.349911 | 0 | 0.213836 | 0 | 0 | 0.092401 | 0.005914 | 0 | 0 | 0 | 0.015873 | 0 | 1 | 0.138365 | false | 0 | 0.044025 | 0 | 0.358491 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
226990cee4efe4dbfe653dc0472db81ab56d2396 | 390 | py | Python | videogame_project/videogame_app/models.py | cs-fullstack-fall-2018/django-form-post1-R3coTh3Cod3r | 3e44b737425fe347757a50f30aa5df021057bfde | [
"Apache-2.0"
] | null | null | null | videogame_project/videogame_app/models.py | cs-fullstack-fall-2018/django-form-post1-R3coTh3Cod3r | 3e44b737425fe347757a50f30aa5df021057bfde | [
"Apache-2.0"
] | null | null | null | videogame_project/videogame_app/models.py | cs-fullstack-fall-2018/django-form-post1-R3coTh3Cod3r | 3e44b737425fe347757a50f30aa5df021057bfde | [
"Apache-2.0"
] | null | null | null | from django.db import models
from django.utils import timezone
class GameIdea(models.Model):
name=models.CharField(max_length=100)
genre= models.CharField(max_length=100)
currentdate= models.DateTimeField(blank=True, null=True)
def __str__(self):
return self.name, self.genre
def displayed(self):
self.currentdate = timezone.now()
self.save() | 26 | 60 | 0.707692 | 50 | 390 | 5.4 | 0.56 | 0.074074 | 0.133333 | 0.177778 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018987 | 0.189744 | 390 | 15 | 61 | 26 | 0.835443 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.181818 | 0.090909 | 0.818182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
2271553668c1d9c135110d311fde305c56e23bd6 | 1,557 | py | Python | Tools/english_word/src/spider.py | pynickle/awesome-python-tools | e405fb8d9a1127ae7cd5bcbd6481da78f6f1fb07 | [
"BSD-2-Clause"
] | 21 | 2019-06-02T01:55:14.000Z | 2022-01-08T22:35:31.000Z | Tools/english_word/src/spider.py | code-nick-python/awesome-daily-tools | e405fb8d9a1127ae7cd5bcbd6481da78f6f1fb07 | [
"BSD-2-Clause"
] | 3 | 2019-06-02T01:55:17.000Z | 2019-06-14T12:32:06.000Z | Tools/english_word/src/spider.py | code-nick-python/awesome-daily-tools | e405fb8d9a1127ae7cd5bcbd6481da78f6f1fb07 | [
"BSD-2-Clause"
] | 16 | 2019-06-23T13:00:04.000Z | 2021-09-18T06:09:58.000Z | import requests
import re
import time
import random
import pprint
import os
headers = {"user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/77.0.3858.0 Safari/537.36"}
def youdict(threadName, q):
res = []
index = 0
url = q.get(timeout = 2)
index += 1
r = requests.get(url, headers = headers, timeout = 5)
html = str(r.content, encoding="utf-8").replace("\n", "").replace(" ", "").replace('<span class="yd-kw-suffix">[英语单词大全]</span>', "")
words = re.findall('<div class="caption"><h3 style="margin-top: 10px;"><a style="color:#333;" target="_blank" href="/w/.*?">(.*?)</a>[ ]?</h3><p>(.*?)</p></div>', html)
for word in words:
res.append(word)
if index%5 == 0:
time.sleep(3 + random.random())
else:
time.sleep(1 + random.random())
return res
def hujiang(threadName, q):
res = []
index = 0
url = q.get(timeout = 2)
index += 1
r = requests.get(url, headers=headers, timeout=5)
html = str(r.content, encoding="utf-8").replace("\n", "").replace(" ", "").replace('<span class="yd-kw-suffix">[英语单词大全]</span>', "")
words = re.findall('<li class="clearfix"><a href="/ciku/(.*?)/" target="_blank">.*?</a><span>(.*?)</span></li>', html)
for word in words:
res.append(word)
if index%5 == 0:
time.sleep(3 + random.random())
else:
time.sleep(1 + random.random())
return res
if __name__ == "__main__":
main()
| 32.4375 | 173 | 0.552987 | 211 | 1,557 | 4.033175 | 0.407583 | 0.042303 | 0.032902 | 0.044653 | 0.627497 | 0.627497 | 0.627497 | 0.627497 | 0.627497 | 0.627497 | 0 | 0.043624 | 0.234425 | 1,557 | 47 | 174 | 33.12766 | 0.670302 | 0 | 0 | 0.666667 | 0 | 0.076923 | 0.309272 | 0.110596 | 0 | 0 | 0 | 0 | 0 | 1 | 0.051282 | false | 0 | 0.153846 | 0 | 0.25641 | 0.025641 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
227598bb20ab74c029eb76f22348999ce40f32c0 | 718 | py | Python | web_programming/fetch_github_info.py | JB1959/Python | b6ca263983933c3ecc06ed0083dd11b6faf870c8 | [
"MIT"
] | 14 | 2020-10-03T05:43:48.000Z | 2021-11-01T21:02:26.000Z | web_programming/fetch_github_info.py | JB1959/Python | b6ca263983933c3ecc06ed0083dd11b6faf870c8 | [
"MIT"
] | 3 | 2020-06-08T07:03:15.000Z | 2020-06-08T08:41:22.000Z | web_programming/fetch_github_info.py | JB1959/Python | b6ca263983933c3ecc06ed0083dd11b6faf870c8 | [
"MIT"
] | 12 | 2020-10-03T05:44:19.000Z | 2022-01-16T05:37:54.000Z | #!/usr/bin/env python3
"""
Created by sarathkaul on 14/11/19
Basic authentication using an API password is deprecated and will soon no longer work.
Visit https://developer.github.com/changes/2020-02-14-deprecating-password-auth
for more information around suggested workarounds and removal dates.
"""
import requests
_GITHUB_API = "https://api.github.com/user"
def fetch_github_info(auth_user: str, auth_pass: str) -> dict:
"""
Fetch GitHub info of a user using the requests module
"""
return requests.get(_GITHUB_API, auth=(auth_user, auth_pass)).json()
if __name__ == "__main__":
for key, value in fetch_github_info("<USER NAME>", "<PASSWORD>").items():
print(f"{key}: {value}")
| 26.592593 | 86 | 0.71727 | 105 | 718 | 4.714286 | 0.647619 | 0.066667 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024752 | 0.155989 | 718 | 26 | 87 | 27.615385 | 0.792079 | 0.481894 | 0 | 0 | 0 | 0 | 0.200573 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0.428571 | 0.142857 | 0 | 0.428571 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
2277e005db07cac1472613b25b5759c8831551c6 | 7,195 | py | Python | ecart/ecart.py | micael-grilo/E-Cart | 76e86b4c7ea5bd2becda23ef8c69470c86630c5e | [
"MIT"
] | null | null | null | ecart/ecart.py | micael-grilo/E-Cart | 76e86b4c7ea5bd2becda23ef8c69470c86630c5e | [
"MIT"
] | null | null | null | ecart/ecart.py | micael-grilo/E-Cart | 76e86b4c7ea5bd2becda23ef8c69470c86630c5e | [
"MIT"
] | null | null | null | import redis
import copy
from functools import wraps
from .exception import ErrorMessage
from .decorators import raise_exception
from .serializer import Serializer
TTL = 604800
class Cart(object):
"""
Main Class for Cart, contains all functionality
"""
@raise_exception("E-Cart can't be initialized due to Error: ")
def __init__(self, user_id, redis_connection, ttl=TTL):
"""
Constructor for the class, initializes user_id and checks whether
the users' cart exists or not.
"""
self.__redis_user_hash_token = "E-CART"
self.user_id = user_id
self.user_redis_key = self.__get_user_redis_key(user_id)
self.redis_connection = redis_connection
self.ttl = ttl
self.user_cart_exists = self.cart_exists(user_id)
self.destroy = self.__del__
@raise_exception("ttl can't be set due to Error: ")
def set_ttl(self):
"""
Update the ttl of the cart
"""
return self.redis_connection.expire(self.user_redis_key, self.ttl)
@raise_exception("ttl can't be obtained due to Error: ")
def get_ttl(self):
ttl = self.redis_connection.ttl(self.user_redis_key)
if ttl:
return ttl
else:
raise ErrorMessage("User Cart does not exists")
def __product_dict(self, unit_cost, quantity, extra_data_dict={}):
"""
Returns the dictionary for a product, with the argument values.
"""
product_dict = {
"unit_cost": unit_cost,
"quantity": quantity
}
product_dict.update(extra_data_dict)
return product_dict
@raise_exception("Cart exists can't return a value due to Error: ")
def cart_exists(self, user_id):
"""
Confirm user's cart hash in Redis
"""
return self.redis_connection.exists(self.user_redis_key)
def __get_user_redis_key_prefix(self):
"""
Generate the prefix for the user's redis key.
"""
return ":".join([self.__redis_user_hash_token, "USER_ID"])
def __get_user_redis_key(self, user_id):
"""
Generates the name of the Hash used for storing User cart in Redis
"""
if user_id:
return self.__get_user_redis_key_prefix() + ":"+str(user_id)
else:
raise ErrorMessage("user_id can't be null")
@raise_exception("Redis user key can't be obtained due to Error: ")
def get_user_redis_key(self):
"""
Returns the name of the Hash used for storing User cart in Redis
"""
return self.user_redis_key
@raise_exception("Product can't be added to the User cart due to Error: ")
def add(self, product_id, unit_cost, quantity=1, **extra_data_dict):
"""
Returns True if the addition of the product of the given product_id and unit_cost with given quantity
is succesful else False.
Can also add extra details in the form of dictionary.
"""
product_dict = self.__product_dict(
unit_cost, quantity, extra_data_dict)
self.redis_connection.hset(
self.user_redis_key, product_id, Serializer.dumps(product_dict))
self.user_cart_exists = self.cart_exists(self.user_id)
self.set_ttl()
@raise_exception("Product can't be obtained due to Error: ")
def get_product(self, product_id):
"""
Returns the cart details as a Dictionary for the given product_id
"""
if self.user_cart_exists:
product_string = self.redis_connection.hget(
self.user_redis_key, product_id)
if product_string:
return Serializer.loads(product_string)
else:
return {}
else:
raise ErrorMessage("The user cart is Empty")
@raise_exception("contains can't function due to Error: ")
def contains(self, product_id):
"""
Checks whether the given product exists in the cart
"""
return self.redis_connection.hexists(self.user_redis_key, product_id)
def __get_raw_cart(self):
return self.redis_connection.hgetall(
self.user_redis_key)
@raise_exception("Cart can't be obtained due to Error: ")
def get(self):
"""
Returns all the products and their details present in the cart as a dictionary
"""
return {key: Serializer.loads(value) for key, value in self.__get_raw_cart().items()}
@raise_exception("count can't be obtained due to Error: ")
def count(self):
"""
Returns the number of types of products in the carts
"""
return self.redis_connection.hlen(self.user_redis_key)
@raise_exception("remove can't function due to Error: ")
def remove(self, product_id):
"""
Removes the product from the cart
"""
if self.user_cart_exists:
if self.redis_connection.hdel(self.user_redis_key, product_id):
self.set_ttl()
return True
else:
return False
else:
raise ErrorMessage("The user cart is Empty")
@raise_exception("Product dictionaries can't be obatined due to Error: ")
def get_product_dicts(self):
"""
Returns the list of all product details
"""
return [Serializer.loads(product_string) for product_string in self.redis_connection.hvals(self.user_redis_key)]
def __quantities(self):
return map(lambda product_dict: product_dict.get('quantity'), self.get_product_dicts())
@raise_exception("quantity can't be obtained due to Error: ")
def quantity(self):
"""
Returns the total number of units of all products in the cart
"""
return reduce(lambda result, quantity: quantity + result, self.__quantities())
@raise_exception("total_cost can't be obatined due to Error: ")
def total_cost(self):
"""
Returns the net total of all product cost from the cart
"""
return sum(self.__price_list())
@raise_exception("copy can't be made due to Error: ")
def copy(self, target_user_id):
"""
Copies the cart of the user to the target_user_id
"""
is_copied = self.redis_connection.hmset(
self.__get_user_redis_key(target_user_id), self.__get_raw_cart())
target_cart = Cart(target_user_id)
target_cart.set_ttl()
return target_cart if is_copied else None
def __product_price(self, product_dict):
"""
Returns the product of product_quantity and its unit_cost
"""
return product_dict['quantity'] * product_dict['unit_cost']
def __price_list(self):
"""
Returns the list of product's total_cost
"""
return map(lambda product_dict: self.__product_price(product_dict), self.get_product_dicts())
def __del__(self):
"""
Deletes the user's cart
"""
self.redis_connection.delete(self.user_redis_key)
| 34.927184 | 120 | 0.624739 | 937 | 7,195 | 4.542156 | 0.154749 | 0.041353 | 0.053571 | 0.045818 | 0.331532 | 0.230028 | 0.12641 | 0.099624 | 0.074248 | 0.046053 | 0 | 0.001372 | 0.290896 | 7,195 | 205 | 121 | 35.097561 | 0.832811 | 0.173037 | 0 | 0.107143 | 0 | 0 | 0.14251 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.205357 | false | 0 | 0.053571 | 0.017857 | 0.464286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
97e1339259b947d5c260266bb5a742c74a8323da | 4,644 | py | Python | squad/base/argument_parser.py | uwnlp/piqa | e18f2189c93965c94655d5cc943dcecdc2c1ea57 | [
"Apache-2.0"
] | 89 | 2018-08-25T07:59:07.000Z | 2021-05-04T06:37:27.000Z | squad/base/argument_parser.py | seominjoon/piqa | e18f2189c93965c94655d5cc943dcecdc2c1ea57 | [
"Apache-2.0"
] | 11 | 2018-09-28T17:33:27.000Z | 2019-11-27T23:34:45.000Z | squad/base/argument_parser.py | uwnlp/piqa | e18f2189c93965c94655d5cc943dcecdc2c1ea57 | [
"Apache-2.0"
] | 10 | 2018-09-19T06:48:06.000Z | 2020-04-14T20:42:06.000Z | import argparse
import os
class ArgumentParser(argparse.ArgumentParser):
def __init__(self, description='base', **kwargs):
super(ArgumentParser, self).__init__(description=description)
def add_arguments(self):
home = os.path.expanduser('~')
self.add_argument('model', type=str)
self.add_argument('--mode', type=str, default='train')
self.add_argument('--iteration', type=str, default='0')
self.add_argument('--pause', type=int, default=0) # ignore this argument.
# Data (input) paths
self.add_argument('--train_path', type=str, default=os.path.join(home, 'data', 'squad', 'train-v1.1.json'),
help='location of the training data')
self.add_argument('--test_path', type=str, default=os.path.join(home, 'data', 'squad', 'dev-v1.1.json'),
help='location of the test data')
# Output paths
self.add_argument('--output_dir', type=str, default='/tmp/piqa/squad/', help='Output directory')
self.add_argument('--save_dir', type=str, default=None, help='location for saving the model')
self.add_argument('--load_dir', type=str, default=None, help='location for loading the model')
self.add_argument('--dump_dir', type=str, default=None, help='location for dumping outputs')
self.add_argument('--report_path', type=str, default=None, help='location for report')
self.add_argument('--pred_path', type=str, default=None, help='location for prediction json file during `test`')
self.add_argument('--cache_path', type=str, default=None)
self.add_argument('--question_emb_dir', type=str, default=None)
self.add_argument('--context_emb_dir', type=str, default=None)
# Training arguments
self.add_argument('--epochs', type=int, default=20)
self.add_argument('--train_steps', type=int, default=0)
self.add_argument('--eval_steps', type=int, default=1000)
self.add_argument('--eval_save_period', type=int, default=500)
self.add_argument('--report_period', type=int, default=100)
# Similarity search (faiss, pysparnn) arguments
self.add_argument('--metric', type=str, default='ip', help='ip|l2')
self.add_argument('--nlist', type=int, default=1)
self.add_argument('--nprobe', type=int, default=1)
self.add_argument('--bpv', type=int, default=None, help='bytes per vector (e.g. 8)')
self.add_argument('--num_train_mats', type=int, default=100)
# Demo arguments
self.add_argument('--port', type=int, default=8080)
# Other arguments
self.add_argument('--draft', default=False, action='store_true')
self.add_argument('--cuda', default=False, action='store_true')
self.add_argument('--preload', default=False, action='store_true')
self.add_argument('--cache', default=False, action='store_true')
self.add_argument('--archive', default=False, action='store_true')
self.add_argument('--dump_period', type=int, default=20)
self.add_argument('--emb_type', type=str, default='dense', help='dense|sparse')
self.add_argument('--metadata', default=False, action='store_true')
self.add_argument('--mem_info', default=False, action='store_true')
def parse_args(self, **kwargs):
args = super().parse_args()
if args.draft:
args.batch_size = 2
args.eval_steps = 1
args.eval_save_period = 2
args.train_steps = 2
if args.save_dir is None:
args.save_dir = os.path.join(args.output_dir, 'save')
if args.load_dir is None:
args.load_dir = os.path.join(args.output_dir, 'save')
if args.dump_dir is None:
args.dump_dir = os.path.join(args.output_dir, 'dump')
if args.question_emb_dir is None:
args.question_emb_dir = os.path.join(args.output_dir, 'question_emb')
if args.context_emb_dir is None:
args.context_emb_dir = os.path.join(args.output_dir, 'context_emb')
if args.report_path is None:
args.report_path = os.path.join(args.output_dir, 'report.csv')
if args.pred_path is None:
args.pred_path = os.path.join(args.output_dir, 'pred.json')
if args.cache_path is None:
args.cache_path = os.path.join(args.output_dir, 'cache.b')
args.load_dir = os.path.abspath(args.load_dir)
args.context_emb_dir = os.path.abspath(args.context_emb_dir)
args.question_emb_dir = os.path.abspath(args.question_emb_dir)
return args
| 50.478261 | 120 | 0.644703 | 631 | 4,644 | 4.55309 | 0.193344 | 0.085277 | 0.182736 | 0.050122 | 0.4504 | 0.386704 | 0.344239 | 0.224852 | 0.053603 | 0.053603 | 0 | 0.009799 | 0.208872 | 4,644 | 91 | 121 | 51.032967 | 0.772183 | 0.032084 | 0 | 0 | 0 | 0 | 0.186762 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042254 | false | 0 | 0.028169 | 0 | 0.098592 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
97e32ebe567c88c97e005c959868e8ed6406d1eb | 2,210 | py | Python | getml/loss_functions.py | srnnkls/getml-python-api | 032b2fec19a0e0a519eab480ee61e0d422d63993 | [
"MIT"
] | null | null | null | getml/loss_functions.py | srnnkls/getml-python-api | 032b2fec19a0e0a519eab480ee61e0d422d63993 | [
"MIT"
] | null | null | null | getml/loss_functions.py | srnnkls/getml-python-api | 032b2fec19a0e0a519eab480ee61e0d422d63993 | [
"MIT"
] | null | null | null | # Copyright 2019 The SQLNet Company GmbH
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to
# deal in the Software without restriction, including without limitation the
# rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
# sell copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
# DEALINGS IN THE SOFTWARE.
"""
This module contains the loss functions for the getml library.
"""
# ------------------------------------------------------------------------------
class _LossFunction(object):
"""
Base class. Should not ever be directly initialized!
"""
def __init__(self):
self.thisptr = dict()
self.thisptr["type_"] = "none"
# ------------------------------------------------------------------------------
class CrossEntropyLoss(_LossFunction):
"""
Cross entropy function.
Recommended loss function for classification problems.
"""
def __init__(self):
super(CrossEntropyLoss, self).__init__()
self.thisptr["type_"] = "CrossEntropyLoss"
# ------------------------------------------------------------------------------
class SquareLoss(_LossFunction):
"""
Square loss function.
Recommended loss function for regression problems.
"""
def __init__(self):
super(SquareLoss, self).__init__()
self.thisptr["type_"] = "SquareLoss"
# ------------------------------------------------------------------------------
| 32.985075 | 80 | 0.619005 | 246 | 2,210 | 5.455285 | 0.51626 | 0.065574 | 0.02459 | 0.0462 | 0.120715 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002195 | 0.175566 | 2,210 | 66 | 81 | 33.484848 | 0.734358 | 0.744796 | 0 | 0.25 | 0 | 0 | 0.092975 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
97e922fd511e37dd6ba6caa81bbded4c80d22dc7 | 316 | py | Python | todo/management/urls.py | Sanguet/todo-challenge | 8eabc02081e7ce6b33408558d4a4a39edee3944c | [
"MIT"
] | null | null | null | todo/management/urls.py | Sanguet/todo-challenge | 8eabc02081e7ce6b33408558d4a4a39edee3944c | [
"MIT"
] | null | null | null | todo/management/urls.py | Sanguet/todo-challenge | 8eabc02081e7ce6b33408558d4a4a39edee3944c | [
"MIT"
] | null | null | null | # Django
from django.urls import include, path
# Django REST Framework
from rest_framework.routers import DefaultRouter
# Views
from .views import tasks as task_views
router = DefaultRouter()
router.register(r'tasks', task_views.TaskViewSet, basename='task')
urlpatterns = [
path('', include(router.urls))
]
| 19.75 | 66 | 0.762658 | 40 | 316 | 5.95 | 0.5 | 0.109244 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136076 | 316 | 15 | 67 | 21.066667 | 0.871795 | 0.107595 | 0 | 0 | 0 | 0 | 0.032374 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
97f1fff136972b7db73eca847e9e3cb4870be823 | 4,022 | py | Python | django_storymarket/forms.py | jacobian/django-storymarket | ec43318ddb9964e67220f6fa9675389b637422ce | [
"BSD-3-Clause"
] | 1 | 2019-01-12T10:05:59.000Z | 2019-01-12T10:05:59.000Z | django_storymarket/forms.py | jacobian/django-storymarket | ec43318ddb9964e67220f6fa9675389b637422ce | [
"BSD-3-Clause"
] | null | null | null | django_storymarket/forms.py | jacobian/django-storymarket | ec43318ddb9964e67220f6fa9675389b637422ce | [
"BSD-3-Clause"
] | null | null | null | import logging
import operator
import storymarket
from django import forms
from django.core.cache import cache
from django.conf import settings
from .models import SyncedObject
# Timeout for choices cached from Storymarket. 5 minutes.
CHOICE_CACHE_TIMEOUT = 600
log = logging.getLogger('django_storymarket')
class StorymarketSyncForm(forms.ModelForm):
"""
A form allowing the choice of sync options for a given model instance.
"""
class Meta:
model = SyncedObject
fields = ['org', 'category', 'tags', 'pricing', 'rights']
def __init__(self, *args, **kwargs):
super(StorymarketSyncForm, self).__init__(*args, **kwargs)
# Override some fields. Tags is left alone; the default is fine.
self.fields['org'] = forms.TypedChoiceField(label='Org',
choices=self._choices('orgs'),
coerce=int)
self.fields['category'] = forms.TypedChoiceField(label='Category',
choices=self._choices('subcategories'),
coerce=int)
self.fields['pricing'] = forms.TypedChoiceField(label='Pricing',
choices=self._choices('pricing'),
coerce=int)
self.fields['rights'] = forms.TypedChoiceField(label='Rights',
choices=self._choices('rights'),
coerce=int)
def _choices(self, manager_name):
"""
Generate a list of choices from a given storymarket manager type.
These choices are cached to save API hits, sorted, and an empty
choice is included.
"""
cache_key = 'storymarket_choice_cache:%s' % manager_name
choices = cache.get(cache_key)
if choices is None:
manager = getattr(self._api, manager_name)
try:
objs = sorted(manager.all(), key=operator.attrgetter('name'))
except storymarket.exceptions.StorymarketError, e:
log.exception('Storymarket API call failed: %s' % e)
return [(u'', u'--- Storymarket Unavailable ---')]
# If there's only a single object, just select it -- don't offer
# an empty choice. Otherwise, offer an empty.
if len(objs) == 1:
empty_choice = []
else:
empty_choice = [(u'', u'---------')]
choices = empty_choice + [(o.id, o.name) for o in objs]
cache.set(cache_key, choices, CHOICE_CACHE_TIMEOUT)
return choices
@property
def _api(self):
return storymarket.Storymarket(settings.STORYMARKET_API_KEY)
class StorymarketOptionalSyncForm(StorymarketSyncForm):
"""
Like a StorymarketSyncForm, but with an extra boolean field indicating
whether syncing should take place or not.
"""
sync = forms.BooleanField(initial=False, required=False,
label="Upload to Storymarket")
def __init__(self, *args, **kwargs):
super(StorymarketOptionalSyncForm, self).__init__(*args, **kwargs)
# Make fields optional; we'll validate them in clean()
for field in ('org', 'category', 'tags'):
self.fields[field].required = False
def clean(self):
if self.cleaned_data['sync']:
for field in ('org', 'category', 'tags'):
if not self.cleaned_data.get(field, None):
message = self.fields[field].error_messages['required']
self._errors[field] = self.error_class([message])
del self.cleaned_data[field]
return self.cleaned_data | 43.717391 | 96 | 0.544008 | 394 | 4,022 | 5.431472 | 0.383249 | 0.028037 | 0.048598 | 0.026636 | 0.047664 | 0.047664 | 0 | 0 | 0 | 0 | 0 | 0.001939 | 0.359025 | 4,022 | 92 | 97 | 43.717391 | 0.828161 | 0.06912 | 0 | 0.129032 | 0 | 0 | 0.08697 | 0.008125 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.112903 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
97f2191d807924b9920f7ca4379d337e4f2f9d92 | 6,361 | py | Python | examples/api-samples/inc_samples/sample33.py | groupdocs-legacy-sdk/python | 80e5ef5a9a14ac4a7815c6cf933b5b2997381455 | [
"Apache-2.0"
] | null | null | null | examples/api-samples/inc_samples/sample33.py | groupdocs-legacy-sdk/python | 80e5ef5a9a14ac4a7815c6cf933b5b2997381455 | [
"Apache-2.0"
] | null | null | null | examples/api-samples/inc_samples/sample33.py | groupdocs-legacy-sdk/python | 80e5ef5a9a14ac4a7815c6cf933b5b2997381455 | [
"Apache-2.0"
] | null | null | null | ####<i>This sample will show how to convert several HTML documents to PDF and merge them to one document</i>
#Import of classes from libraries
import base64
import os
import shutil
import random
import time
from pyramid.renderers import render_to_response
from groupdocs.StorageApi import StorageApi
from groupdocs.AsyncApi import AsyncApi
from groupdocs.ApiClient import ApiClient
from groupdocs.GroupDocsRequestSigner import GroupDocsRequestSigner
from groupdocs.models.JobInfo import JobInfo
# Checking value on null
def IsNotNull(value):
return value is not None and len(value) > 0
####Set variables and get POST data
def sample33(request):
clientId = request.POST.get('client_id')
privateKey = request.POST.get('private_key')
firstUrl = request.POST.get('url1')
secondUrl = request.POST.get('url2')
thirdUrl = request.POST.get('url3')
basePath = request.POST.get('server_type')
message = ""
iframe = ""
# Checking clientId, privateKey and file_Id
if IsNotNull(clientId) == False or IsNotNull(privateKey) == False:
return render_to_response('__main__:templates/sample33.pt',
{ 'error' : 'You do not enter all parameters' })
####Create Signer, ApiClient and Storage Api objects
#Create signer object
signer = GroupDocsRequestSigner(privateKey)
#Create apiClient object
apiClient = ApiClient(signer)
#Create Storage Api object
storageApi = StorageApi(apiClient)
#Create Async api object
asyncApi = AsyncApi(apiClient)
#Set base Path
if basePath == "":
basePath = "https://api.groupdocs.com/v2.0"
storageApi.basePath = basePath
asyncApi.basePath = basePath
#Create list of URL's
urlList = [firstUrl, secondUrl, thirdUrl]
#Create empty list for uploaded files GUID's
guidList = []
for url in urlList:
try:
#Upload file
upload = storageApi.UploadWeb(clientId, url)
if upload.status == "Ok":
#Add GUID of uploaded file to list
guidList.append(upload.result.guid)
else:
raise Exception(upload.error_message)
except Exception, e:
return render_to_response('__main__:templates/sample33.pt',
{ 'error' : str(e) })
####Make a request to Signature API using clientId
try:
#Create list of result document type
convertType = []
convertType.append("pdf")
#Create JobInfo object and set attributes
jobInfo = JobInfo()
jobInfo.actions = "convert, combine"
jobInfo.out_formats = convertType
jobInfo.status = "-1"
jobInfo.email_results = True
rand = random.randint(0, 500)
jobInfo.name = "test" + str(rand)
#Create job
createJob = asyncApi.CreateJob(clientId, jobInfo)
if createJob.status == "Ok":
for guid in guidList:
try:
#Add all uploaded files to created job
addJobDocument = asyncApi.AddJobDocument(clientId, createJob.result.job_id, guid, False)
if addJobDocument.status != "Ok":
raise Exception(addJobDocument.error_message)
except Exception, e:
return render_to_response('__main__:templates/sample33.pt',
{ 'error' : str(e) })
#Change job status
jobInfo.status = "0"
try:
#Update job with new status
updateJob = asyncApi.UpdateJob(clientId,createJob.result.job_id, jobInfo)
if updateJob.status == "Ok":
time.sleep(5)
try:
#Get result file from job by it's ID
getJobDocument = asyncApi.GetJobDocuments(clientId, createJob.result.job_id)
if getJobDocument.status == "Ok":
fileGuid = getJobDocument.result.outputs[0].guid
#Generation of iframe URL using $pageImage->result->guid
#iframe to prodaction server
if basePath == "https://api.groupdocs.com/v2.0":
iframe = 'https://apps.groupdocs.com/document-viewer/embed/' + fileGuid
#iframe to dev server
elif basePath == "https://dev-api.groupdocs.com/v2.0":
iframe = 'https://dev-apps.groupdocs.com/document-viewer/embed/' + fileGuid
#iframe to test server
elif basePath == "https://stage-api.groupdocs.com/v2.0":
iframe = 'https://stage-apps.groupdocs.com/document-viewer/embed/' + fileGuid
elif basePath == "http://realtime-api.groupdocs.com":
iframe = 'http://realtime-apps.groupdocs.com/document-viewer/embed/' + fileGuid
iframe = signer.signUrl(iframe)
else:
raise Exception(getJobDocument.error_message)
except Exception, e:
return render_to_response('__main__:templates/sample33.pt',
{ 'error' : str(e) })
else:
raise Exception(updateJob.error_message)
except Exception, e:
return render_to_response('__main__:templates/sample33.pt',
{ 'error' : str(e) })
else:
raise Exception(createJob.error_message)
except Exception, e:
return render_to_response('__main__:templates/sample33.pt',
{ 'error' : str(e) })
#If request was successfull - set message variable for template
return render_to_response('__main__:templates/sample33.pt',
{ 'userId' : clientId,
'privateKey' : privateKey,
'url1' : firstUrl,
'url2' : secondUrl,
'url3' : thirdUrl,
'iframe' : iframe,
'message' : message },
request=request)
| 43.868966 | 111 | 0.562962 | 627 | 6,361 | 5.61882 | 0.282297 | 0.030656 | 0.036333 | 0.043713 | 0.269657 | 0.245813 | 0.245813 | 0.196424 | 0.169742 | 0.126597 | 0 | 0.009901 | 0.349002 | 6,361 | 144 | 112 | 44.173611 | 0.84086 | 0.144946 | 0 | 0.242991 | 0 | 0 | 0.142249 | 0.038896 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.102804 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
97f5869664190ff99134b09c60ba7139b7a21527 | 7,658 | py | Python | cdisp/core.py | felippebarbosa/cdisp | d9a612c252495ab017bffccdd7e82bbb555e07dd | [
"BSL-1.0"
] | null | null | null | cdisp/core.py | felippebarbosa/cdisp | d9a612c252495ab017bffccdd7e82bbb555e07dd | [
"BSL-1.0"
] | null | null | null | cdisp/core.py | felippebarbosa/cdisp | d9a612c252495ab017bffccdd7e82bbb555e07dd | [
"BSL-1.0"
] | null | null | null | #-*- coding: utf-8 -*-
"""
Dispersion calculation functions
"""
import numpy # module for array manipulation
import pandas # module for general data analysis
import os # module for general OS manipulation
import scipy # module for scientific manipulation and analysis
##
def set_transverse_mode(data_frame, order_tag, neff_tag = 'neff', complex_neff = False):
""" Function for classification of transverse modes
For this function to work, the frequency and polarization must the the same.
Also the input have to be a Pandas data frame;
"""
if type(x) <> 'pandas.core.frame.DataFrame': raise(ValueError("The object MUST be a Pandas data frame"))
####
No = len(data_frame) # number of modes
order_list = np.array(['%1d' % x for x in np.arange(1, No + 1)][::-1]) # list with the transversal order
neffs = np.array(data_frame[neff_tag]) # neffs of the modes
if complex_neff:
neffs = np.abs(np.array([complex(s.replace('i' , 'j ')) for s in neffs])) # for complex neff
inds = neffs.argsort(kind = 'mergesort') # neff sorting
inds2 = np.array(inds).argsort(kind = 'mergesort') # index resorting (reverse sorting)
order_list_sorted = order_list[inds2] # list with the right (sorted) transversal order
data_frame[order_tag] = order_list_sorted
return data_frame
#######
def data_classification(data_frame, wavelength_tag = 'wlength', frequency_tag = 'freq',
input_tags = ['eig', 'Ptm', 'Ppml', 'Pcore', 'Pbus'],
class_tags = ['polarization', 'ring_bus', 'transverse_mode']):
""" Function for filtering quality factor, losses and classification of polarization and transverse modes
The input have to be a Pandas data frame;
"""
## limits setting
pml_thre = 0.5 # threshold for power in the PMLs
bus_thre = 1.0 # threshold for power in the bus waveguide relative to the ring
tm_thre = 1.0 # threshold for power in the TM mode
## tags for classification
[eigenval_tag, TM_tag, pml_tag, ring_tag, bus_tag] = input_tags
[pol_tag, ring_bus_tag, order_tag] = class_tags
## list of columns
list_col = list(data_frame.columns) # columns names
Neig = list_col.index(eigenval_tag) # index before
list_par = list_col[:Neig] # list of parameters
## create wavelength or frequency colunm
if frequency_tag not in list_col: data_frame[frequency_tag] = scipy.constants.c/data_frame[wavelength_tag]
if wavelength_tag not in list_col: data_frame[wavelength_tag] = scipy.constants.c/data_frame[frequency_tag]
## setting frequency column as the standard for internal use
if frequency_tag not in list_par:
list_par.remove(wavelength_tag)
list_par.append(frequency_tag)
## PML filtering
data_frame = data_frame[data_frame[pml_tag] < pml_thre] # Filter the light that goes to the Pml
## TE and TM modes separation
data_frame[pol_tag] = np.array(pandas.cut(np.array(data_frame[TM_tag]), [0, tm_thre, data_frame[TM_tag].max()], labels = ['TE', 'TM']))
list_tag = [pol_tag]
## waveguide and bus separation
if bus_tag in list_col:
data_frame[ring_bus_tag] = np.array(pandas.cut((np.array(data_frame[bus_tag])/np.array(data_frame[ring_tag]))**(1./4), [0, bus_thre, 1000000], labels = ['ring', 'bus']))
# data_frame[ring_bus_tag] = np.array(pandas.cut(np.array(data_frame[ring_tag]), [0, ring_thre, 100000], labels = ['','ring']))
list_tag = list_tag + [ring_bus_tag]
## transverse mode separation
list_group = list_par + list_tag # list to filter the first time
data_frame = data_frame.groupby(list_group, as_index = False).apply(set_transverse_mode, order_tag) # transverse order
return data_frame, list_group + [order_tag]
####
def find_idx_nearest_val(array, value):
'''function to find the nearest index matching to the value given'''
idx_sorted = np.argsort(array)
sorted_array = np.array(array[idx_sorted])
idx = np.searchsorted(sorted_array, value, side="left")
if idx >= len(array):
idx_nearest = idx_sorted[len(array)-1]
elif idx == 0:
idx_nearest = idx_sorted[0]
else:
if abs(value - sorted_array[idx-1]) < abs(value - sorted_array[idx]):
idx_nearest = idx_sorted[idx-1]
else:
idx_nearest = idx_sorted[idx]
return idx_nearest
###
def dispersion_calculation(data_frame, frequency_tag = 'freq', wavelength_tag = 'wlength',
neff_tag = 'neff', wlength0 = None):
""" functions for dispersion calculation """
## initial definitions
wlength = np.array(data_frame[wavelength_tag]) # wavelength
omega = 2*np.pi*np.array(data_frame[frequency_tag]) # angular frequency
beta = np.array(data_frame[neff_tag])*omega/scipy.constants.c # propagation constant
## dialing with circular waveguides
if 'r0' in data_frame.columns:
rad0 = np.array(data_frame['r0'])
beta = beta/rad0
else: rad0 = 1.0
## dispersion calculations
beta1 = Df(beta*rad0, omega)/rad0 # beta 1
beta2 = Df(beta1*rad0, omega)/rad0 # beta 2
beta3 = Df(beta2*rad0, omega)/rad0 # beta 3
beta4 = Df(beta3*rad0, omega)/rad0 # beta 4
D = -2*np.pi*scipy.constants.c/wlength*beta2 # D parameter
## set up the wlength for phase matching
wlength0 = 0.9e-6
if wlength0 == None: wlength0 = wlength[int(wlength.shape[0]/2)]
elif wlength0 < min(wlength): wlength0 = min(wlength)
elif wlength0 > max(wlength): wlength0 = max(wlength)
omega0 = 2*np.pi*scipy.constants.c/wlength0
## phase matching calculation
idx0 = find_idx_nearest_val(omega, omega0)
Dbeta = calculate_Dbeta(beta, idx0) # propagation constant in
Dbeta2 = beta2[idx0]*(omega - omega[idx0])**2 + beta4[idx0]/12*(omega - omega[idx0])**4
norm_gain = calculate_gain(Dbeta, 1.0e4)
## outputs
n_clad, n_core = 1.0, 3.5
output_tags = ['beta', 'beta1', 'beta2', 'beta3', 'beta4', 'D', 'Dbeta', 'Dbeta_approx', 'beta_norm', 'beta_clad', 'beta_core',
'n_clad', 'n_core', 'gain', 'ng', 'fsr']
outputs = [beta, beta1, beta2, beta3, beta4, D, Dbeta, Dbeta2, beta/scipy.constants.c, n_clad*omega/scipy.constants.c, n_core*omega/scipy.constants.c,
n_clad, n_core, norm_gain, beta1*scipy.constants.c, 1/(2*np.pi*rad0*beta1)]
for m, output in enumerate(outputs):
data_frame[output_tags[m]] = output
return data_frame
###
def dispersion_analysis(data_frame, list0, frequency_tag = 'freq'):
## list of columns
list0.remove(frequency_tag)
## remove short data_frames
Lmin = 3
data_frame = data_frame.groupby(list0, as_index = False).filter(lambda x: len(x) >= Lmin)
## calculate dispersion
data_frame = data_frame.groupby(list0, as_index = False).apply(dispersion_calculation)
return data_frame
##
def calculate_Dbeta(x, idx0):
'''calculate Dbeta for a set of date with equally spaced frequencies'''
d = x.shape[0] # array dimension
Dx = np.full(d, np.nan)
idxm = max(-idx0, idx0 - d + 1) # minimum index
idxp = min(idx0 + 1, d - idx0) # maximum index
for idx in range(idxm, idxp):
xm, xp = np.roll(x, idx), np.roll(x, -idx)
Dx[idx0 + idx] = xm[idx0] + xp[idx0] - 2*x[idx0]
return Dx
##
def calculate_gain(Dbeta, Pn):
'''calculate the gain of the 4 wave mixing
** here Pn is normalized such as Pn = gamma*P0'''
return np.sqrt(Pn**2 - (Dbeta/2 + Pn)**2)
| 48.77707 | 177 | 0.657482 | 1,094 | 7,658 | 4.436929 | 0.223035 | 0.077874 | 0.020396 | 0.029666 | 0.1815 | 0.134528 | 0.092501 | 0.070251 | 0.058714 | 0.021014 | 0 | 0.02333 | 0.22199 | 7,658 | 156 | 178 | 49.089744 | 0.791373 | 0.178376 | 0 | 0.048544 | 0 | 0 | 0.052019 | 0.004911 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.038835 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
97f988da234108443107eea262cb4a176c0459c9 | 176 | py | Python | tests/cpydiff/modules_array_deletion.py | learnforpractice/micropython-cpp | 004bc8382f74899e7b876cc29bfa6a9cc976ba10 | [
"MIT"
] | 692 | 2016-12-19T23:25:35.000Z | 2022-03-31T14:20:48.000Z | tests/cpydiff/modules_array_deletion.py | learnforpractice/micropython-cpp | 004bc8382f74899e7b876cc29bfa6a9cc976ba10 | [
"MIT"
] | 509 | 2017-03-28T19:37:18.000Z | 2022-03-31T20:31:43.000Z | tests/cpydiff/modules_array_deletion.py | learnforpractice/micropython-cpp | 004bc8382f74899e7b876cc29bfa6a9cc976ba10 | [
"MIT"
] | 228 | 2016-12-19T05:03:30.000Z | 2022-03-22T18:13:00.000Z | """
categories: Modules,array
description: Array deletion not implemented
cause: Unknown
workaround: Unknown
"""
import array
a = array.array('b', (1, 2, 3))
del a[1]
print(a)
| 16 | 43 | 0.715909 | 26 | 176 | 4.846154 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026316 | 0.136364 | 176 | 10 | 44 | 17.6 | 0.802632 | 0.590909 | 0 | 0 | 0 | 0 | 0.015625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
97fa5c7d0604d6e2fc363a4c15650e9b99bf74f3 | 602 | py | Python | 112_Path Sum.py | Alvin1994/leetcode-python3- | ba2bde873c925554cc39f2bd13be81967713477d | [
"Apache-2.0"
] | null | null | null | 112_Path Sum.py | Alvin1994/leetcode-python3- | ba2bde873c925554cc39f2bd13be81967713477d | [
"Apache-2.0"
] | null | null | null | 112_Path Sum.py | Alvin1994/leetcode-python3- | ba2bde873c925554cc39f2bd13be81967713477d | [
"Apache-2.0"
] | null | null | null | # Definition for a binary tree node.
# class TreeNode:
# def __init__(self, x):
# self.val = x
# self.left = None
# self.right = None
class Solution:
def hasPathSum(self, root: 'TreeNode', sum: 'int') -> 'bool':
if not root:
return False
def helper(node,val):
if not node:
return False
val -= node.val
if node.left is None and node.right is None:
return val == 0
return helper(node.left, val) or helper(node.right, val)
return helper(root,sum)
| 28.666667 | 68 | 0.521595 | 75 | 602 | 4.133333 | 0.413333 | 0.096774 | 0.058065 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002681 | 0.380399 | 602 | 21 | 69 | 28.666667 | 0.828418 | 0.247508 | 0 | 0.166667 | 0 | 0 | 0.033557 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
97fdbe6160aa3872cb3be14af73e7667fe00624c | 978 | py | Python | homeassistant/components/hue/v2/helpers.py | MrDelik/core | 93a66cc357b226389967668441000498a10453bb | [
"Apache-2.0"
] | 30,023 | 2016-04-13T10:17:53.000Z | 2020-03-02T12:56:31.000Z | homeassistant/components/hue/v2/helpers.py | MrDelik/core | 93a66cc357b226389967668441000498a10453bb | [
"Apache-2.0"
] | 24,710 | 2016-04-13T08:27:26.000Z | 2020-03-02T12:59:13.000Z | homeassistant/components/hue/v2/helpers.py | MrDelik/core | 93a66cc357b226389967668441000498a10453bb | [
"Apache-2.0"
] | 11,956 | 2016-04-13T18:42:31.000Z | 2020-03-02T09:32:12.000Z | """Helper functions for Philips Hue v2."""
from __future__ import annotations
def normalize_hue_brightness(brightness: float | None) -> float | None:
"""Return calculated brightness values."""
if brightness is not None:
# Hue uses a range of [0, 100] to control brightness.
brightness = float((brightness / 255) * 100)
return brightness
def normalize_hue_transition(transition: float | None) -> float | None:
"""Return rounded transition values."""
if transition is not None:
# hue transition duration is in milliseconds and round them to 100ms
transition = int(round(transition, 1) * 1000)
return transition
def normalize_hue_colortemp(colortemp: int | None) -> int | None:
"""Return color temperature within Hue's ranges."""
if colortemp is not None:
# Hue only accepts a range between 153..500
colortemp = min(colortemp, 500)
colortemp = max(colortemp, 153)
return colortemp
| 32.6 | 76 | 0.682004 | 121 | 978 | 5.429752 | 0.454545 | 0.054795 | 0.068493 | 0.054795 | 0.073059 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041114 | 0.229039 | 978 | 29 | 77 | 33.724138 | 0.830239 | 0.322086 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.071429 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
97feddd1f63ca0959b0312d053d59692a6f28e9d | 3,646 | py | Python | sdk/python/pulumi_civo/get_network.py | dirien/pulumi-civo | f75eb1482bade0d21fb25c9e20e6838791518226 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2020-08-04T12:27:02.000Z | 2022-03-14T13:16:43.000Z | sdk/python/pulumi_civo/get_network.py | dirien/pulumi-civo | f75eb1482bade0d21fb25c9e20e6838791518226 | [
"ECL-2.0",
"Apache-2.0"
] | 85 | 2020-08-17T19:03:57.000Z | 2022-03-25T19:17:57.000Z | sdk/python/pulumi_civo/get_network.py | dirien/pulumi-civo | f75eb1482bade0d21fb25c9e20e6838791518226 | [
"ECL-2.0",
"Apache-2.0"
] | 5 | 2020-08-04T12:27:03.000Z | 2022-03-24T00:56:24.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
__all__ = [
'GetNetworkResult',
'AwaitableGetNetworkResult',
'get_network',
]
@pulumi.output_type
class GetNetworkResult:
"""
A collection of values returned by getNetwork.
"""
def __init__(__self__, default=None, id=None, label=None, name=None, region=None):
if default and not isinstance(default, bool):
raise TypeError("Expected argument 'default' to be a bool")
pulumi.set(__self__, "default", default)
if id and not isinstance(id, str):
raise TypeError("Expected argument 'id' to be a str")
pulumi.set(__self__, "id", id)
if label and not isinstance(label, str):
raise TypeError("Expected argument 'label' to be a str")
pulumi.set(__self__, "label", label)
if name and not isinstance(name, str):
raise TypeError("Expected argument 'name' to be a str")
pulumi.set(__self__, "name", name)
if region and not isinstance(region, str):
raise TypeError("Expected argument 'region' to be a str")
pulumi.set(__self__, "region", region)
@property
@pulumi.getter
def default(self) -> bool:
"""
If is the default network.
"""
return pulumi.get(self, "default")
@property
@pulumi.getter
def id(self) -> Optional[str]:
"""
A unique ID that can be used to identify and reference a Network.
"""
return pulumi.get(self, "id")
@property
@pulumi.getter
def label(self) -> Optional[str]:
"""
The label used in the configuration.
"""
return pulumi.get(self, "label")
@property
@pulumi.getter
def name(self) -> str:
"""
The name of the network.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def region(self) -> Optional[str]:
return pulumi.get(self, "region")
class AwaitableGetNetworkResult(GetNetworkResult):
# pylint: disable=using-constant-test
def __await__(self):
if False:
yield self
return GetNetworkResult(
default=self.default,
id=self.id,
label=self.label,
name=self.name,
region=self.region)
def get_network(id: Optional[str] = None,
label: Optional[str] = None,
region: Optional[str] = None,
opts: Optional[pulumi.InvokeOptions] = None) -> AwaitableGetNetworkResult:
"""
Use this data source to access information about an existing resource.
:param str id: The unique identifier of an existing Network.
:param str label: The label of an existing Network.
:param str region: The region of an existing Network.
"""
__args__ = dict()
__args__['id'] = id
__args__['label'] = label
__args__['region'] = region
if opts is None:
opts = pulumi.InvokeOptions()
if opts.version is None:
opts.version = _utilities.get_version()
__ret__ = pulumi.runtime.invoke('civo:index/getNetwork:getNetwork', __args__, opts=opts, typ=GetNetworkResult).value
return AwaitableGetNetworkResult(
default=__ret__.default,
id=__ret__.id,
label=__ret__.label,
name=__ret__.name,
region=__ret__.region)
| 31.162393 | 120 | 0.620954 | 427 | 3,646 | 5.100703 | 0.271663 | 0.030303 | 0.036731 | 0.068871 | 0.15978 | 0.063361 | 0.038567 | 0 | 0 | 0 | 0 | 0.000378 | 0.27345 | 3,646 | 116 | 121 | 31.431034 | 0.82182 | 0.179649 | 0 | 0.12987 | 1 | 0 | 0.115668 | 0.019979 | 0 | 0 | 0 | 0 | 0 | 1 | 0.103896 | false | 0 | 0.064935 | 0.012987 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
3f0006363bb84a90ae81c6bd90ba3b9c73aecdc7 | 714 | py | Python | app/kobo/forms.py | wri/django_kobo | 505d52fc0d49d875af068e58ad959b95d1464dd5 | [
"MIT"
] | 1 | 2018-12-20T07:59:55.000Z | 2018-12-20T07:59:55.000Z | app/kobo/forms.py | wri/django_kobo | 505d52fc0d49d875af068e58ad959b95d1464dd5 | [
"MIT"
] | 9 | 2018-11-06T01:51:28.000Z | 2018-12-21T22:19:42.000Z | app/kobo/forms.py | wri/django_kobo | 505d52fc0d49d875af068e58ad959b95d1464dd5 | [
"MIT"
] | 2 | 2018-11-21T15:13:32.000Z | 2020-02-19T08:39:37.000Z | from django import forms
from .models import Connection, KoboUser, KoboData
from django.contrib.admin.widgets import FilteredSelectMultiple
from django.db.models import Q
class ConnectionForm(forms.ModelForm):
class Meta:
model = Connection
exclude = []
widgets = {
'auth_pass': forms.PasswordInput(),
}
class KoboUserForm(forms.ModelForm):
class Meta:
model = KoboUser
exclude = []
surveys = forms.ModelMultipleChoiceField(queryset=KoboData.objects.filter(Q(tags__contains=['bns']) | Q(tags__contains=['nrgt'])), widget=FilteredSelectMultiple(
'Surveys', is_stacked=False), label='')
| 31.043478 | 165 | 0.644258 | 68 | 714 | 6.676471 | 0.558824 | 0.066079 | 0.0837 | 0.101322 | 0.123348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.254902 | 714 | 22 | 166 | 32.454545 | 0.853383 | 0 | 0 | 0.235294 | 0 | 0 | 0.032213 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.058824 | 0.235294 | 0 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
3f042a0420967f88675a79d4f9cf3ecb5cca91b8 | 1,947 | py | Python | vega/trainer/callbacks/horovod.py | zjzh/vega | aa6e7b8c69024262fc483ee06113b4d1bd5156d8 | [
"Apache-2.0"
] | null | null | null | vega/trainer/callbacks/horovod.py | zjzh/vega | aa6e7b8c69024262fc483ee06113b4d1bd5156d8 | [
"Apache-2.0"
] | null | null | null | vega/trainer/callbacks/horovod.py | zjzh/vega | aa6e7b8c69024262fc483ee06113b4d1bd5156d8 | [
"Apache-2.0"
] | null | null | null | # -*- coding:utf-8 -*-
# Copyright (C) 2020. Huawei Technologies Co., Ltd. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Data parallel callback."""
import logging
import vega
from vega.common import ClassFactory, ClassType
from .callback import Callback
logger = logging.getLogger(__name__)
@ClassFactory.register(ClassType.CALLBACK)
class Horovod(Callback):
"""Callback that saves the evaluated Performance."""
def __init__(self):
"""Initialize ModelCheckpoint callback."""
super(Horovod, self).__init__()
self.priority = 260
def before_train(self, logs=None):
"""Be called before the training process."""
if not self.trainer.horovod:
return
if vega.is_torch_backend():
self._init_torch()
def _init_torch(self):
import torch
import horovod.torch as hvd
hvd.broadcast_parameters(self.trainer.model.state_dict(), root_rank=0)
hvd.broadcast_optimizer_state(self.trainer.optimizer, root_rank=0)
self.trainer._average_metrics = self._average_metrics
def _average_metrics(self, metrics_results):
import torch
import horovod.torch as hvd
for key, value in metrics_results.items():
tensor = torch.tensor(value)
avg_tensor = hvd.allreduce(tensor, name=key)
metrics_results[key] = avg_tensor.item()
return metrics_results
| 33.568966 | 78 | 0.698511 | 251 | 1,947 | 5.270916 | 0.525896 | 0.045351 | 0.019652 | 0.024187 | 0.051398 | 0.051398 | 0.051398 | 0 | 0 | 0 | 0 | 0.009132 | 0.212635 | 1,947 | 57 | 79 | 34.157895 | 0.853881 | 0.390858 | 0 | 0.137931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.137931 | false | 0 | 0.275862 | 0 | 0.517241 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
3f0440a332e725d1be2b9f4d8bf41ca99082b5e6 | 5,580 | py | Python | parse_doc.py | nprapps/idp-georgia | 316eba6195b7f410567a7e11eb4811ff7cba54cc | [
"Unlicense"
] | 1 | 2017-04-15T01:48:27.000Z | 2017-04-15T01:48:27.000Z | parse_doc.py | nprapps/idp-georgia | 316eba6195b7f410567a7e11eb4811ff7cba54cc | [
"Unlicense"
] | 153 | 2017-04-14T18:06:26.000Z | 2017-06-02T13:08:09.000Z | parse_doc.py | nprapps/idp-georgia | 316eba6195b7f410567a7e11eb4811ff7cba54cc | [
"Unlicense"
] | 1 | 2021-02-18T11:15:52.000Z | 2021-02-18T11:15:52.000Z | # _*_ coding:utf-8 _*_
import logging
import re
import app_config
from bs4 import BeautifulSoup
from shortcode import process_shortcode
logging.basicConfig(format=app_config.LOG_FORMAT)
logger = logging.getLogger(__name__)
logger.setLevel(app_config.LOG_LEVEL)
end_doc_regex = re.compile(ur'^\s*[Ee][Nn][Dd]\s*$',
re.UNICODE)
new_section_marker_regex = re.compile(ur'^\s*\+{50,}\s*$',
re.UNICODE)
section_end_marker_regex = re.compile(ur'^\s*-{50,}\s*$',
re.UNICODE)
frontmatter_marker_regex = re.compile(ur'^\s*-{3}\s*$',
re.UNICODE)
extract_metadata_regex = re.compile(ur'^(.*?):(.*)$',
re.UNICODE)
shortcode_regex = re.compile(ur'^\s*\[%\s*.*\s*%\]\s*$', re.UNICODE)
def is_section_marker(tag):
"""
Checks for the beginning of a new section
"""
text = tag.get_text()
m = new_section_marker_regex.match(text)
if m:
return True
else:
return False
def is_section_end_marker(tag):
"""
Checks for the beginning of a new section
"""
text = tag.get_text()
m = section_end_marker_regex.match(text)
if m:
return True
else:
return False
def process_headline(contents):
logger.debug('--process_headline start--')
headline = None
for tag in contents:
if tag.name == "h2":
headline = tag.get_text()
else:
logger.warning('unexpected tag found: Ignore %s' % tag.get_text())
if not headline:
logger.error('Did not find headline on post. Contents: %s' % contents)
return headline
def process_metadata(contents):
logger.debug('--process_metadata start--')
metadata = {}
for tag in contents:
text = tag.get_text()
m = extract_metadata_regex.match(text)
if m:
key = m.group(1).strip().lower()
value = m.group(2).strip().lower()
metadata[key] = value
else:
logger.error('Could not parse metadata. Text: %s' % text)
logger.debug("metadata: %s" % metadata)
return metadata
def process_section_contents(contents):
"""
Process episode copy content
In particular parse and generate HTML from shortcodes
"""
logger.debug('--process_post_contents start--')
parsed = []
for tag in contents:
text = tag.get_text()
m = shortcode_regex.match(text)
if m:
parsed.append(process_shortcode(tag))
else:
parsed.append(unicode(tag))
episode_contents = ''.join(parsed)
return episode_contents
def parse_raw_sections(raw_sections):
"""
parse raw episodes into an array of section objects
"""
# Divide each episode into its subparts
# - Headline
# - FrontMatter
# - Contents
sections = []
for raw_section in raw_sections:
section = {}
marker_counter = 0
section_raw_headline = []
section_raw_metadata = []
section_raw_contents = []
for tag in raw_section:
text = tag.get_text()
m = frontmatter_marker_regex.match(text)
if m:
marker_counter += 1
else:
if (marker_counter == 0):
section_raw_headline.append(tag)
elif (marker_counter == 1):
section_raw_metadata.append(tag)
else:
section_raw_contents.append(tag)
section[u'headline'] = process_headline(section_raw_headline)
metadata = process_metadata(section_raw_metadata)
for k, v in metadata.iteritems():
section[k] = v
section[u'contents'] = process_section_contents(section_raw_contents)
sections.append(section)
return sections
def split_sections(doc):
"""
split the raw document into an array of raw sections
"""
logger.debug('--split_sections start--')
raw_sections = []
raw_episode_contents = []
ignore_orphan_text = True
body = doc.soup.body
for child in body.children:
if is_section_marker(child):
# Detected first post stop ignoring orphan text
if ignore_orphan_text:
ignore_orphan_text = False
else:
if ignore_orphan_text:
continue
elif is_section_end_marker(child):
ignore_orphan_text = True
raw_sections.append(raw_episode_contents)
raw_episode_contents = []
else:
raw_episode_contents.append(child)
return raw_sections
def find_section_id(sections, id):
"""
Find the section with a given id
"""
for idx, section in enumerate(sections):
try:
if section['id'] == id:
return idx
except KeyError:
continue
return None
def process_extracted_contents(inline_intro):
"""
Remove html markup
"""
return inline_intro['contents']
def parse(doc):
"""
parse google doc files and extract markup
"""
try:
parsed_document = {}
logger.info('-------------start------------')
raw_sections = split_sections(doc)
sections = parse_raw_sections(raw_sections)
logger.info('Number of sections: %s' % len(sections))
parsed_document['sections'] = sections
finally:
logger.info('-------------end------------')
return parsed_document
| 28.040201 | 78 | 0.58405 | 637 | 5,580 | 4.904239 | 0.22449 | 0.035211 | 0.022407 | 0.03073 | 0.193342 | 0.154289 | 0.112036 | 0.112036 | 0.112036 | 0.09219 | 0 | 0.003616 | 0.306093 | 5,580 | 198 | 79 | 28.181818 | 0.803202 | 0.02509 | 0 | 0.297101 | 0 | 0 | 0.08861 | 0.020838 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.036232 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
3f05790f911b335d2d94be5f242d22af72e43329 | 5,494 | py | Python | xenia_python_client_library/models/attachments_list.py | DutchAnalytics/xenia-python-client-library | 60dc3e21094086124b552ff5bed5895fee826b57 | [
"Apache-2.0"
] | null | null | null | xenia_python_client_library/models/attachments_list.py | DutchAnalytics/xenia-python-client-library | 60dc3e21094086124b552ff5bed5895fee826b57 | [
"Apache-2.0"
] | null | null | null | xenia_python_client_library/models/attachments_list.py | DutchAnalytics/xenia-python-client-library | 60dc3e21094086124b552ff5bed5895fee826b57 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Xenia Python Client Library
Python Client Library to interact with the Xenia API. # noqa: E501
The version of the OpenAPI document: v2.1
Generated by: https://openapi-generator.tech
"""
import pprint
import re # noqa: F401
import six
from xenia_python_client_library.configuration import Configuration
class AttachmentsList(object):
"""NOTE: This class is auto generated by OpenAPI Generator.
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
"""
Attributes:
openapi_types (dict): The key is attribute name
and the value is attribute type.
attribute_map (dict): The key is attribute name
and the value is json key in definition.
"""
openapi_types = {
'id': 'str',
'source_name': 'str',
'destination_name': 'str',
'mapping': 'list[AttachmentFieldsList]'
}
attribute_map = {
'id': 'id',
'source_name': 'source_name',
'destination_name': 'destination_name',
'mapping': 'mapping'
}
def __init__(self, id=None, source_name=None, destination_name=None, mapping=None, local_vars_configuration=None): # noqa: E501
"""AttachmentsList - a model defined in OpenAPI""" # noqa: E501
if local_vars_configuration is None:
local_vars_configuration = Configuration()
self.local_vars_configuration = local_vars_configuration
self._id = None
self._source_name = None
self._destination_name = None
self._mapping = None
self.discriminator = None
if id is not None:
self.id = id
if source_name is not None:
self.source_name = source_name
if destination_name is not None:
self.destination_name = destination_name
if mapping is not None:
self.mapping = mapping
@property
def id(self):
"""Gets the id of this AttachmentsList. # noqa: E501
:return: The id of this AttachmentsList. # noqa: E501
:rtype: str
"""
return self._id
@id.setter
def id(self, id):
"""Sets the id of this AttachmentsList.
:param id: The id of this AttachmentsList. # noqa: E501
:type: str
"""
self._id = id
@property
def source_name(self):
"""Gets the source_name of this AttachmentsList. # noqa: E501
:return: The source_name of this AttachmentsList. # noqa: E501
:rtype: str
"""
return self._source_name
@source_name.setter
def source_name(self, source_name):
"""Sets the source_name of this AttachmentsList.
:param source_name: The source_name of this AttachmentsList. # noqa: E501
:type: str
"""
self._source_name = source_name
@property
def destination_name(self):
"""Gets the destination_name of this AttachmentsList. # noqa: E501
:return: The destination_name of this AttachmentsList. # noqa: E501
:rtype: str
"""
return self._destination_name
@destination_name.setter
def destination_name(self, destination_name):
"""Sets the destination_name of this AttachmentsList.
:param destination_name: The destination_name of this AttachmentsList. # noqa: E501
:type: str
"""
self._destination_name = destination_name
@property
def mapping(self):
"""Gets the mapping of this AttachmentsList. # noqa: E501
:return: The mapping of this AttachmentsList. # noqa: E501
:rtype: list[AttachmentFieldsList]
"""
return self._mapping
@mapping.setter
def mapping(self, mapping):
"""Sets the mapping of this AttachmentsList.
:param mapping: The mapping of this AttachmentsList. # noqa: E501
:type: list[AttachmentFieldsList]
"""
self._mapping = mapping
def to_dict(self):
"""Returns the model properties as a dict"""
result = {}
for attr, _ in six.iteritems(self.openapi_types):
value = getattr(self, attr)
if isinstance(value, list):
result[attr] = list(map(
lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
value
))
elif hasattr(value, "to_dict"):
result[attr] = value.to_dict()
elif isinstance(value, dict):
result[attr] = dict(map(
lambda item: (item[0], item[1].to_dict())
if hasattr(item[1], "to_dict") else item,
value.items()
))
else:
result[attr] = value
return result
def to_str(self):
"""Returns the string representation of the model"""
return pprint.pformat(self.to_dict())
def __repr__(self):
"""For `print` and `pprint`"""
return self.to_str()
def __eq__(self, other):
"""Returns true if both objects are equal"""
if not isinstance(other, AttachmentsList):
return False
return self.to_dict() == other.to_dict()
def __ne__(self, other):
"""Returns true if both objects are not equal"""
if not isinstance(other, AttachmentsList):
return True
return self.to_dict() != other.to_dict()
| 27.60804 | 132 | 0.59028 | 624 | 5,494 | 5.035256 | 0.18109 | 0.063654 | 0.106938 | 0.095481 | 0.389561 | 0.309675 | 0.283577 | 0.18205 | 0.099618 | 0.056652 | 0 | 0.014408 | 0.317801 | 5,494 | 198 | 133 | 27.747475 | 0.823906 | 0.307608 | 0 | 0.088889 | 1 | 0 | 0.050868 | 0.008065 | 0 | 0 | 0 | 0 | 0 | 1 | 0.155556 | false | 0 | 0.044444 | 0 | 0.355556 | 0.022222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
3f07c6d2135990949504b1e72bbaec00f43feafb | 616 | py | Python | server/src/models/movie.py | Rubilmax/netflux | 9e79063b81e3dc78055fc683c230de511827f030 | [
"MIT"
] | 2 | 2019-06-17T08:28:03.000Z | 2019-06-17T08:28:32.000Z | server/src/models/movie.py | Rubilmax/netflux | 9e79063b81e3dc78055fc683c230de511827f030 | [
"MIT"
] | 3 | 2020-09-05T00:54:20.000Z | 2021-05-07T15:34:58.000Z | server/src/models/movie.py | Rubilmax/netflux | 9e79063b81e3dc78055fc683c230de511827f030 | [
"MIT"
] | null | null | null | """
Define the Movie model
"""
from . import db
from .abc import BaseModel, MetaBaseModel
class Movie(db.Model, BaseModel, metaclass=MetaBaseModel):
""" The Movie model """
__tablename__ = "movies"
movie_id = db.Column(db.String(300), primary_key=True)
title = db.Column(db.String(300))
author = db.Column(db.String(300))
release_year = db.Column(db.Integer)
def __init__(self, movie_id, title, author, release_year):
""" Create a new movie """
self.movie_id = movie_id
self.title = title
self.author = author
self.release_year = release_year
| 25.666667 | 62 | 0.657468 | 81 | 616 | 4.790123 | 0.395062 | 0.072165 | 0.103093 | 0.123711 | 0.146907 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018789 | 0.222403 | 616 | 23 | 63 | 26.782609 | 0.791232 | 0.095779 | 0 | 0 | 0 | 0 | 0.011173 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.153846 | 0 | 0.692308 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
3f1168ed05032f188730bcd06823c66a0ec28d77 | 5,168 | py | Python | testfixtures/tests/test_roundcomparison.py | Alexhuszagh/XLDiscoverer | 60937b1f7f2e23af4219eb26519d6b83fb4232d6 | [
"Apache-2.0",
"MIT"
] | null | null | null | testfixtures/tests/test_roundcomparison.py | Alexhuszagh/XLDiscoverer | 60937b1f7f2e23af4219eb26519d6b83fb4232d6 | [
"Apache-2.0",
"MIT"
] | null | null | null | testfixtures/tests/test_roundcomparison.py | Alexhuszagh/XLDiscoverer | 60937b1f7f2e23af4219eb26519d6b83fb4232d6 | [
"Apache-2.0",
"MIT"
] | null | null | null | # Copyright (c) 2014 Simplistix Ltd
# See license.txt for license details.
from decimal import Decimal
from testfixtures import RoundComparison as R, compare, ShouldRaise
from unittest import TestCase
from ..compat import PY2, PY3
class Tests(TestCase):
def test_equal_yes_rhs(self):
self.assertTrue(0.123457 == R(0.123456, 5))
def test_equal_yes_lhs(self):
self.assertTrue(R(0.123456, 5) == 0.123457)
def test_equal_no_rhs(self):
self.assertFalse(0.123453 == R(0.123456, 5))
def test_equal_no_lhs(self):
self.assertFalse(R(0.123456, 5) == 0.123453)
def test_not_equal_yes_rhs(self):
self.assertFalse(0.123457 != R(0.123456, 5))
def test_not_equal_yes_lhs(self):
self.assertFalse(R(0.123456, 5) != 0.123457)
def test_not_equal_no_rhs(self):
self.assertTrue(0.123453 != R(0.123456, 5))
def test_not_equal_no_lhs(self):
self.assertTrue(R(0.123456, 5) != 0.123453)
def test_equal_in_sequence_rhs(self):
self.assertEqual((1, 2, 0.123457),
(1, 2, R(0.123456, 5)))
def test_equal_in_sequence_lhs(self):
self.assertEqual((1, 2, R(0.123456, 5)),
(1, 2, 0.123457))
def test_not_equal_in_sequence_rhs(self):
self.assertNotEqual((1, 2, 0.1236),
(1, 2, R(0.123456, 5)))
def test_not_equal_in_sequence_lhs(self):
self.assertNotEqual((1, 2, R(0.123456, 5)),
(1, 2, 0.1236))
def test_not_numeric_rhs(self):
with ShouldRaise(TypeError):
'abc' == R(0.123456, 5)
def test_not_numeric_lhs(self):
with ShouldRaise(TypeError):
R(0.123456, 5) == 'abc'
def test_repr(self):
compare('<R:0.12346 to 5 digits>',
repr(R(0.123456, 5)))
def test_str(self):
compare('<R:0.12346 to 5 digits>',
repr(R(0.123456, 5)))
def test_str_negative(self):
if PY3:
expected = '<R:123500 to -2 digits>'
else:
expected = '<R:123500.0 to -2 digits>'
compare(expected, repr(R(123456, -2)))
TYPE_ERROR_DECIMAL = TypeError(
"Cannot compare <R:0.12346 to 5 digits> with <class 'decimal.Decimal'>"
)
def test_equal_yes_decimal_to_float_rhs(self):
with ShouldRaise(self.TYPE_ERROR_DECIMAL, unless=PY2):
self.assertTrue(Decimal("0.123457") == R(0.123456, 5))
def test_equal_yes_decimal_to_float_lhs(self):
with ShouldRaise(self.TYPE_ERROR_DECIMAL, unless=PY2):
self.assertTrue(R(0.123456, 5) == Decimal("0.123457"))
def test_equal_no_decimal_to_float_rhs(self):
with ShouldRaise(self.TYPE_ERROR_DECIMAL, unless=PY2):
self.assertFalse(Decimal("0.123453") == R(0.123456, 5))
def test_equal_no_decimal_to_float_lhs(self):
with ShouldRaise(self.TYPE_ERROR_DECIMAL, unless=PY2):
self.assertFalse(R(0.123456, 5) == Decimal("0.123453"))
TYPE_ERROR_FLOAT = TypeError(
"Cannot compare <R:0.12346 to 5 digits> with <class 'float'>"
)
def test_equal_yes_float_to_decimal_rhs(self):
with ShouldRaise(self.TYPE_ERROR_FLOAT, unless=PY2):
self.assertTrue(0.123457 == R(Decimal("0.123456"), 5))
def test_equal_yes_float_to_decimal_lhs(self):
with ShouldRaise(self.TYPE_ERROR_FLOAT, unless=PY2):
self.assertTrue(R(Decimal("0.123456"), 5) == 0.123457)
def test_equal_no_float_to_decimal_rhs(self):
with ShouldRaise(self.TYPE_ERROR_FLOAT, unless=PY2):
self.assertFalse(0.123453 == R(Decimal("0.123456"), 5))
def test_equal_no_float_to_decimal_lhs(self):
with ShouldRaise(self.TYPE_ERROR_FLOAT, unless=PY2):
self.assertFalse(R(Decimal("0.123456"), 5) == 0.123453)
def test_integer_float(self):
with ShouldRaise(TypeError, unless=PY2):
1 == R(1.000001, 5)
def test_float_integer(self):
with ShouldRaise(TypeError, unless=PY2):
R(1.000001, 5) == 1
def test_equal_yes_integer_other_rhs(self):
self.assertTrue(10 == R(11, -1))
def test_equal_yes_integer_lhs(self):
self.assertTrue(R(11, -1) == 10)
def test_equal_no_integer_rhs(self):
self.assertFalse(10 == R(16, -1))
def test_equal_no_integer_lhs(self):
self.assertFalse(R(16, -1) == 10)
def test_equal_integer_zero_precision(self):
self.assertTrue(1 == R(1, 0))
def test_equal_yes_negative_precision(self):
self.assertTrue(149.123 == R(101.123, -2))
def test_equal_no_negative_precision(self):
self.assertFalse(149.123 == R(150.001, -2))
def test_decimal_yes_rhs(self):
self.assertTrue(Decimal('0.123457') == R(Decimal('0.123456'), 5))
def test_decimal_yes_lhs(self):
self.assertTrue(R(Decimal('0.123456'), 5) == Decimal('0.123457'))
def test_decimal_no_rhs(self):
self.assertFalse(Decimal('0.123453') == R(Decimal('0.123456'), 5))
def test_decimal_no_lhs(self):
self.assertFalse(R(Decimal('0.123456'), 5) == Decimal('0.123453'))
| 33.128205 | 79 | 0.629257 | 745 | 5,168 | 4.138255 | 0.108725 | 0.08628 | 0.072657 | 0.058385 | 0.743756 | 0.649692 | 0.538112 | 0.484593 | 0.410963 | 0.306195 | 0 | 0.136099 | 0.235101 | 5,168 | 155 | 80 | 33.341935 | 0.643815 | 0.013545 | 0 | 0.165138 | 0 | 0 | 0.069872 | 0 | 0 | 0 | 0 | 0 | 0.284404 | 1 | 0.348624 | false | 0 | 0.036697 | 0 | 0.412844 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
3f11b3d9455edd6883b563bf0cbd4035db741ccc | 23,628 | py | Python | config/usb_device_cdc.py | newbs/usb | 5aeafc26849673a357a6110713524387f2f5f84d | [
"0BSD"
] | null | null | null | config/usb_device_cdc.py | newbs/usb | 5aeafc26849673a357a6110713524387f2f5f84d | [
"0BSD"
] | null | null | null | config/usb_device_cdc.py | newbs/usb | 5aeafc26849673a357a6110713524387f2f5f84d | [
"0BSD"
] | null | null | null | """*****************************************************************************
* Copyright (C) 2019 Microchip Technology Inc. and its subsidiaries.
*
* Subject to your compliance with these terms, you may use Microchip software
* and any derivatives exclusively with Microchip products. It is your
* responsibility to comply with third party license terms applicable to your
* use of third party software (including open source software) that may
* accompany Microchip software.
*
* THIS SOFTWARE IS SUPPLIED BY MICROCHIP "AS IS". NO WARRANTIES, WHETHER
* EXPRESS, IMPLIED OR STATUTORY, APPLY TO THIS SOFTWARE, INCLUDING ANY IMPLIED
* WARRANTIES OF NON-INFRINGEMENT, MERCHANTABILITY, AND FITNESS FOR A
* PARTICULAR PURPOSE.
*
* IN NO EVENT WILL MICROCHIP BE LIABLE FOR ANY INDIRECT, SPECIAL, PUNITIVE,
* INCIDENTAL OR CONSEQUENTIAL LOSS, DAMAGE, COST OR EXPENSE OF ANY KIND
* WHATSOEVER RELATED TO THE SOFTWARE, HOWEVER CAUSED, EVEN IF MICROCHIP HAS
* BEEN ADVISED OF THE POSSIBILITY OR THE DAMAGES ARE FORESEEABLE. TO THE
* FULLEST EXTENT ALLOWED BY LAW, MICROCHIP'S TOTAL LIABILITY ON ALL CLAIMS IN
* ANY WAY RELATED TO THIS SOFTWARE WILL NOT EXCEED THE AMOUNT OF FEES, IF ANY,
* THAT YOU HAVE PAID DIRECTLY TO MICROCHIP FOR THIS SOFTWARE.
*****************************************************************************"""
currentQSizeRead = 1
currentQSizeWrite = 1
currentQSizeSerialStateNotification = 1
cdcInterfacesNumber = 2
cdcDescriptorSize = 58
cdcEndpointsPic32 = 2
cdcEndpointsSAM = 3
indexFunction = None
configValue = None
startInterfaceNumber = None
numberOfInterfaces = None
useIad = None
epNumberInterrupt = None
epNumberBulkOut = None
epNumberBulkIn = None
cdcEndpointNumber = None
def handleMessage(messageID, args):
global useIad
if (messageID == "UPDATE_CDC_IAD_ENABLE"):
useIad.setValue(args["iadEnable"])
return args
def onAttachmentConnected(source, target):
global cdcInterfacesNumber
global cdcDescriptorSize
global configValue
global startInterfaceNumber
global numberOfInterfaces
global useIad
global epNumberInterrupt
global epNumberBulkOut
global epNumberBulkIn
global cdcEndpointsPic32
global cdcEndpointsSAM
global currentQSizeRead
global currentQSizeWrite
global currentQSizeSerialStateNotification
print ("CDC Function Driver: Attached")
remoteComponent = target["component"]
remoteComponentId = remoteComponent.getID()
if (remoteComponentId == "usb_device"):
dependencyID = source["id"]
ownerComponent = source["component"]
# Read number of functions from USB Device Layer
nFunctions = Database.getSymbolValue("usb_device", "CONFIG_USB_DEVICE_FUNCTIONS_NUMBER")
if nFunctions != None:
#Log.writeDebugMessage ("USB Device CDC Function Driver: Attachment connected")
# Update Number of Functions in USB Device, Increment the value by One.
args = {"nFunction":nFunctions + 1}
res = Database.sendMessage("usb_device", "UPDATE_FUNCTIONS_NUMBER", args)
# If we have CDC function driver plus any function driver (no matter what Class), we enable IAD.
if nFunctions > 0:
args = {"nFunction":True}
res = Database.sendMessage("usb_device", "UPDATE_IAD_ENABLE", args)
iadEnableSymbol = ownerComponent.getSymbolByID("CONFIG_USB_DEVICE_FUNCTION_USE_IAD")
iadEnableSymbol.clearValue()
iadEnableSymbol.setValue(True, 1)
isIadEnabled = Database.getSymbolValue("usb_device_cdc_0", "CONFIG_USB_DEVICE_FUNCTION_USE_IAD")
if isIadEnabled == False:
args = {"iadEnable":True}
res = Database.sendMessage("usb_device_cdc_0", "UPDATE_CDC_IAD_ENABLE", args)
nCDCInstances = Database.getSymbolValue("usb_device_cdc", "CONFIG_USB_DEVICE_CDC_INSTANCES")
if nCDCInstances == 2:
configDescriptorSize = Database.getSymbolValue("usb_device", "CONFIG_USB_DEVICE_CONFIG_DESCRPTR_SIZE")
if configDescriptorSize != None:
args = {"nFunction": configDescriptorSize + 8}
res = Database.sendMessage("usb_device", "UPDATE_CONFIG_DESCRPTR_SIZE", args)
configDescriptorSize = Database.getSymbolValue("usb_device", "CONFIG_USB_DEVICE_CONFIG_DESCRPTR_SIZE")
if configDescriptorSize != None:
iadEnableSymbol = ownerComponent.getSymbolByID("CONFIG_USB_DEVICE_FUNCTION_USE_IAD")
if iadEnableSymbol.getValue() == True:
descriptorSize = cdcDescriptorSize + 8
else:
descriptorSize = cdcDescriptorSize
args = {"nFunction": configDescriptorSize + descriptorSize}
res = Database.sendMessage("usb_device", "UPDATE_CONFIG_DESCRPTR_SIZE", args)
nInterfaces = Database.getSymbolValue("usb_device", "CONFIG_USB_DEVICE_INTERFACES_NUMBER")
if nInterfaces != None:
args = {"nFunction": nInterfaces + cdcInterfacesNumber}
res = Database.sendMessage("usb_device", "UPDATE_INTERFACES_NUMBER", args)
startInterfaceNumber.setValue(nInterfaces, 1)
nEndpoints = Database.getSymbolValue("usb_device", "CONFIG_USB_DEVICE_ENDPOINTS_NUMBER")
if nEndpoints != None:
epNumberInterrupt.setValue(nEndpoints + 1, 1)
epNumberBulkOut.setValue(nEndpoints + 2, 1)
if any(x in Variables.get("__PROCESSOR") for x in ["PIC32MZ", "PIC32MX", "PIC32MK", "SAMD21", "SAMDA1","SAMD51", "SAME51", "SAME53", "SAME54", "SAML21", "SAML22", "SAMD11"]):
epNumberBulkIn.setValue(nEndpoints + 2, 1)
args = {"nFunction": nEndpoints + cdcEndpointsPic32}
res = Database.sendMessage("usb_device", "UPDATE_ENDPOINTS_NUMBER", args)
else:
epNumberBulkIn.setValue(nEndpoints + 3, 1)
args = {"nFunction": nEndpoints + cdcEndpointsSAM}
res = Database.sendMessage("usb_device", "UPDATE_ENDPOINTS_NUMBER", args)
def onAttachmentDisconnected(source, target):
print ("CDC Function Driver: Detached")
global cdcInterfacesNumber
global cdcDescriptorSize
global configValue
global startInterfaceNumber
global numberOfInterfaces
global useIad
global epNumberInterrupt
global epNumberBulkOut
global epNumberBulkIn
global cdcEndpointsPic32
global cdcEndpointsSAM
global cdcInstancesCount
global currentQSizeRead
global currentQSizeWrite
global currentQSizeSerialStateNotification
dependencyID = source["id"]
ownerComponent = source["component"]
remoteComponent = target["component"]
remoteComponentId = remoteComponent.getID()
if (remoteComponentId == "usb_device"):
nFunctions = Database.getSymbolValue("usb_device", "CONFIG_USB_DEVICE_FUNCTIONS_NUMBER")
if nFunctions != None:
nFunctions = nFunctions - 1
args = {"nFunction":nFunctions}
res = Database.sendMessage("usb_device", "UPDATE_FUNCTIONS_NUMBER", args)
endpointNumber = Database.getSymbolValue("usb_device", "CONFIG_USB_DEVICE_ENDPOINTS_NUMBER")
if endpointNumber != None:
if any(x in Variables.get("__PROCESSOR") for x in ["PIC32MZ"]):
args = {"nFunction":endpointNumber - cdcEndpointsPic32 }
res = Database.sendMessage("usb_device", "UPDATE_ENDPOINTS_NUMBER", args)
else:
args = {"nFunction":endpointNumber - cdcEndpointsSAM }
res = Database.sendMessage("usb_device", "UPDATE_ENDPOINTS_NUMBER", args)
interfaceNumber = Database.getSymbolValue("usb_device", "CONFIG_USB_DEVICE_INTERFACES_NUMBER")
if interfaceNumber != None:
args = {"nFunction": interfaceNumber - 2}
res = Database.sendMessage("usb_device", "UPDATE_INTERFACES_NUMBER", args)
nCDCInstances = Database.getSymbolValue("usb_device_cdc", "CONFIG_USB_DEVICE_CDC_INSTANCES")
if nCDCInstances != None:
nCDCInstances = nCDCInstances - 1
args = {"cdcInstanceCount": nCDCInstances}
res = Database.sendMessage("usb_device_cdc", "UPDATE_CDC_INSTANCES", args)
if nCDCInstances == 1 and nFunctions != None and nFunctions == 1:
args = {"iadEnable":False}
res = Database.sendMessage("usb_device_cdc_0", "UPDATE_CDC_IAD_ENABLE", args)
args = {"nFunction":False}
res = Database.sendMessage("usb_device", "UPDATE_IAD_ENABLE", args)
configDescriptorSize = Database.getSymbolValue("usb_device", "CONFIG_USB_DEVICE_CONFIG_DESCRPTR_SIZE")
if configDescriptorSize != None:
args = {"nFunction": configDescriptorSize - 8}
res = Database.sendMessage("usb_device", "UPDATE_CONFIG_DESCRPTR_SIZE", args)
configDescriptorSize = Database.getSymbolValue("usb_device", "CONFIG_USB_DEVICE_CONFIG_DESCRPTR_SIZE")
if configDescriptorSize != None:
if useIad.getValue() == True:
descriptorSize = cdcDescriptorSize + 8
else:
descriptorSize = cdcDescriptorSize
args = {"nFunction": configDescriptorSize - descriptorSize}
res = Database.sendMessage("usb_device", "UPDATE_CONFIG_DESCRPTR_SIZE", args)
def destroyComponent(component):
print ("CDC Function Driver: Destroyed")
# This function is called when user modifies the CDC Queue Size.
def usbDeviceCdcBufferQueueSize(usbSymbolSource, event):
global currentQSizeRead
global currentQSizeWrite
global currentQSizeSerialStateNotification
queueDepthCombined = Database.getSymbolValue("usb_device_cdc", "CONFIG_USB_DEVICE_CDC_QUEUE_DEPTH_COMBINED")
if (event["id"] == "CONFIG_USB_DEVICE_FUNCTION_READ_Q_SIZE"):
queueDepthCombined = queueDepthCombined - currentQSizeRead + event["value"]
currentQSizeRead = event["value"]
if (event["id"] == "CONFIG_USB_DEVICE_FUNCTION_WRITE_Q_SIZE"):
queueDepthCombined = queueDepthCombined - currentQSizeWrite + event["value"]
currentQSizeWrite = event["value"]
if (event["id"] == "CONFIG_USB_DEVICE_FUNCTION_SERIAL_NOTIFIACATION_Q_SIZE"):
queueDepthCombined = queueDepthCombined - currentQSizeSerialStateNotification + event["value"]
currentQSizeSerialStateNotification = event["value"]
# We have updated queueDepthCombined variable with current combined queue length.
# Now send a message to USB_DEVICE_CDC_COMMON.PY to modify the Combined queue length.
args = {"cdcQueueDepth": queueDepthCombined}
res = Database.sendMessage("usb_device_cdc", "UPDATE_CDC_QUEUE_DEPTH_COMBINED", args)
def instantiateComponent(usbDeviceCdcComponent, index):
global cdcDescriptorSize
global cdcInterfacesNumber
global cdcDescriptorSize
global configValue
global startInterfaceNumber
global numberOfInterfaces
global useIad
global currentQSizeRead
global currentQSizeWrite
global currentQSizeSerialStateNotification
global epNumberInterrupt
global epNumberBulkOut
global epNumberBulkIn
res = Database.activateComponents(["usb_device"])
if any(x in Variables.get("__PROCESSOR") for x in ["PIC32MZ"]):
MaxEpNumber = 7
BulkInDefaultEpNumber = 2
elif any(x in Variables.get("__PROCESSOR") for x in ["PIC32MX", "PIC32MK"]):
MaxEpNumber = 15
BulkInDefaultEpNumber = 2
elif any(x in Variables.get("__PROCESSOR") for x in ["SAMD21", "SAMDA1", "SAMD51", "SAME51", "SAME53", "SAME54", "SAML21", "SAML22", "SAMD11"]):
MaxEpNumber = 7
BulkInDefaultEpNumber = 2
elif any(x in Variables.get("__PROCESSOR") for x in ["SAMA5D2", "SAM9X60"]):
MaxEpNumber = 15
BulkInDefaultEpNumber = 3
elif any(x in Variables.get("__PROCESSOR") for x in ["SAME70", "SAMS70", "SAMV70", "SAMV71"]):
MaxEpNumber = 9
BulkInDefaultEpNumber = 3
elif any(x in Variables.get("__PROCESSOR") for x in ["SAMG55"]):
MaxEpNumber = 5
BulkInDefaultEpNumber = 3
# Index of this function
indexFunction = usbDeviceCdcComponent.createIntegerSymbol("CONFIG_USB_DEVICE_FUNCTION_INDEX", None)
indexFunction.setVisible(False)
indexFunction.setMin(0)
indexFunction.setMax(16)
indexFunction.setDefaultValue(index)
# Config name: Configuration number
configValue = usbDeviceCdcComponent.createIntegerSymbol("CONFIG_USB_DEVICE_FUNCTION_CONFIG_VALUE", None)
configValue.setLabel("Configuration Value")
configValue.setVisible(False)
configValue.setMin(1)
configValue.setMax(16)
configValue.setDefaultValue(1)
configValue.setReadOnly(True)
# Adding Start Interface number
startInterfaceNumber = usbDeviceCdcComponent.createIntegerSymbol("CONFIG_USB_DEVICE_FUNCTION_INTERFACE_NUMBER", None)
startInterfaceNumber.setLabel("Start Interface Number")
helpText = '''Indicates the Interface Number of the first interfaces in
the Communication Device Interface Group. This is provided here for
indication purposes only and is automatically updated based on the
function driver selection.'''
startInterfaceNumber.setDescription(helpText)
startInterfaceNumber.setVisible(True)
startInterfaceNumber.setMin(0)
startInterfaceNumber.setDefaultValue(0)
startInterfaceNumber.setReadOnly(True)
# Adding Number of Interfaces
numberOfInterfaces = usbDeviceCdcComponent.createIntegerSymbol("CONFIG_USB_DEVICE_FUNCTION_NUMBER_OF_INTERFACES", None)
numberOfInterfaces.setLabel("Number of Interfaces")
helpText = '''Indicates the interfaces in the Communication Device
Interface Group. This is provided here for indication purposes
only.'''
numberOfInterfaces.setDescription(helpText)
numberOfInterfaces.setVisible(True)
numberOfInterfaces.setMin(1)
numberOfInterfaces.setMax(16)
numberOfInterfaces.setDefaultValue(2)
numberOfInterfaces.setReadOnly(True)
# Use IAD
useIad = usbDeviceCdcComponent.createBooleanSymbol("CONFIG_USB_DEVICE_FUNCTION_USE_IAD", None)
useIad.setLabel("Use Interface Association Descriptor")
helpText = '''Enable this option to generate a Interface Association
Descriptor (IAD). This option should be enabled in case multiple CDC
interfaces are included in the Device. Enabling the option will update
the Class, Sublass fields in the Device Descriptor to indicate that
that device uses IAD.'''
useIad.setDescription(helpText)
useIad.setVisible(True)
useIad.setDefaultValue(False)
useIad.setUseSingleDynamicValue(True)
# CDC Function driver Read Queue Size
queueSizeRead = usbDeviceCdcComponent.createIntegerSymbol("CONFIG_USB_DEVICE_FUNCTION_READ_Q_SIZE", None)
queueSizeRead.setLabel("CDC Read Queue Size")
helpText = '''Configure the size of the Read Queue. This configures the
maximum number of Read Requests that can be queued before the Function
Driver returns a queue full response. Using a queue increases memory
consumption but also increases throughput. The driver will queue
requests if the transfer request is currently being processed.'''
queueSizeRead.setDescription(helpText)
queueSizeRead.setVisible(True)
queueSizeRead.setMin(1)
queueSizeRead.setMax(32767)
queueSizeRead.setDefaultValue(1)
currentQSizeRead = queueSizeRead.getValue()
# CDC Function driver Write Queue Size
queueSizeWrite = usbDeviceCdcComponent.createIntegerSymbol("CONFIG_USB_DEVICE_FUNCTION_WRITE_Q_SIZE", None)
helpText = '''Configure the size of the Write Queue. This configures
the maximum number of Write Requests that can be queued before the
Function Driver returns a queue full response. Using a queue increases
memory consumption but also increases throughput. The driver will queue
requests if the transfer request is currently being processed.'''
queueSizeWrite.setDescription(helpText)
queueSizeWrite.setLabel("CDC Write Queue Size")
queueSizeWrite.setVisible(True)
queueSizeWrite.setMin(1)
queueSizeWrite.setMax(32767)
queueSizeWrite.setDefaultValue(1)
currentQSizeWrite = queueSizeWrite.getValue()
# CDC Function driver Serial state notification Queue Size
queueSizeSerialStateNotification = usbDeviceCdcComponent.createIntegerSymbol("CONFIG_USB_DEVICE_FUNCTION_SERIAL_NOTIFIACATION_Q_SIZE", None)
queueSizeSerialStateNotification.setLabel("CDC Serial Notification Queue Size")
helpText = '''Configure the size of the Serial State Notification
Queue. This configures the maximum number of Serial State Notification
Requests that can be queued before the Function Driver returns a queue
full response. Using a queue increases memory consumption but also
increases throughput. The driver will queue requests if the transfer
request is currently being processed.'''
queueSizeSerialStateNotification.setDescription(helpText)
queueSizeSerialStateNotification.setVisible(True)
queueSizeSerialStateNotification.setMin(1)
queueSizeSerialStateNotification.setMax(32767)
queueSizeSerialStateNotification.setDefaultValue(1)
currentQSizeSerialStateNotification = queueSizeSerialStateNotification.getValue()
# CDC Function driver Notification Endpoint Number
epNumberInterrupt = usbDeviceCdcComponent.createIntegerSymbol("CONFIG_USB_DEVICE_FUNCTION_INT_ENDPOINT_NUMBER", None)
helpText = '''Specify the endpoint number of Interrupt IN Endpoint to
be used for this instance of the CDC Interface. Refer to Device
Datasheet for details on available endpoints and limitations.'''
epNumberInterrupt.setDescription(helpText)
epNumberInterrupt.setLabel("Interrupt Endpoint Number")
epNumberInterrupt.setVisible(True)
epNumberInterrupt.setMin(1)
epNumberInterrupt.setDefaultValue(1)
epNumberInterrupt.setMax(MaxEpNumber)
# CDC Function driver Data OUT Endpoint Number
epNumberBulkOut = usbDeviceCdcComponent.createIntegerSymbol("CONFIG_USB_DEVICE_FUNCTION_BULK_OUT_ENDPOINT_NUMBER", None)
helpText = '''Specify the endpoint number of Bulk Out Endpoint to
be used for this instance of the CDC Interface. Refer to Device
Datasheet for details on available endpoints and limitations.'''
epNumberBulkOut.setDescription(helpText)
epNumberBulkOut.setLabel("Bulk OUT Endpoint Number")
epNumberBulkOut.setVisible(True)
epNumberBulkOut.setMin(1)
epNumberBulkOut.setDefaultValue(2)
epNumberBulkOut.setMax(MaxEpNumber)
# CDC Function driver Data IN Endpoint Number
epNumberBulkIn = usbDeviceCdcComponent.createIntegerSymbol("CONFIG_USB_DEVICE_FUNCTION_BULK_IN_ENDPOINT_NUMBER", None)
helpText = '''Specify the endpoint number of Bulk IN Endpoint to
be used for this instance of the CDC Interface. Refer to Device
Datasheet for details on available endpoints and limitations.'''
epNumberBulkIn.setDescription(helpText)
epNumberBulkIn.setLabel("Bulk IN Endpoint Number")
epNumberBulkIn.setVisible(True)
epNumberBulkIn.setMin(1)
epNumberBulkIn.setMax(MaxEpNumber)
epNumberBulkIn.setDefaultValue(BulkInDefaultEpNumber)
usbDeviceCdcBufPool = usbDeviceCdcComponent.createBooleanSymbol("CONFIG_USB_DEVICE_CDC_BUFFER_POOL", None)
usbDeviceCdcBufPool.setLabel("**** Buffer Pool Update ****")
usbDeviceCdcBufPool.setDependencies(usbDeviceCdcBufferQueueSize, ["CONFIG_USB_DEVICE_FUNCTION_READ_Q_SIZE", "CONFIG_USB_DEVICE_FUNCTION_WRITE_Q_SIZE", "CONFIG_USB_DEVICE_FUNCTION_SERIAL_NOTIFIACATION_Q_SIZE"])
usbDeviceCdcBufPool.setVisible(False)
############################################################################
#### Dependency ####
############################################################################
# USB DEVICE CDC Common Dependency
Log.writeDebugMessage ("Dependency Started")
numInstances = Database.getSymbolValue("usb_device_cdc", "CONFIG_USB_DEVICE_CDC_INSTANCES")
if (numInstances == None):
numInstances = 0
args = {"cdcInstanceCount": index+1}
res = Database.sendMessage("usb_device_cdc", "UPDATE_CDC_INSTANCES", args)
#############################################################
# Function Init Entry for CDC
#############################################################
usbDeviceCdcFunInitFile = usbDeviceCdcComponent.createFileSymbol(None, None)
usbDeviceCdcFunInitFile.setType("STRING")
usbDeviceCdcFunInitFile.setOutputName("usb_device.LIST_USB_DEVICE_FUNCTION_INIT_ENTRY")
usbDeviceCdcFunInitFile.setSourcePath("templates/device/cdc/system_init_c_device_data_cdc_function_init.ftl")
usbDeviceCdcFunInitFile.setMarkup(True)
#############################################################
# Function Registration table for CDC
#############################################################
usbDeviceCdcFunRegTableFile = usbDeviceCdcComponent.createFileSymbol(None, None)
usbDeviceCdcFunRegTableFile.setType("STRING")
usbDeviceCdcFunRegTableFile.setOutputName("usb_device.LIST_USB_DEVICE_FUNCTION_ENTRY")
usbDeviceCdcFunRegTableFile.setSourcePath("templates/device/cdc/system_init_c_device_data_cdc_function.ftl")
usbDeviceCdcFunRegTableFile.setMarkup(True)
#############################################################
# HS Descriptors for CDC Function
#############################################################
usbDeviceCdcDescriptorHsFile = usbDeviceCdcComponent.createFileSymbol(None, None)
usbDeviceCdcDescriptorHsFile.setType("STRING")
usbDeviceCdcDescriptorHsFile.setOutputName("usb_device.LIST_USB_DEVICE_FUNCTION_DESCRIPTOR_HS_ENTRY")
usbDeviceCdcDescriptorHsFile.setSourcePath("templates/device/cdc/system_init_c_device_data_cdc_function_descrptr_hs.ftl")
usbDeviceCdcDescriptorHsFile.setMarkup(True)
#############################################################
# FS Descriptors for CDC Function
#############################################################
usbDeviceCdcDescriptorFsFile = usbDeviceCdcComponent.createFileSymbol(None, None)
usbDeviceCdcDescriptorFsFile.setType("STRING")
usbDeviceCdcDescriptorFsFile.setOutputName("usb_device.LIST_USB_DEVICE_FUNCTION_DESCRIPTOR_FS_ENTRY")
usbDeviceCdcDescriptorFsFile.setSourcePath("templates/device/cdc/system_init_c_device_data_cdc_function_descrptr_fs.ftl")
usbDeviceCdcDescriptorFsFile.setMarkup(True)
#############################################################
# Class code Entry for CDC Function
#############################################################
usbDeviceCdcDescriptorClassCodeFile = usbDeviceCdcComponent.createFileSymbol(None, None)
usbDeviceCdcDescriptorClassCodeFile.setType("STRING")
usbDeviceCdcDescriptorClassCodeFile.setOutputName("usb_device.LIST_USB_DEVICE_DESCRIPTOR_CLASS_CODE_ENTRY")
usbDeviceCdcDescriptorClassCodeFile.setSourcePath("templates/device/cdc/system_init_c_device_data_cdc_function_class_codes.ftl")
usbDeviceCdcDescriptorClassCodeFile.setMarkup(True)
################################################
# USB CDC Function driver Files
################################################
usbDeviceCdcHeaderFile = usbDeviceCdcComponent.createFileSymbol(None, None)
addFileName('usb_device_cdc.h', usbDeviceCdcComponent, usbDeviceCdcHeaderFile, "middleware/", "/usb/", True, None)
usbCdcHeaderFile = usbDeviceCdcComponent.createFileSymbol(None, None)
addFileName('usb_cdc.h', usbDeviceCdcComponent, usbCdcHeaderFile, "middleware/", "/usb/", True, None)
usbDeviceCdcSourceFile = usbDeviceCdcComponent.createFileSymbol(None, None)
addFileName('usb_device_cdc.c', usbDeviceCdcComponent, usbDeviceCdcSourceFile, "middleware/src/", "/usb/src", True, None)
usbDeviceCdcAcmSourceFile = usbDeviceCdcComponent.createFileSymbol(None, None)
addFileName('usb_device_cdc_acm.c', usbDeviceCdcComponent, usbDeviceCdcAcmSourceFile, "middleware/src/", "/usb/src", True, None)
usbDeviceCdcLocalHeaderFile = usbDeviceCdcComponent.createFileSymbol(None, None)
addFileName('usb_device_cdc_local.h', usbDeviceCdcComponent, usbDeviceCdcLocalHeaderFile, "middleware/src/", "/usb/src", True, None)
# all files go into src/
def addFileName(fileName, component, symbol, srcPath, destPath, enabled, callback):
configName1 = Variables.get("__CONFIGURATION_NAME")
#filename = component.createFileSymbol(None, None)
symbol.setProjectPath("config/" + configName1 + destPath)
symbol.setSourcePath(srcPath + fileName)
symbol.setOutputName(fileName)
symbol.setDestPath(destPath)
if fileName[-2:] == '.h':
symbol.setType("HEADER")
else:
symbol.setType("SOURCE")
symbol.setEnabled(enabled)
| 46.60355 | 210 | 0.752878 | 2,362 | 23,628 | 7.368332 | 0.171041 | 0.047058 | 0.030165 | 0.026431 | 0.478855 | 0.471443 | 0.399161 | 0.367846 | 0.33251 | 0.289071 | 0 | 0.008148 | 0.127391 | 23,628 | 506 | 211 | 46.695652 | 0.835969 | 0.05248 | 0 | 0.318059 | 0 | 0 | 0.319534 | 0.117997 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.008086 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
3f18a598378fac5606353de6db627c25234fa321 | 22,823 | py | Python | jts/backend/jobapps/views.py | goupaz/babylon | 4e638d02705469061e563fec349676d8faa9f648 | [
"MIT"
] | 1 | 2019-08-08T09:03:17.000Z | 2019-08-08T09:03:17.000Z | backend/jobapps/views.py | goupaz/website | ce1bc8b6c52ee0815a7b98842ec3bde0c20e0add | [
"Apache-2.0"
] | 2 | 2020-10-09T19:16:09.000Z | 2020-10-10T20:40:41.000Z | jts/backend/jobapps/views.py | goupaz/babylon-hackathon | 4e638d02705469061e563fec349676d8faa9f648 | [
"MIT"
] | 1 | 2019-07-21T01:42:21.000Z | 2019-07-21T01:42:21.000Z | from datetime import datetime as dt
from django.utils import timezone
import uuid
from django.contrib.auth import get_user_model
from django.http import JsonResponse
from django.views.decorators.csrf import csrf_exempt
from rest_framework.decorators import api_view
from company.utils import get_or_create_company
from position.utils import get_or_insert_position
from utils import utils
from utils.error_codes import ResponseCodes
from utils.generic_json_creator import create_response
from .models import JobApplication, Contact, ApplicationStatus, StatusHistory
from .models import JobApplicationNote, JobApplicationFile
from .models import Source
from alumni.serializers import AlumniSerializer
from .serializers import ApplicationStatusSerializer
from .serializers import JobApplicationNoteSerializer, JobApplicationFileSerializer
from .serializers import JobApplicationSerializer, ContactSerializer
from .serializers import SourceSerializer
from .serializers import StatusHistorySerializer
User = get_user_model()
@csrf_exempt
@api_view(["GET", "POST", "PUT", "PATCH", "DELETE"])
def job_applications(request):
body = request.data
if 'recaptcha_token' in body and utils.verify_recaptcha(None, body['recaptcha_token'],
'add_job') == ResponseCodes.verify_recaptcha_failed:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.verify_recaptcha_failed),
safe=False)
if request.method == "GET":
timestamp = request.GET.get('timestamp')
if timestamp is not None:
timestamp = int(timestamp) / 1000
if timestamp is None:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.invalid_parameters))
profile = request.user
time = dt.fromtimestamp(int(timestamp))
user_job_apps = JobApplication.objects.filter(created__gte=time)
job_application_list = JobApplicationSerializer(instance=user_job_apps, many=True, context={
'user': request.user}).data
response = {'data': job_application_list, 'synching': profile.synching}
return JsonResponse(create_response(data=response), safe=False)
status_id = request.GET.get('status_id')
if status_id is not None:
user_job_apps = JobApplication.objects.filter(
application_status__id=status_id, user__id=request.user.id, is_deleted=False).order_by('-apply_date')
else:
user_job_apps = JobApplication.objects.filter(
user_id=request.user.id, is_deleted=False).order_by('-apply_date')
job_applications_list = JobApplicationSerializer(instance=user_job_apps, many=True, context={
'user': request.user}).data
return JsonResponse(create_response(data=job_applications_list), safe=False)
elif request.method == "POST":
job_title = body['job_title']
company = body['company']
application_date = body['application_date']
status = int(body['status_id'])
source = body['source']
jt = get_or_insert_position(job_title)
jc = get_or_create_company(company)
if Source.objects.filter(value__iexact=source).count() == 0:
source = Source.objects.create(value=source)
else:
source = Source.objects.get(value__iexact=source)
job_application = JobApplication(position=jt, company_object=jc, apply_date=application_date,
msg_id='', app_source=source, user=request.user)
job_application.application_status = ApplicationStatus.objects.get(pk=status)
job_application.save()
return JsonResponse(
create_response(
data=JobApplicationSerializer(instance=job_application, many=False, context={'user': request.user}).data),
safe=False)
elif request.method == "PUT":
status_id = body.get('status_id')
rejected = body.get('rejected')
job_application_ids = []
if 'jobapp_ids' in body:
job_application_ids = body['jobapp_ids']
if 'jobapp_id' in body:
job_application_ids.append(body['jobapp_id'])
if len(job_application_ids) == 0:
return JsonResponse(create_response(success=False, error_code=ResponseCodes.record_not_found), safe=False)
elif rejected is None and status_id is None:
return JsonResponse(create_response(success=False, error_code=ResponseCodes.record_not_found), safe=False)
else:
user_job_apps = JobApplication.objects.filter(pk__in=job_application_ids)
if user_job_apps.count() == 0:
return JsonResponse(create_response(success=False, error_code=ResponseCodes.record_not_found), safe=False)
else:
for user_job_app in user_job_apps:
if user_job_app.user == request.user:
if status_id is None:
user_job_app.is_rejected = rejected
else:
new_status = ApplicationStatus.objects.filter(pk=status_id)
if new_status.count() == 0:
return JsonResponse(
create_response(data=None, success=False,
error_code=ResponseCodes.invalid_parameters),
safe=False)
else:
if rejected is None:
user_job_app.application_status = new_status[0]
else:
user_job_app.application_status = new_status[0]
user_job_app.is_rejected = rejected
status_history = StatusHistory(
job_post=user_job_app, application_status=new_status[0])
status_history.save()
if rejected is not None:
user_job_app.rejected_date = timezone.now()
user_job_app.updated_date = timezone.now()
user_job_app.save()
return JsonResponse(create_response(data=None), safe=False)
elif request.method == "PATCH":
job_app_id = body.get('jobapp_id')
if job_app_id is None:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.record_not_found),
safe=False)
user_job_app = JobApplication.objects.get(pk=job_app_id)
if user_job_app.user != request.user:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.record_not_found),
safe=False)
if user_job_app.msg_id is not None and user_job_app.msg_id != '':
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.record_not_found),
safe=False)
job_title = body.get('job_title')
company = body.get('company')
application_date = body.get('application_date')
source = body.get('source')
if application_date is not None:
user_job_app.apply_date = application_date
if job_title is not None:
user_job_app.position = get_or_insert_position(job_title)
if company is not None:
user_job_app.company_object = get_or_create_company(company)
if source is not None:
if Source.objects.filter(value__iexact=source).count() == 0:
source = Source.objects.create(value=source)
else:
source = Source.objects.get(value__iexact=source)
user_job_app.app_source = source
user_job_app.updated_date = timezone.now()
user_job_app.save()
return JsonResponse(create_response(
data=JobApplicationSerializer(instance=user_job_app, many=False, context={'user': request.user}).data),
safe=False)
elif request.method == "DELETE":
job_application_ids = []
if 'jobapp_ids' in body:
job_application_ids = body['jobapp_ids']
if 'jobapp_id' in body:
job_application_ids.append(body['jobapp_id'])
if len(job_application_ids) == 0 or JobApplication.objects.filter(pk__in=job_application_ids).count() == 0:
return JsonResponse(create_response(success=False, error_code=ResponseCodes.record_not_found), safe=False)
else:
user_job_apps = JobApplication.objects.filter(pk__in=job_application_ids)
for user_job_app in user_job_apps:
if user_job_app.user == request.user:
user_job_app.deleted_date = timezone.now()
user_job_app.is_deleted = True
user_job_app.save()
return JsonResponse(create_response(data=None), safe=False)
@csrf_exempt
@api_view(["GET"])
def statuses(request):
statuses_list = ApplicationStatus.objects.all()
statuses_list = ApplicationStatusSerializer(instance=statuses_list, many=True).data
return JsonResponse(create_response(data=statuses_list), safe=False)
@csrf_exempt
@api_view(["GET"])
def sources(request):
source_list = SourceSerializer(instance=Source.objects.all(), many=True).data
return JsonResponse(create_response(data=source_list), safe=False)
@csrf_exempt
@api_view(["GET"])
def status_history(request, job_app_pk):
if job_app_pk is None:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.invalid_parameters), safe=False)
else:
statuses_list = StatusHistory.objects.filter(job_post__pk=job_app_pk)
statuses_list = StatusHistorySerializer(instance=statuses_list, many=True).data
return JsonResponse(create_response(data=statuses_list), safe=False)
@csrf_exempt
@api_view(["GET", "POST", "PUT", "DELETE"])
def contacts(request, job_app_pk):
body = request.data
if request.method == "GET":
data = {}
if job_app_pk is None:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.invalid_parameters), safe=False)
else:
contacts_list = Contact.objects.filter(job_post__pk=job_app_pk)
contacts_list = ContactSerializer(instance=contacts_list, many=True).data
data['contacts'] = contacts_list
user_profile = request.user
if not user_profile.user_type.alumni_listing_enabled:
alumni = []
else:
jobapp = JobApplication.objects.get(pk=job_app_pk)
alumni_list = User.objects.filter(college=user_profile.college, company=jobapp.company_object,
user_type__name__iexact='Alumni', is_demo=False)
alumni = AlumniSerializer(
instance=alumni_list, many=True, context={'user': request.user}).data
data['alumni'] = alumni
return JsonResponse(create_response(data=data, success=True, error_code=ResponseCodes.success), safe=False)
elif request.method == "POST":
first_name = body.get('first_name')
last_name = body.get('last_name')
if job_app_pk is None or first_name is None or last_name is None:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.invalid_parameters),
safe=False)
user_job_app = JobApplication.objects.get(pk=job_app_pk)
if user_job_app.user == request.user:
phone_number = body.get('phone_number')
linkedin_url = body.get('linkedin_url')
description = body.get('description')
email = body.get('email')
job_title = body.get('job_title')
jt = None
jc = None
if job_title is not None:
jt = get_or_insert_position(job_title)
company = body.get('company')
if company is not None:
jc = get_or_create_company(company)
contact = Contact(
job_post=user_job_app, first_name=first_name, last_name=last_name, phone_number=phone_number,
linkedin_url=linkedin_url,
description=description, email=email,
position=jt, company=jc)
contact.save()
data = ContactSerializer(
instance=contact, many=False).data
return JsonResponse(create_response(data=data), safe=False)
else:
return JsonResponse(
create_response(data=None, success=False, error_code=ResponseCodes.record_not_found),
safe=False)
elif request.method == "PUT":
contact_id = body.get('contact_id')
if contact_id is None:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.invalid_parameters),
safe=False)
contact = Contact.objects.get(pk=contact_id)
if contact.job_post.user == request.user:
first_name = body.get('first_name')
if first_name is not None:
contact.first_name = first_name
last_name = body.get('last_name')
if last_name is not None:
contact.last_name = last_name
email = body.get('email')
if email is not None:
contact.email = email
phone_number = body.get('phone_number')
if phone_number is not None:
contact.phone_number = phone_number
linkedin_url = body.get('linkedin_url')
if linkedin_url is not None:
contact.linkedin_url = linkedin_url
description = body.get('description')
if description is not None:
contact.description = description
job_title = body.get('job_title')
if job_title is not None:
contact.position = get_or_insert_position(job_title)
company = body.get('company')
if company is not None:
contact.company = get_or_create_company(company)
contact.update_date = timezone.now()
contact.save()
data = ContactSerializer(
instance=contact, many=False).data
return JsonResponse(create_response(data=data, success=True, error_code=ResponseCodes.success),
safe=False)
else:
return JsonResponse(
create_response(data=None, success=False, error_code=ResponseCodes.record_not_found),
safe=False)
elif request.method == "DELETE":
contact_id = body.get('contact_id')
if contact_id is None:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.invalid_parameters),
safe=False)
user_job_app_contact = Contact.objects.filter(
pk=contact_id)
if user_job_app_contact.count() == 0:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.record_not_found),
safe=False)
user_job_app_contact = user_job_app_contact[0]
if user_job_app_contact.job_post.user == request.user:
user_job_app_contact.delete()
return JsonResponse(create_response(data=None, success=True, error_code=ResponseCodes.success), safe=False)
else:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.record_not_found),
safe=False)
@csrf_exempt
@api_view(["GET", "POST", "PUT", "DELETE"])
def notes(request, job_app_pk):
body = request.data
if 'recaptcha_token' in body and utils.verify_recaptcha(None, body['recaptcha_token'],
'jobapp_note') == ResponseCodes.verify_recaptcha_failed:
return JsonResponse(
create_response(data=None, success=False, error_code=ResponseCodes.verify_recaptcha_failed),
safe=False)
if request.method == "GET":
if job_app_pk is None:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.invalid_parameters),
safe=False)
else:
notes_list = JobApplicationNote.objects.filter(
job_post__pk=job_app_pk).order_by('-update_date', '-created_date')
notes_list = JobApplicationNoteSerializer(
instance=notes_list, many=True).data
return JsonResponse(create_response(data=notes_list, success=True, error_code=ResponseCodes.success),
safe=False)
elif request.method == "POST":
description = body['description']
if job_app_pk is None or description is None:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.invalid_parameters),
safe=False)
else:
user_job_app = JobApplication.objects.get(pk=job_app_pk)
if user_job_app.user == request.user:
note = JobApplicationNote(
job_post=user_job_app, description=description)
note.save()
data = JobApplicationNoteSerializer(
instance=note, many=False).data
return JsonResponse(create_response(data=data, success=True, error_code=ResponseCodes.success),
safe=False)
else:
return JsonResponse(
create_response(data=None, success=False, error_code=ResponseCodes.record_not_found), safe=False)
elif request.method == "PUT":
jobapp_note_id = body['jobapp_note_id']
description = body['description']
if jobapp_note_id is None:
return JsonResponse(
create_response(data=None, success=False, error_code=ResponseCodes.invalid_parameters), safe=False)
else:
note = JobApplicationNote.objects.get(pk=jobapp_note_id)
if note.job_post.user == request.user:
note.description = description
note.update_date = timezone.now()
note.save()
data = JobApplicationNoteSerializer(
instance=note, many=False).data
return JsonResponse(create_response(data=data, success=True, error_code=ResponseCodes.success),
safe=False)
else:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.record_not_found),
safe=False)
elif request.method == "DELETE":
jobapp_note_id = body['jobapp_note_id']
if jobapp_note_id is None:
return JsonResponse(
create_response(data=None, success=False, error_code=ResponseCodes.invalid_parameters), safe=False)
else:
user_job_app_note = JobApplicationNote.objects.get(
pk=jobapp_note_id)
if user_job_app_note.job_post.user == request.user:
user_job_app_note.delete()
return JsonResponse(create_response(data=None, success=True, error_code=ResponseCodes.success), safe=False)
else:
return JsonResponse(
create_response(data=None, success=False, error_code=ResponseCodes.record_not_found),
safe=False)
@csrf_exempt
@api_view(["GET", "POST", "DELETE"])
def files(request, job_app_pk):
body = request.data
if request.method == "GET":
if job_app_pk is None:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.invalid_parameters),
safe=False)
else:
files_list = JobApplicationFile.objects.filter(
job_post__pk=job_app_pk).order_by('-update_date', '-created_date')
files_list = JobApplicationFileSerializer(
instance=files_list, many=True).data
return JsonResponse(create_response(data=files_list, success=True, error_code=ResponseCodes.success),
safe=False)
elif request.method == "POST":
file = body['file']
if job_app_pk is None or file is None:
return JsonResponse(create_response(data=None, success=False, error_code=ResponseCodes.invalid_parameters),
safe=False)
else:
ext = file.name.split('.')[-1]
filename = "%s.%s" % (uuid.uuid4(), ext)
name = file.name.replace(('.' + ext), '')
filename = name + '_' + filename
user_job_app = JobApplication.objects.get(pk=job_app_pk)
if user_job_app.user == request.user:
jobapp_file = JobApplicationFile(
job_post=user_job_app, name=name)
jobapp_file.save()
jobapp_file.file.save(filename, file, save=True)
data = JobApplicationFileSerializer(
instance=jobapp_file, many=False).data
return JsonResponse(create_response(data=data, success=True, error_code=ResponseCodes.success),
safe=False)
else:
return JsonResponse(
create_response(data=None, success=False, error_code=ResponseCodes.record_not_found), safe=False)
elif request.method == "DELETE":
jobapp_file_id = body['jobapp_file_id']
if jobapp_file_id is None:
return JsonResponse(
create_response(data=None, success=False, error_code=ResponseCodes.invalid_parameters), safe=False)
else:
user_job_app_file = JobApplicationFile.objects.get(
pk=jobapp_file_id)
if user_job_app_file.job_post.user == request.user:
user_job_app_file.delete()
return JsonResponse(create_response(data=None, success=True, error_code=ResponseCodes.success), safe=False)
else:
return JsonResponse(
create_response(data=None, success=False, error_code=ResponseCodes.record_not_found),
safe=False) | 50.605322 | 131 | 0.623056 | 2,524 | 22,823 | 5.374802 | 0.064976 | 0.030517 | 0.091995 | 0.12266 | 0.723574 | 0.701238 | 0.638213 | 0.608285 | 0.565163 | 0.542533 | 0 | 0.001113 | 0.291285 | 22,823 | 451 | 132 | 50.605322 | 0.837589 | 0 | 0 | 0.55792 | 0 | 0 | 0.034788 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016548 | false | 0 | 0.049645 | 0 | 0.189125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
3f1e42b52ec11496ab90f620e8e049e8cb9d426e | 1,462 | py | Python | tests/test_env.py | dmitrvk/mymusichere-app | 02a6d5f60a72197e08c98da59b0ef7e7168dcf4b | [
"MIT"
] | null | null | null | tests/test_env.py | dmitrvk/mymusichere-app | 02a6d5f60a72197e08c98da59b0ef7e7168dcf4b | [
"MIT"
] | 14 | 2020-06-06T19:08:03.000Z | 2020-12-03T12:07:04.000Z | tests/test_env.py | dmitrvk/mymusichere-app | 02a6d5f60a72197e08c98da59b0ef7e7168dcf4b | [
"MIT"
] | null | null | null | # Licensed under the MIT License
from mymusichere import env
class TestEnv:
def test_get_config_from_env(self, monkeypatch):
monkeypatch.setenv('CONFIG', 'value')
assert env.get_str_config('config') == 'value'
def test_get_secret_from_env(self, monkeypatch):
monkeypatch.setenv('SECRET', 'value')
assert env.get_secret('secret') == 'value'
def test_get_config_from_file(self, fs):
fs.create_file('/config', contents='value')
assert env.get_str_config('config') == 'value'
def test_get_secret_from_file(self, fs):
fs.create_file('/run/secrets/secret', contents='value')
assert env.get_secret('secret') == 'value'
def test_config_default(self):
assert env.get_str_config('config', 'default') == 'default'
def test_secret_default(self):
assert env.get_secret('secret', 'default') == 'default'
def test_bool_config(self, monkeypatch):
monkeypatch.setenv('CONFIG_TRUE', '1')
monkeypatch.setenv('CONFIG_FALSE', '0')
assert env.get_bool_config('config_true') is True
assert env.get_bool_config('config_false') is False
assert env.get_bool_config('config_default', default=True) is True
assert env.get_bool_config('config_default', default=False) is False
def test_str_config(self, monkeypatch):
monkeypatch.setenv('CONFIG', 'config')
assert env.get_str_config('config') == 'config'
| 35.658537 | 76 | 0.678523 | 189 | 1,462 | 4.978836 | 0.185185 | 0.105207 | 0.140276 | 0.136026 | 0.757705 | 0.658874 | 0.399575 | 0.348565 | 0.289054 | 0.121148 | 0 | 0.001696 | 0.19357 | 1,462 | 40 | 77 | 36.55 | 0.796438 | 0.02052 | 0 | 0.142857 | 0 | 0 | 0.169231 | 0 | 0 | 0 | 0 | 0 | 0.392857 | 1 | 0.285714 | false | 0 | 0.035714 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
3f31a6445fe5a4545fbddede8dd570fd945d12b3 | 596 | py | Python | web-interface/app/application/misc/pages/misc_options/purposes_sampling.py | horvathi94/seqmeta | 94f2f04c372181c93a6f68b6efe15b141ef02779 | [
"MIT"
] | null | null | null | web-interface/app/application/misc/pages/misc_options/purposes_sampling.py | horvathi94/seqmeta | 94f2f04c372181c93a6f68b6efe15b141ef02779 | [
"MIT"
] | null | null | null | web-interface/app/application/misc/pages/misc_options/purposes_sampling.py | horvathi94/seqmeta | 94f2f04c372181c93a6f68b6efe15b141ef02779 | [
"MIT"
] | null | null | null | from dataclasses import dataclass
from .base import _MiscOptionBase
from application.src.misc.sampling import PurposesOfSampling
@dataclass
class Editor(_MiscOptionBase):
name = "Purpose of sampling"
id = "purpose_of_sampling"
link = "misc_bp.submit_purpose_of_sampling"
description = "The reason the sample was collected " \
"<em>e.g. diagnostic testing</em>"
@classmethod
def get_values(cls) -> list:
return PurposesOfSampling.fetch_list()
@classmethod
def save(cls, data: list) -> None:
PurposesOfSampling.save_by_procedure(data)
| 24.833333 | 60 | 0.718121 | 69 | 596 | 6.028986 | 0.637681 | 0.064904 | 0.122596 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.196309 | 596 | 23 | 61 | 25.913043 | 0.868476 | 0 | 0 | 0.125 | 0 | 0 | 0.235294 | 0.057143 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.1875 | 0.0625 | 0.6875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
3f374d2a724eacf4543f9a4bee934b7b700f04f6 | 396 | py | Python | python/phevaluator/table_tests/test_hashtable8.py | StTronn/PokerHandEvaluator | 3611a7072c2a62844d6aca32d798aafa59e4606d | [
"Apache-2.0"
] | 1 | 2020-11-12T14:35:02.000Z | 2020-11-12T14:35:02.000Z | python/phevaluator/table_tests/test_hashtable8.py | StTronn/PokerHandEvaluator | 3611a7072c2a62844d6aca32d798aafa59e4606d | [
"Apache-2.0"
] | null | null | null | python/phevaluator/table_tests/test_hashtable8.py | StTronn/PokerHandEvaluator | 3611a7072c2a62844d6aca32d798aafa59e4606d | [
"Apache-2.0"
] | null | null | null | import unittest
from table_tests.utils import BaseTestNoFlushTable
from evaluator.hashtable8 import NO_FLUSH_8
class TestNoFlush8Table(BaseTestNoFlushTable):
TOCOMPARE = NO_FLUSH_8
TABLE = [0] * len(TOCOMPARE)
VISIT = [0] * len(TOCOMPARE)
NUM_CARDS = 8
def test_noflush8_table(self):
self.assertListEqual(self.TABLE, self.TOCOMPARE)
if __name__ == "__main__":
unittest.main()
| 23.294118 | 52 | 0.765152 | 49 | 396 | 5.857143 | 0.571429 | 0.04878 | 0.055749 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023599 | 0.143939 | 396 | 16 | 53 | 24.75 | 0.823009 | 0 | 0 | 0 | 0 | 0 | 0.020202 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 1 | 0.083333 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
3f39db2a6e3725e4d6d3a964e14a0df2e6772218 | 655 | py | Python | days/day01/part2.py | jaredbancroft/aoc2021 | 4eaf339cc0c8566da2af13f7cb9cf6fe87355aac | [
"MIT"
] | null | null | null | days/day01/part2.py | jaredbancroft/aoc2021 | 4eaf339cc0c8566da2af13f7cb9cf6fe87355aac | [
"MIT"
] | null | null | null | days/day01/part2.py | jaredbancroft/aoc2021 | 4eaf339cc0c8566da2af13f7cb9cf6fe87355aac | [
"MIT"
] | null | null | null | from helpers import inputs
def solution(day):
depths = inputs.read_to_list(f"inputs/{day}.txt")
part2_total = 0
for index, depth in enumerate(depths):
if index - 3 >= 0:
current_window = (
int(depth) + int(depths[index - 1]) + int(depths[index - 2])
)
previous_window = (
int(depths[index - 1])
+ int(depths[index - 2])
+ int(depths[index - 3])
)
diff = current_window - previous_window
if diff > 0:
part2_total += 1
return f"Day 01 Part 2 Total Depth Increase: {part2_total}"
| 31.190476 | 76 | 0.51145 | 77 | 655 | 4.233766 | 0.441558 | 0.138037 | 0.214724 | 0.092025 | 0.184049 | 0.184049 | 0.184049 | 0.184049 | 0 | 0 | 0 | 0.039506 | 0.381679 | 655 | 20 | 77 | 32.75 | 0.765432 | 0 | 0 | 0 | 0 | 0 | 0.099237 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.055556 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.