hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
9c13699f5f2423fa33e54734cadfdc08430a5008 | 14,575 | py | Python | webrtc_pkg/webrtc_pkg/RTCCam/rtc_cam.py | Road-Balance/rb_nanosaur | cc17d184e926f3a021698e6f52e916b107dc0fc5 | [
"RSA-MD"
] | 7 | 2021-05-29T09:28:08.000Z | 2021-12-11T10:57:10.000Z | webrtc_pkg/webrtc_pkg/RTCCam/rtc_cam.py | Road-Balance/rb_nanosaur | cc17d184e926f3a021698e6f52e916b107dc0fc5 | [
"RSA-MD"
] | 1 | 2021-05-31T20:49:13.000Z | 2021-06-01T11:39:16.000Z | webrtc_pkg/webrtc_pkg/RTCCam/rtc_cam.py | Road-Balance/rb_nanosaur | cc17d184e926f3a021698e6f52e916b107dc0fc5 | [
"RSA-MD"
] | 1 | 2022-03-29T08:38:57.000Z | 2022-03-29T08:38:57.000Z | import av # PyAV 임포트
import cv2
import gi
import time
import logging
import asyncio
import numpy as np
from rtcbot import CVCamera, CVDisplay
gi.require_version("Gst", "1.0")
from gi.repository import GObject, Gst
# TODO: Where should place this?
Gst.init(None)
class WebCam(CVCamera):
# TODO: set cam number 0, 1 etc...
_log = logging.getLogger("rtcbot.WebCam")
def __init__(
self,
width=640, # 320,
height=480, # 240,
camID=0,
fps=30,
preprocessframe=lambda x: x,
loop=None,
):
self._width = width
self._height = height
self._cameranumber = camID
self._fps = fps
self._processframe = preprocessframe
self._is_camera_on = False
super().__init__(
self._width, self._height, self._cameranumber, self._fps, self._processframe
)
def _producer(self):
"""
Runs the actual frame capturing code.
"""
cap = cv2.VideoCapture(self._cameranumber)
cap.set(cv2.CAP_PROP_FRAME_WIDTH, self._width)
cap.set(cv2.CAP_PROP_FRAME_HEIGHT, self._height)
cap.set(cv2.CAP_PROP_FPS, self._fps)
if self._is_camera_on == False:
ret, frame = cap.read()
if not ret:
self._log.error("Camera Read Failed %s", str(ret))
cap.release()
self._setError(ret)
return
else:
self._is_camera_on = True
self._log.debug("Camera Ready")
self._setReady(True)
while not self._shouldClose:
ret, frame = cap.read()
if not ret:
self._log.error("CV read error %s", str(ret))
else:
# This optional function is given by the user. default is identity x->x
# frame = self._processframe(frame)
# Send the frame to all subscribers
self._put_nowait(frame)
cap.release()
self._setReady(False)
class CSICam(CVCamera):
"""
GSTCam For Jetson Nano
"""
_log = logging.getLogger("rtcbot.CSICam")
def __init__(
self,
width=640,
height=480,
camID=0,
fps=60,
flip_method=2,
preprocessframe=lambda x: x,
loop=None,
capture_mode="CV",
):
self._width = width
self._height = height
self._cameranumber = camID
self._fps = fps
self._flip_method = flip_method
self._processframe = preprocessframe
self._capture_mode = capture_mode
self._is_camera_on = False
super().__init__(
self._width, self._height, self._cameranumber, self._fps, self._processframe
)
def gstreamer_pipeline(
self,
capture_width=1280,
capture_height=720,
framerate=60,
flip_method=2,
):
return (
"nvarguscamerasrc sensor_id=%d ! "
"video/x-raw(memory:NVMM), "
"width=(int)%d, height=(int)%d, "
"format=(string)NV12, framerate=(fraction)%d/1 ! "
"nvvidconv flip-method=%d ! "
"video/x-raw, width=(int)%d, height=(int)%d, format=(string)BGRx ! "
"videoconvert ! "
"video/x-raw, format=(string)BGR ! "
"appsink"
% (
self._cameranumber,
capture_width,
capture_height,
self._fps,
flip_method,
self._width,
self._height,
)
)
def gst_to_opencv(self, sample):
buf = sample.get_buffer()
caps = sample.get_caps()
# print(caps.get_structure(0).get_value("format"))
# print(caps.get_structure(0).get_value("height"))
# print(caps.get_structure(0).get_value("width"))
# print(buf.get_size())
arr = np.ndarray(
(
caps.get_structure(0).get_value("height"),
caps.get_structure(0).get_value("width"),
3,
),
buffer=buf.extract_dup(0, buf.get_size()),
dtype=np.uint8,
)
return arr
def _producer(self):
"""
Runs the actual frame capturing code.
"""
gst_cmd = self.gstreamer_pipeline(
capture_width=1280, capture_height=720, flip_method=self._flip_method
)
print(gst_cmd)
if self._capture_mode == "GST":
pipeline = Gst.parse_launch(gst_cmd)
sink = pipeline.get_by_name("sink")
pipeline.set_state(Gst.State.PLAYING)
elif self._capture_mode == "CV":
print("CV Mode")
cap = cv2.VideoCapture(gst_cmd, cv2.CAP_GSTREAMER)
cap.set(cv2.CAP_PROP_FRAME_WIDTH, self._width)
cap.set(cv2.CAP_PROP_FRAME_HEIGHT, self._height)
cap.set(cv2.CAP_PROP_FPS, self._fps)
if self._is_camera_on == False:
ret, frame = cap.read()
if not ret:
self._log.error("Camera Read Failed %s", str(ret))
cap.release()
self._setError(ret)
return
else:
self._is_camera_on = True
self._log.debug("Camera Ready")
self._setReady(True)
while not self._shouldClose:
if self._capture_mode == "GST":
sample = sink.emit("pull-sample")
if not sample:
continue
self._log.error("GST read error")
else:
new_frame = self.gst_to_opencv(sample)
self._put_nowait(new_frame)
elif self._capture_mode == "CV":
ret, frame = cap.read()
if not ret:
self._log.error("CV read error %s", str(ret))
else:
self._put_nowait(frame)
if self._capture_mode == "CV":
cap.release()
pipeline.set_state(Gst.State.NULL)
self._setReady(False)
self._log.info("Ended camera capture")
class GSTCam(CVCamera):
"""
Uses a camera supported by OpenCV.
When initializing, can give an optional function which preprocesses frames as they are read, and returns the
modified versions thereof. Please note that the preprocessing happens synchronously in the camera capture thread,
so any processing should be relatively fast, and should avoid pure python code due to the GIL. Numpy and openCV functions
should be OK.
"""
_log = logging.getLogger("rtcbot.GSTCam")
def __init__(
self,
width=640,
height=480,
camID=0,
fps=30,
preprocessframe=lambda x: x,
loop=None,
):
self._width = width
self._height = height
self._cameranumber = camID
self._fps = fps
self._processframe = preprocessframe
self._is_camera_on = False
super().__init__(
self._width, self._height, self._cameranumber, self._fps, self._processframe
)
def gstreamer_pipeline(
self,
capture_width=640,
capture_height=480,
display_width=640,
display_height=480,
framerate=30,
flip_method=0,
):
return (
"v4l2src device=/dev/video%d ! "
"videoconvert ! videorate ! "
"video/x-raw, framerate=%d/1, width=%d, height=%d, format=(string)BGR ! "
"videoconvert ! "
"appsink sync=false max-buffers=2 drop=true name=sink emit-signals=true"
% (
self._cameranumber,
self._fps,
self._width,
self._height,
)
)
def gst_to_opencv(self, sample):
buf = sample.get_buffer()
caps = sample.get_caps()
# print(caps.get_structure(0).get_value("format"))
# print(caps.get_structure(0).get_value("height"))
# print(caps.get_structure(0).get_value("width"))
# print(buf.get_size())
arr = np.ndarray(
(
caps.get_structure(0).get_value("height"),
caps.get_structure(0).get_value("width"),
3,
),
buffer=buf.extract_dup(0, buf.get_size()),
dtype=np.uint8,
)
return arr
def _producer(self):
"""
Runs the actual frame capturing code.
"""
gst_cmd = self.gstreamer_pipeline()
print(gst_cmd)
pipeline = Gst.parse_launch(gst_cmd)
sink = pipeline.get_by_name("sink")
pipeline.set_state(Gst.State.PLAYING)
# if self._is_camera_on == False:
# ret, frame = cap.read()
# if not ret:
# self._log.error("Camera Read Failed %s", str(ret))
# cap.release()
# self._setError(ret)
# return
# else:
# self._is_camera_on = True
# self._log.debug("Camera Ready")
# t = time.time()
# i = 0
# self._setReady(True)
# print("Done...")
while not self._shouldClose:
sample = sink.emit("pull-sample")
# ret, frame = cap.read()
if not sample:
continue
# self._log.error("GST read error")
else:
# This optional function is given by the user. default is identity x->x
new_frame = self.gst_to_opencv(sample)
# cv2.imshow("frame", new_frame)
# if cv2.waitKey(1) & 0xFF == ord("q"):
# break
# Send the frame to all subscribers
self._put_nowait(new_frame)
# cap.release()
pipeline.set_state(Gst.State.NULL)
self._setReady(False)
self._log.info("Ended camera capture")
class GSTCamH264(CVCamera):
_log = logging.getLogger("rtcbot.GSTCam")
def __init__(
self,
width=640,
height=480,
camID=0,
fps=30,
preprocessframe=lambda x: x,
loop=None,
):
self._width = width
self._height = height
self._cameranumber = camID
self._fps = fps
self._processframe = preprocessframe
self._is_camera_on = False
super().__init__(
self._width, self._height, self._cameranumber, self._fps, self._processframe
)
def gstreamer_pipeline(
self,
capture_width=640,
capture_height=480,
display_width=640,
display_height=480,
framerate=30,
flip_method=0,
):
return (
"v4l2src device=/dev/video%d ! "
"videoconvert ! videorate ! "
"video/x-raw, framerate=%d/1, width=%d, height=%d ! "
"videoconvert ! x264enc tune=zerolatency ! "
"appsink sync=false max-buffers=2 drop=true name=sink emit-signals=true"
% (
self._cameranumber,
self._fps,
self._width,
self._height,
)
)
def gst_parse(self, sample):
buf = sample.get_buffer()
caps = sample.get_caps()
# arr = np.ndarray(
# (
# caps.get_structure(0).get_value("height"),
# caps.get_structure(0).get_value("width"),
# 3,
# ),
# buffer=buf.extract_dup(0, buf.get_size()),
# dtype=np.uint8,
# )
arr = buf.extract_dup(0, buf.get_size())
return arr
def _producer(self):
"""
Runs the actual frame capturing code.
"""
gst_cmd = self.gstreamer_pipeline()
print(gst_cmd)
pipeline = Gst.parse_launch(gst_cmd)
sink = pipeline.get_by_name("sink")
pipeline.set_state(Gst.State.PLAYING)
while not self._shouldClose:
sample = sink.emit("pull-sample")
# ret, frame = cap.read()
if not sample:
continue
# self._log.error("GST read error")
else:
new_frame = self.gst_parse(sample)
# Send the frame to all subscribers
self._put_nowait(new_frame)
# cap.release()
pipeline.set_state(Gst.State.NULL)
self._setReady(False)
self._log.info("Ended camera capture")
class RawCam(CVCamera):
"""
Uses a camera supported by OpenCV.
When initializing, can give an optional function which preprocesses frames as they are read, and returns the
modified versions thereof. Please note that the preprocessing happens synchronously in the camera capture thread,
so any processing should be relatively fast, and should avoid pure python code due to the GIL. Numpy and openCV functions
should be OK.
"""
_log = logging.getLogger("rtcbot.RawCam")
def __init__(
self,
width=640,
height=480,
camID=0,
fps=30,
preprocessframe=lambda x: x,
loop=None,
):
self._width = width
self._height = height
self._cameranumber = camID
self._fps = fps
self._processframe = preprocessframe
self._is_camera_on = False
self._container = av.open(
f"/dev/video{self._cameranumber}",
options={"s": "1280x720", "framerate": "60"},
)
self._video = self._container.streams.video[0]
super().__init__(
self._width, self._height, self._cameranumber, self._fps, self._processframe
)
def _producer(self):
"""
Runs the actual frame capturing code.
"""
frames = self._container.decode(video=0)
while not self._shouldClose:
frame = next(frames)
img = frame.to_image()
self._put_nowait(img)
self._log.info("Ended camera capture")
if __name__ == "__main__":
# camera = WebCam(camID=0)
camera = CSICam(camID=0)
# camera = GSTCam(camID=0)
# camera = RawCam(camID=0)
display = CVDisplay()
frameSubscription = camera.subscribe()
display.putSubscription(frameSubscription)
try:
asyncio.get_event_loop().run_forever()
finally:
camera.close()
display.close()
| 28.137066 | 125 | 0.539417 | 1,596 | 14,575 | 4.701754 | 0.157268 | 0.023987 | 0.025986 | 0.027186 | 0.79451 | 0.775053 | 0.760128 | 0.751199 | 0.742937 | 0.737473 | 0 | 0.018378 | 0.357873 | 14,575 | 517 | 126 | 28.191489 | 0.783417 | 0.175094 | 0 | 0.750733 | 0 | 0.005865 | 0.09578 | 0.006708 | 0 | 0 | 0 | 0.001934 | 0 | 1 | 0.046921 | false | 0 | 0.026393 | 0.008798 | 0.1261 | 0.01173 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9c7c379a5c997bda1208504175bb14147bd121dd | 216 | py | Python | grr/core/grr_response_core/lib/local/plugins.py | tsehori/grr | 048506f22f74642bfe61749069a45ddf496fdab3 | [
"Apache-2.0"
] | 1 | 2021-07-01T01:43:06.000Z | 2021-07-01T01:43:06.000Z | grr/core/grr_response_core/lib/local/plugins.py | tsehori/grr | 048506f22f74642bfe61749069a45ddf496fdab3 | [
"Apache-2.0"
] | 44 | 2021-05-14T22:49:24.000Z | 2022-03-13T21:54:02.000Z | grr/core/grr_response_core/lib/local/plugins.py | tsehori/grr | 048506f22f74642bfe61749069a45ddf496fdab3 | [
"Apache-2.0"
] | 1 | 2020-06-25T14:25:54.000Z | 2020-06-25T14:25:54.000Z | #!/usr/bin/env python
# Lint as: python3
"""Imports for local site-specific plugins implementations."""
from __future__ import absolute_import
from __future__ import division
from __future__ import unicode_literals
| 27 | 62 | 0.814815 | 28 | 216 | 5.785714 | 0.75 | 0.185185 | 0.296296 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005236 | 0.115741 | 216 | 7 | 63 | 30.857143 | 0.842932 | 0.435185 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
92c54f2d7c97659f7ead59ac21d6cf640c43c21b | 29 | py | Python | euclidean/R2/polygon/__init__.py | keystonetowersystems/euclidean | 72965b5ea1b0d70438376024d0c9a14457bdfb13 | [
"MIT"
] | 1 | 2021-05-26T19:18:38.000Z | 2021-05-26T19:18:38.000Z | euclidean/R2/polygon/__init__.py | keystonetowersystems/euclidean | 72965b5ea1b0d70438376024d0c9a14457bdfb13 | [
"MIT"
] | 1 | 2021-06-30T14:13:13.000Z | 2021-06-30T15:34:33.000Z | euclidean/R2/polygon/__init__.py | keystonetowersystems/euclidean | 72965b5ea1b0d70438376024d0c9a14457bdfb13 | [
"MIT"
] | null | null | null | from .polygon import Polygon
| 14.5 | 28 | 0.827586 | 4 | 29 | 6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
131a00742f30578928d28962a994fce8d23d0052 | 138 | py | Python | dashboard/accounts/views.py | bibekebib/improved-dollop | 6a55296c5193906380e3bd6c7edb78383c0b183b | [
"MIT"
] | null | null | null | dashboard/accounts/views.py | bibekebib/improved-dollop | 6a55296c5193906380e3bd6c7edb78383c0b183b | [
"MIT"
] | null | null | null | dashboard/accounts/views.py | bibekebib/improved-dollop | 6a55296c5193906380e3bd6c7edb78383c0b183b | [
"MIT"
] | null | null | null | from django.shortcuts import render
# Create your views here.
def Home(request):
return render(request, 'accounts/dashboard.html')
| 17.25 | 53 | 0.753623 | 18 | 138 | 5.777778 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152174 | 138 | 7 | 54 | 19.714286 | 0.888889 | 0.166667 | 0 | 0 | 0 | 0 | 0.20354 | 0.20354 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
13234bda2815b2d224544d502b964d5c52dfc7c1 | 6,330 | py | Python | tests/test_chatbase.py | Olegt0rr/aioChatbase | 8a922d1b400fd67af9f0914d7cbf86dc784f7179 | [
"MIT"
] | 16 | 2018-07-04T08:44:37.000Z | 2021-12-09T21:03:29.000Z | tests/test_chatbase.py | Olegt0rr/aioChatbase | 8a922d1b400fd67af9f0914d7cbf86dc784f7179 | [
"MIT"
] | 6 | 2018-06-21T22:18:49.000Z | 2018-08-03T08:43:29.000Z | tests/test_chatbase.py | Olegt0rr/aioChatbase | 8a922d1b400fd67af9f0914d7cbf86dc784f7179 | [
"MIT"
] | 3 | 2018-08-03T19:34:56.000Z | 2018-11-18T13:10:07.000Z | import pytest
import logging
import asyncio
from aiochatbase import Chatbase
from aiochatbase import types
from . import FakeChatbaseServer, CLICK_RESPONSE_DICT, BULK_RESPONSE_DICT, EVENT_RESPONSE_DICT
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger('TrueModerTest')
pytestmark = pytest.mark.asyncio
CHATBASE_TOKEN = '123456789:AABBCCDDEEFFaabbccddeeff-1234567890'
CHATBOT_PLATFORM = 'TestPlatform'
USER_ID = '123456'
MESSAGE_TEXT = 'test message text'
INTENT = 'Another message'
CB_MESSAGE_ID = '12345'
@pytest.yield_fixture
def cb(event_loop: asyncio.AbstractEventLoop):
_chatbase = Chatbase(CHATBASE_TOKEN, CHATBOT_PLATFORM, loop=event_loop)
yield _chatbase
event_loop.run_until_complete(_chatbase.close())
async def test_cb_init_without_loop(event_loop):
chatbase = Chatbase(CHATBASE_TOKEN, CHATBOT_PLATFORM)
await chatbase.close()
async def test_prepare_message(cb, event_loop):
msg = await cb.prepare_message(user_id=USER_ID, intent=INTENT, message=MESSAGE_TEXT)
assert isinstance(msg, types.Message)
assert msg.user_id == USER_ID
assert msg.intent == INTENT
assert msg.message == MESSAGE_TEXT
async def test_register_message(cb, event_loop):
""" Registering message in basic mode """
async with FakeChatbaseServer(message_dict={'message_id': CB_MESSAGE_ID, 'status': 200}, loop=event_loop):
result = await cb.register_message(user_id=USER_ID, intent=INTENT)
assert result == CB_MESSAGE_ID
async def test_register_message_without_task(cb, event_loop):
""" Registering message in basic mode strongly without task """
async with FakeChatbaseServer(message_dict={'message_id': CB_MESSAGE_ID, 'status': 200}, loop=event_loop):
result = await cb.register_message(user_id=USER_ID, intent=INTENT, task=False)
assert result == CB_MESSAGE_ID
async def test_register_message_with_task(cb, event_loop):
""" Registering message in basic mode with task """
async with FakeChatbaseServer(message_dict={'message_id': CB_MESSAGE_ID, 'status': 200}, loop=event_loop):
result = await cb.register_message(user_id=USER_ID, intent=INTENT, task=True)
assert isinstance(result, asyncio.Task)
done, pending = await asyncio.wait([result], return_when=asyncio.ALL_COMPLETED)
assert done.pop().result() == CB_MESSAGE_ID
async def test_register_messages(cb, event_loop):
msg_1 = await cb.prepare_message('1', 'test bulk', message=MESSAGE_TEXT)
msg_2 = await cb.prepare_message('2', 'test bulk', not_handled=True)
msg_3 = await cb.prepare_message('3', 'test bulk', version='Test', session_id='12345')
messages_list = [msg_1, msg_2, msg_3]
async with FakeChatbaseServer(message_dict=BULK_RESPONSE_DICT, loop=event_loop):
result = await cb.register_messages(messages_list)
assert result == [5917431215, 5917431216, 5917431217]
async def test_register_messages_without_task(cb, event_loop):
msg_1 = await cb.prepare_message('1', 'test bulk')
msg_2 = await cb.prepare_message('2', 'test bulk')
msg_3 = await cb.prepare_message('3', 'test bulk')
messages_list = [msg_1, msg_2, msg_3]
async with FakeChatbaseServer(message_dict=BULK_RESPONSE_DICT, loop=event_loop):
result = await cb.register_messages(messages_list, task=False)
assert result == [5917431215, 5917431216, 5917431217]
async def test_register_messages_with_task(cb, event_loop):
msg_1 = await cb.prepare_message('1', 'test bulk')
msg_2 = await cb.prepare_message('2', 'test bulk')
msg_3 = await cb.prepare_message('3', 'test bulk')
messages_list = [msg_1, msg_2, msg_3]
async with FakeChatbaseServer(message_dict=BULK_RESPONSE_DICT, loop=event_loop):
result = await cb.register_messages(messages_list, task=True)
assert isinstance(result, asyncio.Task)
done, pending = await asyncio.wait([result], return_when=asyncio.ALL_COMPLETED)
assert done.pop().result() == [5917431215, 5917431216, 5917431217]
async def test_register_click(cb, event_loop):
async with FakeChatbaseServer(message_dict=CLICK_RESPONSE_DICT, loop=event_loop):
result = await cb.register_click(url='google.com')
assert result is True
async def test_register_click_without_task(cb, event_loop):
async with FakeChatbaseServer(message_dict=CLICK_RESPONSE_DICT, loop=event_loop):
result = await cb.register_click(url='google.com', task=False)
assert result is True
async def test_register_click_with_task(cb, event_loop):
async with FakeChatbaseServer(message_dict=CLICK_RESPONSE_DICT, loop=event_loop):
result = await cb.register_click(url='google.com', task=True)
assert isinstance(result, asyncio.Task)
done, pending = await asyncio.wait([result], return_when=asyncio.ALL_COMPLETED)
assert done.pop().result() is True
async def test_register_event(cb, event_loop):
any_dict = {
'property 1 (int)': 1,
'property 2 (str)': 'two',
'property 3 (float)': 3.0,
'property 4 (bool)': True,
}
async with FakeChatbaseServer(message_dict=EVENT_RESPONSE_DICT, loop=event_loop):
result = await cb.register_event('123456', 'test event', properties=any_dict, version='TestVersion')
assert result is True
async def test_register_event_without_task(cb, event_loop):
any_dict = {
'property 1 (int)': 1,
'property 2 (str)': 'two',
'property 3 (float)': 3.0,
'property 4 (bool)': True,
}
async with FakeChatbaseServer(message_dict=EVENT_RESPONSE_DICT, loop=event_loop):
result = await cb.register_event('123456', 'test event', properties=any_dict, task=False)
assert result is True
async def test_register_event_with_task(cb, event_loop):
any_dict = {
'property 1 (int)': 1,
'property 2 (str)': 'two',
'property 3 (float)': 3.0,
'property 4 (bool)': True,
}
async with FakeChatbaseServer(message_dict=EVENT_RESPONSE_DICT, loop=event_loop):
result = await cb.register_event('123456', 'test event', properties=any_dict, task=True)
assert isinstance(result, asyncio.Task)
done, pending = await asyncio.wait([result], return_when=asyncio.ALL_COMPLETED)
assert done.pop().result() is True
| 40.576923 | 110 | 0.726224 | 853 | 6,330 | 5.134818 | 0.121923 | 0.059589 | 0.03516 | 0.054795 | 0.811416 | 0.780822 | 0.760731 | 0.753653 | 0.72032 | 0.662785 | 0 | 0.037972 | 0.16793 | 6,330 | 155 | 111 | 40.83871 | 0.793621 | 0 | 0 | 0.469565 | 0 | 0 | 0.090526 | 0.007287 | 0 | 0 | 0 | 0 | 0.173913 | 1 | 0.008696 | false | 0 | 0.052174 | 0 | 0.06087 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
135f6e2207c67c524197aafa3abfdcf16d1aef8d | 1,025 | py | Python | 046 Remove Kth Node From End/Remove_Kth_Node_From_End.py | Iftakharpy/AlgoExpert-Questions | f4aef449bfe0ee651d84a92487c3b3bedb3aa739 | [
"Apache-2.0"
] | 3 | 2021-11-19T07:32:27.000Z | 2022-03-22T13:46:27.000Z | 046 Remove Kth Node From End/Remove_Kth_Node_From_End.py | Iftakharpy/AlgoExpert-Questions | f4aef449bfe0ee651d84a92487c3b3bedb3aa739 | [
"Apache-2.0"
] | null | null | null | 046 Remove Kth Node From End/Remove_Kth_Node_From_End.py | Iftakharpy/AlgoExpert-Questions | f4aef449bfe0ee651d84a92487c3b3bedb3aa739 | [
"Apache-2.0"
] | 5 | 2022-01-02T11:51:12.000Z | 2022-03-22T13:53:32.000Z | class LinkedList:
def __init__(self, value):
self.value = value
self.next = None
# time O(n) | space O(1)
def removeKthNodeFromEnd(head, k):
endFinder = head
for i in range(k):
endFinder = endFinder.next
if endFinder is None:
head.value = head.next.value
head.next = head.next.next
return
node = head
while endFinder.next is not None:
node = node.next
endFinder = endFinder.next
node.next = node.next.next
# time O(n) | space O(1)
def removeKthNodeFromEnd(head, k):
endFinder = head
for i in range(k):
endFinder = endFinder.next
if endFinder is None:
head.value = head.next.value
head.next = head.next.next
return
# Little variation here
node = LinkedList(None)
node.next = head
while endFinder is not None:
node = node.next
endFinder = endFinder.next
node.next = node.next.next
| 23.837209 | 38 | 0.567805 | 126 | 1,025 | 4.587302 | 0.238095 | 0.096886 | 0.152249 | 0.038062 | 0.747405 | 0.747405 | 0.747405 | 0.747405 | 0.747405 | 0.747405 | 0 | 0.002999 | 0.349268 | 1,025 | 42 | 39 | 24.404762 | 0.863568 | 0.065366 | 0 | 0.709677 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.096774 | false | 0 | 0 | 0 | 0.193548 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
136da822f0fe864ca564bc76c195704d3a5bc793 | 83 | py | Python | mlaction/test/test_mlaction.py | iOSDevLog/mlaction | 4c5efd9bc0320f1a66760f997eb8477abcb3e688 | [
"MIT"
] | null | null | null | mlaction/test/test_mlaction.py | iOSDevLog/mlaction | 4c5efd9bc0320f1a66760f997eb8477abcb3e688 | [
"MIT"
] | 1 | 2019-06-11T06:59:51.000Z | 2019-06-12T13:31:44.000Z | mlaction/test/test_mlaction.py | iOSDevLog/mlaction | 4c5efd9bc0320f1a66760f997eb8477abcb3e688 | [
"MIT"
] | null | null | null | import mlaction
def test_mlaction_name():
assert mlaction.name == "mlaction"
| 13.833333 | 38 | 0.73494 | 10 | 83 | 5.9 | 0.6 | 0.40678 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168675 | 83 | 5 | 39 | 16.6 | 0.855072 | 0 | 0 | 0 | 0 | 0 | 0.096386 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
137a360d44f663f87c43c18392046f6ea4934a4f | 166 | py | Python | tests/example.py | emfree/systemtap-python-tools | badc02f886cd738b3afd1ba143e6b84e6408b280 | [
"MIT"
] | 45 | 2016-08-20T23:57:23.000Z | 2021-08-23T13:11:38.000Z | tests/example.py | emfree/pystap | badc02f886cd738b3afd1ba143e6b84e6408b280 | [
"MIT"
] | 2 | 2016-07-25T19:31:36.000Z | 2016-08-04T22:59:43.000Z | tests/example.py | emfree/systemtap-python-tools | badc02f886cd738b3afd1ba143e6b84e6408b280 | [
"MIT"
] | 6 | 2016-10-09T03:31:26.000Z | 2020-02-16T10:13:01.000Z | def callee_a():
pass
def callee_b():
callee_c()
def callee_c():
pass
def caller():
callee_a()
callee_b()
callee_a()
while True:
caller()
| 9.764706 | 15 | 0.584337 | 24 | 166 | 3.75 | 0.375 | 0.3 | 0.288889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.277108 | 166 | 16 | 16 | 10.375 | 0.75 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0.166667 | 0 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
138cadd857540752a519ef76bf26a49f420783d8 | 49 | py | Python | xonsh/ptk2/__init__.py | ion201/xonsh | 7cf0307a0d53d198b8c05c83456d86af14c0daa4 | [
"BSD-2-Clause-FreeBSD"
] | 4,716 | 2016-06-07T05:48:42.000Z | 2022-03-31T22:30:15.000Z | xonsh/ptk2/__init__.py | ion201/xonsh | 7cf0307a0d53d198b8c05c83456d86af14c0daa4 | [
"BSD-2-Clause-FreeBSD"
] | 3,644 | 2016-06-07T05:55:42.000Z | 2022-03-31T13:25:57.000Z | xonsh/ptk2/__init__.py | ion201/xonsh | 7cf0307a0d53d198b8c05c83456d86af14c0daa4 | [
"BSD-2-Clause-FreeBSD"
] | 576 | 2016-06-07T06:28:32.000Z | 2022-03-31T02:46:15.000Z | from xonsh.ptk_shell import * # noqa: F403 F401
| 24.5 | 48 | 0.734694 | 8 | 49 | 4.375 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 0.183673 | 49 | 1 | 49 | 49 | 0.725 | 0.306122 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
13c64034abce30546ff60ba40d05421e6d6f8f81 | 32 | py | Python | stubbs/defs/snde.py | holy-crust/reclaimer | 0aa693da3866ce7999c68d5f71f31a9c932cdb2c | [
"MIT"
] | null | null | null | stubbs/defs/snde.py | holy-crust/reclaimer | 0aa693da3866ce7999c68d5f71f31a9c932cdb2c | [
"MIT"
] | null | null | null | stubbs/defs/snde.py | holy-crust/reclaimer | 0aa693da3866ce7999c68d5f71f31a9c932cdb2c | [
"MIT"
] | null | null | null | from ...hek.defs.snde import *
| 16 | 31 | 0.65625 | 5 | 32 | 4.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15625 | 32 | 1 | 32 | 32 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
13d8a12153d834d34e95080003941e8341a48d8e | 292 | py | Python | analysis/spec/qtensor_specs/__init__.py | marwahaha/QTensor | 936d078825a6418f9d32d2c176332422d8a4c137 | [
"BSD-3-Clause"
] | 20 | 2020-09-08T20:32:44.000Z | 2022-03-18T11:27:57.000Z | analysis/spec/qtensor_specs/__init__.py | marwahaha/QTensor | 936d078825a6418f9d32d2c176332422d8a4c137 | [
"BSD-3-Clause"
] | 21 | 2020-10-09T04:44:48.000Z | 2021-10-05T03:32:35.000Z | analysis/spec/qtensor_specs/__init__.py | marwahaha/QTensor | 936d078825a6418f9d32d2c176332422d8a4c137 | [
"BSD-3-Clause"
] | 4 | 2020-12-18T01:37:10.000Z | 2021-07-26T21:24:20.000Z | # AUTOGENERATED! DO NOT EDIT! File to edit: notebooks/index.ipynb (unless otherwise specified).
__all__ = ['cli']
# Cell
import click
@click.group()
def cli():
pass
from qtensor_specs import speed_comparison
from qtensor_specs import qaoa_bench
from qtensor_specs import time_vs_flop
| 19.466667 | 95 | 0.777397 | 42 | 292 | 5.142857 | 0.714286 | 0.152778 | 0.222222 | 0.305556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.14726 | 292 | 14 | 96 | 20.857143 | 0.86747 | 0.335616 | 0 | 0 | 1 | 0 | 0.015707 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.125 | 0.5 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
13e3d78e611402c5e67e5f8af423596f6140ccec | 36 | py | Python | subtractor/__init__.py | shimomura314/Arithmetic-Logic-Unit | a547d2fe4ac3cbc2e38a64d654e26141f4c9c81a | [
"MIT"
] | null | null | null | subtractor/__init__.py | shimomura314/Arithmetic-Logic-Unit | a547d2fe4ac3cbc2e38a64d654e26141f4c9c81a | [
"MIT"
] | null | null | null | subtractor/__init__.py | shimomura314/Arithmetic-Logic-Unit | a547d2fe4ac3cbc2e38a64d654e26141f4c9c81a | [
"MIT"
] | null | null | null | from .substractor import substractor | 36 | 36 | 0.888889 | 4 | 36 | 8 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 36 | 1 | 36 | 36 | 0.969697 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b9302aecec322b8b58dd709cf29bac47644cf1eb | 4,711 | py | Python | models/classifier.py | iQua/InfoCensor | 0fcc840c3a57354e9fadfaed4545284b5e0d79f6 | [
"Apache-2.0"
] | null | null | null | models/classifier.py | iQua/InfoCensor | 0fcc840c3a57354e9fadfaed4545284b5e0d79f6 | [
"Apache-2.0"
] | null | null | null | models/classifier.py | iQua/InfoCensor | 0fcc840c3a57354e9fadfaed4545284b5e0d79f6 | [
"Apache-2.0"
] | 1 | 2022-03-09T20:27:31.000Z | 2022-03-09T20:27:31.000Z | import torch
from torch import nn
from torch.nn import functional as F
class mlp_classifier(nn.Module):
def __init__(self, in_dim, hidden_dims=None, bn=True, drop_rate=0.0, num_classes=2):
super(mlp_classifier, self).__init__()
self.drop_rate = drop_rate
modules = []
if hidden_dims is None:
hidden_dims = []
hidden_dims = [in_dim] + hidden_dims
for layer_idx in range(len(hidden_dims)-1):
if bn:
modules.append(
nn.Sequential(
nn.Linear(hidden_dims[layer_idx], hidden_dims[layer_idx+1]),
nn.BatchNorm1d(hidden_dims[layer_idx+1]),
nn.ReLU(),
nn.Dropout(drop_rate))
)
else:
modules.append(
nn.Sequential(
nn.Linear(hidden_dims[layer_idx], hidden_dims[layer_idx+1]),
nn.ReLU(),
nn.Dropout(drop_rate))
)
self.features = None if len(modules) == 0 else nn.Sequential(*modules)
self.logits = nn.Linear(hidden_dims[-1], num_classes)
def forward(self, input):
features = F.dropout(input, p=self.drop_rate, training=self.training)
if self.features is not None: features = self.features(features)
return self.logits(features)
class binary_classifier(nn.Module):
def __init__(self, in_dim, hidden_dims=None, bn=True, drop_rate=0.0):
super(binary_classifier, self).__init__()
self.drop_rate = drop_rate
modules = []
if hidden_dims is None:
hidden_dims = []
hidden_dims = [in_dim] + hidden_dims
for layer_idx in range(len(hidden_dims)-1):
if bn:
modules.append(
nn.Sequential(
nn.Linear(hidden_dims[layer_idx], hidden_dims[layer_idx+1]),
nn.BatchNorm1d(hidden_dims[layer_idx+1]),
nn.ReLU(),
nn.Dropout(drop_rate))
)
else:
modules.append(
nn.Sequential(
nn.Linear(hidden_dims[layer_idx], hidden_dims[layer_idx+1]),
nn.ReLU(),
nn.Dropout(drop_rate))
)
self.features = None if len(modules) == 0 else nn.Sequential(*modules)
self.logit = nn.Linear(hidden_dims[-1], 1)
self.sigmoid = nn.Sigmoid()
def forward(self, input):
features = F.dropout(input, p=self.drop_rate, training=self.training)
if self.features is not None: features = self.features(features)
return self.sigmoid(self.logit(features))
class vgg_classifier(nn.Module):
def __init__(self, num_classes=2):
super(vgg_classifier, self).__init__()
self.convnet = nn.Sequential(
nn.Conv2d(128, 256, kernel_size=3, padding=1),
nn.BatchNorm2d(256),
nn.ReLU(inplace=True),
nn.Conv2d(256, 256, kernel_size=3, padding=1),
nn.BatchNorm2d(256),
nn.ReLU(inplace=True),
nn.Conv2d(256, 256, kernel_size=3, padding=1),
nn.BatchNorm2d(256),
nn.ReLU(inplace=True),
nn.MaxPool2d(kernel_size=2, stride=2),
nn.Conv2d(256, 512, kernel_size=3, padding=1),
nn.BatchNorm2d(512),
nn.ReLU(inplace=True),
nn.Conv2d(512, 512, kernel_size=3, padding=1),
nn.BatchNorm2d(512),
nn.ReLU(inplace=True),
nn.Conv2d(512, 512, kernel_size=3, padding=1),
nn.BatchNorm2d(512),
nn.ReLU(inplace=True),
nn.MaxPool2d(kernel_size=2, stride=2),
nn.Conv2d(512, 512, kernel_size=3, padding=1),
nn.BatchNorm2d(512),
nn.ReLU(inplace=True),
nn.Conv2d(512, 512, kernel_size=3, padding=1),
nn.BatchNorm2d(512),
nn.ReLU(inplace=True),
nn.Conv2d(512, 512, kernel_size=3, padding=1),
nn.BatchNorm2d(512),
nn.ReLU(inplace=True),
nn.MaxPool2d(kernel_size=2, stride=2),
)
self.fcnet = nn.Sequential(
nn.Linear(512 * 1, 4096),
nn.ReLU(inplace=True),
nn.Dropout(),
nn.Linear(4096, 4096),
nn.ReLU(inplace=True),
nn.Dropout(),
nn.Linear(4096, num_classes),
)
def forward(self, x):
out = self.convnet(x)
out = out.view(out.size(0), -1)
out = self.fcnet(out)
return out
| 34.639706 | 88 | 0.535131 | 563 | 4,711 | 4.30373 | 0.131439 | 0.099051 | 0.059018 | 0.077177 | 0.855138 | 0.825423 | 0.813454 | 0.813454 | 0.813454 | 0.813454 | 0 | 0.055067 | 0.348546 | 4,711 | 135 | 89 | 34.896296 | 0.734441 | 0 | 0 | 0.690265 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.053097 | false | 0 | 0.026549 | 0 | 0.132743 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b949542135858be2437ca5c4838e571f8adc6197 | 1,415 | py | Python | xero_python/project/models/__init__.py | parasharrk/xero-python | e8416f3bd893520a343af014f5bb65acbf1f4f13 | [
"MIT"
] | null | null | null | xero_python/project/models/__init__.py | parasharrk/xero-python | e8416f3bd893520a343af014f5bb65acbf1f4f13 | [
"MIT"
] | null | null | null | xero_python/project/models/__init__.py | parasharrk/xero-python | e8416f3bd893520a343af014f5bb65acbf1f4f13 | [
"MIT"
] | null | null | null | # coding: utf-8
# flake8: noqa
"""
Xero Projects API
This is the Xero Projects API # noqa: E501
OpenAPI spec version: 2.4.0
Contact: api@xero.com
Generated by: https://openapi-generator.tech
"""
# import models into model package
from xero_python.project.models.amount import Amount
from xero_python.project.models.charge_type import ChargeType
from xero_python.project.models.currency_code import CurrencyCode
from xero_python.project.models.error import Error
from xero_python.project.models.pagination import Pagination
from xero_python.project.models.project import Project
from xero_python.project.models.project_create_or_update import ProjectCreateOrUpdate
from xero_python.project.models.project_patch import ProjectPatch
from xero_python.project.models.project_status import ProjectStatus
from xero_python.project.models.project_user import ProjectUser
from xero_python.project.models.project_users import ProjectUsers
from xero_python.project.models.projects import Projects
from xero_python.project.models.task import Task
from xero_python.project.models.task_create_or_update import TaskCreateOrUpdate
from xero_python.project.models.tasks import Tasks
from xero_python.project.models.time_entries import TimeEntries
from xero_python.project.models.time_entry import TimeEntry
from xero_python.project.models.time_entry_create_or_update import (
TimeEntryCreateOrUpdate,
)
| 39.305556 | 85 | 0.84311 | 199 | 1,415 | 5.81407 | 0.316583 | 0.12446 | 0.217805 | 0.326707 | 0.482282 | 0.318928 | 0.06223 | 0 | 0 | 0 | 0 | 0.00627 | 0.098233 | 1,415 | 35 | 86 | 40.428571 | 0.90047 | 0.15477 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.9 | 0 | 0.9 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b95144c5eb4b28a47456b82d862cab7583f75d6c | 116 | py | Python | domains/entry/action.py | GRParasky/finance-project | fc7b834cd97edb7f16dac78d141d638c5f43970f | [
"MIT"
] | null | null | null | domains/entry/action.py | GRParasky/finance-project | fc7b834cd97edb7f16dac78d141d638c5f43970f | [
"MIT"
] | null | null | null | domains/entry/action.py | GRParasky/finance-project | fc7b834cd97edb7f16dac78d141d638c5f43970f | [
"MIT"
] | null | null | null | from domains.entry.model import *
from database import save, commit
def create(obj: Entry):
return save(obj)
| 14.5 | 33 | 0.732759 | 17 | 116 | 5 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181034 | 116 | 7 | 34 | 16.571429 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
b9c77e3dc5abdc09e4ba4c35316165f0e6bb2154 | 5,005 | py | Python | legacy/phi/tf/util.py | tailintalent/PDE-Control | 7031909188e7ce217da2b1628236011d1dff161a | [
"MIT"
] | 22 | 2020-04-27T12:48:32.000Z | 2022-03-23T10:41:48.000Z | legacy/phi/tf/util.py | tailintalent/PDE-Control | 7031909188e7ce217da2b1628236011d1dff161a | [
"MIT"
] | 5 | 2020-12-18T14:19:23.000Z | 2022-01-22T18:29:27.000Z | legacy/phi/tf/util.py | tailintalent/PDE-Control | 7031909188e7ce217da2b1628236011d1dff161a | [
"MIT"
] | 3 | 2021-05-29T23:30:53.000Z | 2022-02-14T06:30:32.000Z | # coding=utf-8
import tensorflow as tf
from phi.math.nd import *
def group_normalization(x, group_count, eps=1e-5):
batch_size, H, W, C = tf.shape(x)
gamma = tf.Variable(np.ones([1,1,1,C]), dtype=tf.float32, name="GN_gamma")
beta = tf.Variable(np.zeros([1,1,1,C]), dtype=tf.float32, name="GN_beta")
x = tf.reshape(x, [batch_size, group_count, H, W, C // group_count])
mean, var = tf.nn.moments(x, [2, 3, 4], keep_dims=True)
x = (x - mean) / tf.sqrt(var + eps)
x = tf.reshape(x, [batch_size, H, W, C])
return x * gamma + beta
def residual_block(y, nb_channels, kernel_size=(3, 3), _strides=(1, 1), activation=tf.nn.leaky_relu,
_project_shortcut=False, padding="SYMMETRIC", name=None, training=False, trainable=True, reuse=tf.AUTO_REUSE):
shortcut = y
if isinstance(kernel_size, int):
kernel_size = (kernel_size, kernel_size)
pad1 = [(kernel_size[0] - 1) // 2, kernel_size[0] // 2]
pad2 = [(kernel_size[1] - 1) // 2, kernel_size[1] // 2]
# down-sampling is performed with a stride of 2
y = tf.pad(y, [[0,0], pad1, pad2, [0,0]], mode=padding)
y = tf.layers.conv2d(y, nb_channels, kernel_size=kernel_size, strides=_strides, padding='valid',
name=None if name is None else name+"/conv1", trainable=trainable, reuse=reuse)
# y = tf.layers.batch_normalization(y, name=None if name is None else name+"/norm1", training=training, trainable=trainable, reuse=reuse)
y = activation(y)
y = tf.pad(y, [[0,0], pad1, pad2, [0,0]], mode=padding)
y = tf.layers.conv2d(y, nb_channels, kernel_size=kernel_size, strides=(1, 1), padding='valid',
name=None if name is None else name + "/conv2", trainable=trainable, reuse=reuse)
# y = tf.layers.batch_normalization(y, name=None if name is None else name+"/norm2", training=training, trainable=trainable, reuse=reuse)
# identity shortcuts used directly when the input and output are of the same dimensions
if _project_shortcut or _strides != (1, 1):
# when the dimensions increase projection shortcut is used to match dimensions (done by 1×1 convolutions)
# when the shortcuts go across feature maps of two sizes, they are performed with a stride of 2
shortcut = tf.pad(shortcut, [[0,0], pad1, pad2, [0,0]], mode=padding)
shortcut = tf.layers.conv2d(shortcut, nb_channels, kernel_size=(1, 1), strides=_strides, padding='valid',
name=None if name is None else name + "/convid", trainable=trainable, reuse=reuse)
# shortcut = tf.layers.batch_normalization(shortcut, name=None if name is None else name+"/normid", training=training, trainable=trainable, reuse=reuse)
y += shortcut
y = activation(y)
return y
def residual_block_1d(y, nb_channels, kernel_size=(3,), _strides=(1,), activation=tf.nn.leaky_relu,
_project_shortcut=False, padding="SYMMETRIC", name=None, training=False, trainable=True, reuse=tf.AUTO_REUSE):
shortcut = y
if isinstance(kernel_size, int):
kernel_size = (kernel_size,)
pad1 = [(kernel_size[0] - 1) // 2, kernel_size[0] // 2]
# down-sampling is performed with a stride of 2
y = tf.pad(y, [[0,0], pad1, [0,0]], mode=padding)
y = tf.layers.conv1d(y, nb_channels, kernel_size=kernel_size, strides=_strides, padding='valid',
name=None if name is None else name+"/conv1", trainable=trainable, reuse=reuse)
# y = tf.layers.batch_normalization(y, name=None if name is None else name+"/norm1", training=training, trainable=trainable, reuse=reuse)
y = activation(y)
y = tf.pad(y, [[0,0], pad1, [0,0]], mode=padding)
y = tf.layers.conv1d(y, nb_channels, kernel_size=kernel_size, strides=(1,), padding='valid',
name=None if name is None else name + "/conv2", trainable=trainable, reuse=reuse)
# y = tf.layers.batch_normalization(y, name=None if name is None else name+"/norm2", training=training, trainable=trainable, reuse=reuse)
# identity shortcuts used directly when the input and output are of the same dimensions
if _project_shortcut or _strides != (1,):
# when the dimensions increase projection shortcut is used to match dimensions (done by 1×1 convolutions)
# when the shortcuts go across feature maps of two sizes, they are performed with a stride of 2
shortcut = tf.pad(shortcut, [[0,0], pad1, [0,0]], mode=padding)
shortcut = tf.layers.conv1d(shortcut, nb_channels, kernel_size=(1, 1), strides=_strides, padding='valid',
name=None if name is None else name + "/convid", trainable=trainable, reuse=reuse)
# shortcut = tf.layers.batch_normalization(shortcut, name=None if name is None else name+"/normid", training=training, trainable=trainable, reuse=reuse)
y += shortcut
y = activation(y)
return y
def istensor(object):
if isinstance(object, StaggeredGrid):
object = object.staggered
return isinstance(object, tf.Tensor)
| 53.244681 | 160 | 0.671728 | 751 | 5,005 | 4.383489 | 0.173103 | 0.075942 | 0.036452 | 0.051033 | 0.872418 | 0.867861 | 0.842345 | 0.828676 | 0.823512 | 0.808627 | 0 | 0.026342 | 0.196004 | 5,005 | 93 | 161 | 53.817204 | 0.791252 | 0.303297 | 0 | 0.464286 | 0 | 0 | 0.029098 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.035714 | 0 | 0.178571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6a27a94fe78530ef19934721d2bf3077fbe218c8 | 8,861 | py | Python | examples/python/ducctest.py | mjc87/SHTOOLS | 8d83c42d1313d5624c4db8c2e57300c5d819834e | [
"BSD-3-Clause"
] | 251 | 2015-01-27T12:58:28.000Z | 2022-03-29T17:19:36.000Z | examples/python/ducctest.py | mjc87/SHTOOLS | 8d83c42d1313d5624c4db8c2e57300c5d819834e | [
"BSD-3-Clause"
] | 193 | 2015-03-11T06:21:08.000Z | 2022-03-31T14:05:45.000Z | examples/python/ducctest.py | mreineck/SHTOOLS | fec33f203ee0b47008fd69d4080304d6ebd272e7 | [
"BSD-3-Clause"
] | 100 | 2015-04-03T07:11:05.000Z | 2022-03-23T23:46:33.000Z | import numpy as np
import pyshtools as pysh
from time import time
nthreads = 1
def _l2error(a, b):
return np.sqrt(np.sum(np.abs(a - b) ** 2) / np.sum(np.abs(a) ** 2))
# force SHTOOLS to deallocate temporary buffers
def flush_buffers(grd):
degrees = np.arange(1, dtype=float)
degrees[0] = np.inf
power = degrees ** (-2)
pysh.backends.select_preferred_backend("shtools")
clm = pysh.SHCoeffs.from_random(power, seed=12345)
grid2 = clm.expand(grid=grd)
_ = grid2.expand()
clm = pysh.SHCoeffs.from_random(power, seed=12345, kind="complex")
grid2 = clm.expand(grid=grd)
_ = grid2.expand()
def test_SHT(lmax, grd, csphase, normalization, extend):
degrees = np.arange(lmax + 1, dtype=float)
degrees[0] = np.inf
power = degrees ** (-2)
clm = pysh.SHCoeffs.from_random(power, seed=12345)
clm = clm.convert(normalization=normalization, csphase=csphase, lmax=lmax)
pysh.backends.select_preferred_backend("ducc", nthreads=nthreads)
t0 = time()
grid = clm.expand(grid=grd, extend=extend)
cilm = grid.expand()
tducc = time() - t0
pysh.backends.select_preferred_backend("shtools")
t0 = time()
grid2 = clm.expand(grid=grd, extend=extend)
cilm2 = grid2.expand()
tshtools = time() - t0
flush_buffers(grd)
return (
_l2error(grid.to_array(), grid2.to_array())
+ _l2error(cilm.to_array(), cilm2.to_array()),
tshtools / tducc,
)
def test_SHTducc(lmax, grd, nthreads):
degrees = np.arange(lmax + 1, dtype=float)
degrees[0] = np.inf
power = degrees ** (-2)
clm = pysh.SHCoeffs.from_random(power, seed=12345)
clm = clm.convert(normalization="ortho", csphase=1, lmax=lmax)
pysh.backends.select_preferred_backend("ducc", nthreads=nthreads)
t0 = time()
grid = clm.expand(grid=grd, extend=False)
cilm = grid.expand(normalization="ortho", csphase=1)
tducc = time() - t0
return _l2error(clm.to_array(), cilm.to_array()), tducc
def test_SHTC(lmax, grd, csphase, normalization, extend):
degrees = np.arange(lmax + 1, dtype=float)
degrees[0] = np.inf
power = degrees ** (-2)
clm = pysh.SHCoeffs.from_random(power, seed=12345, kind="complex")
clm = clm.convert(normalization=normalization, csphase=csphase, lmax=lmax)
pysh.backends.select_preferred_backend("ducc", nthreads=nthreads)
t0 = time()
grid = clm.expand(grid=grd, extend=extend)
cilm = grid.expand(normalization=normalization, csphase=csphase)
tducc = time() - t0
pysh.backends.select_preferred_backend("shtools")
t0 = time()
grid2 = clm.expand(grid=grd, extend=extend)
cilm2 = grid2.expand(normalization=normalization, csphase=csphase)
tshtools = time() - t0
flush_buffers(grd)
return (
_l2error(grid.to_array(), grid2.to_array())
+ _l2error(cilm.to_array(), cilm2.to_array()),
tshtools / tducc,
)
def test_SHT_deriv(lmax, grd, csphase, extend):
degrees = np.arange(lmax + 1, dtype=float)
degrees[0] = 1.0
power = degrees ** (-2)
clm = pysh.SHCoeffs.from_random(power, seed=12345)
clm = clm.convert(csphase=csphase, lmax=lmax)
pysh.backends.select_preferred_backend("ducc", nthreads=nthreads)
t0 = time()
grad = clm.gradient(extend=extend, radius=3.4)
tducc = time() - t0
pysh.backends.select_preferred_backend("shtools")
t0 = time()
grad2 = clm.gradient(extend=extend, radius=1.0)
tshtools = time() - t0
flush_buffers(grd)
return (
_l2error(3.4 * grad.phi.to_array(), grad2.phi.to_array())
+ _l2error(3.4 * grad.theta.to_array(), grad2.theta.to_array()),
tshtools / tducc,
)
def test_rot(lmax, alpha, beta, gamma):
degrees = np.arange(lmax + 1, dtype=float)
degrees[0] = np.inf
power = degrees ** (-2)
clm = pysh.SHCoeffs.from_random(power, seed=12345)
pysh.backends.select_preferred_backend("ducc", nthreads=nthreads)
t0 = time()
clm_rotated = clm.rotate(alpha, beta, gamma, degrees=True)
tducc = time() - t0
pysh.backends.select_preferred_backend("shtools")
t0 = time()
clm_rotated2 = clm.rotate(alpha, beta, gamma, degrees=True)
tshtools = time() - t0
return (
_l2error(clm_rotated.to_array(), clm_rotated2.to_array()),
tshtools / tducc,
)
def test_rotc(lmax, alpha, beta, gamma):
degrees = np.arange(lmax + 1, dtype=float)
degrees[0] = np.inf
power = degrees ** (-2)
clm = pysh.SHCoeffs.from_random(power, seed=12345, kind="complex")
pysh.backends.select_preferred_backend("ducc", nthreads=nthreads)
t0 = time()
clm_rotated = clm.rotate(alpha, beta, gamma, degrees=True)
tducc = time() - t0
pysh.backends.select_preferred_backend("shtools")
t0 = time()
clm_rotated2 = clm.rotate(alpha, beta, gamma, degrees=True)
tshtools = time() - t0
return (
_l2error(clm_rotated.to_array(), clm_rotated2.to_array()),
tshtools / tducc,
)
def test_rot2(lmax, alpha, beta, gamma):
degrees = np.arange(lmax + 1, dtype=float)
degrees[0] = np.inf
power = degrees ** (-2)
clm = pysh.SHCoeffs.from_random(power, seed=12345)
pysh.backends.select_preferred_backend("ducc", nthreads=nthreads)
t0 = time()
clm_rotated = clm.rotate(alpha, beta, gamma, degrees=True)
clm_rotated = clm_rotated.rotate(-gamma, -beta, -alpha, degrees=True)
tducc = time() - t0
return _l2error(clm.to_array(), clm_rotated.to_array()), tducc
lmax_list = [127, 255, 511, 1023]
print("SHRealCoeff rotation tests:")
for lmax in lmax_list:
for alpha in [47]:
res = test_rot(lmax, alpha, 27, 59)
print(
"lmax={:4}: L2 error={:e}, speedup factor={:f}".format(
lmax, res[0], res[1]
)
)
print("SHComplexCoeff rotation tests:")
for lmax in lmax_list:
for alpha in [47]:
res = test_rotc(lmax, alpha, 27, 59)
print(
"lmax={:4}: L2 error={:e}, speedup factor={:f}".format(
lmax, res[0], res[1]
)
)
lmax_list = [80]
print("SHT tests unnorm:")
for grid in ["GLQ", "DH", "DH2"]:
for csphase in [-1, 1]:
for norm in ["unnorm"]:
for extend in [True, False]:
for lmax in [5, 10, 20, 85]:
res = test_SHT(lmax, grid, csphase, norm, extend)
print(
"{:3}, CS={:2}, norm={:7}, extend={:5}, lmax={:4}: "
"L2 error={:e}, speedup factor={:f}".format(
grid, csphase, norm, extend, lmax, res[0], res[1]
)
)
lmax_list = [127, 255, 511, 1023, 2047]
print("SHT tests:")
for grid in ["GLQ", "DH", "DH2"]:
for csphase in [-1, 1]:
for norm in ["ortho", "4pi", "schmidt"]:
for extend in [True, False]:
for lmax in lmax_list:
res = test_SHT(lmax, grid, csphase, norm, extend)
print(
"{:3}, CS={:2}, norm={:7}, extend={:5}, lmax={:4}: "
"L2 error={:e}, speedup factor={:f}".format(
grid, csphase, norm, extend, lmax, res[0], res[1]
)
)
print("SHTC tests:")
for grid in ["GLQ", "DH", "DH2"]:
for csphase in [-1, 1]:
for norm in ["ortho", "4pi", "schmidt"]:
for extend in [True, False]:
for lmax in lmax_list:
res = test_SHTC(lmax, grid, csphase, norm, extend)
print(
"{:3}, CS={:2}, norm={:7}, extend={:5}, lmax={:4}: "
"L2 error={:e}, speedup factor={:f}".format(
grid, csphase, norm, extend, lmax, res[0], res[1]
)
)
print("SHT gradient tests:")
for grid in ["DH", "DH2"]:
for csphase in [-1, 1]:
for extend in [True, False]:
for lmax in lmax_list:
res = test_SHT_deriv(lmax, grid, csphase, extend)
print(
"{:3}, CS={:2}, extend={:5}, lmax={:4}: L2 error={:e}, "
"speedup factor={:f}".format(
grid, csphase, extend, lmax, res[0], res[1]
)
)
print("DUCC: forward/backward rotation with high band limits:")
for lmax in [4095]:
for alpha in [47]:
res = test_rot2(lmax, alpha, 27, 59)
print(
"lmax={:4}: L2 error={:e}, time={:f}".format(lmax, res[0], res[1])
)
print("DUCC: forward/backward SHT with high band limits:")
for lmax in [8191]:
res = test_SHTducc(lmax, "GLQ", nthreads=nthreads)
print("lmax={:4}: L2 error={:e}, time={:f}".format(lmax, res[0], res[1]))
| 31.421986 | 78 | 0.578039 | 1,134 | 8,861 | 4.420635 | 0.11552 | 0.027927 | 0.046679 | 0.070018 | 0.869938 | 0.832835 | 0.808697 | 0.78077 | 0.740475 | 0.728506 | 0 | 0.041518 | 0.271527 | 8,861 | 281 | 79 | 31.533808 | 0.735089 | 0.005078 | 0 | 0.656109 | 0 | 0 | 0.099274 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040724 | false | 0 | 0.013575 | 0.004525 | 0.090498 | 0.072398 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e041338b03fd4cbfde1ea0b318a2f61c1fb5ce55 | 46 | py | Python | nntoolbox/sequence/models/__init__.py | nhatsmrt/nn-toolbox | 689b9924d3c88a433f8f350b89c13a878ac7d7c3 | [
"Apache-2.0"
] | 16 | 2019-07-11T15:57:41.000Z | 2020-09-08T13:52:45.000Z | nntoolbox/sequence/models/__init__.py | nhatsmrt/nn-toolbox | 689b9924d3c88a433f8f350b89c13a878ac7d7c3 | [
"Apache-2.0"
] | 1 | 2022-01-18T22:21:57.000Z | 2022-01-18T22:21:57.000Z | nntoolbox/sequence/models/__init__.py | nhatsmrt/nn-toolbox | 689b9924d3c88a433f8f350b89c13a878ac7d7c3 | [
"Apache-2.0"
] | 1 | 2019-08-07T10:07:09.000Z | 2019-08-07T10:07:09.000Z | from .encoder import *
from .decoder import *
| 15.333333 | 22 | 0.73913 | 6 | 46 | 5.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 46 | 2 | 23 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e043f30b895db0ebcf148842d65a755cc1166a94 | 9,184 | py | Python | tests/test_reference_metric.py | Steve-Hawk/nrpytutorial | 42d7450dba8bf43aa9c2d8f38f85f18803de69b7 | [
"BSD-2-Clause"
] | 1 | 2019-12-23T05:31:25.000Z | 2019-12-23T05:31:25.000Z | tests/test_reference_metric.py | Steve-Hawk/nrpytutorial | 42d7450dba8bf43aa9c2d8f38f85f18803de69b7 | [
"BSD-2-Clause"
] | null | null | null | tests/test_reference_metric.py | Steve-Hawk/nrpytutorial | 42d7450dba8bf43aa9c2d8f38f85f18803de69b7 | [
"BSD-2-Clause"
] | 2 | 2019-11-14T03:31:18.000Z | 2019-12-12T13:42:52.000Z | from UnitTesting.create_test import create_test
def test_Spherical():
module = 'reference_metric'
module_name = 'rfm_Spherical'
function_and_global_dict = {'reference_metric(True)': ['xxmin', 'xxmax', 'UnitVectors', 'ReU', 'ReDD', 'ghatDD', 'ghatUU', 'detgammahat',
'detgammahatdD', 'detgammahatdDD', 'ReUdD', 'ReUdDD', 'ReDDdD', 'ReDDdDD', 'ghatDDdD',
'ghatDDdDD', 'GammahatUDD', 'GammahatUDDdD', 'Cart_to_xx','xxCart','xxSph','scalefactor_orthog']}
initialization_string_dict = {'reference_metric(True)': '''
import NRPy_param_funcs as par
par.set_parval_from_str("reference_metric::CoordSystem", "Spherical")
'''}
create_test(module, module_name, function_and_global_dict, initialization_string_dict=initialization_string_dict)
def test_SinhSpherical():
module = 'reference_metric'
module_name = 'rfm_SinhSpherical'
function_and_global_dict = {'reference_metric(True)': ['xxmin', 'xxmax', 'UnitVectors', 'ReU', 'ReDD', 'ghatDD', 'ghatUU', 'detgammahat',
'detgammahatdD', 'detgammahatdDD', 'ReUdD', 'ReUdDD', 'ReDDdD', 'ReDDdDD', 'ghatDDdD',
'ghatDDdDD', 'GammahatUDD', 'GammahatUDDdD', 'Cart_to_xx','xxCart','xxSph','scalefactor_orthog']}
initialization_string_dict = {'reference_metric(True)': '''
import NRPy_param_funcs as par
par.set_parval_from_str("reference_metric::CoordSystem", "SinhSpherical")
'''}
create_test(module, module_name, function_and_global_dict, initialization_string_dict=initialization_string_dict)
def test_SinhSphericalv2():
module = 'reference_metric'
module_name = 'rfm_SinhSphericalv2'
function_and_global_dict = {'reference_metric(True)': ['xxmin', 'xxmax', 'UnitVectors', 'ReU', 'ReDD', 'ghatDD', 'ghatUU', 'detgammahat',
'detgammahatdD', 'detgammahatdDD', 'ReUdD', 'ReUdDD', 'ReDDdD', 'ReDDdDD', 'ghatDDdD',
'ghatDDdDD', 'GammahatUDD', 'GammahatUDDdD', 'Cart_to_xx','xxCart','xxSph','scalefactor_orthog']}
initialization_string_dict = {'reference_metric(True)': '''
import NRPy_param_funcs as par
par.set_parval_from_str("reference_metric::CoordSystem", "SinhSphericalv2")
'''}
create_test(module, module_name, function_and_global_dict, initialization_string_dict=initialization_string_dict)
def test_NobleSphericalThetaOptionOne():
module = 'reference_metric'
module_name = 'rfm_NobleSphericalThetaOptionOne'
function_and_global_dict = {'reference_metric(False)': ['UnitVectors', 'ReU', 'ReDD', 'ghatDD', 'ghatUU', 'detgammahat',
'detgammahatdD', 'detgammahatdDD', 'ReUdD', 'ReUdDD', 'ReDDdD', 'ReDDdDD', 'ghatDDdD',
'ghatDDdDD', 'GammahatUDD', 'GammahatUDDdD', 'Cart_to_xx','xxCart','xxSph','scalefactor_orthog']}
initialization_string_dict = {'reference_metric(False)': '''
import NRPy_param_funcs as par
par.set_parval_from_str("reference_metric::CoordSystem", "NobleSphericalThetaOptionOne")
'''}
create_test(module, module_name, function_and_global_dict, initialization_string_dict=initialization_string_dict)
def test_NobleSphericalThetaOptionTwo():
module = 'reference_metric'
module_name = 'rfm_NobleSphericalThetaOptionTwo'
function_and_global_dict = {'reference_metric(False)': ['UnitVectors', 'ReU', 'ReDD', 'ghatDD', 'ghatUU', 'detgammahat',
'detgammahatdD', 'detgammahatdDD', 'ReUdD', 'ReUdDD', 'ReDDdD', 'ReDDdDD', 'ghatDDdD',
'ghatDDdDD', 'GammahatUDD', 'GammahatUDDdD', 'Cart_to_xx','xxCart','xxSph','scalefactor_orthog']}
initialization_string_dict = {'reference_metric(False)': '''
import NRPy_param_funcs as par
par.set_parval_from_str("reference_metric::CoordSystem", "NobleSphericalThetaOptionTwo")
'''}
create_test(module, module_name, function_and_global_dict, initialization_string_dict=initialization_string_dict)
def test_Cylindrical():
module = 'reference_metric'
module_name = 'rfm_Cylindrical'
function_and_global_dict = {'reference_metric(True)': ['xxmin', 'xxmax', 'UnitVectors', 'ReU', 'ReDD', 'ghatDD', 'ghatUU', 'detgammahat',
'detgammahatdD', 'detgammahatdDD', 'ReUdD', 'ReUdDD', 'ReDDdD', 'ReDDdDD', 'ghatDDdD',
'ghatDDdDD', 'GammahatUDD', 'GammahatUDDdD', 'Cart_to_xx','xxCart','xxSph','scalefactor_orthog']}
initialization_string_dict = {'reference_metric(True)': '''
import NRPy_param_funcs as par
par.set_parval_from_str("reference_metric::CoordSystem", "Cylindrical")
'''}
create_test(module, module_name, function_and_global_dict, initialization_string_dict=initialization_string_dict)
def test_SinhCylindrical():
module = 'reference_metric'
module_name = 'rfm_SinhCylindrical'
function_and_global_dict = {'reference_metric(True)': ['xxmin', 'xxmax', 'UnitVectors', 'ReU', 'ReDD', 'ghatDD', 'ghatUU', 'detgammahat',
'detgammahatdD', 'detgammahatdDD', 'ReUdD', 'ReUdDD', 'ReDDdD', 'ReDDdDD', 'ghatDDdD',
'ghatDDdDD', 'GammahatUDD', 'GammahatUDDdD', 'Cart_to_xx','xxCart','xxSph','scalefactor_orthog']}
initialization_string_dict = {'reference_metric(True)': '''
import NRPy_param_funcs as par
par.set_parval_from_str("reference_metric::CoordSystem", "SinhCylindrical")
'''}
create_test(module, module_name, function_and_global_dict, initialization_string_dict=initialization_string_dict)
def test_SinhCylindricalv2():
module = 'reference_metric'
module_name = 'rfm_SinhCylindricalv2'
function_and_global_dict = {'reference_metric(True)': ['xxmin', 'xxmax', 'UnitVectors', 'ReU', 'ReDD', 'ghatDD', 'ghatUU', 'detgammahat',
'detgammahatdD', 'detgammahatdDD', 'ReUdD', 'ReUdDD', 'ReDDdD', 'ReDDdDD', 'ghatDDdD',
'ghatDDdDD', 'GammahatUDD', 'GammahatUDDdD', 'Cart_to_xx','xxCart','xxSph','scalefactor_orthog']}
initialization_string_dict = {'reference_metric(True)': '''
import NRPy_param_funcs as par
par.set_parval_from_str("reference_metric::CoordSystem", "SinhCylindricalv2")
'''}
create_test(module, module_name, function_and_global_dict, initialization_string_dict=initialization_string_dict)
def test_SymTP():
module = 'reference_metric'
module_name = 'rfm_SymTP'
function_and_global_dict = {'reference_metric(True)': ['xxmin', 'xxmax', 'UnitVectors', 'ReU', 'ReDD', 'ghatDD', 'ghatUU', 'detgammahat',
'detgammahatdD', 'detgammahatdDD', 'ReUdD', 'ReUdDD', 'ReDDdD', 'ReDDdDD', 'ghatDDdD',
'ghatDDdDD', 'GammahatUDD', 'GammahatUDDdD', 'Cart_to_xx','xxCart','xxSph','scalefactor_orthog']}
initialization_string_dict = {'reference_metric(True)': '''
import NRPy_param_funcs as par
par.set_parval_from_str("reference_metric::CoordSystem", "SymTP")
'''}
create_test(module, module_name, function_and_global_dict, initialization_string_dict=initialization_string_dict)
def test_SinhSymTP():
module = 'reference_metric'
module_name = 'rfm_SinhSymTP'
function_and_global_dict = {'reference_metric(True)': ['xxmin', 'xxmax', 'UnitVectors', 'ReU', 'ReDD', 'ghatDD', 'ghatUU', 'detgammahat',
'detgammahatdD', 'detgammahatdDD', 'ReUdD', 'ReUdDD', 'ReDDdD', 'ReDDdDD', 'ghatDDdD',
'ghatDDdDD', 'GammahatUDD', 'GammahatUDDdD', 'Cart_to_xx','xxCart','xxSph','scalefactor_orthog']}
initialization_string_dict = {'reference_metric(True)': '''
import NRPy_param_funcs as par
par.set_parval_from_str("reference_metric::CoordSystem", "SinhSymTP")
'''}
create_test(module, module_name, function_and_global_dict, initialization_string_dict=initialization_string_dict)
def test_Cartesian():
module = 'reference_metric'
module_name = 'rfm_Cartesian'
function_and_global_dict = {'reference_metric(True)': ['UnitVectors', 'ReU', 'ReDD', 'ghatDD', 'ghatUU', 'detgammahat',
'detgammahatdD', 'detgammahatdDD', 'ReUdD', 'ReUdDD', 'ReDDdD', 'ReDDdDD', 'ghatDDdD',
'ghatDDdDD', 'GammahatUDD', 'GammahatUDDdD', 'Cart_to_xx','xxCart','xxSph','scalefactor_orthog']}
initialization_string_dict = {'reference_metric(True)': '''
import NRPy_param_funcs as par
par.set_parval_from_str("reference_metric::CoordSystem", "Cartesian")
'''}
create_test(module, module_name, function_and_global_dict, initialization_string_dict=initialization_string_dict)
if __name__ == '__main__':
import sys
if len(sys.argv) <= 3:
failed_functions = []
for fun in dir():
if fun[0:5] == 'test_':
print('\nTesting ' + str(fun) + '...\n')
try:
exec(fun + '()')
except SystemExit:
failed_functions.append(fun)
if failed_functions != []:
import sys, os
with open(os.path.join('UnitTesting', 'failed_tests.txt'), 'a') as file:
for function in failed_functions:
file.write(sys.argv[0] + ': ' + str(function) + '\n')
exit(1)
else:
globals()[sys.argv[4]]()
| 42.716279 | 141 | 0.683036 | 932 | 9,184 | 6.371245 | 0.108369 | 0.111149 | 0.133378 | 0.077804 | 0.861401 | 0.861401 | 0.798417 | 0.791681 | 0.791681 | 0.791681 | 0 | 0.001576 | 0.170949 | 9,184 | 214 | 142 | 42.915888 | 0.778303 | 0 | 0 | 0.617021 | 0 | 0 | 0.444578 | 0.136215 | 0 | 0 | 0 | 0 | 0 | 1 | 0.078014 | false | 0 | 0.099291 | 0 | 0.177305 | 0.007092 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e057fea135a781a1f3e343cccc2c8d03caae0408 | 23,721 | py | Python | src/backend/marsha/core/tests/test_xapi.py | insad/marsha | 3c6627b9a1debbb594e43233df7b7edb88f57f45 | [
"MIT"
] | 64 | 2018-04-26T23:46:14.000Z | 2022-03-26T21:32:23.000Z | src/backend/marsha/core/tests/test_xapi.py | insad/marsha | 3c6627b9a1debbb594e43233df7b7edb88f57f45 | [
"MIT"
] | 533 | 2018-04-17T10:17:24.000Z | 2022-03-31T13:07:49.000Z | src/backend/marsha/core/tests/test_xapi.py | insad/marsha | 3c6627b9a1debbb594e43233df7b7edb88f57f45 | [
"MIT"
] | 16 | 2018-09-21T12:52:34.000Z | 2021-11-29T16:44:51.000Z | """Tests for the xapi module of the Marsha project."""
from unittest import mock
from django.test import TestCase, override_settings
from rest_framework_simplejwt.tokens import AccessToken
from ..defaults import RAW, RUNNING
from ..factories import DocumentFactory, VideoFactory
from ..xapi import (
XAPI,
XAPIDocumentStatement,
XAPIVideoStatement,
get_xapi_statement,
requests,
)
class XAPIVideoStatmentTest(TestCase):
"""Test the XAPIVideoStatement class."""
def test_xapi_statement_missing_user(self):
"""Missing lti user should fallback on session_id."""
video = VideoFactory(
id="68333c45-4b8c-4018-a195-5d5e1706b838",
playlist__consumer_site__domain="example.com",
title="test video xapi",
)
jwt_token = AccessToken()
jwt_token.payload["session_id"] = "326c0689-48c1-493e-8d2d-9fb0c289de7f"
jwt_token.payload["context_id"] = "course-v1:ufr+mathematics+0001"
base_statement = {
"context": {
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
}
},
"result": {
"extensions": {
"https://w3id.org/xapi/video/extensions/time-from": 0,
"https://w3id.org/xapi/video/extensions/time-to": 0,
"https://w3id.org/xapi/video/extensions/length": 104.304,
"https://w3id.org/xapi/video/extensions/progress": 0,
"https://w3id.org/xapi/video/extensions/played-segments": "0",
}
},
"verb": {
"display": {"en-US": "seeked"},
"id": "https://w3id.org/xapi/video/verbs/seeked",
},
"id": "17dfcd44-b3e0-403d-ab96-e3ef7da616d4",
}
xapi_statement = XAPIVideoStatement(video, base_statement, jwt_token)
statement = xapi_statement.get_statement()
self.assertIsNotNone(statement["timestamp"])
self.assertEqual(
statement["actor"],
{
"objectType": "Agent",
"account": {
"name": "326c0689-48c1-493e-8d2d-9fb0c289de7f",
"homePage": "http://example.com",
},
},
)
self.assertEqual(
statement["object"],
{
"definition": {
"type": "https://w3id.org/xapi/video/activity-type/video",
"name": {"en-US": "test video xapi"},
},
"id": "uuid://68333c45-4b8c-4018-a195-5d5e1706b838",
"objectType": "Activity",
},
)
self.assertEqual(
statement["context"],
{
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
},
"contextActivities": {
"category": [{"id": "https://w3id.org/xapi/video"}],
"parent": [
{
"id": "course-v1:ufr+mathematics+0001",
"objectType": "Activity",
"definition": {
"type": "http://adlnet.gov/expapi/activities/course"
},
}
],
},
},
)
self.assertEqual(statement["verb"], base_statement["verb"])
self.assertEqual(statement["id"], base_statement["id"])
self.assertEqual(statement["result"], base_statement["result"])
@override_settings(LANGUAGE_CODE="en-us")
def test_xapi_statement_enrich_statement(self):
"""XAPI statement sent by the front application should be enriched."""
video = VideoFactory(
id="68333c45-4b8c-4018-a195-5d5e1706b838",
playlist__consumer_site__domain="example.com",
title="test video xapi",
)
jwt_token = AccessToken()
jwt_token.payload["user"] = {"id": "b2584aa405540758db2a6278521b6478"}
jwt_token.payload["session_id"] = "326c0689-48c1-493e-8d2d-9fb0c289de7f"
jwt_token.payload["context_id"] = "course-v1:ufr+mathematics+0001"
base_statement = {
"context": {
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
}
},
"result": {
"extensions": {
"https://w3id.org/xapi/video/extensions/time-from": 0,
"https://w3id.org/xapi/video/extensions/time-to": 0,
"https://w3id.org/xapi/video/extensions/length": 104.304,
"https://w3id.org/xapi/video/extensions/progress": 0,
"https://w3id.org/xapi/video/extensions/played-segments": "0",
}
},
"verb": {
"display": {"en-US": "seeked"},
"id": "https://w3id.org/xapi/video/verbs/seeked",
},
"id": "17dfcd44-b3e0-403d-ab96-e3ef7da616d4",
}
xapi_statement = XAPIVideoStatement(video, base_statement, jwt_token)
statement = xapi_statement.get_statement()
self.assertIsNotNone(statement["timestamp"])
self.assertEqual(
statement["actor"],
{
"objectType": "Agent",
"account": {
"name": "b2584aa405540758db2a6278521b6478",
"homePage": "http://example.com",
},
},
)
self.assertEqual(
statement["object"],
{
"definition": {
"type": "https://w3id.org/xapi/video/activity-type/video",
"name": {"en-US": "test video xapi"},
},
"id": "uuid://68333c45-4b8c-4018-a195-5d5e1706b838",
"objectType": "Activity",
},
)
self.assertEqual(
statement["context"],
{
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
},
"contextActivities": {
"category": [{"id": "https://w3id.org/xapi/video"}],
"parent": [
{
"id": "course-v1:ufr+mathematics+0001",
"objectType": "Activity",
"definition": {
"type": "http://adlnet.gov/expapi/activities/course"
},
}
],
},
},
)
self.assertEqual(statement["verb"], base_statement["verb"])
self.assertEqual(statement["id"], base_statement["id"])
self.assertEqual(statement["result"], base_statement["result"])
def test_xapi_statement_live_video(self):
"""A live video should send a webinar activity type."""
video = VideoFactory(
id="68333c45-4b8c-4018-a195-5d5e1706b838",
playlist__consumer_site__domain="example.com",
title="test video xapi",
live_state=RUNNING,
live_type=RAW,
)
jwt_token = AccessToken()
jwt_token.payload["user"] = {"id": "b2584aa405540758db2a6278521b6478"}
jwt_token.payload["session_id"] = "326c0689-48c1-493e-8d2d-9fb0c289de7f"
jwt_token.payload["context_id"] = "course-v1:ufr+mathematics+0001"
base_statement = {
"context": {
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
}
},
"result": {
"extensions": {
"https://w3id.org/xapi/video/extensions/time-from": 0,
"https://w3id.org/xapi/video/extensions/time-to": 0,
"https://w3id.org/xapi/video/extensions/length": 104.304,
"https://w3id.org/xapi/video/extensions/progress": 0,
"https://w3id.org/xapi/video/extensions/played-segments": "0",
}
},
"verb": {
"display": {"en-US": "seeked"},
"id": "https://w3id.org/xapi/video/verbs/seeked",
},
"id": "17dfcd44-b3e0-403d-ab96-e3ef7da616d4",
}
xapi_statement = XAPIVideoStatement(video, base_statement, jwt_token)
statement = xapi_statement.get_statement()
self.assertIsNotNone(statement["timestamp"])
self.assertEqual(
statement["actor"],
{
"objectType": "Agent",
"account": {
"name": "b2584aa405540758db2a6278521b6478",
"homePage": "http://example.com",
},
},
)
self.assertEqual(
statement["object"],
{
"definition": {
"type": "http://id.tincanapi.com/activitytype/webinar",
"name": {"en-US": "test video xapi"},
},
"id": "uuid://68333c45-4b8c-4018-a195-5d5e1706b838",
"objectType": "Activity",
},
)
self.assertEqual(
statement["context"],
{
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
},
"contextActivities": {
"category": [{"id": "https://w3id.org/xapi/video"}],
"parent": [
{
"id": "course-v1:ufr+mathematics+0001",
"objectType": "Activity",
"definition": {
"type": "http://adlnet.gov/expapi/activities/course"
},
}
],
},
},
)
self.assertEqual(statement["verb"], base_statement["verb"])
self.assertEqual(statement["id"], base_statement["id"])
self.assertEqual(statement["result"], base_statement["result"])
def test_xapi_statement_missing_context_id(self):
"""Parent contextActivities should be missing without context_id."""
video = VideoFactory(
id="68333c45-4b8c-4018-a195-5d5e1706b838",
playlist__consumer_site__domain="example.com",
title="test video xapi",
)
jwt_token = AccessToken()
jwt_token.payload["session_id"] = "326c0689-48c1-493e-8d2d-9fb0c289de7f"
base_statement = {
"context": {
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
}
},
"result": {
"extensions": {
"https://w3id.org/xapi/video/extensions/time-from": 0,
"https://w3id.org/xapi/video/extensions/time-to": 0,
"https://w3id.org/xapi/video/extensions/length": 104.304,
"https://w3id.org/xapi/video/extensions/progress": 0,
"https://w3id.org/xapi/video/extensions/played-segments": "0",
}
},
"verb": {
"display": {"en-US": "seeked"},
"id": "https://w3id.org/xapi/video/verbs/seeked",
},
"id": "17dfcd44-b3e0-403d-ab96-e3ef7da616d4",
}
xapi_statement = XAPIVideoStatement(video, base_statement, jwt_token)
statement = xapi_statement.get_statement()
self.assertIsNotNone(statement["timestamp"])
self.assertEqual(
statement["actor"],
{
"objectType": "Agent",
"account": {
"name": "326c0689-48c1-493e-8d2d-9fb0c289de7f",
"homePage": "http://example.com",
},
},
)
self.assertEqual(
statement["object"],
{
"definition": {
"type": "https://w3id.org/xapi/video/activity-type/video",
"name": {"en-US": "test video xapi"},
},
"id": "uuid://68333c45-4b8c-4018-a195-5d5e1706b838",
"objectType": "Activity",
},
)
self.assertEqual(
statement["context"],
{
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
},
"contextActivities": {
"category": [{"id": "https://w3id.org/xapi/video"}]
},
},
)
self.assertEqual(statement["verb"], base_statement["verb"])
self.assertEqual(statement["id"], base_statement["id"])
self.assertEqual(statement["result"], base_statement["result"])
class XAPIDocumentStatementTest(TestCase):
"""Test the XAPIDocumentStatement class."""
@override_settings(LANGUAGE_CODE="en-us")
def test_xapi_statement_enrich_statement(self):
"""XAPI statement sent by the front application should be enriched."""
document = DocumentFactory(
id="68333c45-4b8c-4018-a195-5d5e1706b838",
playlist__consumer_site__domain="example.com",
title="test document xapi",
)
jwt_token = AccessToken()
jwt_token.payload["user"] = {"id": "b2584aa405540758db2a6278521b6478"}
jwt_token.payload["session_id"] = "326c0689-48c1-493e-8d2d-9fb0c289de7f"
jwt_token.payload["context_id"] = "course-v1:ufr+mathematics+0001"
base_statement = {
"context": {
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
}
},
"verb": {
"display": {"en-US": "downloaded"},
"id": "http://id.tincanapi.com/verb/downloaded",
},
"id": "17dfcd44-b3e0-403d-ab96-e3ef7da616d4",
}
xapi_statement = XAPIDocumentStatement(document, base_statement, jwt_token)
statement = xapi_statement.get_statement()
self.assertIsNotNone(statement["timestamp"])
self.assertEqual(
statement["actor"],
{
"objectType": "Agent",
"account": {
"name": "b2584aa405540758db2a6278521b6478",
"homePage": "http://example.com",
},
},
)
self.assertEqual(
statement["object"],
{
"definition": {
"type": "http://id.tincanapi.com/activitytype/document",
"name": {"en-US": "test document xapi"},
},
"id": "uuid://68333c45-4b8c-4018-a195-5d5e1706b838",
"objectType": "Activity",
},
)
self.assertEqual(
statement["context"],
{
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
},
"contextActivities": {
"parent": [
{
"id": "course-v1:ufr+mathematics+0001",
"objectType": "Activity",
"definition": {
"type": "http://adlnet.gov/expapi/activities/course"
},
}
],
},
},
)
self.assertEqual(statement["verb"], base_statement["verb"])
self.assertEqual(statement["id"], base_statement["id"])
@override_settings(LANGUAGE_CODE="en-us")
def test_xapi_statement_missing_context_id(self):
"""Parent contextActivities should be missing without context_id."""
document = DocumentFactory(
id="68333c45-4b8c-4018-a195-5d5e1706b838",
playlist__consumer_site__domain="example.com",
title="test document xapi",
)
jwt_token = AccessToken()
jwt_token.payload["user"] = {"id": "b2584aa405540758db2a6278521b6478"}
jwt_token.payload["session_id"] = "326c0689-48c1-493e-8d2d-9fb0c289de7f"
base_statement = {
"context": {
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
}
},
"verb": {
"display": {"en-US": "downloaded"},
"id": "http://id.tincanapi.com/verb/downloaded",
},
"id": "17dfcd44-b3e0-403d-ab96-e3ef7da616d4",
}
xapi_statement = XAPIDocumentStatement(document, base_statement, jwt_token)
statement = xapi_statement.get_statement()
self.assertIsNotNone(statement["timestamp"])
self.assertEqual(
statement["actor"],
{
"objectType": "Agent",
"account": {
"name": "b2584aa405540758db2a6278521b6478",
"homePage": "http://example.com",
},
},
)
self.assertEqual(
statement["object"],
{
"definition": {
"type": "http://id.tincanapi.com/activitytype/document",
"name": {"en-US": "test document xapi"},
},
"id": "uuid://68333c45-4b8c-4018-a195-5d5e1706b838",
"objectType": "Activity",
},
)
self.assertEqual(
statement["context"],
{
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
},
},
)
self.assertEqual(statement["verb"], base_statement["verb"])
self.assertEqual(statement["id"], base_statement["id"])
@override_settings(LANGUAGE_CODE="en-us")
def test_xapi_statement_missing_user_id(self):
"""Missing lti user should fallback on session_id."""
document = DocumentFactory(
id="68333c45-4b8c-4018-a195-5d5e1706b838",
playlist__consumer_site__domain="example.com",
title="test document xapi",
)
jwt_token = AccessToken()
jwt_token.payload["session_id"] = "326c0689-48c1-493e-8d2d-9fb0c289de7f"
jwt_token.payload["context_id"] = "course-v1:ufr+mathematics+0001"
base_statement = {
"context": {
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
}
},
"verb": {
"display": {"en-US": "downloaded"},
"id": "http://id.tincanapi.com/verb/downloaded",
},
"id": "17dfcd44-b3e0-403d-ab96-e3ef7da616d4",
}
xapi_statement = XAPIDocumentStatement(document, base_statement, jwt_token)
statement = xapi_statement.get_statement()
self.assertIsNotNone(statement["timestamp"])
self.assertEqual(
statement["actor"],
{
"objectType": "Agent",
"account": {
"name": "326c0689-48c1-493e-8d2d-9fb0c289de7f",
"homePage": "http://example.com",
},
},
)
self.assertEqual(
statement["object"],
{
"definition": {
"type": "http://id.tincanapi.com/activitytype/document",
"name": {"en-US": "test document xapi"},
},
"id": "uuid://68333c45-4b8c-4018-a195-5d5e1706b838",
"objectType": "Activity",
},
)
self.assertEqual(
statement["context"],
{
"extensions": {
"https://w3id.org/xapi/video/extensions/session-id": "a6151456-18b7-"
"43b4-8452-2037fed588df"
},
"contextActivities": {
"parent": [
{
"id": "course-v1:ufr+mathematics+0001",
"objectType": "Activity",
"definition": {
"type": "http://adlnet.gov/expapi/activities/course"
},
}
],
},
},
)
self.assertEqual(statement["verb"], base_statement["verb"])
self.assertEqual(statement["id"], base_statement["id"])
class XAPITest(TestCase):
"""Test the xapi module."""
@mock.patch.object(requests, "post")
def test_xapi_enrich_and_send_statement(self, mock_requests_post):
"""XAPI statement sent by the front application should be enriched.
Before sending a statement, the xapi module is responsible for enriching it.
"""
xapi = XAPI("https://lrs.example.com", "auth_token")
mock_response = mock.MagicMock()
mock_response.raise_for_status.return_value = 200
mock_requests_post.return_value = mock_response
statement = {"foo": "bar"}
mock_xapi_statement = mock.MagicMock()
mock_xapi_statement.get_statement.return_value = statement
xapi.send(mock_xapi_statement)
args, kwargs = mock_requests_post.call_args_list[0]
self.assertEqual(args[0], "https://lrs.example.com")
self.assertEqual(
kwargs["headers"],
{
"Authorization": "auth_token",
"Content-Type": "application/json",
"X-Experience-API-Version": "1.0.3",
},
)
self.assertEqual(kwargs["json"], statement)
class GetXapiStatementTest(TestCase):
"""Test get_xapi_statement function."""
def test_get_xapi_statement_with_video(self):
"""With video parameter must return XAPIVideoStatement."""
statement_class = get_xapi_statement("video")
self.assertEqual(statement_class, XAPIVideoStatement)
def test_get_xapi_statement_with_document(self):
"""With document parameter must return XAPIDocumentStatement."""
statement_class = get_xapi_statement("document")
self.assertEqual(statement_class, XAPIDocumentStatement)
def test_get_xapi_statement_with_unknown_resource(self):
"""With unknown resource must throw an exception."""
with self.assertRaises(NotImplementedError):
get_xapi_statement("unknown")
| 37.772293 | 89 | 0.497323 | 1,931 | 23,721 | 5.986018 | 0.099948 | 0.035038 | 0.046717 | 0.062289 | 0.862185 | 0.856995 | 0.846959 | 0.846959 | 0.846959 | 0.839865 | 0 | 0.090757 | 0.369209 | 23,721 | 627 | 90 | 37.832536 | 0.681748 | 0.03714 | 0 | 0.638989 | 0 | 0 | 0.317079 | 0.090274 | 0 | 0 | 0 | 0 | 0.093863 | 1 | 0.019856 | false | 0 | 0.01083 | 0 | 0.037906 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e06390f6b16bdad14a4ad992ecccb82d19ebe26b | 46,201 | py | Python | integration_tests/src/main/python/window_function_test.py | ekrivokonmapr/spark-rapids | f774e1a231416aab7cacc86f06010ef797fc19c7 | [
"Apache-2.0"
] | null | null | null | integration_tests/src/main/python/window_function_test.py | ekrivokonmapr/spark-rapids | f774e1a231416aab7cacc86f06010ef797fc19c7 | [
"Apache-2.0"
] | null | null | null | integration_tests/src/main/python/window_function_test.py | ekrivokonmapr/spark-rapids | f774e1a231416aab7cacc86f06010ef797fc19c7 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2020-2021, NVIDIA CORPORATION.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import math
import pytest
from asserts import assert_gpu_and_cpu_are_equal_collect, assert_gpu_and_cpu_are_equal_sql, assert_gpu_fallback_collect
from data_gen import *
from marks import *
from pyspark.sql.types import *
from pyspark.sql.types import NumericType
from pyspark.sql.window import Window
import pyspark.sql.functions as f
def meta_idfn(meta):
def tmp(something):
return meta + idfn(something)
return tmp
_grpkey_longs_with_no_nulls = [
('a', RepeatSeqGen(LongGen(nullable=False), length=20)),
('b', IntegerGen()),
('c', IntegerGen())]
_grpkey_longs_with_nulls = [
('a', RepeatSeqGen(LongGen(nullable=(True, 10.0)), length=20)),
('b', IntegerGen()),
('c', IntegerGen())]
_grpkey_longs_with_dates = [
('a', RepeatSeqGen(LongGen(), length=2048)),
('b', DateGen(nullable=False, start=date(year=2020, month=1, day=1), end=date(year=2020, month=12, day=31))),
('c', IntegerGen())]
_grpkey_longs_with_nullable_dates = [
('a', RepeatSeqGen(LongGen(nullable=False), length=20)),
('b', DateGen(nullable=(True, 5.0), start=date(year=2020, month=1, day=1), end=date(year=2020, month=12, day=31))),
('c', IntegerGen())]
_grpkey_longs_with_timestamps = [
('a', RepeatSeqGen(LongGen(), length=2048)),
('b', TimestampGen(nullable=False)),
('c', IntegerGen())]
_grpkey_longs_with_nullable_timestamps = [
('a', RepeatSeqGen(LongGen(nullable=False), length=20)),
('b', TimestampGen(nullable=(True, 5.0))),
('c', IntegerGen())]
_grpkey_longs_with_decimals = [
('a', RepeatSeqGen(LongGen(nullable=False), length=20)),
('b', DecimalGen(precision=18, scale=3, nullable=False)),
('c', IntegerGen())]
_grpkey_longs_with_nullable_decimals = [
('a', RepeatSeqGen(LongGen(nullable=(True, 10.0)), length=20)),
('b', DecimalGen(precision=18, scale=10, nullable=True)),
('c', IntegerGen())]
_grpkey_decimals_with_nulls = [
('a', RepeatSeqGen(LongGen(nullable=(True, 10.0)), length=20)),
('b', IntegerGen()),
# the max decimal precision supported by sum operation is 8
('c', DecimalGen(precision=8, scale=3, nullable=True))]
_grpkey_byte_with_nulls = [
('a', RepeatSeqGen(int_gen, length=20)),
# restrict the values generated by min_val/max_val not to be overflow when calculating
('b', ByteGen(nullable=True, min_val=-98, max_val=98, special_cases=[])),
('c', IntegerGen())]
_grpkey_short_with_nulls = [
('a', RepeatSeqGen(int_gen, length=20)),
# restrict the values generated by min_val/max_val not to be overflow when calculating
('b', ShortGen(nullable=True, min_val=-32700, max_val=32700, special_cases=[])),
('c', IntegerGen())]
_grpkey_int_with_nulls = [
('a', RepeatSeqGen(int_gen, length=20)),
# restrict the values generated by min_val/max_val not to be overflow when calculating
('b', IntegerGen(nullable=True, min_val=-2147483000, max_val=2147483000, special_cases=[])),
('c', IntegerGen())]
_grpkey_long_with_nulls = [
('a', RepeatSeqGen(int_gen, length=20)),
# restrict the values generated by min_val/max_val not to be overflow when calculating
('b', LongGen(nullable=True, min_val=-9223372036854775000, max_val=9223372036854775000, special_cases=[])),
('c', IntegerGen())]
_grpkey_date_with_nulls = [
('a', RepeatSeqGen(int_gen, length=20)),
('b', DateGen(nullable=(True, 5.0), start=date(year=2020, month=1, day=1), end=date(year=2020, month=12, day=31))),
('c', IntegerGen())]
_grpkey_byte_with_nulls_with_overflow = [
('a', IntegerGen()),
('b', ByteGen(nullable=True))]
_grpkey_short_with_nulls_with_overflow = [
('a', IntegerGen()),
('b', ShortGen(nullable=True))]
_grpkey_int_with_nulls_with_overflow = [
('a', IntegerGen()),
('b', IntegerGen(nullable=True))]
_grpkey_long_with_nulls_with_overflow = [
('a', IntegerGen()),
('b', LongGen(nullable=True))]
part_and_order_gens = [long_gen, DoubleGen(no_nans=True, special_cases=[]),
string_gen, boolean_gen, timestamp_gen, DecimalGen(precision=18, scale=1)]
running_part_and_order_gens = [long_gen, DoubleGen(no_nans=True, special_cases=[]),
string_gen, byte_gen, timestamp_gen, DecimalGen(precision=18, scale=1)]
lead_lag_data_gens = [long_gen, DoubleGen(no_nans=True, special_cases=[]),
boolean_gen, timestamp_gen, string_gen, DecimalGen(precision=18, scale=3),
StructGen(children=[
['child_int', IntegerGen()],
['child_time', DateGen()],
['child_string', StringGen()]
])]
all_basic_gens_no_nans = [byte_gen, short_gen, int_gen, long_gen,
FloatGen(no_nans=True, special_cases=[]), DoubleGen(no_nans=True, special_cases=[]),
string_gen, boolean_gen, date_gen, timestamp_gen, null_gen]
@pytest.mark.xfail(reason="[UNSUPPORTED] Ranges over order by byte column overflow "
"(https://github.com/NVIDIA/spark-rapids/pull/2020#issuecomment-838127070)")
@ignore_order
@pytest.mark.parametrize('data_gen', [_grpkey_byte_with_nulls_with_overflow], ids=idfn)
def test_window_aggs_for_ranges_numeric_byte_overflow(data_gen):
assert_gpu_and_cpu_are_equal_sql(
lambda spark: gen_df(spark, data_gen, length=2048),
"window_agg_table",
'select '
' sum(b) over '
' (partition by a order by b asc '
' range between 127 preceding and 127 following) as sum_c_asc, '
'from window_agg_table',
conf={'spark.rapids.sql.window.range.byte.enabled': True})
@pytest.mark.xfail(reason="[UNSUPPORTED] Ranges over order by short column overflow "
"(https://github.com/NVIDIA/spark-rapids/pull/2020#issuecomment-838127070)")
@ignore_order
@pytest.mark.parametrize('data_gen', [_grpkey_short_with_nulls_with_overflow], ids=idfn)
def test_window_aggs_for_ranges_numeric_short_overflow(data_gen):
assert_gpu_and_cpu_are_equal_sql(
lambda spark: gen_df(spark, data_gen, length=2048),
"window_agg_table",
'select '
' sum(b) over '
' (partition by a order by b asc '
' range between 32767 preceding and 32767 following) as sum_c_asc, '
'from window_agg_table',
conf={'spark.rapids.sql.window.range.short.enabled': True})
@pytest.mark.xfail(reason="[UNSUPPORTED] Ranges over order by int column overflow "
"(https://github.com/NVIDIA/spark-rapids/pull/2020#issuecomment-838127070)")
@ignore_order
@pytest.mark.parametrize('data_gen', [_grpkey_int_with_nulls_with_overflow], ids=idfn)
def test_window_aggs_for_ranges_numeric_int_overflow(data_gen):
assert_gpu_and_cpu_are_equal_sql(
lambda spark: gen_df(spark, data_gen, length=2048),
"window_agg_table",
'select '
' sum(b) over '
' (partition by a order by b asc '
' range between 2147483647 preceding and 2147483647 following) as sum_c_asc, '
'from window_agg_table')
@pytest.mark.xfail(reason="[UNSUPPORTED] Ranges over order by long column overflow "
"(https://github.com/NVIDIA/spark-rapids/pull/2020#issuecomment-838127070)")
@ignore_order
@pytest.mark.parametrize('data_gen', [_grpkey_long_with_nulls_with_overflow], ids=idfn)
def test_window_aggs_for_ranges_numeric_long_overflow(data_gen):
assert_gpu_and_cpu_are_equal_sql(
lambda spark: gen_df(spark, data_gen, length=2048),
"window_agg_table",
'select '
' sum(b) over '
' (partition by a order by b asc '
' range between 9223372036854775807 preceding and 9223372036854775807 following) as sum_c_asc, '
'from window_agg_table')
# In a distributed setup the order of the partitions returend might be different, so we must ignore the order
# but small batch sizes can make sort very slow, so do the final order by locally
@ignore_order(local=True)
@pytest.mark.parametrize('batch_size', ['1000', '1g'], ids=idfn) # set the batch size so we can test multiple stream batches
@pytest.mark.parametrize('data_gen', [
_grpkey_byte_with_nulls,
_grpkey_short_with_nulls,
_grpkey_int_with_nulls,
_grpkey_long_with_nulls,
_grpkey_date_with_nulls,
], ids=idfn)
def test_window_aggs_for_range_numeric_date(data_gen, batch_size):
conf = {'spark.rapids.sql.batchSizeBytes': batch_size,
'spark.rapids.sql.window.range.byte.enabled': True,
'spark.rapids.sql.window.range.short.enabled': True}
assert_gpu_and_cpu_are_equal_sql(
lambda spark: gen_df(spark, data_gen, length=2048),
'window_agg_table',
'select '
' sum(c) over '
' (partition by a order by b asc '
' range between 1 preceding and 3 following) as sum_c_asc, '
' avg(c) over '
' (partition by a order by b asc '
' range between 1 preceding and 3 following) as avg_b_asc, '
' max(c) over '
' (partition by a order by b asc '
' range between 1 preceding and 3 following) as max_b_desc, '
' min(c) over '
' (partition by a order by b asc '
' range between 1 preceding and 3 following) as min_b_asc, '
' count(1) over '
' (partition by a order by b asc '
' range between CURRENT ROW and UNBOUNDED following) as count_1_asc, '
' count(c) over '
' (partition by a order by b asc '
' range between CURRENT ROW and UNBOUNDED following) as count_b_asc, '
' avg(c) over '
' (partition by a order by b asc '
' range between UNBOUNDED preceding and CURRENT ROW) as avg_b_unbounded, '
' sum(c) over '
' (partition by a order by b asc '
' range between UNBOUNDED preceding and CURRENT ROW) as sum_b_unbounded, '
' max(c) over '
' (partition by a order by b asc '
' range between UNBOUNDED preceding and UNBOUNDED following) as max_b_unbounded '
'from window_agg_table ',
conf = conf)
# In a distributed setup the order of the partitions returend might be different, so we must ignore the order
# but small batch sizes can make sort very slow, so do the final order by locally
@ignore_order(local=True)
@pytest.mark.parametrize('batch_size', ['1000', '1g'], ids=idfn) # set the batch size so we can test multiple stream batches
@pytest.mark.parametrize('data_gen', [_grpkey_longs_with_no_nulls,
_grpkey_longs_with_nulls,
_grpkey_longs_with_dates,
_grpkey_longs_with_nullable_dates,
_grpkey_longs_with_decimals,
_grpkey_longs_with_nullable_decimals,
_grpkey_decimals_with_nulls], ids=idfn)
def test_window_aggs_for_rows(data_gen, batch_size):
conf = {'spark.rapids.sql.batchSizeBytes': batch_size,
'spark.rapids.sql.castFloatToDecimal.enabled': True}
assert_gpu_and_cpu_are_equal_sql(
lambda spark : gen_df(spark, data_gen, length=2048),
"window_agg_table",
'select '
' sum(c) over '
' (partition by a order by b,c asc rows between 1 preceding and 1 following) as sum_c_asc, '
' max(c) over '
' (partition by a order by b desc, c desc rows between 2 preceding and 1 following) as max_c_desc, '
' min(c) over '
' (partition by a order by b,c rows between 2 preceding and current row) as min_c_asc, '
' count(1) over '
' (partition by a order by b,c rows between UNBOUNDED preceding and UNBOUNDED following) as count_1, '
' count(c) over '
' (partition by a order by b,c rows between UNBOUNDED preceding and UNBOUNDED following) as count_c, '
' avg(c) over '
' (partition by a order by b,c rows between UNBOUNDED preceding and UNBOUNDED following) as avg_c, '
' rank() over '
' (partition by a order by b,c rows between UNBOUNDED preceding and CURRENT ROW) as rank_val, '
' dense_rank() over '
' (partition by a order by b,c rows between UNBOUNDED preceding and CURRENT ROW) as dense_rank_val, '
' row_number() over '
' (partition by a order by b,c rows between UNBOUNDED preceding and CURRENT ROW) as row_num '
'from window_agg_table ',
conf = conf)
# This is for aggregations that work with a running window optimization. They don't need to be batched
# specially, but it only works if all of the aggregations can support this.
# the order returned should be consistent because the data ends up in a single task (no partitioning)
@pytest.mark.parametrize('batch_size', ['1000', '1g'], ids=idfn) # set the batch size so we can test multiple stream batches
@pytest.mark.parametrize('b_gen', all_basic_gens_no_nans + [decimal_gen_scale_precision], ids=meta_idfn('data:'))
def test_window_running_no_part(b_gen, batch_size):
conf = {'spark.rapids.sql.batchSizeBytes': batch_size,
'spark.rapids.sql.hasNans': False,
'spark.rapids.sql.castFloatToDecimal.enabled': True}
query_parts = ['row_number() over (order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as row_num',
'rank() over (order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as rank_val',
'dense_rank() over (order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as dense_rank_val',
'count(b) over (order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as count_col',
'min(b) over (order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as min_col',
'max(b) over (order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as max_col']
if isinstance(b_gen.data_type, NumericType) and not isinstance(b_gen, FloatGen) and not isinstance(b_gen, DoubleGen):
query_parts.append('sum(b) over (order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as sum_col')
assert_gpu_and_cpu_are_equal_sql(
lambda spark : two_col_df(spark, LongRangeGen(), b_gen, length=1024 * 14),
"window_agg_table",
'select ' +
', '.join(query_parts) +
' from window_agg_table ',
validate_execs_in_gpu_plan = ['GpuRunningWindowExec'],
conf = conf)
# Test that we can do a running window sum on floats and doubles. This becomes problematic because we do the agg in parallel
# which means that the result can switch back and forth from Inf to not Inf depending on the order of aggregations.
# We test this by limiting the range of the values in the sum to never hit Inf, and by using abs so we don't have
# positive and negative values that interfere with each other.
# the order returned should be consistent because the data ends up in a single task (no partitioning)
@approximate_float
@pytest.mark.parametrize('batch_size', ['1000', '1g'], ids=idfn) # set the batch size so we can test multiple stream batches
def test_running_float_sum_no_part(batch_size):
conf = {'spark.rapids.sql.batchSizeBytes': batch_size,
'spark.rapids.sql.variableFloatAgg.enabled': True,
'spark.rapids.sql.castFloatToDecimal.enabled': True}
query_parts = ['a',
'sum(cast(b as double)) over (order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as shrt_dbl_sum',
'sum(abs(dbl)) over (order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as dbl_sum',
'sum(cast(b as float)) over (order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as shrt_flt_sum',
'sum(abs(flt)) over (order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as flt_sum']
gen = StructGen([('a', LongRangeGen()),('b', short_gen),('flt', float_gen),('dbl', double_gen)], nullable=False)
assert_gpu_and_cpu_are_equal_sql(
lambda spark : gen_df(spark, gen, length=1024 * 14),
"window_agg_table",
'select ' +
', '.join(query_parts) +
' from window_agg_table ',
validate_execs_in_gpu_plan = ['GpuRunningWindowExec'],
conf = conf)
# Rank aggregations are running window aggregations but they care about the ordering. In most tests we don't
# allow duplicate ordering, because that makes the results ambiguous. If two rows end up being switched even
# if the order-by column is the same then we can get different results for say a running sum. Here we are going
# to allow for duplication in the ordering, because there will be no other columns. This means that if you swtich
# rows it does not matter because the only time rows are switched is when the rows are exactly the same.
@pytest.mark.parametrize('data_gen', all_basic_gens_no_nans + [decimal_gen_scale_precision], ids=meta_idfn('data:'))
def test_window_running_rank_no_part(data_gen):
# Keep the batch size small. We have tested these with operators with exact inputs already, this is mostly
# testing the fixup operation.
conf = {'spark.rapids.sql.batchSizeBytes': 1000}
query_parts = ['a',
'rank() over (order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as rank_val',
'dense_rank() over (order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as dense_rank_val']
# When generating the ordering try really hard to have duplicate values
assert_gpu_and_cpu_are_equal_sql(
lambda spark : unary_op_df(spark, RepeatSeqGen(data_gen, length=500), length=1024 * 14),
"window_agg_table",
'select ' +
', '.join(query_parts) +
' from window_agg_table ',
validate_execs_in_gpu_plan = ['GpuRunningWindowExec'],
conf = conf)
# Rank aggregations are running window aggregations but they care about the ordering. In most tests we don't
# allow duplicate ordering, because that makes the results ambiguous. If two rows end up being switched even
# if the order-by column is the same then we can get different results for say a running sum. Here we are going
# to allow for duplication in the ordering, because there will be no other columns. This means that if you swtich
# rows it does not matter because the only time rows are switched is when the rows are exactly the same.
# In a distributed setup the order of the partitions returned might be different, so we must ignore the order
# but small batch sizes can make sort very slow, so do the final order by locally
@ignore_order(local=True)
@pytest.mark.parametrize('data_gen', all_basic_gens + [decimal_gen_scale_precision], ids=idfn)
def test_window_running_rank(data_gen):
# Keep the batch size small. We have tested these with operators with exact inputs already, this is mostly
# testing the fixup operation.
conf = {'spark.rapids.sql.batchSizeBytes': 1000}
query_parts = ['b', 'a',
'rank() over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as rank_val',
'dense_rank() over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as dense_rank_val']
# When generating the ordering try really hard to have duplicate values
assert_gpu_and_cpu_are_equal_sql(
lambda spark : two_col_df(spark, RepeatSeqGen(data_gen, length=500), RepeatSeqGen(data_gen, length=100), length=1024 * 14),
"window_agg_table",
'select ' +
', '.join(query_parts) +
' from window_agg_table ',
validate_execs_in_gpu_plan = ['GpuRunningWindowExec'],
conf = conf)
# This is for aggregations that work with a running window optimization. They don't need to be batched
# specially, but it only works if all of the aggregations can support this.
# In a distributed setup the order of the partitions returned might be different, so we must ignore the order
# but small batch sizes can make sort very slow, so do the final order by locally
@ignore_order(local=True)
@pytest.mark.parametrize('batch_size', ['1000', '1g'], ids=idfn) # set the batch size so we can test multiple stream batches
@pytest.mark.parametrize('b_gen, c_gen', [(long_gen, x) for x in running_part_and_order_gens] +
[(x, long_gen) for x in all_basic_gens_no_nans + [decimal_gen_scale_precision]], ids=idfn)
def test_window_running(b_gen, c_gen, batch_size):
conf = {'spark.rapids.sql.batchSizeBytes': batch_size,
'spark.rapids.sql.hasNans': False,
'spark.rapids.sql.variableFloatAgg.enabled': True,
'spark.rapids.sql.castFloatToDecimal.enabled': True}
query_parts = ['b', 'a', 'row_number() over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as row_num',
'rank() over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as rank_val',
'dense_rank() over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as dense_rank_val',
'count(c) over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as count_col',
'min(c) over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as min_col',
'max(c) over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as max_col']
# Decimal precision can grow too large. Float and Double can get odd results for Inf/-Inf because of ordering
if isinstance(c_gen.data_type, NumericType) and (not isinstance(c_gen, FloatGen)) and (not isinstance(c_gen, DoubleGen)) and (not isinstance(c_gen, DecimalGen)):
query_parts.append('sum(c) over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as sum_col')
assert_gpu_and_cpu_are_equal_sql(
lambda spark : three_col_df(spark, LongRangeGen(), RepeatSeqGen(b_gen, length=100), c_gen, length=1024 * 14),
"window_agg_table",
'select ' +
', '.join(query_parts) +
' from window_agg_table ',
validate_execs_in_gpu_plan = ['GpuRunningWindowExec'],
conf = conf)
# Test that we can do a running window sum on floats and doubles and decimal. This becomes problematic because we do the agg in parallel
# which means that the result can switch back and forth from Inf to not Inf depending on the order of aggregations.
# We test this by limiting the range of the values in the sum to never hit Inf, and by using abs so we don't have
# positive and negative values that interfere with each other.
# decimal is problematic if the precision is so high it falls back to the CPU.
# In a distributed setup the order of the partitions returned might be different, so we must ignore the order
# but small batch sizes can make sort very slow, so do the final order by locally
@ignore_order(local=True)
@pytest.mark.parametrize('batch_size', ['1000', '1g'], ids=idfn) # set the batch size so we can test multiple stream batches
def test_window_running_float_decimal_sum(batch_size):
conf = {'spark.rapids.sql.batchSizeBytes': batch_size,
'spark.rapids.sql.variableFloatAgg.enabled': True,
'spark.rapids.sql.castFloatToDecimal.enabled': True}
query_parts = ['b', 'a',
'sum(cast(c as double)) over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as dbl_sum',
'sum(abs(dbl)) over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as dbl_sum',
'sum(cast(c as float)) over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as flt_sum',
'sum(abs(flt)) over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as flt_sum',
'sum(cast(c as Decimal(6,1))) over (partition by b order by a rows between UNBOUNDED PRECEDING AND CURRENT ROW) as dec_sum']
gen = StructGen([('a', LongRangeGen()),('b', RepeatSeqGen(int_gen, length=1000)),('c', short_gen),('flt', float_gen),('dbl', double_gen)], nullable=False)
assert_gpu_and_cpu_are_equal_sql(
lambda spark : gen_df(spark, gen, length=1024 * 14),
"window_agg_table",
'select ' +
', '.join(query_parts) +
' from window_agg_table ',
validate_execs_in_gpu_plan = ['GpuRunningWindowExec'],
conf = conf)
# In a distributed setup the order of the partitions returned might be different, so we must ignore the order
# but small batch sizes can make sort very slow, so do the final order by locally
@ignore_order(local=True)
@approximate_float
@pytest.mark.parametrize('batch_size', ['1000', '1g'], ids=idfn) # set the batch size so we can test multiple stream batches
@pytest.mark.parametrize('c_gen', lead_lag_data_gens, ids=idfn)
@pytest.mark.parametrize('a_b_gen', part_and_order_gens, ids=meta_idfn('partAndOrderBy:'))
def test_multi_types_window_aggs_for_rows_lead_lag(a_b_gen, c_gen, batch_size):
conf = {'spark.rapids.sql.batchSizeBytes': batch_size,
'spark.rapids.sql.hasNans': False}
data_gen = [
('a', RepeatSeqGen(a_b_gen, length=20)),
('b', a_b_gen),
('c', c_gen)]
# By default for many operations a range of unbounded to unbounded is used
# This will not work until https://github.com/NVIDIA/spark-rapids/issues/216
# is fixed.
# Ordering needs to include c because with nulls and especially on booleans
# it is possible to get a different ordering when it is ambiguous.
base_window_spec = Window.partitionBy('a').orderBy('b', 'c')
inclusive_window_spec = base_window_spec.rowsBetween(-10, 100)
def do_it(spark):
df = gen_df(spark, data_gen, length=2048) \
.withColumn('inc_count_1', f.count('*').over(inclusive_window_spec)) \
.withColumn('inc_count_c', f.count('c').over(inclusive_window_spec)) \
.withColumn('lead_5_c', f.lead('c', 5).over(base_window_spec)) \
.withColumn('lag_1_c', f.lag('c', 1).over(base_window_spec)) \
.withColumn('row_num', f.row_number().over(base_window_spec))
if isinstance(c_gen, StructGen):
"""
The MIN()/MAX() aggregations amount to a RANGE query. These are not
currently supported on STRUCT columns.
Also, LEAD()/LAG() defaults cannot currently be specified for STRUCT
columns. `[ 10, 3.14159, "foobar" ]` isn't recognized as a valid STRUCT scalar.
"""
return df.withColumn('lead_def_c', f.lead('c', 2, None).over(base_window_spec)) \
.withColumn('lag_def_c', f.lag('c', 4, None).over(base_window_spec))
else:
default_val = gen_scalar_value(c_gen, force_no_nulls=False)
return df.withColumn('inc_max_c', f.max('c').over(inclusive_window_spec)) \
.withColumn('inc_min_c', f.min('c').over(inclusive_window_spec)) \
.withColumn('lead_def_c', f.lead('c', 2, default_val).over(base_window_spec)) \
.withColumn('lag_def_c', f.lag('c', 4, default_val).over(base_window_spec))
assert_gpu_and_cpu_are_equal_collect(do_it, conf=conf)
struct_with_arrays = StructGen(children=[
['child_int', int_gen],
['child_time', date_gen],
['child_string', string_gen],
['child_array', ArrayGen(int_gen, max_length=10)]])
lead_lag_struct_with_arrays_gen = [struct_with_arrays,
ArrayGen(struct_with_arrays, max_length=10),
StructGen(children=[['child_struct', struct_with_arrays]])]
@ignore_order(local=True)
@approximate_float
@pytest.mark.parametrize('struct_gen', lead_lag_struct_with_arrays_gen, ids=idfn)
@pytest.mark.parametrize('a_b_gen', part_and_order_gens, ids=meta_idfn('partAndOrderBy:'))
def test_lead_lag_for_structs_with_arrays(a_b_gen, struct_gen):
conf = {'spark.rapids.sql.hasNans': False}
data_gen = [
('a', RepeatSeqGen(a_b_gen, length=20)),
('b', IntegerGen(nullable=False, special_cases=[])),
('c', struct_gen)]
# By default for many operations a range of unbounded to unbounded is used
# This will not work until https://github.com/NVIDIA/spark-rapids/issues/216
# is fixed.
# Ordering needs to include c because with nulls and especially on booleans
# it is possible to get a different ordering when it is ambiguous.
base_window_spec = Window.partitionBy('a').orderBy('b')
def do_it(spark):
return gen_df(spark, data_gen, length=2048) \
.withColumn('lead_5_c', f.lead('c', 5).over(base_window_spec)) \
.withColumn('lag_1_c', f.lag('c', 1).over(base_window_spec))
assert_gpu_and_cpu_are_equal_collect(do_it, conf=conf)
lead_lag_array_data_gens =\
[ArrayGen(sub_gen, max_length=10) for sub_gen in lead_lag_data_gens] + \
[ArrayGen(ArrayGen(sub_gen, max_length=10), max_length=10) for sub_gen in lead_lag_data_gens] + \
[ArrayGen(ArrayGen(ArrayGen(sub_gen, max_length=10), max_length=10), max_length=10) \
for sub_gen in lead_lag_data_gens]
@ignore_order(local=True)
@pytest.mark.parametrize('d_gen', lead_lag_array_data_gens, ids=meta_idfn('agg:'))
@pytest.mark.parametrize('c_gen', [LongRangeGen()], ids=meta_idfn('orderBy:'))
@pytest.mark.parametrize('b_gen', [long_gen], ids=meta_idfn('orderBy:'))
@pytest.mark.parametrize('a_gen', [long_gen], ids=meta_idfn('partBy:'))
def test_window_aggs_for_rows_lead_lag_on_arrays(a_gen, b_gen, c_gen, d_gen):
data_gen = [
('a', RepeatSeqGen(a_gen, length=20)),
('b', b_gen),
('c', c_gen),
('d', d_gen),
('d_default', d_gen)]
assert_gpu_and_cpu_are_equal_sql(
lambda spark: gen_df(spark, data_gen, length=2048),
"window_agg_table",
'''
SELECT
LEAD(d, 5) OVER (PARTITION by a ORDER BY b,c) lead_d_5,
LEAD(d, 2, d_default) OVER (PARTITION by a ORDER BY b,c) lead_d_2_default,
LAG(d, 5) OVER (PARTITION by a ORDER BY b,c) lag_d_5,
LAG(d, 2, d_default) OVER (PARTITION by a ORDER BY b,c) lag_d_2_default
FROM window_agg_table
''')
# lead and lag don't currently work for string columns, so redo the tests, but just for strings
# without lead and lag
# In a distributed setup the order of the partitions returned might be different, so we must ignore the order
# but small batch sizes can make sort very slow, so do the final order by locally
@ignore_order(local=True)
@approximate_float
@pytest.mark.parametrize('c_gen', [string_gen], ids=idfn)
@pytest.mark.parametrize('a_b_gen', part_and_order_gens, ids=meta_idfn('partAndOrderBy:'))
def test_multi_types_window_aggs_for_rows(a_b_gen, c_gen):
data_gen = [
('a', RepeatSeqGen(a_b_gen, length=20)),
('b', a_b_gen),
('c', c_gen)]
# By default for many operations a range of unbounded to unbounded is used
# This will not work until https://github.com/NVIDIA/spark-rapids/issues/216
# is fixed.
# Ordering needs to include c because with nulls and especially on booleans
# it is possible to get a different ordering when it is ambiguous
baseWindowSpec = Window.partitionBy('a').orderBy('b', 'c')
inclusiveWindowSpec = baseWindowSpec.rowsBetween(-10, 100)
def do_it(spark):
return gen_df(spark, data_gen, length=2048) \
.withColumn('inc_count_1', f.count('*').over(inclusiveWindowSpec)) \
.withColumn('inc_count_c', f.count('c').over(inclusiveWindowSpec)) \
.withColumn('inc_max_c', f.max('c').over(inclusiveWindowSpec)) \
.withColumn('inc_min_c', f.min('c').over(inclusiveWindowSpec)) \
.withColumn('rank_val', f.rank().over(baseWindowSpec)) \
.withColumn('dense_rank_val', f.dense_rank().over(baseWindowSpec)) \
.withColumn('row_num', f.row_number().over(baseWindowSpec))
assert_gpu_and_cpu_are_equal_collect(do_it, conf={'spark.rapids.sql.hasNans': 'false'})
# Test for RANGE queries, with timestamp order-by expressions.
# In a distributed setup the order of the partitions returned might be different, so we must ignore the order
# but small batch sizes can make sort very slow, so do the final order by locally
@ignore_order(local=True)
@pytest.mark.parametrize('data_gen', [_grpkey_longs_with_timestamps,
pytest.param(_grpkey_longs_with_nullable_timestamps)],
ids=idfn)
def test_window_aggs_for_ranges_timestamps(data_gen):
assert_gpu_and_cpu_are_equal_sql(
lambda spark: gen_df(spark, data_gen, length=2048),
"window_agg_table",
'select '
' sum(c) over '
' (partition by a order by b asc '
' range between interval 1 DAY 5 HOUR 3 MINUTE 2 SECOND 1 MILLISECOND 5 MICROSECOND preceding '
' and interval 1 DAY 5 HOUR 3 MINUTE 2 SECOND 1 MILLISECOND 5 MICROSECOND following) as sum_c_asc, '
' avg(c) over '
' (partition by a order by b asc '
' range between interval 1 DAY 5 HOUR 3 MINUTE 2 SECOND 1 MILLISECOND 5 MICROSECOND preceding '
' and interval 1 DAY 5 HOUR 3 MINUTE 2 SECOND 1 MILLISECOND 5 MICROSECOND following) as avg_c_asc, '
' max(c) over '
' (partition by a order by b desc '
' range between interval 2 DAY 5 HOUR 3 MINUTE 2 SECOND 1 MILLISECOND 5 MICROSECOND preceding '
' and interval 1 DAY 5 HOUR 3 MINUTE 2 SECOND 1 MILLISECOND 5 MICROSECOND following) as max_c_desc, '
' min(c) over '
' (partition by a order by b asc '
' range between interval 2 DAY 5 HOUR 3 MINUTE 2 SECOND 1 MILLISECOND 5 MICROSECOND preceding '
' and current row) as min_c_asc, '
' count(1) over '
' (partition by a order by b asc '
' range between CURRENT ROW and UNBOUNDED following) as count_1_asc, '
' count(c) over '
' (partition by a order by b asc '
' range between CURRENT ROW and UNBOUNDED following) as count_c_asc, '
' avg(c) over '
' (partition by a order by b asc '
' range between UNBOUNDED preceding and CURRENT ROW) as avg_c_unbounded, '
' sum(c) over '
' (partition by a order by b asc '
' range between UNBOUNDED preceding and CURRENT ROW) as sum_c_unbounded, '
' max(c) over '
' (partition by a order by b asc '
' range between UNBOUNDED preceding and UNBOUNDED following) as max_c_unbounded '
'from window_agg_table',
conf = {'spark.rapids.sql.castFloatToDecimal.enabled': True})
_gen_data_for_collect_list = [
('a', RepeatSeqGen(LongGen(), length=20)),
('b', LongRangeGen()),
('c_bool', BooleanGen()),
('c_short', ShortGen()),
('c_int', IntegerGen()),
('c_long', LongGen()),
('c_date', DateGen()),
('c_ts', TimestampGen()),
('c_byte', ByteGen()),
('c_string', StringGen()),
('c_float', FloatGen()),
('c_double', DoubleGen()),
('c_decimal', DecimalGen(precision=8, scale=3)),
('c_struct', StructGen(children=[
['child_int', IntegerGen()],
['child_time', DateGen()],
['child_string', StringGen()],
['child_decimal', DecimalGen(precision=8, scale=3)]])),
('c_array', ArrayGen(int_gen)),
('c_map', simple_string_to_string_map_gen)]
# SortExec does not support array type, so sort the result locally.
@ignore_order(local=True)
def test_window_aggs_for_rows_collect_list():
assert_gpu_and_cpu_are_equal_sql(
lambda spark : gen_df(spark, _gen_data_for_collect_list),
"window_collect_table",
'''
select
collect_list(c_bool) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_bool,
collect_list(c_short) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_short,
collect_list(c_int) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_int,
collect_list(c_long) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_long,
collect_list(c_date) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_date,
collect_list(c_ts) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_ts,
collect_list(c_byte) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_byte,
collect_list(c_string) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_string,
collect_list(c_float) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_float,
collect_list(c_double) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_double,
collect_list(c_decimal) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_decimal,
collect_list(c_struct) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_struct,
collect_list(c_array) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_array,
collect_list(c_map) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as collect_map
from window_collect_table
''')
# SortExec does not support array type, so sort the result locally.
@ignore_order(local=True)
# This test is more directed at Databricks and their running window optimization instead of ours
# this is why we do not validate that we inserted in a GpuRunningWindowExec, yet.
def test_running_window_function_exec_for_all_aggs():
assert_gpu_and_cpu_are_equal_sql(
lambda spark : gen_df(spark, _gen_data_for_collect_list),
"window_collect_table",
'''
select
sum(c_int) over
(partition by a order by b,c_int rows between UNBOUNDED PRECEDING AND CURRENT ROW) as sum_int,
min(c_long) over
(partition by a order by b,c_int rows between UNBOUNDED PRECEDING AND CURRENT ROW) as min_long,
max(c_date) over
(partition by a order by b,c_int rows between UNBOUNDED PRECEDING AND CURRENT ROW) as max_date,
count(1) over
(partition by a order by b,c_int rows between UNBOUNDED PRECEDING AND CURRENT ROW) as count_1,
count(*) over
(partition by a order by b,c_int rows between UNBOUNDED PRECEDING AND CURRENT ROW) as count_star,
row_number() over
(partition by a order by b,c_int) as row_num,
rank() over
(partition by a order by b,c_int) as rank_val,
dense_rank() over
(partition by a order by b,c_int) as dense_rank_val,
collect_list(c_float) over
(partition by a order by b,c_int rows between UNBOUNDED PRECEDING AND CURRENT ROW) as collect_float,
collect_list(c_decimal) over
(partition by a order by b,c_int rows between UNBOUNDED PRECEDING AND CURRENT ROW) as collect_decimal,
collect_list(c_struct) over
(partition by a order by b,c_int rows between UNBOUNDED PRECEDING AND CURRENT ROW) as collect_struct
from window_collect_table
''')
# Generates some repeated values to test the deduplication of GpuCollectSet.
# And GpuCollectSet does not yet support struct type.
_gen_data_for_collect_set = [
('a', RepeatSeqGen(LongGen(), length=20)),
('b', LongRangeGen()),
('c_bool', RepeatSeqGen(BooleanGen(), length=15)),
('c_int', RepeatSeqGen(IntegerGen(), length=15)),
('c_long', RepeatSeqGen(LongGen(), length=15)),
('c_short', RepeatSeqGen(ShortGen(), length=15)),
('c_date', RepeatSeqGen(DateGen(), length=15)),
('c_timestamp', RepeatSeqGen(TimestampGen(), length=15)),
('c_byte', RepeatSeqGen(ByteGen(), length=15)),
('c_string', RepeatSeqGen(StringGen(), length=15)),
('c_float', RepeatSeqGen(FloatGen(), length=15)),
('c_double', RepeatSeqGen(DoubleGen(), length=15)),
('c_decimal', RepeatSeqGen(DecimalGen(precision=8, scale=3), length=15)),
# case to verify the NAN_UNEQUAL strategy
('c_fp_nan', RepeatSeqGen(FloatGen().with_special_case(math.nan, 200.0), length=5)),
]
# SortExec does not support array type, so sort the result locally.
@ignore_order(local=True)
def test_window_aggs_for_rows_collect_set():
assert_gpu_and_cpu_are_equal_sql(
lambda spark: gen_df(spark, _gen_data_for_collect_set),
"window_collect_table",
'''
select a, b,
sort_array(cc_bool),
sort_array(cc_int),
sort_array(cc_long),
sort_array(cc_short),
sort_array(cc_date),
sort_array(cc_ts),
sort_array(cc_byte),
sort_array(cc_str),
sort_array(cc_float),
sort_array(cc_double),
sort_array(cc_decimal),
sort_array(cc_fp_nan)
from (
select a, b,
collect_set(c_bool) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as cc_bool,
collect_set(c_int) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as cc_int,
collect_set(c_long) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as cc_long,
collect_set(c_short) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as cc_short,
collect_set(c_date) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as cc_date,
collect_set(c_timestamp) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as cc_ts,
collect_set(c_byte) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as cc_byte,
collect_set(c_string) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as cc_str,
collect_set(c_float) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as cc_float,
collect_set(c_double) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as cc_double,
collect_set(c_decimal) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as cc_decimal,
collect_set(c_fp_nan) over
(partition by a order by b,c_int rows between CURRENT ROW and UNBOUNDED FOLLOWING) as cc_fp_nan
from window_collect_table
) t
''')
# In a distributed setup the order of the partitions returned might be different, so we must ignore the order
# but small batch sizes can make sort very slow, so do the final order by locally
@ignore_order(local=True)
@pytest.mark.parametrize('part_gen', [StructGen([["a", long_gen]]), ArrayGen(long_gen)], ids=meta_idfn('partBy:'))
# For arrays the sort and hash partition are also not supported
@allow_non_gpu('WindowExec', 'Alias', 'WindowExpression', 'AggregateExpression', 'Count', 'WindowSpecDefinition', 'SpecifiedWindowFrame', 'Literal', 'SortExec', 'SortOrder', 'ShuffleExchangeExec', 'HashPartitioning')
def test_nested_part_fallback(part_gen):
data_gen = [
('a', RepeatSeqGen(part_gen, length=20)),
('b', LongRangeGen()),
('c', int_gen)]
window_spec = Window.partitionBy('a').orderBy('b').rowsBetween(-5, 5)
def do_it(spark):
return gen_df(spark, data_gen, length=2048) \
.withColumn('rn', f.count('c').over(window_spec))
assert_gpu_fallback_collect(do_it, 'WindowExec')
# In a distributed setup the order of the partitions returend might be different, so we must ignore the order
# but small batch sizes can make sort very slow, so do the final order by locally
@ignore_order(local=True)
@pytest.mark.parametrize('ride_along', all_basic_gens + decimal_gens + array_gens_sample + struct_gens_sample + map_gens_sample, ids=idfn)
def test_window_ride_along(ride_along):
assert_gpu_and_cpu_are_equal_sql(
lambda spark : gen_df(spark, [('a', LongRangeGen()), ('b', ride_along)]),
"window_agg_table",
'select *,'
' row_number() over (order by a) as row_num '
'from window_agg_table ',
conf = allow_negative_scale_of_decimal_conf)
| 53.784633 | 216 | 0.673557 | 6,622 | 46,201 | 4.492298 | 0.0746 | 0.027531 | 0.043364 | 0.038725 | 0.815584 | 0.786103 | 0.762102 | 0.73689 | 0.70835 | 0.690366 | 0 | 0.017963 | 0.228826 | 46,201 | 858 | 217 | 53.847319 | 0.816975 | 0.172594 | 0 | 0.497345 | 0 | 0.033628 | 0.329777 | 0.030167 | 0 | 0 | 0 | 0 | 0.040708 | 1 | 0.049558 | false | 0 | 0.015929 | 0.00708 | 0.077876 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0ef64d6c5d081831904e96f1a495fc326c781bd3 | 4,145 | py | Python | collect_images.py | hwyncho/FacialRecognition | af5eed8f28ade79918a24cd6cfb5722b14814461 | [
"MIT"
] | 9 | 2017-09-25T14:47:31.000Z | 2022-01-17T10:12:41.000Z | collect_images.py | hwyncho/FacialRecognition | af5eed8f28ade79918a24cd6cfb5722b14814461 | [
"MIT"
] | null | null | null | collect_images.py | hwyncho/FacialRecognition | af5eed8f28ade79918a24cd6cfb5722b14814461 | [
"MIT"
] | 9 | 2017-09-25T14:47:32.000Z | 2020-07-24T02:19:31.000Z | _TIMEOUT = 10
def _collect_from_bing(q, start=0, stop=28, save_dir='./'):
"""
Search and collects images from Bing.
Parameters
==========
q : str
search keyword
start : int
first index of images want to collect
stop : int
last index of images want to collect
save_dir : str
directory of images want to save
"""
import math
import os
import requests
from pyquery import PyQuery as pq
# check the directory exists.
if not os.path.exists('{0}/{1}'.format(save_dir, q)):
os.makedirs('{0}/{1}'.format(save_dir, q))
url = 'https://www.bing.com/images/search'
params = {
'q': q,
'form': 'A',
'qft': '+filterui:face-face'
}
# crawl links of images
links = []
for n in range(start, stop, 28):
params['first'] = n
params['count'] = 28
response = requests.get(url=url, params=params, timeout=_TIMEOUT)
html = pq(response.text)
# parse links of images
count = 0
for item in html('#main .row .item .thumb').items():
links.append(item.attr('href'))
count += 1
if count < 28:
break
# save images
a = int(math.log10(stop)) + 1
save_count = 0
if len(links) > 0:
for (i, link) in enumerate(links):
try:
img = requests.get(url=link, timeout=_TIMEOUT).content
except:
continue
file_name = str(i).zfill(a)
with open('{0}/{1}/bing_{2}.jpg'.format(save_dir, q, file_name), 'wb') as f:
f.write(img)
save_count += 1
print('Number of images saved is : {}'.format(save_count))
def _collect_from_google(q, start=0, stop=20, save_dir='./'):
"""
Search and collects images from Google.
Parameters
==========
q : str
search keyword
start : int
first index of images want to collect
stop : int
last index of images want to collect
save_dir : str
directory of images want to save
"""
import math
import os
import requests
from pyquery import PyQuery as pq
# check the directory exists.
if not os.path.exists('{0}/{1}'.format(save_dir, q)):
os.makedirs('{0}/{1}'.format(save_dir, q))
url = 'https://www.google.com/search'
params = {
'q': q,
'tbm': 'isch',
'tbs': 'itp:face'
}
# crawl links of images
links = []
for n in range(start, stop, 20):
params['start'] = n
response = requests.get(url=url, params=params, timeout=_TIMEOUT)
html = pq(response.text)
# parse links of images
for item in html('#ires tr a img').items():
links.append(item.attr('src'))
# save images
a = int(math.log10(stop)) + 1
save_count = 0
if len(links) > 0:
for (i, link) in enumerate(links):
try:
img = requests.get(url=link, timeout=_TIMEOUT).content
except:
continue
file_name = str(i).zfill(a)
with open('{0}/{1}/google_{2}.jpg'.format(save_dir, q, file_name), 'wb') as f:
f.write(img)
save_count += 1
print('Number of images saved is : {}'.format(save_count))
def collect(from_, q, start=0, stop=20, save_dir='./'):
"""
Search and collects images.
Parameters
==========
from_ : str
from which sites to collect
q : str
search keyword
start : int
first index of images want to collect
stop : int
last index of images want to collect
save_dir : str
directory of images want to save
"""
if from_ == 'all':
_collect_from_bing(q, start, stop, save_dir)
_collect_from_google(q, start, stop, save_dir)
elif from_ == 'bing':
_collect_from_bing(q, start, stop, save_dir)
elif from_ == 'google':
_collect_from_google(q, start, stop, save_dir)
else:
raise ValueError("argument 'from_' must be one of 'all', 'bing', and 'google'.")
| 26.401274 | 90 | 0.550784 | 548 | 4,145 | 4.062044 | 0.217153 | 0.050314 | 0.048518 | 0.056604 | 0.836029 | 0.802336 | 0.802336 | 0.778077 | 0.718778 | 0.718778 | 0 | 0.016732 | 0.322316 | 4,145 | 156 | 91 | 26.570513 | 0.775721 | 0.227021 | 0 | 0.609756 | 0 | 0 | 0.126365 | 0.007278 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036585 | false | 0 | 0.097561 | 0 | 0.134146 | 0.02439 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
164fb8583a195b2c1253489c868a6642832056c1 | 679 | py | Python | python/anyascii/_data/_025.py | casept/anyascii | d4f426b91751254b68eaa84c6cd23099edd668e6 | [
"ISC"
] | null | null | null | python/anyascii/_data/_025.py | casept/anyascii | d4f426b91751254b68eaa84c6cd23099edd668e6 | [
"ISC"
] | null | null | null | python/anyascii/_data/_025.py | casept/anyascii | d4f426b91751254b68eaa84c6cd23099edd668e6 | [
"ISC"
] | null | null | null | b='- - | | - - | | - - | | + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + - - | | - | + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + / \\ X - | - | - | - | - | - | # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # :black_small_square: :white_small_square: # # # # # # ^ ^ ^ ^ :arrow_forward: > > > > > v v v v :arrow_backward: < < < < < * * * * * * * * * * * * * * * * ( ) * * * * * * * * * * / \\ / \\ * # # # # # ^ ^ ^ * # # # # * * * * / \\ \\ :white_medium_square: :black_medium_square: :white_medium_small_square: :black_medium_small_square: /' | 679 | 679 | 0.231222 | 30 | 679 | 4.7 | 0.366667 | 0.312057 | 0.042553 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.375552 | 679 | 1 | 679 | 679 | 0.332547 | 0 | 0 | 0 | 0 | 1 | 0.992647 | 0.141176 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
1682ac558dff860c7c0f8e88943157dab077c76e | 70 | py | Python | datasets/__init__.py | Cppowboy/StaticHyperNetwork | 63c9cc17d1ebf9809129e736bbfddf1bf0374fdd | [
"Apache-2.0"
] | null | null | null | datasets/__init__.py | Cppowboy/StaticHyperNetwork | 63c9cc17d1ebf9809129e736bbfddf1bf0374fdd | [
"Apache-2.0"
] | null | null | null | datasets/__init__.py | Cppowboy/StaticHyperNetwork | 63c9cc17d1ebf9809129e736bbfddf1bf0374fdd | [
"Apache-2.0"
] | null | null | null | from datasets.mnist import Mnist
from datasets.cifar10 import Cifar10
| 23.333333 | 36 | 0.857143 | 10 | 70 | 6 | 0.5 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064516 | 0.114286 | 70 | 2 | 37 | 35 | 0.903226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
1695ab45c352ee1488606757ec9244f4692dc9c4 | 66 | py | Python | dlist_top/types/__init__.py | dlist-top/client-py | f69c2c1fff7f93836cb4bbdc7247f9b675c6a940 | [
"MIT"
] | null | null | null | dlist_top/types/__init__.py | dlist-top/client-py | f69c2c1fff7f93836cb4bbdc7247f9b675c6a940 | [
"MIT"
] | null | null | null | dlist_top/types/__init__.py | dlist-top/client-py | f69c2c1fff7f93836cb4bbdc7247f9b675c6a940 | [
"MIT"
] | null | null | null | from .entity import *
from .payload import *
from .events import * | 22 | 22 | 0.742424 | 9 | 66 | 5.444444 | 0.555556 | 0.408163 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 66 | 3 | 23 | 22 | 0.890909 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
16e29a738da69e33c8e8d534e74e38b338e347ae | 25,745 | py | Python | tests/layer_tests/onnx_tests/test_conv.py | pfinashx/openvino | 1d417e888b508415510fb0a92e4a9264cf8bdef7 | [
"Apache-2.0"
] | 1 | 2022-02-26T17:33:44.000Z | 2022-02-26T17:33:44.000Z | tests/layer_tests/onnx_tests/test_conv.py | pfinashx/openvino | 1d417e888b508415510fb0a92e4a9264cf8bdef7 | [
"Apache-2.0"
] | 18 | 2022-01-21T08:42:58.000Z | 2022-03-28T13:21:31.000Z | tests/layer_tests/onnx_tests/test_conv.py | pfinashx/openvino | 1d417e888b508415510fb0a92e4a9264cf8bdef7 | [
"Apache-2.0"
] | null | null | null | # Copyright (C) 2018-2022 Intel Corporation
# SPDX-License-Identifier: Apache-2.0
import os
import numpy as np
import pytest
from common.layer_test_class import check_ir_version
from common.onnx_layer_test_class import OnnxRuntimeLayerTest
from unit_tests.utils.graph import build_graph
class TestConv(OnnxRuntimeLayerTest):
def _prepare_input(self, inputs_dict):
for input in inputs_dict.keys():
inputs_dict[input] = np.random.randn(*inputs_dict[input]).astype(np.float32)
return inputs_dict
def create_net(self, shape, weights_shape, dilations, group, pads, strides, bias, ir_version, auto_pad=None):
"""
ONNX net IR net
Input->Conv->Output => Input->Convolution
"""
#
# Create ONNX model
#
import onnx
from onnx import helper
from onnx import TensorProto
output_shape = np.array(shape)
output_shape[1] = group
_pads = np.array(pads).reshape([2, -1])
kernel_extent = np.array(dilations) * (np.array(weights_shape[2:]) - 1) + 1
spatial_val_wo_stride = shape[2:] + np.add(_pads[0, :], _pads[1, :]) - kernel_extent
output_shape[2:] = (spatial_val_wo_stride.astype(np.float) / strides + 1).astype(np.int64)
output_shape = output_shape.astype(np.int).tolist()
input = helper.make_tensor_value_info('input', TensorProto.FLOAT, shape)
output = helper.make_tensor_value_info('output', TensorProto.FLOAT, output_shape)
weights_const = np.random.randn(*weights_shape).astype(np.float32)
node_weights_def = onnx.helper.make_node(
'Constant',
inputs=[],
outputs=['weights'],
value=helper.make_tensor(
name='const_tensor',
data_type=TensorProto.FLOAT,
dims=weights_const.shape,
vals=weights_const.flatten(),
),
)
conv_args = dict(kernel_shape=weights_shape[2:],
dilations=dilations,
group=group,
strides=strides)
if pads and auto_pad not in ['SAME_UPPER', 'SAME_LOWER']:
conv_args['pads'] = pads
if auto_pad:
conv_args['auto_pad'] = auto_pad
if bias:
bias_const = np.random.randint(-10, 10, weights_shape[0]).astype(np.float32)
node_bias_def = onnx.helper.make_node(
'Constant',
inputs=[],
outputs=['bias'],
value=helper.make_tensor(
name='const_tensor',
data_type=TensorProto.FLOAT,
dims=bias_const.shape,
vals=bias_const.flatten(),
),
)
node_def = onnx.helper.make_node(
'Conv',
inputs=['input', 'weights', 'bias'],
outputs=['output'],
**conv_args
)
nodes = [node_weights_def, node_bias_def, node_def]
else:
node_def = onnx.helper.make_node(
'Conv',
inputs=['input', 'weights'],
outputs=['output'],
**conv_args
)
nodes = [node_weights_def, node_def]
# Create the graph (GraphProto)
graph_def = helper.make_graph(
nodes,
'test_model',
[input],
[output],
)
# Create the model (ModelProto)
onnx_net = helper.make_model(graph_def, producer_name='test_model')
#
# Create reference IR net
#
ref_net = None
if check_ir_version(10, None, ir_version):
if len(shape) == 3:
input_shape = shape.copy()
input_shape.insert(2, 1)
node_shape = output_shape.copy()
node_shape.insert(2, 1)
nodes_attributes = {
'input': {'kind': 'op', 'type': 'Parameter'},
'input_data': {'shape': shape, 'kind': 'data'},
'before_shape_const_indata': {'shape': [len(input_shape)], 'value': input_shape, 'kind': 'data'},
'before_shape_const': {'kind': 'op', 'type': 'Const'},
'before_shape_const_data': {'shape': [len(input_shape)], 'kind': 'data'},
'reshape_before': {'kind': 'op', 'type': 'Reshape'},
'reshape_before_data': {'shape': input_shape, 'kind': 'data'},
'kernel_indata': {'kind': 'data', 'shape': [len(weights_const.flatten())]},
'kernel': {'kind': 'op', 'type': 'Const'},
'kernel_data': {'kind': 'data', 'value': None},
'node': {'kind': 'op', 'type': 'Convolution' if group == 1 else 'GroupConvolution',
'dilations': [1, dilations[0]],
'pads_begin': [0, _pads[0, 0]], 'pads_end': [0, _pads[1, 0]]},
'node_data': {'shape': node_shape, 'kind': 'data'},
'after_shape_const_indata': {'shape': [len(output_shape)], 'value': output_shape, 'kind': 'data'},
'after_shape_const': {'kind': 'op', 'type': 'Const'},
'after_shape_const_data': {'shape': [len(output_shape)], 'kind': 'data'},
'reshape_after': {'kind': 'op', 'type': 'Reshape'},
'reshape_after_data': {'shape': output_shape, 'kind': 'data'},
'result': {'kind': 'op', 'type': 'Result'}}
edges = [('input', 'input_data'),
('input_data', 'reshape_before'),
('before_shape_const_indata', 'before_shape_const'),
('before_shape_const', 'before_shape_const_data'),
('before_shape_const_data', 'reshape_before'),
('reshape_before', 'reshape_before_data'),
('reshape_before_data', 'node'),
('kernel_indata', 'kernel'),
('kernel', 'kernel_data'),
('kernel_data', 'node'),
('node', 'node_data'),
('node_data', 'reshape_after'),
('after_shape_const_indata', 'after_shape_const'),
('after_shape_const', 'after_shape_const_data'),
('after_shape_const_data', 'reshape_after'),
('reshape_after', 'reshape_after_data')]
if bias:
nodes_attributes.update({'const_indata': {'kind': 'data', 'value': bias_const.flatten()},
'const': {'kind': 'op', 'type': 'Const'},
'const_data': {'kind': 'data', 'shape': None},
'bias': {'type': 'Add', 'kind': 'op'},
'bias_data': {'kind': 'data', 'shape': output_shape}})
edges += [('reshape_after_data', 'bias'),
('const_indata', 'const'),
('const', 'const_data'),
('const_data', 'bias'),
('bias', 'bias_data'),
('bias_data', 'result')]
else:
edges += [('reshape_after_data', 'result')]
ref_net = build_graph(nodes_attributes, edges)
else:
_weights_shape = weights_shape.copy()
if group != 1:
_weights_shape.insert(1, 1)
nodes_attributes = {
'input': {'kind': 'op', 'type': 'Parameter'},
'input_data': {'shape': shape, 'kind': 'data'},
'kernel_indata': {'kind': 'data', 'value': weights_const.flatten()},
'kernel': {'kind': 'op', 'type': 'Const'},
'kernel_data': {'kind': 'data', 'shape': _weights_shape},
'node': {'kind': 'op', 'type': 'Convolution' if group == 1 else 'GroupConvolution',
'dilations': dilations, 'pads_begin': _pads[0, :], 'pads_end': _pads[1, :]},
'node_data': {'shape': output_shape, 'kind': 'data'},
'result': {'kind': 'op', 'type': 'Result'}}
edges = [('input', 'input_data'),
('input_data', 'node'),
('kernel_indata', 'kernel'),
('kernel', 'kernel_data'),
('kernel_data', 'node'),
('node', 'node_data')]
if bias:
nodes_attributes.update({'const_indata': {'kind': 'data', 'value': bias_const.flatten()},
'const': {'kind': 'op', 'type': 'Const'},
'const_data': {'kind': 'data', 'shape': None},
'bias': {'type': 'Add', 'kind': 'op'},
'bias_data': {'kind': 'data', 'shape': output_shape}})
edges += [('node_data', 'bias'),
('const_indata', 'const'),
('const', 'const_data'),
('const_data', 'bias'),
('bias', 'bias_data'),
('bias_data', 'result')]
else:
edges += [('node_data', 'result')]
ref_net = build_graph(nodes_attributes, edges)
return onnx_net, ref_net
test_data_3D = [
dict(weights_shape=[1, 3, 3], group=1),
dict(weights_shape=[1, 3, 5], group=1),
dict(weights_shape=[3, 1, 3], group=3),
dict(weights_shape=[3, 1, 5], group=3)]
test_data_3D_autopad = [
dict(weights_shape=[1, 3, 3], group=1, pads=[1, 1], strides=[1], dilations=[1]),
dict(weights_shape=[1, 3, 3], group=1, pads=[2, 2], strides=[1], dilations=[2]),
dict(weights_shape=[1, 3, 3], group=1, pads=[1, 1], strides=[2], dilations=[1]),
dict(weights_shape=[1, 3, 3], group=1, pads=[2, 2], strides=[2], dilations=[2]),
dict(weights_shape=[1, 3, 5], group=1, pads=[2, 2], strides=[1], dilations=[1]),
dict(weights_shape=[1, 3, 5], group=1, pads=[4, 4], strides=[1], dilations=[2]),
dict(weights_shape=[1, 3, 5], group=1, pads=[2, 2], strides=[2], dilations=[1]),
dict(weights_shape=[1, 3, 5], group=1, pads=[4, 4], strides=[2], dilations=[2]),
dict(weights_shape=[3, 1, 3], group=3, pads=[1, 1], strides=[1], dilations=[1]),
dict(weights_shape=[3, 1, 3], group=3, pads=[2, 2], strides=[1], dilations=[2]),
dict(weights_shape=[3, 1, 3], group=3, pads=[1, 1], strides=[2], dilations=[1]),
dict(weights_shape=[3, 1, 3], group=3, pads=[2, 2], strides=[2], dilations=[2]),
dict(weights_shape=[3, 1, 5], group=3, pads=[2, 2], strides=[1], dilations=[1]),
dict(weights_shape=[3, 1, 5], group=3, pads=[4, 4], strides=[1], dilations=[2]),
dict(weights_shape=[3, 1, 5], group=3, pads=[2, 2], strides=[2], dilations=[1]),
dict(weights_shape=[3, 1, 5], group=3, pads=[4, 4], strides=[2], dilations=[2])]
test_data_4D_precommit = [
dict(weights_shape=[1, 3, 3, 3], group=1),
dict(weights_shape=[3, 1, 3, 3], group=3)]
test_data_4D = [
dict(weights_shape=[1, 3, 3, 3], group=1),
dict(weights_shape=[1, 3, 5, 3], group=1),
dict(weights_shape=[3, 1, 3, 3], group=3),
dict(weights_shape=[3, 1, 3, 5], group=3)]
test_data_4D_autopad = [
dict(weights_shape=[1, 3, 3, 3], group=1, pads=[1, 1, 1, 1], strides=[1, 1], dilations=[1, 1]),
dict(weights_shape=[1, 3, 3, 3], group=1, pads=[2, 2, 2, 2], strides=[1, 1], dilations=[2, 2]),
dict(weights_shape=[1, 3, 3, 3], group=1, pads=[3, 5, 3, 5], strides=[1, 1], dilations=[3, 5]),
dict(weights_shape=[1, 3, 3, 3], group=1, pads=[1, 1, 1, 1], strides=[2, 2], dilations=[1, 1]),
dict(weights_shape=[1, 3, 3, 3], group=1, pads=[2, 2, 2, 2], strides=[2, 2], dilations=[2, 2]),
dict(weights_shape=[1, 3, 3, 3], group=1, pads=[3, 5, 3, 5], strides=[2, 2], dilations=[3, 5]),
dict(weights_shape=[1, 3, 3, 3], group=1, pads=[1, 0, 1, 0], strides=[3, 5], dilations=[1, 1]),
dict(weights_shape=[1, 3, 3, 3], group=1, pads=[2, 0, 2, 0], strides=[3, 5], dilations=[2, 2]),
dict(weights_shape=[1, 3, 3, 3], group=1, pads=[3, 3, 3, 3], strides=[3, 5], dilations=[3, 5]),
dict(weights_shape=[1, 3, 5, 3], group=1, pads=[2, 1, 2, 1], strides=[1, 1], dilations=[1, 1]),
dict(weights_shape=[1, 3, 5, 3], group=1, pads=[4, 2, 4, 2], strides=[1, 1], dilations=[2, 2]),
dict(weights_shape=[1, 3, 5, 3], group=1, pads=[6, 5, 6, 5], strides=[1, 1], dilations=[3, 5]),
dict(weights_shape=[1, 3, 5, 3], group=1, pads=[2, 1, 2, 1], strides=[2, 2], dilations=[1, 1]),
dict(weights_shape=[1, 3, 5, 3], group=1, pads=[4, 2, 4, 2], strides=[2, 2], dilations=[2, 2]),
dict(weights_shape=[1, 3, 5, 3], group=1, pads=[6, 5, 6, 5], strides=[2, 2], dilations=[3, 5]),
dict(weights_shape=[1, 3, 5, 3], group=1, pads=[2, 0, 2, 0], strides=[3, 5], dilations=[1, 1]),
dict(weights_shape=[1, 3, 5, 3], group=1, pads=[4, 0, 4, 0], strides=[3, 5], dilations=[2, 2]),
dict(weights_shape=[1, 3, 5, 3], group=1, pads=[6, 3, 6, 3], strides=[3, 5], dilations=[3, 5]),
dict(weights_shape=[3, 1, 3, 3], group=3, pads=[1, 1, 1, 1], strides=[1, 1], dilations=[1, 1]),
dict(weights_shape=[3, 1, 3, 3], group=3, pads=[2, 2, 2, 2], strides=[1, 1], dilations=[2, 2]),
dict(weights_shape=[3, 1, 3, 3], group=3, pads=[3, 5, 3, 5], strides=[1, 1], dilations=[3, 5]),
dict(weights_shape=[3, 1, 3, 3], group=3, pads=[1, 1, 1, 1], strides=[2, 2], dilations=[1, 1]),
dict(weights_shape=[3, 1, 3, 3], group=3, pads=[2, 2, 2, 2], strides=[2, 2], dilations=[2, 2]),
dict(weights_shape=[3, 1, 3, 3], group=3, pads=[3, 5, 3, 5], strides=[2, 2], dilations=[3, 5]),
dict(weights_shape=[3, 1, 3, 3], group=3, pads=[1, 0, 1, 0], strides=[3, 5], dilations=[1, 1]),
dict(weights_shape=[3, 1, 3, 3], group=3, pads=[2, 0, 2, 0], strides=[3, 5], dilations=[2, 2]),
dict(weights_shape=[3, 1, 3, 3], group=3, pads=[3, 3, 3, 3], strides=[3, 5], dilations=[3, 5]),
dict(weights_shape=[3, 1, 3, 5], group=3, pads=[1, 2, 1, 2], strides=[1, 1], dilations=[1, 1]),
dict(weights_shape=[3, 1, 3, 5], group=3, pads=[2, 4, 2, 4], strides=[1, 1], dilations=[2, 2]),
dict(weights_shape=[3, 1, 3, 5], group=3, pads=[3, 10, 3, 10], strides=[1, 1], dilations=[3, 5]),
dict(weights_shape=[3, 1, 3, 5], group=3, pads=[1, 2, 1, 2], strides=[2, 2], dilations=[1, 1]),
dict(weights_shape=[3, 1, 3, 5], group=3, pads=[2, 4, 2, 4], strides=[2, 2], dilations=[2, 2]),
dict(weights_shape=[3, 1, 3, 5], group=3, pads=[3, 10, 3, 10], strides=[2, 2], dilations=[3, 5]),
dict(weights_shape=[3, 1, 3, 5], group=3, pads=[1, 0, 1, 0], strides=[3, 5], dilations=[1, 1]),
dict(weights_shape=[3, 1, 3, 5], group=3, pads=[2, 2, 2, 2], strides=[3, 5], dilations=[2, 2]),
dict(weights_shape=[3, 1, 3, 5], group=3, pads=[3, 8, 3, 8], strides=[3, 5], dilations=[3, 5])]
test_data_5D_precommit = [
dict(weights_shape=[1, 3, 3, 3, 3], group=1),
dict(weights_shape=[3, 1, 3, 3, 3], group=3)]
test_data_5D = [
dict(weights_shape=[1, 3, 3, 3, 3], group=1),
dict(weights_shape=[1, 3, 3, 4, 5], group=1),
dict(weights_shape=[3, 1, 3, 3, 3], group=3),
dict(weights_shape=[3, 1, 5, 4, 3], group=3)]
test_data_5D_autopad = [
dict(weights_shape=[1, 3, 3, 3, 3], group=1, pads=[1, 1, 1, 1, 1, 1], strides=[1, 1, 1], dilations=[1, 1, 1]),
dict(weights_shape=[1, 3, 3, 3, 3], group=1, pads=[2, 2, 2, 2, 2, 2], strides=[1, 1, 1], dilations=[2, 2, 2]),
dict(weights_shape=[1, 3, 3, 3, 3], group=1, pads=[3, 4, 5, 3, 4, 5], strides=[1, 1, 1], dilations=[3, 4, 5]),
dict(weights_shape=[1, 3, 3, 3, 3], group=1, pads=[1, 1, 1, 1, 1, 1], strides=[2, 2, 2], dilations=[1, 1, 1]),
dict(weights_shape=[1, 3, 3, 3, 3], group=1, pads=[2, 2, 2, 2, 2, 2], strides=[2, 2, 2], dilations=[2, 2, 2]),
dict(weights_shape=[1, 3, 3, 3, 3], group=1, pads=[3, 4, 5, 3, 4, 5], strides=[2, 2, 2], dilations=[3, 4, 5]),
dict(weights_shape=[1, 3, 3, 3, 3], group=1, pads=[1, 1, 0, 1, 1, 0], strides=[3, 4, 5], dilations=[1, 1, 1]),
dict(weights_shape=[1, 3, 3, 3, 3], group=1, pads=[2, 2, 0, 2, 2, 0], strides=[3, 4, 5], dilations=[2, 2, 2]),
dict(weights_shape=[1, 3, 3, 3, 3], group=1, pads=[3, 4, 3, 3, 4, 3], strides=[3, 4, 5], dilations=[3, 4, 5]),
dict(weights_shape=[1, 3, 3, 4, 5], group=1, pads=[1, 1, 2, 1, 2, 2], strides=[1, 1, 1], dilations=[1, 1, 1]),
dict(weights_shape=[1, 3, 3, 4, 5], group=1, pads=[2, 3, 4, 2, 3, 4], strides=[1, 1, 1], dilations=[2, 2, 2]),
dict(weights_shape=[1, 3, 3, 4, 5], group=1, pads=[3, 6, 10, 3, 6, 10], strides=[1, 1, 1], dilations=[3, 4, 5]),
dict(weights_shape=[1, 3, 3, 4, 5], group=1, pads=[1, 1, 2, 1, 2, 2], strides=[2, 2, 2], dilations=[1, 1, 1]),
dict(weights_shape=[1, 3, 3, 4, 5], group=1, pads=[2, 3, 4, 2, 3, 4], strides=[2, 2, 2], dilations=[2, 2, 2]),
dict(weights_shape=[1, 3, 3, 4, 5], group=1, pads=[3, 6, 10, 3, 6, 10], strides=[2, 2, 2], dilations=[3, 4, 5]),
dict(weights_shape=[1, 3, 3, 4, 5], group=1, pads=[1, 1, 0, 1, 2, 0], strides=[3, 4, 5], dilations=[1, 1, 1]),
dict(weights_shape=[1, 3, 3, 4, 5], group=1, pads=[2, 3, 2, 2, 3, 2], strides=[3, 4, 5], dilations=[2, 2, 2]),
dict(weights_shape=[1, 3, 3, 4, 5], group=1, pads=[3, 6, 8, 3, 6, 8], strides=[3, 4, 5], dilations=[3, 4, 5]),
dict(weights_shape=[3, 1, 3, 3, 3], group=3, pads=[1, 1, 1, 1, 1, 1], strides=[1, 1, 1], dilations=[1, 1, 1]),
dict(weights_shape=[3, 1, 3, 3, 3], group=3, pads=[2, 2, 2, 2, 2, 2], strides=[1, 1, 1], dilations=[2, 2, 2]),
dict(weights_shape=[3, 1, 3, 3, 3], group=3, pads=[3, 4, 5, 3, 4, 5], strides=[1, 1, 1], dilations=[3, 4, 5]),
dict(weights_shape=[3, 1, 3, 3, 3], group=3, pads=[1, 1, 1, 1, 1, 1], strides=[2, 2, 2], dilations=[1, 1, 1]),
dict(weights_shape=[3, 1, 3, 3, 3], group=3, pads=[2, 2, 2, 2, 2, 2], strides=[2, 2, 2], dilations=[2, 2, 2]),
dict(weights_shape=[3, 1, 3, 3, 3], group=3, pads=[3, 4, 5, 3, 4, 5], strides=[2, 2, 2], dilations=[3, 4, 5]),
dict(weights_shape=[3, 1, 3, 3, 3], group=3, pads=[1, 1, 0, 1, 1, 0], strides=[3, 4, 5], dilations=[1, 1, 1]),
dict(weights_shape=[3, 1, 3, 3, 3], group=3, pads=[2, 2, 0, 2, 2, 0], strides=[3, 4, 5], dilations=[2, 2, 2]),
dict(weights_shape=[3, 1, 3, 3, 3], group=3, pads=[3, 4, 3, 3, 4, 3], strides=[3, 4, 5], dilations=[3, 4, 5]),
dict(weights_shape=[3, 1, 5, 4, 3], group=3, pads=[2, 1, 1, 2, 2, 1], strides=[1, 1, 1], dilations=[1, 1, 1]),
dict(weights_shape=[3, 1, 5, 4, 3], group=3, pads=[4, 3, 2, 4, 3, 2], strides=[1, 1, 1], dilations=[2, 2, 2]),
dict(weights_shape=[3, 1, 5, 4, 3], group=3, pads=[6, 6, 5, 6, 6, 5], strides=[1, 1, 1], dilations=[3, 4, 5]),
dict(weights_shape=[3, 1, 5, 4, 3], group=3, pads=[2, 1, 1, 2, 2, 1], strides=[2, 2, 2], dilations=[1, 1, 1]),
dict(weights_shape=[3, 1, 5, 4, 3], group=3, pads=[4, 3, 2, 4, 3, 2], strides=[2, 2, 2], dilations=[2, 2, 2]),
dict(weights_shape=[3, 1, 5, 4, 3], group=3, pads=[6, 6, 5, 6, 6, 5], strides=[2, 2, 2], dilations=[3, 4, 5]),
dict(weights_shape=[3, 1, 5, 4, 3], group=3, pads=[2, 1, 0, 2, 2, 0], strides=[3, 4, 5], dilations=[1, 1, 1]),
dict(weights_shape=[3, 1, 5, 4, 3], group=3, pads=[4, 3, 0, 4, 3, 0], strides=[3, 4, 5], dilations=[2, 2, 2]),
dict(weights_shape=[3, 1, 5, 4, 3], group=3, pads=[6, 6, 3, 6, 6, 3], strides=[3, 4, 5], dilations=[3, 4, 5])]
@pytest.mark.parametrize("params", test_data_3D)
@pytest.mark.parametrize("dilations", [[1], [2]])
@pytest.mark.parametrize("pads", [[0, 0], [1, 1], [1, 2]])
@pytest.mark.parametrize("strides", [[1], [2]])
@pytest.mark.parametrize("bias", [False, True])
@pytest.mark.nightly
def test_conv_3D(self, params, dilations, pads, strides, bias, ie_device, precision, ir_version, temp_dir):
self._test(*self.create_net(**params, shape=[2, 3, 25], dilations=dilations, pads=pads, strides=strides,
bias=bias, ir_version=ir_version),
ie_device, precision, ir_version, temp_dir=temp_dir)
@pytest.mark.parametrize("params", test_data_3D_autopad[:-1])
@pytest.mark.parametrize("auto_pad", ['SAME_UPPER', 'SAME_LOWER'])
@pytest.mark.parametrize("bias", [False, True])
@pytest.mark.nightly
@pytest.mark.xfail(reason='autopad dimetions do not agree with framework')
def test_conv_3D_autopad(self, params, auto_pad, bias, ie_device, precision, ir_version, temp_dir):
self._test(*self.create_net(**params, shape=[2, 3, 25], bias=bias, auto_pad=auto_pad, ir_version=ir_version),
ie_device, precision, ir_version, temp_dir=temp_dir)
@pytest.mark.parametrize("params", test_data_4D_precommit)
@pytest.mark.parametrize("dilations", [[3, 5]])
@pytest.mark.parametrize("pads", [[1, 2, 3, 4]])
@pytest.mark.parametrize("strides", [[3, 5]])
@pytest.mark.parametrize("bias", [False, True])
@pytest.mark.precommit
def test_conv_4D_precommit(self, params, dilations, pads, strides, bias, ie_device, precision,
ir_version, temp_dir):
self._test(*self.create_net(**params, shape=[2, 3, 25, 25], dilations=dilations, pads=pads, strides=strides,
bias=bias, ir_version=ir_version),
ie_device, precision, ir_version, temp_dir=temp_dir)
@pytest.mark.parametrize("params", test_data_4D)
@pytest.mark.parametrize("dilations", [[1, 1], [2, 2], [3, 5]])
@pytest.mark.parametrize("pads", [[0, 0, 0, 0], [1, 1, 1, 1], [1, 2, 3, 4]])
@pytest.mark.parametrize("strides", [[1, 1], [2, 2], [3, 5]])
@pytest.mark.parametrize("bias", [False, True])
@pytest.mark.nightly
def test_conv_4D(self, params, dilations, pads, strides, bias, ie_device, precision, ir_version, temp_dir):
self._test(
*self.create_net(**params, shape=[2, 3, 25, 25], dilations=dilations, pads=pads, strides=strides, bias=bias,
ir_version=ir_version), ie_device, precision, ir_version, temp_dir=temp_dir)
@pytest.mark.parametrize("params", test_data_4D_autopad[:-1])
@pytest.mark.parametrize("auto_pad", ['SAME_UPPER', 'SAME_LOWER'])
@pytest.mark.parametrize("bias", [False, True])
@pytest.mark.nightly
@pytest.mark.xfail(reason='autopad dimetions do not agree with framework')
def test_conv_4D_autopad(self, params, auto_pad, bias, ie_device, precision, ir_version, temp_dir):
self._test(*self.create_net(**params, shape=[2, 3, 25, 25], bias=bias, auto_pad=auto_pad,
ir_version=ir_version), ie_device, precision, ir_version, temp_dir=temp_dir)
@pytest.mark.parametrize("params", test_data_5D_precommit)
@pytest.mark.parametrize("dilations", [[3, 4, 5]])
@pytest.mark.parametrize("pads", [[1, 2, 3, 4, 5, 6]])
@pytest.mark.parametrize("strides", [[3, 4, 5]])
@pytest.mark.parametrize("bias", [False, True])
@pytest.mark.precommit
def test_conv_5D_precommit(self, params, dilations, pads, strides, bias, ie_device, precision,
ir_version, temp_dir):
self._test(*self.create_net(**params, shape=[2, 3, 25, 25, 25], dilations=dilations, pads=pads, strides=strides,
bias=bias, ir_version=ir_version),
ie_device, precision, ir_version, temp_dir=temp_dir)
@pytest.mark.parametrize("params", test_data_5D)
@pytest.mark.parametrize("dilations", [[1, 1, 1], [2, 2, 2], [3, 4, 5]])
@pytest.mark.parametrize("pads", [[0, 0, 0, 0, 0, 0], [1, 1, 1, 1, 1, 1], [1, 2, 3, 4, 5, 6]])
@pytest.mark.parametrize("strides", [[1, 1, 1], [2, 2, 2], [3, 4, 5]])
@pytest.mark.parametrize("bias", [False, True])
@pytest.mark.nightly
def test_conv_5D(self, params, dilations, pads, strides, bias, ie_device, precision, ir_version, temp_dir):
self._test(*self.create_net(**params, shape=[2, 3, 25, 25, 25], dilations=dilations, pads=pads, strides=strides,
bias=bias, ir_version=ir_version),
ie_device, precision, ir_version, temp_dir=temp_dir)
@pytest.mark.parametrize("params", test_data_5D_autopad[:-1])
@pytest.mark.parametrize("auto_pad", ['SAME_UPPER', 'SAME_LOWER'])
@pytest.mark.parametrize("bias", [False, True])
@pytest.mark.nightly
@pytest.mark.xfail(reason='autopad dimetions do not agree with framework')
def test_conv_5D_autopad(self, params, auto_pad, bias, ie_device, precision, ir_version, temp_dir):
self._test(*self.create_net(**params, shape=[2, 3, 25, 25, 25], bias=bias, auto_pad=auto_pad,
ir_version=ir_version), ie_device, precision, ir_version, temp_dir=temp_dir)
| 62.18599 | 120 | 0.526238 | 3,681 | 25,745 | 3.537897 | 0.04238 | 0.02104 | 0.127774 | 0.06788 | 0.824772 | 0.803655 | 0.770713 | 0.761269 | 0.745758 | 0.715273 | 0 | 0.08375 | 0.277413 | 25,745 | 413 | 121 | 62.336562 | 0.616298 | 0.010371 | 0 | 0.30663 | 0 | 0 | 0.092139 | 0.009163 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027624 | false | 0 | 0.024862 | 0 | 0.082873 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
16ec46c28813714e005ff0fc83a793ec8ff13892 | 53 | py | Python | DocxTemplateManager/__init__.py | TheRobotCarlson/DocxTemplateManager | 475ec12953ab7f8fe868bc6208dcf730ec6b3658 | [
"MIT"
] | null | null | null | DocxTemplateManager/__init__.py | TheRobotCarlson/DocxTemplateManager | 475ec12953ab7f8fe868bc6208dcf730ec6b3658 | [
"MIT"
] | null | null | null | DocxTemplateManager/__init__.py | TheRobotCarlson/DocxTemplateManager | 475ec12953ab7f8fe868bc6208dcf730ec6b3658 | [
"MIT"
] | null | null | null | from .DocxTemplateManager import DocxTemplateManager
| 26.5 | 52 | 0.90566 | 4 | 53 | 12 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075472 | 53 | 1 | 53 | 53 | 0.979592 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bc9bb7f496ac706a8e378543234ccdaad9e157d3 | 116 | py | Python | src/sage/graphs/base/all.py | bopopescu/sage-5 | 9d85b34956ca2edd55af307f99c5d3859acd30bf | [
"BSL-1.0"
] | 2 | 2021-08-20T00:30:35.000Z | 2021-11-17T10:54:00.000Z | src/sage/graphs/base/all.py | bopopescu/sage-5 | 9d85b34956ca2edd55af307f99c5d3859acd30bf | [
"BSL-1.0"
] | null | null | null | src/sage/graphs/base/all.py | bopopescu/sage-5 | 9d85b34956ca2edd55af307f99c5d3859acd30bf | [
"BSL-1.0"
] | null | null | null | from sparse_graph import SparseGraph
from dense_graph import DenseGraph
import sage.graphs.base.static_sparse_graph
| 29 | 43 | 0.887931 | 17 | 116 | 5.823529 | 0.647059 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086207 | 116 | 3 | 44 | 38.666667 | 0.933962 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bcaf84c3f5b588e7269cdfcbe5fedaa3a61caa3c | 3,967 | py | Python | robot-server/tests/errors/test_error_responses.py | knownmed/opentrons | d02eb3c6cbf9f1c8c05c5e9e1dac30a92a8c5e6c | [
"Apache-2.0"
] | 235 | 2017-10-27T20:37:27.000Z | 2022-03-30T14:09:49.000Z | robot-server/tests/errors/test_error_responses.py | knownmed/opentrons | d02eb3c6cbf9f1c8c05c5e9e1dac30a92a8c5e6c | [
"Apache-2.0"
] | 8,425 | 2017-10-26T15:25:43.000Z | 2022-03-31T23:54:26.000Z | robot-server/tests/errors/test_error_responses.py | knownmed/opentrons | d02eb3c6cbf9f1c8c05c5e9e1dac30a92a8c5e6c | [
"Apache-2.0"
] | 130 | 2017-11-09T21:02:37.000Z | 2022-03-15T18:01:24.000Z | """Tests for API error exceptions and response model serialization."""
from robot_server.errors.error_responses import (
ApiError,
ErrorSource,
ErrorDetails,
ErrorResponse,
LegacyErrorResponse,
MultiErrorResponse,
)
def test_error_details() -> None:
"""It should serialize an error response from an ErrorDetails."""
result = ErrorDetails(
id="SomeErrorId",
title="Some Error Title",
detail="Some error detail",
).as_error(status_code=400)
assert isinstance(result, ApiError)
assert result.status_code == 400
assert result.content == {
"errors": (
{
"id": "SomeErrorId",
"title": "Some Error Title",
"detail": "Some error detail",
},
)
}
def test_error_details_with_meta() -> None:
"""It should serialize an error with meta and source from ErrorDetails."""
result = ErrorDetails(
id="SomeErrorId",
title="Some Error Title",
detail="Some error detail",
source=ErrorSource(pointer="/foo/bar/baz"),
meta={"some": "meta information"},
).as_error(status_code=400)
assert isinstance(result, ApiError)
assert result.status_code == 400
assert result.content == {
"errors": (
{
"id": "SomeErrorId",
"title": "Some Error Title",
"detail": "Some error detail",
"source": {"pointer": "/foo/bar/baz"},
"meta": {"some": "meta information"},
},
)
}
def test_legacy_error_response() -> None:
"""It should serialize an error response from a LegacyErrorResponse."""
result = LegacyErrorResponse(
message="Some error detail",
).as_error(status_code=400)
assert isinstance(result, ApiError)
assert result.status_code == 400
assert result.content == {"message": "Some error detail"}
def test_error_response() -> None:
"""It should serialize an error response from an ErrorResponse."""
result = ErrorResponse(
errors=(
ErrorDetails(
id="SomeErrorId",
title="Some Error Title",
detail="Some error detail",
meta={"some": "meta information"},
),
)
).as_error(status_code=400)
assert isinstance(result, ApiError)
assert result.status_code == 400
assert result.content == {
"errors": (
{
"id": "SomeErrorId",
"title": "Some Error Title",
"detail": "Some error detail",
"meta": {"some": "meta information"},
},
)
}
def test_multi_error_response() -> None:
"""It should serialize an error response from a MultiErrorResponse."""
result = MultiErrorResponse(
errors=[
ErrorDetails(
id="SomeErrorId",
title="Some Error Title",
detail="Some error detail",
meta={"some": "meta information"},
),
ErrorDetails(
id="SomeOtherErrorId",
title="Some Other Error Title",
detail="Some other error detail",
meta={"some": "other meta information"},
),
]
).as_error(status_code=400)
assert isinstance(result, ApiError)
assert result.status_code == 400
assert result.content == {
"errors": [
{
"id": "SomeErrorId",
"title": "Some Error Title",
"detail": "Some error detail",
"meta": {"some": "meta information"},
},
{
"id": "SomeOtherErrorId",
"title": "Some Other Error Title",
"detail": "Some other error detail",
"meta": {"some": "other meta information"},
},
]
}
| 29.827068 | 78 | 0.532644 | 359 | 3,967 | 5.799443 | 0.155989 | 0.07781 | 0.076849 | 0.096061 | 0.808838 | 0.808838 | 0.772334 | 0.772334 | 0.745917 | 0.724784 | 0 | 0.01157 | 0.346357 | 3,967 | 132 | 79 | 30.05303 | 0.791361 | 0.097051 | 0 | 0.540541 | 0 | 0 | 0.233455 | 0 | 0 | 0 | 0 | 0 | 0.135135 | 1 | 0.045045 | false | 0 | 0.009009 | 0 | 0.054054 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
bccbf44e9739facb6c12d6ac9966b0877989ec95 | 103 | py | Python | src/qutip_tensornetwork/__init__.py | AGaliciaMartinez/qutip-tensornetwork | e8223e6f5fc59fb67e7d1438e7ad94d271ee9d64 | [
"BSD-3-Clause"
] | 3 | 2021-11-23T09:51:58.000Z | 2022-02-04T17:28:55.000Z | src/qutip_tensornetwork/__init__.py | AGaliciaMartinez/qutip-tensornetwork | e8223e6f5fc59fb67e7d1438e7ad94d271ee9d64 | [
"BSD-3-Clause"
] | 9 | 2021-11-05T10:59:15.000Z | 2022-02-03T17:04:52.000Z | src/qutip_tensornetwork/__init__.py | AGaliciaMartinez/qutip-tensornetwork | e8223e6f5fc59fb67e7d1438e7ad94d271ee9d64 | [
"BSD-3-Clause"
] | 2 | 2021-11-18T20:57:47.000Z | 2022-02-26T08:27:02.000Z | from .version import version as __version__
from .core import *
from .core.data.network import Network
| 25.75 | 43 | 0.805825 | 15 | 103 | 5.266667 | 0.466667 | 0.202532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135922 | 103 | 3 | 44 | 34.333333 | 0.88764 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
bce1e854fd78e8244fb9203244ddb3211e2e14c4 | 37 | py | Python | crank/feature/__init__.py | abeersaqib/crank | 0241ef46a618e24212e4a73b399e2293b4ffca08 | [
"MIT"
] | 162 | 2020-05-28T05:24:50.000Z | 2022-03-26T00:12:40.000Z | crank/feature/__init__.py | abeersaqib/crank | 0241ef46a618e24212e4a73b399e2293b4ffca08 | [
"MIT"
] | 39 | 2020-05-29T08:18:03.000Z | 2022-01-08T13:32:47.000Z | crank/feature/__init__.py | abeersaqib/crank | 0241ef46a618e24212e4a73b399e2293b4ffca08 | [
"MIT"
] | 31 | 2020-05-28T12:31:08.000Z | 2022-02-19T14:58:35.000Z | from .feature import Feature # noqa
| 18.5 | 36 | 0.756757 | 5 | 37 | 5.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189189 | 37 | 1 | 37 | 37 | 0.933333 | 0.108108 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4c0c1012abbfa93c9bc487a21cedbd80aee7e797 | 24 | py | Python | prometheus_toolbox/metrics/__init__.py | vbilyi/prometheus_toolbox | 6b21fa39148cf685fc16117716b0374bf9962f44 | [
"MIT"
] | null | null | null | prometheus_toolbox/metrics/__init__.py | vbilyi/prometheus_toolbox | 6b21fa39148cf685fc16117716b0374bf9962f44 | [
"MIT"
] | null | null | null | prometheus_toolbox/metrics/__init__.py | vbilyi/prometheus_toolbox | 6b21fa39148cf685fc16117716b0374bf9962f44 | [
"MIT"
] | null | null | null | from .measures import *
| 12 | 23 | 0.75 | 3 | 24 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 24 | 1 | 24 | 24 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4c42cd36d3acbe716ee0aaf0330a55c9b308fd1e | 29,540 | py | Python | Peach/Publishers/file.py | aleasims/Peach | bb56841e943d719d5101fee0a503ed34308eda04 | [
"MIT"
] | null | null | null | Peach/Publishers/file.py | aleasims/Peach | bb56841e943d719d5101fee0a503ed34308eda04 | [
"MIT"
] | null | null | null | Peach/Publishers/file.py | aleasims/Peach | bb56841e943d719d5101fee0a503ed34308eda04 | [
"MIT"
] | 1 | 2020-07-26T03:57:45.000Z | 2020-07-26T03:57:45.000Z |
'''
Some default file publishers. Output generated data to a file, etc.
@author: Michael Eddington
@version: $Id: file.py 2280 2011-02-17 05:54:04Z meddingt $
'''
#
# Copyright (c) Michael Eddington
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
#
# Authors:
# Michael Eddington (mike@phed.org)
# $Id: file.py 2280 2011-02-17 05:54:04Z meddingt $
import os, sys, time
from Peach.Engine.engine import Engine
from Peach.Engine.dom import Data, State, Action
from Peach.publisher import Publisher
import base64
try:
import win32pdh
import win32pdhutil
import win32pdhquery
import ctypes
import win32api
except:
pass
class FileWriter(Publisher):
'''
Publishes generated data to a file. No concept of receaving data
yet.
'''
def __init__(self, filename):
'''
@type filename: string
@param filename: Filename to write to
'''
Publisher.__init__(self)
self._filename = None
self._fd = None
self._state = 0 # 0 -stoped; 1 -started
self.setFilename(filename)
def getFilename(self):
'''
Get current filename.
@rtype: string
@return: current filename
'''
return self._filename
def setFilename(self, filename):
'''
Set new filename.
@type filename: string
@param filename: Filename to set
'''
self._filename = filename
def start(self):
pass
def connect(self):
if self._state == 1:
raise Exception('File::start(): Already started!')
if self._fd != None:
self._fd.close()
self.mkdir()
self._fd = open(self._filename, "w+b")
self._state = 1
def stop(self):
self.close()
def mkdir(self):
# lets try and create the folder this file lives in
delim = ""
if self._filename.find("\\") != -1:
delim = '\\'
elif self._filename.find("/") != -1:
delim = '/'
else:
return
# strip filename
try:
path = self._filename[: self._filename.rfind(delim) ]
os.mkdir(path)
except:
pass
def close(self):
if self._state == 0:
return
self._fd.close()
self._fd = None
self._state = 0
def send(self, data):
if type(data) != str:
data = data.encode('iso-8859-1')
self._fd.write(data)
def receive(self, size = None):
if size != None:
return self._fd.read(size)
return self._fd.read()
class FileWriterAS3StringRecorder(Publisher):
'''
Record all test cases one per line, 32bit integer prefix to line
indicating read length.
'''
def __init__(self, filename):
'''
@type filename: string
@param filename: Filename to write to
'''
Publisher.__init__(self)
self._filename = None
self._fd = None
self._state = 0 # 0 -stoped; 1 -started
self.setFilename(filename)
def getFilename(self):
'''
Get current filename.
@rtype: string
@return: current filename
'''
return self._filename
def setFilename(self, filename):
'''
Set new filename.
@type filename: string
@param filename: Filename to set
'''
self._filename = filename
def start(self):
pass
def connect(self):
if self._fd != None:
return
self.mkdir()
self._fd = open(self._filename, "w+b")
self._state = 1
def stop(self):
#if self._state == 0:
# return
#
#self._fd.close()
#self._fd = None
#self._state = 0
pass
def mkdir(self):
# lets try and create the folder this file lives in
delim = ""
if self._filename.find("\\") != -1:
delim = '\\'
elif self._filename.find("/") != -1:
delim = '/'
else:
return
# strip filename
try:
path = self._filename[: self._filename.rfind(delim) ]
os.mkdir(path)
except:
pass
def close(self):
pass
def send(self, data):
self._fd.write(" <string>" + base64.b64encode(data) + "</string>\n")
def receive(self, size = None):
if size != None:
return self._fd.read(size)
return self._fd.read()
class FileWriterAS3NumberRecorder(Publisher):
'''
Record all test cases one per line, 32bit integer prefix to line
indicating read length.
'''
def __init__(self, filename):
'''
@type filename: string
@param filename: Filename to write to
'''
Publisher.__init__(self)
self._filename = None
self._fd = None
self._state = 0 # 0 -stoped; 1 -started
self.setFilename(filename)
def getFilename(self):
'''
Get current filename.
@rtype: string
@return: current filename
'''
return self._filename
def setFilename(self, filename):
'''
Set new filename.
@type filename: string
@param filename: Filename to set
'''
self._filename = filename
def start(self):
pass
def connect(self):
if self._fd != None:
return
self.mkdir()
self._fd = open(self._filename, "w+b")
self._state = 1
def stop(self):
pass
def mkdir(self):
# lets try and create the folder this file lives in
delim = ""
if self._filename.find("\\") != -1:
delim = '\\'
elif self._filename.find("/") != -1:
delim = '/'
else:
return
# strip filename
try:
path = self._filename[: self._filename.rfind(delim) ]
os.mkdir(path)
except:
pass
def close(self):
pass
def send(self, data):
buff = " <number>" + data + "</number>\n";
self._fd.write(buff)
def receive(self, size = None):
if size != None:
return self._fd.read(size)
return self._fd.read()
class FileReader(Publisher):
'''
Publishes generated data to a file. No concept of receaving data
yet.
'''
def __init__(self, filename):
'''
@type filename: string
@param filename: Filename to write to
'''
Publisher.__init__(self)
self._filename = None
self._fd = None
self._state = 0 # 0 -stoped; 1 -started
self.setFilename(filename)
def getFilename(self):
'''
Get current filename.
@rtype: string
@return: current filename
'''
return self._filename
def setFilename(self, filename):
'''
Set new filename.
@type filename: string
@param filename: Filename to set
'''
self._filename = filename
def start(self):
pass
def connect(self):
if self._state == 1:
return
if self._fd != None:
self._fd.close()
self._fd = open(self._filename, "r+b")
self._state = 1
def stop(self):
self.close()
def close(self):
try:
if self._state == 0:
return
self._fd.close()
self._fd = None
self._state = 0
except:
pass
def send(self, data):
self._fd.write(data)
def receive(self, size = None):
if size != None:
return self._fd.read(size)
return self._fd.read()
class FilePerIteration(FileWriter):
'''
This publisher differs from File in that each round
will generate a new filename. Very handy for generating
bogus content (media files, etc).
'''
def __init__(self, filename):
'''
@type filename: string
@param filename: Filename to write to should have a %d in it
someplace :)
'''
FileWriter.__init__(self, filename)
self._roundCount = 0
self._origFilename = filename
self.setFilename(filename % self._roundCount)
self._closed = True
self.data = None
self.dataLookedFor = False
def _getStateByName(self, stateMachine, stateName):
'''
Locate a State object by name in the StateMachine.
'''
for child in stateMachine:
if child.elementType == 'state' and child.name == stateName:
return child
return None
def _getDataWithFileName(self):
'''
Will search state model for a <Data> and get the
filename from it.
'''
stateMachine = self.parent.stateMachine
for state in stateMachine:
if isinstance(state, State):
for action in state:
if isinstance(action, Action):
if action.data != None and action.data.fileName != None:
return action.data
return None
def connect(self):
if self.data == None and self.dataLookedFor == False:
self.data = self._getDataWithFileName()
self.dataLookedFor = True
if self.data != None:
fileBase = self.data.fileName
if fileBase.find('\\'):
fileBase = fileBase.split('\\')[-1]
if fileBase.find('/'):
fileBase = fileBase.split('/')[-1]
fileBase = fileBase.split('.')[0]
self.setFilename( (self._origFilename % self._roundCount).replace("##FILEBASE##", fileBase) )
else:
self.setFilename(self._origFilename % self._roundCount)
FileWriter.connect(self)
self._closed = False
def stop(self):
self.close()
def close(self):
FileWriter.close(self)
if not self._closed:
self._roundCount += 1
if self.data != None:
fileBase = self.data.fileName
if fileBase.find('\\'):
fileBase = fileBase.split('\\')[-1]
if fileBase.find('/'):
fileBase = fileBase.split('/')[-1]
fileBase = fileBase.split('.')[0]
self.setFilename( (self._origFilename % self._roundCount).replace("##FILEBASE##", fileBase) )
else:
self.setFilename(self._origFilename % self._roundCount)
self._closed = True
def send(self, data):
FileWriter.send(self, data)
class FileWriterLauncher(Publisher):
'''
Writes a file to disk and then launches a program.
To use, first use this publisher like the FileWriter
stream publisher. Close, than call a program (or two).
'''
def __init__(self, filename, debugger = "False", waitTime = 3):
'''
@type filename: string
@param filename: Filename to write to
@type waitTime: integer
@param waitTime: Time in seconds to wait before killing process
'''
Publisher.__init__(self)
self._filename = None
self._fd = None
self._state = 0 # 0 -stoped; 1 -started
self.setFilename(filename)
self.waitTime = float(waitTime)
self.debugger = False
if debugger.lower() == "true":
self.debugger = True
def getFilename(self):
'''
Get current filename.
@rtype: string
@return: current filename
'''
return self._filename
def setFilename(self, filename):
'''
Set new filename.
@type filename: string
@param filename: Filename to set
'''
self._filename = filename
def start(self):
pass
def connect(self):
if self._state == 1:
raise Exception('File::start(): Already started!')
if self._fd != None:
self._fd.close()
self.mkdir()
self._fd = open(self._filename, "w+b")
self._state = 1
def stop(self):
self.close()
def mkdir(self):
# lets try and create the folder this file lives in
if os.path.sep not in self._filename:
return
paths = os.path.dirname(self._filename).split(os.path.sep)
curpath = ""
for p in paths:
if len(curpath) == 0:
curpath = p
else:
os.path.join(curpath,p)
try:
os.mkdir(p)
except:
pass
def close(self):
if self._state == 0:
return
self._fd.close()
self._fd = None
self._state = 0
def send(self, data):
self._fd.write(data)
def receive(self, size = None):
if size != None:
return self._fd.read(size)
return self._fd.read()
def FindChildrenOf(self, parentid):
childPids = []
object = "Process"
items, instances = win32pdh.EnumObjectItems(None, None, object, win32pdh.PERF_DETAIL_WIZARD)
instance_dict = {}
for instance in instances:
if instance in instance_dict:
instance_dict[instance] = instance_dict[instance] + 1
else:
instance_dict[instance] = 0
for instance, max_instances in instance_dict.items():
for inum in xrange(max_instances+1):
hq = win32pdh.OpenQuery()
try:
hcs = []
path = win32pdh.MakeCounterPath((None, object, instance, None, inum, "ID Process"))
hcs.append(win32pdh.AddCounter(hq, path))
path = win32pdh.MakeCounterPath((None, object, instance, None, inum, "Creating Process ID"))
hcs.append(win32pdh.AddCounter(hq, path))
try:
# If the process goes away unexpectedly this call will fail
win32pdh.CollectQueryData(hq)
type, pid = win32pdh.GetFormattedCounterValue(hcs[0], win32pdh.PDH_FMT_LONG)
type, ppid = win32pdh.GetFormattedCounterValue(hcs[1], win32pdh.PDH_FMT_LONG)
if int(ppid) == parentid:
childPids.append(int(pid))
except:
pass
finally:
win32pdh.CloseQuery(hq)
return childPids
def call(self, method, args):
# windows or unix?
if sys.platform == 'win32':
return self.callWindows(method, args)
return self.callUnix(method,args)
def callUnix(self, method, args):
'''
Launch program to consume file
@type method: string
@param method: Command to execute
@type args: array of objects
@param args: Arguments to pass
'''
## Make sure we close the file first :)
self.close()
## Figure out how we are calling the program
if self.debugger:
# Launch via agent
Engine.context.agent.OnPublisherCall(method)
methodRunning = method + "_isrunning"
for i in range(long(self.waitTime/0.25)):
ret = Engine.context.agent.OnPublisherCall(methodRunning)
if ret == False:
# Process exited already
break
time.sleep(0.25)
else:
# Launch via spawn
#realArgs = [os.path.basename(method)]
realArgs = [method]
for a in args:
realArgs.append(a)
pid = os.spawnv(os.P_NOWAIT, method, realArgs)
for i in range(0, long(self.waitTime/0.15)):
(pid1, ret) = os.waitpid(pid, os.WNOHANG)
if not (pid1 == 0 and ret == 0):
break
time.sleep(0.15)
try:
import signal
os.kill(pid, signal.SIGTERM)
time.sleep(0.25)
(pid1, ret) = os.waitpid(pid, os.WNOHANG)
if not (pid1 == 0 and ret == 0):
return
os.kill(pid, signal.SIGKILL)
except:
print sys.exc_info()
def callWindows(self, method, args):
'''
Launch program to consume file
@type method: string
@param method: Command to execute
@type args: array of objects
@param args: Arguments to pass
'''
## Make sure we close the file first :)
self.close()
## Figure out how we are calling the program
if self.debugger:
# Launch via agent
Engine.context.agent.OnPublisherCall(method)
methodRunning = method + "_isrunning"
for i in range(long(self.waitTime/0.25)):
ret = Engine.context.agent.OnPublisherCall(methodRunning)
if ret == False:
# Process exited already
break
time.sleep(0.25)
else:
# Launch via spawn
realArgs = ["cmd.exe", "/c", method]
for a in args:
realArgs.append(a)
phandle = os.spawnv(os.P_NOWAIT, os.path.join( os.getenv('SystemRoot'), 'system32','cmd.exe'), realArgs)
# Give it some time before we KILL!
for i in range(long(self.waitTime/0.25)):
if win32process.GetExitCodeProcess(phandle) != win32con.STILL_ACTIVE:
# Process exited already
break
time.sleep(0.25)
try:
pid = ctypes.windll.kernel32.GetProcessId( ctypes.c_ulong(phandle) )
if pid > 0:
for cid in self.FindChildrenOf(pid):
chandle = win32api.OpenProcess(1, 0, cid)
win32process.TerminateProcess(chandle, 0)
try:
win32api.CloseHandle(chandle)
except:
pass
win32process.TerminateProcess(phandle, 0)
try:
win32api.CloseHandle(phandle)
except:
pass
except:
pass
try:
import win32gui, win32con, win32process, win32event, win32api
import sys,time, os, signal, subprocess, ctypes
TH32CS_SNAPPROCESS = 0x00000002
class PROCESSENTRY32(ctypes.Structure):
_fields_ = [("dwSize", ctypes.c_ulong),
("cntUsage", ctypes.c_ulong),
("th32ProcessID", ctypes.c_ulong),
("th32DefaultHeapID", ctypes.c_ulong),
("th32ModuleID", ctypes.c_ulong),
("cntThreads", ctypes.c_ulong),
("th32ParentProcessID", ctypes.c_ulong),
("pcPriClassBase", ctypes.c_ulong),
("dwFlags", ctypes.c_ulong),
("szExeFile", ctypes.c_char * 260)]
class FileWriterLauncherGui(Publisher):
'''
Writes a file to disk and then launches a program. After
some defined amount of time we will try and close the GUI
application by sending WM_CLOSE than kill it.
To use, first use this publisher like the FileWriter
stream publisher. Close, than call a program (or two).
'''
def __init__(self, filename, windowname, debugger = "false", waitTime = 3):
'''
@type filename: string
@param filename: Filename to write to
@type windowname: string
@param windowname: Partial window name to locate and kill
'''
Publisher.__init__(self)
self._filename = None
self._fd = None
self._state = 0 # 0 -stoped; 1 -started
self.setFilename(filename)
self._windowName = windowname
self.waitTime = float(waitTime)
self.debugger = False
self.count = 0
self._fd_sequencial = None
if debugger.lower() == "true":
self.debugger = True
if sys.platform != 'win32':
raise PeachException("Error, publisher FileWriterLauncherGui not supported on non-Windows platforms.")
def getFilename(self):
'''
Get current filename.
@rtype: string
@return: current filename
'''
return self._filename
def setFilename(self, filename):
'''
Set new filename.
@type filename: string
@param filename: Filename to set
'''
self._filename = filename
def start(self):
pass
def connect(self):
if self._state == 1:
raise Exception('File::start(): Already started!')
if self._fd != None:
self._fd.close()
self.mkdir()
# First lets rename the old file if there is one
try:
os.unlink(self._filename)
except:
pass
# If we can't open the file it might
# still be open. Lets retry a few times.
for i in range(10):
try:
self._fd = open(self._filename, "w+b")
break
except:
try:
os.unlink(self._filename)
except:
pass
if i == 9:
raise
time.sleep(1)
self._state = 1
def stop(self):
self.close()
def mkdir(self):
# lets try and create the folder this file lives in
delim = ""
if self._filename.find("\\") != -1:
delim = '\\'
elif self._filename.find("/") != -1:
delim = '/'
else:
return
# strip filename
try:
path = self._filename[: self._filename.rfind(delim) ]
os.mkdir(path)
except:
pass
def close(self):
if self._state == 0:
return
if self._fd_sequencial != None:
self._fd_sequencial.close()
self.count += 1
self._fd.close()
self._fd = None
self._state = 0
def send(self, data):
self._fd.write(data)
if self._fd_sequencial != None:
self._fd_sequencial.write(data)
def receive(self, size = None):
if size != None:
return self._fd.read(size)
return self._fd.read()
def call(self, method, args):
'''
Launch program to consume file
@type method: string
@param method: Command to execute
@type args: array of objects
@param args: Arguments to pass
'''
proc = None
if self.debugger:
# Launch via agent
Engine.context.agent.OnPublisherCall(method)
methodRunning = method + "_isrunning"
for i in range(long(self.waitTime/0.25)):
ret = Engine.context.agent.OnPublisherCall(methodRunning)
if ret == False:
# Process exited already
break
time.sleep(0.15)
else:
realArgs = [method]
for a in args:
realArgs.append(a)
proc = None
try:
proc = subprocess.Popen(realArgs, shell=True)
except:
print "Error: Exception thrown creating process"
raise
# Wait 5 seconds
time.sleep(self.waitTime)
self.closeApp(proc, self._windowName)
def enumCallback(hwnd, args):
'''
Will get called by win32gui.EnumWindows, once for each
top level application window.
'''
proc = args[0]
windowName = args[1]
try:
# Get window title
title = win32gui.GetWindowText(hwnd)
# Is this our guy?
if title.find(windowName) == -1:
win32gui.EnumChildWindows(hwnd, FileWriterLauncherGui.enumChildCallback, args)
return
# Send WM_CLOSE message
win32gui.PostMessage(hwnd, win32con.WM_CLOSE, 0, 0)
except:
pass
enumCallback = staticmethod(enumCallback)
def enumChildCallback(hwnd, args):
'''
Will get called by win32gui.EnumWindows, once for each
top level application window.
'''
proc = args[0]
windowName = args[1]
try:
# Get window title
title = win32gui.GetWindowText(hwnd)
# Is this our guy?
if title.find(windowName) == -1:
return
# Send WM_CLOSE message
win32gui.PostMessage(hwnd, win32con.WM_CLOSE, 0, 0)
except:
pass
#print sys.exc_info()
enumChildCallback = staticmethod(enumChildCallback)
def genChildProcesses(self, proc):
parentPid = proc.pid
for p in self.genProcesses():
if p.th32ParentProcessID == parentPid:
yield p.th32ProcessID
def genProcesses(self):
CreateToolhelp32Snapshot = ctypes.windll.kernel32.CreateToolhelp32Snapshot
Process32First = ctypes.windll.kernel32.Process32First
Process32Next = ctypes.windll.kernel32.Process32Next
CloseHandle = ctypes.windll.kernel32.CloseHandle
hProcessSnap = CreateToolhelp32Snapshot(TH32CS_SNAPPROCESS, 0)
pe32 = PROCESSENTRY32()
pe32.dwSize = ctypes.sizeof(PROCESSENTRY32)
if Process32First(hProcessSnap, ctypes.byref(pe32)) == win32con.FALSE:
print >> sys.stderr, "Failed getting first process."
return
while True:
yield pe32
if Process32Next(hProcessSnap, ctypes.byref(pe32)) == win32con.FALSE:
break
CloseHandle(hProcessSnap)
def closeApp(self, proc, title):
'''
Close Application by window title
'''
try:
win32gui.EnumWindows(FileWriterLauncherGui.enumCallback, [proc, title])
if proc != None and not self.debugger:
win32event.WaitForSingleObject(int(proc._handle), 5*1000)
for pid in self.genChildProcesses(proc):
try:
handle = win32api.OpenProcess(1, False, pid)
win32process.TerminateProcess(handle, -1)
win32api.CloseHandle(handle)
except:
pass
except:
pass
###class FileRegressionGui(Publisher):
### '''
### Writes a file to disk and then launches a program. After
### some defined amount of time we will try and close the GUI
### application by sending WM_CLOSE than kill it.
###
### To use, first use this publisher like the FileWriter
### stream publisher. Close, than call a program (or two).
### '''
###
### def __init__(self, folder, windowname, debugger = "false", waitTime = 3):
### '''
### @type filename: string
### @param filename: Log folder with PoC files
### @type windowname: string
### @param windowname: Partial window name to locate and kill
### '''
### Publisher.__init__(self)
### self._windowName = windowname
### self.waitTime = float(waitTime)
### self.debugger = False
### if debugger.lower() == "true":
### self.debugger = True
###
### self._files = []
### self._currentFile = 0
###
### ## INSERT CODE TO LOCATE FILES
### ## c:\cygwin\bin\find folder -iname "*.pdf"
### ## put them into self._files
###
### def start(self):
### pass
###
### def connect(self):
### pass
###
### def stop(self):
### pass
###
### def close(self):
### pass
###
### def send(self, data):
### pass
###
### def receive(self, size = None):
### pass
###
### def call(self, method, args):
### '''
### Launch program to consume file
###
### @type method: string
### @param method: Command to execute
### @type args: array of objects
### @param args: Arguments to pass
### '''
###
### if self._currentFile > len(self._files):
### raise Exception("We are done regressing")
###
### fileName = self._files[self._currentFile]
### self._currentFile += 1
###
### proc = None
### if self.debugger:
### # Launch via agent
###
### ## NOTE: Will need to copy PoC file ontop of
### ## expected file!
###
### Engine.context.agent.OnPublisherCall(method)
###
### else:
### realArgs = [method]
### for a in args:
### if a == "FILENAME":
### realArgs.append(fileName)
### else:
### realArgs.append(a)
###
### proc = None
### try:
### proc = subprocess.Popen(realArgs, shell=True)
###
### except:
### print "Error: Exception thrown creating process"
### raise
###
### # Wait 5 seconds
### time.sleep(self.waitTime)
###
### self.closeApp(proc, self._windowName)
###
### def enumCallback(hwnd, args):
### '''
### Will get called by win32gui.EnumWindows, once for each
### top level application window.
### '''
###
### proc = args[0]
### windowName = args[1]
###
### try:
###
### # Get window title
### title = win32gui.GetWindowText(hwnd)
###
### # Is this our guy?
### if title.find(windowName) == -1:
### win32gui.EnumChildWindows(hwnd, FileWriterLauncherGui.enumChildCallback, args)
### return
###
### # Send WM_CLOSE message
### win32gui.PostMessage(hwnd, win32con.WM_CLOSE, 0, 0)
### win32gui.PostQuitMessage(hwnd)
### except:
### pass
###
### enumCallback = staticmethod(enumCallback)
###
### def enumChildCallback(hwnd, args):
### '''
### Will get called by win32gui.EnumWindows, once for each
### top level application window.
### '''
###
### proc = args[0]
### windowName = args[1]
###
### try:
###
### # Get window title
### title = win32gui.GetWindowText(hwnd)
###
### # Is this our guy?
### if title.find(windowName) == -1:
### return
###
### # Send WM_CLOSE message
### win32gui.PostMessage(hwnd, win32con.WM_CLOSE, 0, 0)
### win32gui.PostQuitMessage(hwnd)
### except:
### pass
###
### enumChildCallback = staticmethod(enumChildCallback)
###
### def genChildProcesses(self, proc):
### parentPid = proc.pid
###
### for p in self.genProcesses():
### if p.th32ParentProcessID == parentPid:
### yield p.th32ProcessID
###
### def genProcesses(self):
###
### CreateToolhelp32Snapshot = ctypes.windll.kernel32.CreateToolhelp32Snapshot
### Process32First = ctypes.windll.kernel32.Process32First
### Process32Next = ctypes.windll.kernel32.Process32Next
### CloseHandle = ctypes.windll.kernel32.CloseHandle
###
### hProcessSnap = CreateToolhelp32Snapshot(TH32CS_SNAPPROCESS, 0)
### pe32 = PROCESSENTRY32()
### pe32.dwSize = ctypes.sizeof(PROCESSENTRY32)
### if Process32First(hProcessSnap, ctypes.byref(pe32)) == win32con.FALSE:
### print >> sys.stderr, "Failed getting first process."
### return
###
### while True:
### yield pe32
### if Process32Next(hProcessSnap, ctypes.byref(pe32)) == win32con.FALSE:
### break
###
### CloseHandle(hProcessSnap)
###
### def closeApp(self, proc, title):
### '''
### Close Application by window title
### '''
###
### try:
### win32gui.EnumWindows(FileWriterLauncherGui.enumCallback, [proc, title])
###
### if proc:
### win32event.WaitForSingleObject(int(proc._handle), 5*1000)
###
### for pid in self.genChildProcesses(proc):
### try:
### handle = win32api.OpenProcess(1, False, pid)
### win32process.TerminateProcess(handle, -1)
### win32api.CloseHandle(handle)
### except:
### pass
### except:
### pass
except:
pass
# end
| 23.745981 | 108 | 0.613406 | 3,448 | 29,540 | 5.175464 | 0.145882 | 0.039003 | 0.009526 | 0.011768 | 0.719305 | 0.70552 | 0.701821 | 0.695041 | 0.669992 | 0.662819 | 0 | 0.021102 | 0.263643 | 29,540 | 1,243 | 109 | 23.765084 | 0.799283 | 0.226676 | 0 | 0.708772 | 1 | 0 | 0.036187 | 0.00123 | 0 | 0 | 0.000586 | 0 | 0 | 0 | null | null | 0.049123 | 0.022807 | null | null | 0.005263 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4c5542f3eee56259baa9f5bc247daf43ec10d043 | 156 | py | Python | config.py | Aradhya-B/nama | d27a03e2649cae57f66569845d7bd627fe014797 | [
"MIT"
] | null | null | null | config.py | Aradhya-B/nama | d27a03e2649cae57f66569845d7bd627fe014797 | [
"MIT"
] | null | null | null | config.py | Aradhya-B/nama | d27a03e2649cae57f66569845d7bd627fe014797 | [
"MIT"
] | null | null | null | dataDir = '/home/brad/Dropbox/Data/nama'
trainingDir = '/home/brad/Dropbox/Data/nama/trainingData'
modelDir = '/home/brad/Dropbox/Data/nama/trainedModels'
| 31.2 | 57 | 0.769231 | 20 | 156 | 6 | 0.5 | 0.2 | 0.375 | 0.475 | 0.575 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064103 | 156 | 4 | 58 | 39 | 0.821918 | 0 | 0 | 0 | 0 | 0 | 0.711538 | 0.711538 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4c6dd30128edfc4ec770fa8892ad72d3dc41893b | 117 | py | Python | sprint/tmp.py | jumphone/sprint | 94a5e5450d73b357497fba11eef818c6cc8792aa | [
"MIT"
] | 44 | 2018-03-09T22:22:50.000Z | 2021-09-15T09:40:54.000Z | sprint/tmp.py | jumphone/sprint | 94a5e5450d73b357497fba11eef818c6cc8792aa | [
"MIT"
] | 30 | 2018-03-19T05:30:05.000Z | 2022-01-21T06:54:45.000Z | sprint/tmp.py | jumphone/sprint | 94a5e5450d73b357497fba11eef818c6cc8792aa | [
"MIT"
] | 13 | 2018-06-30T10:07:02.000Z | 2021-06-10T13:25:43.000Z | from tools_fq import *
from tools_sam import *
from tools_bed import *
from tools_zf import *
from tools_fa import *
| 19.5 | 23 | 0.786325 | 20 | 117 | 4.35 | 0.4 | 0.517241 | 0.689655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.17094 | 117 | 5 | 24 | 23.4 | 0.896907 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
d5e05ca9e2f7c8b18d07eba0865326aa9c43e85b | 25,213 | py | Python | sdk/agrifood/azure-agrifood-farming/azure/agrifood/farming/operations/_weather_operations.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 2,728 | 2015-01-09T10:19:32.000Z | 2022-03-31T14:50:33.000Z | sdk/agrifood/azure-agrifood-farming/azure/agrifood/farming/operations/_weather_operations.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 17,773 | 2015-01-05T15:57:17.000Z | 2022-03-31T23:50:25.000Z | sdk/agrifood/azure-agrifood-farming/azure/agrifood/farming/operations/_weather_operations.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 1,916 | 2015-01-19T05:05:41.000Z | 2022-03-31T19:36:44.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
import datetime
from typing import TYPE_CHECKING
import warnings
from azure.core.exceptions import ClientAuthenticationError, HttpResponseError, ResourceExistsError, ResourceNotFoundError, map_error
from azure.core.paging import ItemPaged
from azure.core.pipeline import PipelineResponse
from azure.core.pipeline.transport import HttpRequest, HttpResponse
from azure.core.polling import LROPoller, NoPolling, PollingMethod
from azure.core.polling.base_polling import LROBasePolling
from .. import models as _models
if TYPE_CHECKING:
# pylint: disable=unused-import,ungrouped-imports
from typing import Any, Callable, Dict, Generic, Iterable, Optional, TypeVar, Union
T = TypeVar('T')
ClsType = Optional[Callable[[PipelineResponse[HttpRequest, HttpResponse], T, Dict[str, Any]], Any]]
class WeatherOperations(object):
"""WeatherOperations operations.
You should not instantiate this class directly. Instead, you should create a Client instance that
instantiates it for you and attaches it as an attribute.
:ivar models: Alias to model classes used in this operation group.
:type models: ~azure.agrifood.farming.models
:param client: Client for service requests.
:param config: Configuration of service client.
:param serializer: An object model serializer.
:param deserializer: An object model deserializer.
"""
models = _models
def __init__(self, client, config, serializer, deserializer):
self._client = client
self._serialize = serializer
self._deserialize = deserializer
self._config = config
def list(
self,
farmer_id, # type: str
boundary_id, # type: str
extension_id, # type: str
weather_data_type, # type: str
granularity, # type: str
start_date_time=None, # type: Optional[datetime.datetime]
end_date_time=None, # type: Optional[datetime.datetime]
max_page_size=50, # type: Optional[int]
skip_token=None, # type: Optional[str]
**kwargs # type: Any
):
# type: (...) -> Iterable["_models.WeatherDataListResponse"]
"""Returns a paginated list of weather data.
:param farmer_id: Farmer ID.
:type farmer_id: str
:param boundary_id: Boundary ID.
:type boundary_id: str
:param extension_id: ID of the weather extension.
:type extension_id: str
:param weather_data_type: Type of weather data (forecast/historical).
:type weather_data_type: str
:param granularity: Granularity of weather data (daily/hourly).
:type granularity: str
:param start_date_time: Weather data start UTC date-time (inclusive), sample format:
yyyy-MM-ddTHH:mm:ssZ.
:type start_date_time: ~datetime.datetime
:param end_date_time: Weather data end UTC date-time (inclusive), sample format:
yyyy-MM-ddTHH:mm:ssZ.
:type end_date_time: ~datetime.datetime
:param max_page_size: Maximum number of items needed (inclusive).
Minimum = 10, Maximum = 1000, Default value = 50.
:type max_page_size: int
:param skip_token: Skip token for getting next set of results.
:type skip_token: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: An iterator like instance of either WeatherDataListResponse or the result of cls(response)
:rtype: ~azure.core.paging.ItemPaged[~azure.agrifood.farming.models.WeatherDataListResponse]
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.WeatherDataListResponse"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-03-31-preview"
accept = "application/json"
def prepare_request(next_link=None):
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
if not next_link:
# Construct URL
url = self.list.metadata['url'] # type: ignore
path_format_arguments = {
'Endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['farmerId'] = self._serialize.query("farmer_id", farmer_id, 'str')
query_parameters['boundaryId'] = self._serialize.query("boundary_id", boundary_id, 'str')
query_parameters['extensionId'] = self._serialize.query("extension_id", extension_id, 'str', pattern=r'^[A-za-z]{3,50}[.][A-za-z]{3,100}$')
query_parameters['weatherDataType'] = self._serialize.query("weather_data_type", weather_data_type, 'str', max_length=50, min_length=0)
query_parameters['granularity'] = self._serialize.query("granularity", granularity, 'str', max_length=50, min_length=0)
if start_date_time is not None:
query_parameters['startDateTime'] = self._serialize.query("start_date_time", start_date_time, 'iso-8601')
if end_date_time is not None:
query_parameters['endDateTime'] = self._serialize.query("end_date_time", end_date_time, 'iso-8601')
if max_page_size is not None:
query_parameters['$maxPageSize'] = self._serialize.query("max_page_size", max_page_size, 'int', maximum=1000, minimum=10)
if skip_token is not None:
query_parameters['$skipToken'] = self._serialize.query("skip_token", skip_token, 'str')
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
request = self._client.get(url, query_parameters, header_parameters)
else:
url = next_link
query_parameters = {} # type: Dict[str, Any]
path_format_arguments = {
'Endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
}
url = self._client.format_url(url, **path_format_arguments)
request = self._client.get(url, query_parameters, header_parameters)
return request
def extract_data(pipeline_response):
deserialized = self._deserialize('WeatherDataListResponse', pipeline_response)
list_of_elem = deserialized.value
if cls:
list_of_elem = cls(list_of_elem)
return deserialized.next_link or None, iter(list_of_elem)
def get_next(next_link=None):
request = prepare_request(next_link)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
map_error(status_code=response.status_code, response=response, error_map=error_map)
raise HttpResponseError(response=response, model=error)
return pipeline_response
return ItemPaged(
get_next, extract_data
)
list.metadata = {'url': '/weather'} # type: ignore
def get_data_ingestion_job_details(
self,
job_id, # type: str
**kwargs # type: Any
):
# type: (...) -> "_models.WeatherDataIngestionJob"
"""Get weather ingestion job.
:param job_id: ID of the job.
:type job_id: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: WeatherDataIngestionJob, or the result of cls(response)
:rtype: ~azure.agrifood.farming.models.WeatherDataIngestionJob
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.WeatherDataIngestionJob"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-03-31-preview"
accept = "application/json"
# Construct URL
url = self.get_data_ingestion_job_details.metadata['url'] # type: ignore
path_format_arguments = {
'Endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'jobId': self._serialize.url("job_id", job_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error)
deserialized = self._deserialize('WeatherDataIngestionJob', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_data_ingestion_job_details.metadata = {'url': '/weather/ingest-data/{jobId}'} # type: ignore
def _create_data_ingestion_job_initial(
self,
job_id, # type: str
job=None, # type: Optional["_models.WeatherDataIngestionJob"]
**kwargs # type: Any
):
# type: (...) -> "_models.WeatherDataIngestionJob"
cls = kwargs.pop('cls', None) # type: ClsType["_models.WeatherDataIngestionJob"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-03-31-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self._create_data_ingestion_job_initial.metadata['url'] # type: ignore
path_format_arguments = {
'Endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'jobId': self._serialize.url("job_id", job_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
if job is not None:
body_content = self._serialize.body(job, 'WeatherDataIngestionJob')
else:
body_content = None
body_content_kwargs['content'] = body_content
request = self._client.put(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error)
deserialized = self._deserialize('WeatherDataIngestionJob', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_create_data_ingestion_job_initial.metadata = {'url': '/weather/ingest-data/{jobId}'} # type: ignore
def begin_create_data_ingestion_job(
self,
job_id, # type: str
job=None, # type: Optional["_models.WeatherDataIngestionJob"]
**kwargs # type: Any
):
# type: (...) -> LROPoller["_models.WeatherDataIngestionJob"]
"""Create a weather data ingestion job.
:param job_id: Job id supplied by user.
:type job_id: str
:param job: Job parameters supplied by user.
:type job: ~azure.agrifood.farming.models.WeatherDataIngestionJob
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be LROBasePolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.PollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of LROPoller that returns either WeatherDataIngestionJob or the result of cls(response)
:rtype: ~azure.core.polling.LROPoller[~azure.agrifood.farming.models.WeatherDataIngestionJob]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, PollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.WeatherDataIngestionJob"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = self._create_data_ingestion_job_initial(
job_id=job_id,
job=job,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('WeatherDataIngestionJob', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'Endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'jobId': self._serialize.url("job_id", job_id, 'str'),
}
if polling is True: polling_method = LROBasePolling(lro_delay, lro_options={'final-state-via': 'location'}, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
if cont_token:
return LROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_create_data_ingestion_job.metadata = {'url': '/weather/ingest-data/{jobId}'} # type: ignore
def get_data_delete_job_details(
self,
job_id, # type: str
**kwargs # type: Any
):
# type: (...) -> "_models.WeatherDataDeleteJob"
"""Get weather data delete job.
:param job_id: ID of the job.
:type job_id: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: WeatherDataDeleteJob, or the result of cls(response)
:rtype: ~azure.agrifood.farming.models.WeatherDataDeleteJob
:raises: ~azure.core.exceptions.HttpResponseError
"""
cls = kwargs.pop('cls', None) # type: ClsType["_models.WeatherDataDeleteJob"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-03-31-preview"
accept = "application/json"
# Construct URL
url = self.get_data_delete_job_details.metadata['url'] # type: ignore
path_format_arguments = {
'Endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'jobId': self._serialize.url("job_id", job_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
request = self._client.get(url, query_parameters, header_parameters)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [200]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error)
deserialized = self._deserialize('WeatherDataDeleteJob', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
get_data_delete_job_details.metadata = {'url': '/weather/delete-data/{jobId}'} # type: ignore
def _create_data_delete_job_initial(
self,
job_id, # type: str
job=None, # type: Optional["_models.WeatherDataDeleteJob"]
**kwargs # type: Any
):
# type: (...) -> "_models.WeatherDataDeleteJob"
cls = kwargs.pop('cls', None) # type: ClsType["_models.WeatherDataDeleteJob"]
error_map = {
401: ClientAuthenticationError, 404: ResourceNotFoundError, 409: ResourceExistsError
}
error_map.update(kwargs.pop('error_map', {}))
api_version = "2021-03-31-preview"
content_type = kwargs.pop("content_type", "application/json")
accept = "application/json"
# Construct URL
url = self._create_data_delete_job_initial.metadata['url'] # type: ignore
path_format_arguments = {
'Endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'jobId': self._serialize.url("job_id", job_id, 'str'),
}
url = self._client.format_url(url, **path_format_arguments)
# Construct parameters
query_parameters = {} # type: Dict[str, Any]
query_parameters['api-version'] = self._serialize.query("api_version", api_version, 'str')
# Construct headers
header_parameters = {} # type: Dict[str, Any]
header_parameters['Content-Type'] = self._serialize.header("content_type", content_type, 'str')
header_parameters['Accept'] = self._serialize.header("accept", accept, 'str')
body_content_kwargs = {} # type: Dict[str, Any]
if job is not None:
body_content = self._serialize.body(job, 'WeatherDataDeleteJob')
else:
body_content = None
body_content_kwargs['content'] = body_content
request = self._client.put(url, query_parameters, header_parameters, **body_content_kwargs)
pipeline_response = self._client._pipeline.run(request, stream=False, **kwargs)
response = pipeline_response.http_response
if response.status_code not in [202]:
map_error(status_code=response.status_code, response=response, error_map=error_map)
error = self._deserialize.failsafe_deserialize(_models.ErrorResponse, response)
raise HttpResponseError(response=response, model=error)
deserialized = self._deserialize('WeatherDataDeleteJob', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
_create_data_delete_job_initial.metadata = {'url': '/weather/delete-data/{jobId}'} # type: ignore
def begin_create_data_delete_job(
self,
job_id, # type: str
job=None, # type: Optional["_models.WeatherDataDeleteJob"]
**kwargs # type: Any
):
# type: (...) -> LROPoller["_models.WeatherDataDeleteJob"]
"""Create a weather data delete job.
:param job_id: Job ID supplied by end user.
:type job_id: str
:param job: Job parameters supplied by user.
:type job: ~azure.agrifood.farming.models.WeatherDataDeleteJob
:keyword callable cls: A custom type or function that will be passed the direct response
:keyword str continuation_token: A continuation token to restart a poller from a saved state.
:keyword polling: By default, your polling method will be LROBasePolling.
Pass in False for this operation to not poll, or pass in your own initialized polling object for a personal polling strategy.
:paramtype polling: bool or ~azure.core.polling.PollingMethod
:keyword int polling_interval: Default waiting time between two polls for LRO operations if no Retry-After header is present.
:return: An instance of LROPoller that returns either WeatherDataDeleteJob or the result of cls(response)
:rtype: ~azure.core.polling.LROPoller[~azure.agrifood.farming.models.WeatherDataDeleteJob]
:raises ~azure.core.exceptions.HttpResponseError:
"""
polling = kwargs.pop('polling', True) # type: Union[bool, PollingMethod]
cls = kwargs.pop('cls', None) # type: ClsType["_models.WeatherDataDeleteJob"]
lro_delay = kwargs.pop(
'polling_interval',
self._config.polling_interval
)
cont_token = kwargs.pop('continuation_token', None) # type: Optional[str]
if cont_token is None:
raw_result = self._create_data_delete_job_initial(
job_id=job_id,
job=job,
cls=lambda x,y,z: x,
**kwargs
)
kwargs.pop('error_map', None)
kwargs.pop('content_type', None)
def get_long_running_output(pipeline_response):
deserialized = self._deserialize('WeatherDataDeleteJob', pipeline_response)
if cls:
return cls(pipeline_response, deserialized, {})
return deserialized
path_format_arguments = {
'Endpoint': self._serialize.url("self._config.endpoint", self._config.endpoint, 'str', skip_quote=True),
'jobId': self._serialize.url("job_id", job_id, 'str'),
}
if polling is True: polling_method = LROBasePolling(lro_delay, lro_options={'final-state-via': 'location'}, path_format_arguments=path_format_arguments, **kwargs)
elif polling is False: polling_method = NoPolling()
else: polling_method = polling
if cont_token:
return LROPoller.from_continuation_token(
polling_method=polling_method,
continuation_token=cont_token,
client=self._client,
deserialization_callback=get_long_running_output
)
else:
return LROPoller(self._client, raw_result, get_long_running_output, polling_method)
begin_create_data_delete_job.metadata = {'url': '/weather/delete-data/{jobId}'} # type: ignore
| 48.300766 | 171 | 0.653155 | 2,792 | 25,213 | 5.676934 | 0.112822 | 0.031167 | 0.021577 | 0.011483 | 0.787192 | 0.760757 | 0.746877 | 0.725047 | 0.712744 | 0.68429 | 0 | 0.007218 | 0.24174 | 25,213 | 521 | 172 | 48.393474 | 0.821843 | 0.276524 | 0 | 0.702381 | 0 | 0.002976 | 0.100439 | 0.027662 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03869 | false | 0 | 0.032738 | 0 | 0.136905 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d5f984cf363d9697314215ade9881bcf26c5bcde | 32 | py | Python | dags/__init__.py | k2k1422/airflow | 065dee55345637134d87b9add7f4819200f6668e | [
"Apache-2.0"
] | null | null | null | dags/__init__.py | k2k1422/airflow | 065dee55345637134d87b9add7f4819200f6668e | [
"Apache-2.0"
] | null | null | null | dags/__init__.py | k2k1422/airflow | 065dee55345637134d87b9add7f4819200f6668e | [
"Apache-2.0"
] | null | null | null | print("Inside dags folder init") | 32 | 32 | 0.78125 | 5 | 32 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 32 | 1 | 32 | 32 | 0.862069 | 0 | 0 | 0 | 0 | 0 | 0.69697 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
91107bac69679943895ed3ddf785960392b72cc0 | 59 | py | Python | test/test_dot_render.py | gulan/jsdtools | 1707f7c1571dcde6eac456caadb625f691a16bba | [
"0BSD"
] | null | null | null | test/test_dot_render.py | gulan/jsdtools | 1707f7c1571dcde6eac456caadb625f691a16bba | [
"0BSD"
] | 4 | 2018-09-04T14:40:24.000Z | 2018-09-04T19:36:27.000Z | test/test_dot_render.py | gulan/jsdtools | 1707f7c1571dcde6eac456caadb625f691a16bba | [
"0BSD"
] | null | null | null | #!python
import jsdtools.dot as dot
def test_xxx(): pass
| 9.833333 | 26 | 0.728814 | 10 | 59 | 4.2 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.169492 | 59 | 5 | 27 | 11.8 | 0.857143 | 0.118644 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 6 |
911225e0c759015806ad7c1d6866b621680ab03e | 236 | py | Python | themodelshop/utils/data/handlers/handle_pyarrow_Table.py | laraib-sidd/themodelshop | e811036eaf22f0d1b56b7b9c60912930a1fed3cb | [
"MIT"
] | 1 | 2021-01-12T16:13:14.000Z | 2021-01-12T16:13:14.000Z | themodelshop/utils/data/handlers/handle_pyarrow_Table.py | laraib-sidd/themodelshop | e811036eaf22f0d1b56b7b9c60912930a1fed3cb | [
"MIT"
] | 4 | 2020-11-30T12:32:39.000Z | 2021-01-08T12:20:39.000Z | themodelshop/utils/data/handlers/handle_pyarrow_Table.py | laraib-sidd/themodelshop | e811036eaf22f0d1b56b7b9c60912930a1fed3cb | [
"MIT"
] | 1 | 2021-01-12T16:13:20.000Z | 2021-01-12T16:13:20.000Z | """Provides read and write functionality for PyArrow Table"""
def read():
"""This is a function that reads a pyarrow Table from disk"""
pass
def write():
"""This is a function that writes a pyarrow Table to disk"""
pass | 29.5 | 65 | 0.677966 | 36 | 236 | 4.444444 | 0.555556 | 0.225 | 0.0875 | 0.1875 | 0.2375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.220339 | 236 | 8 | 66 | 29.5 | 0.869565 | 0.70339 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
9120f3a3b4afd94a06c4d15ba12253538c1d1d9a | 151 | py | Python | tests/test_game.py | yukinarit/ctetris | 5dcfcdb6f1b0e8d8aeeb864b4f6b7e1ed86d8a49 | [
"MIT"
] | 1 | 2020-10-24T14:18:05.000Z | 2020-10-24T14:18:05.000Z | tests/test_game.py | yukinarit/ctetris | 5dcfcdb6f1b0e8d8aeeb864b4f6b7e1ed86d8a49 | [
"MIT"
] | 3 | 2018-04-03T04:38:24.000Z | 2018-04-19T13:25:58.000Z | tests/test_game.py | yukinarit/py-tetris | 5dcfcdb6f1b0e8d8aeeb864b4f6b7e1ed86d8a49 | [
"MIT"
] | null | null | null | from tetris.game import GameObject
def test_renderable():
pass
def test_game_object():
obj = GameObject()
def test_collision():
pass
| 10.785714 | 34 | 0.695364 | 19 | 151 | 5.315789 | 0.631579 | 0.207921 | 0.336634 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.218543 | 151 | 13 | 35 | 11.615385 | 0.855932 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0.285714 | 0.142857 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
e68dc48829717a52c2fc1ddb99d6974ceeb05a1d | 95 | py | Python | scenes/__init__.py | TheLokin/Kabalayn | 2034364e03e8eca909df11dcc393d70edd18493b | [
"MIT"
] | null | null | null | scenes/__init__.py | TheLokin/Kabalayn | 2034364e03e8eca909df11dcc393d70edd18493b | [
"MIT"
] | null | null | null | scenes/__init__.py | TheLokin/Kabalayn | 2034364e03e8eca909df11dcc393d70edd18493b | [
"MIT"
] | null | null | null | from .menu import *
from .stage import *
from .cutscene import *
from .director import Director | 23.75 | 30 | 0.768421 | 13 | 95 | 5.615385 | 0.461538 | 0.410959 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 95 | 4 | 30 | 23.75 | 0.9125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
e696bfaffdee608bc91225b9146b413dba561bdb | 11,881 | py | Python | tests/unit/apis/test_queue.py | guvenbz/amazon-s3-find-and-forget | 398f7d86d38068c8a9d77ddc9183758946c9dbe4 | [
"Apache-2.0"
] | 165 | 2020-05-29T08:12:17.000Z | 2022-03-30T22:35:57.000Z | tests/unit/apis/test_queue.py | guvenbz/amazon-s3-find-and-forget | 398f7d86d38068c8a9d77ddc9183758946c9dbe4 | [
"Apache-2.0"
] | 101 | 2020-06-24T12:59:49.000Z | 2022-03-28T13:32:15.000Z | tests/unit/apis/test_queue.py | guvenbz/amazon-s3-find-and-forget | 398f7d86d38068c8a9d77ddc9183758946c9dbe4 | [
"Apache-2.0"
] | 23 | 2020-06-18T10:53:49.000Z | 2022-03-29T03:38:04.000Z | import json
import os
from types import SimpleNamespace
from decimal import Decimal
import pytest
from mock import patch, ANY
with patch.dict(os.environ, {"DeletionQueueTable": "DeletionQueueTable"}):
from backend.lambdas.queue import handlers
pytestmark = [pytest.mark.unit, pytest.mark.api, pytest.mark.queue]
autorization_mock = {
"authorizer": {
"claims": {"sub": "cognitoSub", "cognito:username": "cognitoUsername"}
}
}
@patch("backend.lambdas.queue.handlers.deletion_queue_table")
def test_it_retrieves_all_items(table):
table.scan.return_value = {"Items": []}
response = handlers.get_handler({}, SimpleNamespace())
assert {
"statusCode": 200,
"body": json.dumps({"MatchIds": [], "NextStart": None}),
"headers": ANY,
} == response
table.scan.assert_called_with(Limit=10)
@patch("backend.lambdas.queue.handlers.deletion_queue_table")
def test_it_retrieves_all_items_with_size_and_pagination(table):
table.scan.return_value = {
"Items": [
{
"DeletionQueueItemId": "id123",
"MatchId": "foo",
"DataMappers": [],
"CreatedAt": 123456789,
}
]
}
response = handlers.get_handler(
{"queryStringParameters": {"page_size": "1", "start_at": "id000"}},
SimpleNamespace(),
)
assert {
"statusCode": 200,
"body": json.dumps(
{
"MatchIds": [
{
"Type": "Simple",
"DeletionQueueItemId": "id123",
"MatchId": "foo",
"DataMappers": [],
"CreatedAt": 123456789,
}
],
"NextStart": "id123",
}
),
"headers": ANY,
} == response
table.scan.assert_called_with(
Limit=1, ExclusiveStartKey={"DeletionQueueItemId": "id000"}
)
@patch("backend.lambdas.queue.handlers.deletion_queue_table")
def test_it_adds_to_queue(table):
response = handlers.enqueue_handler(
{
"body": json.dumps({"MatchId": "test", "DataMappers": ["a"]}),
"requestContext": autorization_mock,
},
SimpleNamespace(),
)
assert 201 == response["statusCode"]
assert {
"DeletionQueueItemId": ANY,
"MatchId": "test",
"Type": "Simple",
"CreatedAt": ANY,
"DataMappers": ["a"],
"CreatedBy": {"Username": "cognitoUsername", "Sub": "cognitoSub"},
} == json.loads(response["body"])
@patch("backend.lambdas.queue.handlers.deletion_queue_table")
def test_it_adds_composite_to_queue(table):
mid = [{"Column": "first_name", "Value": "test"}]
response = handlers.enqueue_handler(
{
"body": json.dumps(
{"MatchId": mid, "Type": "Composite", "DataMappers": ["a"],}
),
"requestContext": autorization_mock,
},
SimpleNamespace(),
)
assert 201 == response["statusCode"]
assert {
"DeletionQueueItemId": ANY,
"MatchId": mid,
"Type": "Composite",
"CreatedAt": ANY,
"DataMappers": ["a"],
"CreatedBy": {"Username": "cognitoUsername", "Sub": "cognitoSub"},
} == json.loads(response["body"])
@patch("backend.lambdas.queue.handlers.deletion_queue_table")
def test_it_adds_batch_to_queue(table):
response = handlers.enqueue_batch_handler(
{
"body": json.dumps(
{
"Matches": [
{"MatchId": "test", "DataMappers": ["a"]},
{"MatchId": "test2", "DataMappers": ["a"]},
]
}
),
"requestContext": autorization_mock,
},
SimpleNamespace(),
)
assert 201 == response["statusCode"]
assert {
"Matches": [
{
"DeletionQueueItemId": ANY,
"MatchId": "test",
"Type": "Simple",
"CreatedAt": ANY,
"DataMappers": ["a"],
"CreatedBy": {"Username": "cognitoUsername", "Sub": "cognitoSub"},
},
{
"DeletionQueueItemId": ANY,
"MatchId": "test2",
"Type": "Simple",
"CreatedAt": ANY,
"DataMappers": ["a"],
"CreatedBy": {"Username": "cognitoUsername", "Sub": "cognitoSub"},
},
]
} == json.loads(response["body"])
@patch("backend.lambdas.queue.handlers.deletion_queue_table")
def test_it_provides_default_data_mappers(table):
response = handlers.enqueue_handler(
{"body": json.dumps({"MatchId": "test",}), "requestContext": autorization_mock},
SimpleNamespace(),
)
assert 201 == response["statusCode"]
assert {
"DeletionQueueItemId": ANY,
"MatchId": "test",
"Type": "Simple",
"CreatedAt": ANY,
"DataMappers": [],
"CreatedBy": {"Username": "cognitoUsername", "Sub": "cognitoSub"},
} == json.loads(response["body"])
@patch("backend.lambdas.queue.handlers.running_job_exists")
@patch("backend.lambdas.queue.handlers.deletion_queue_table")
def test_it_cancels_deletions(table, mock_running_job):
mock_running_job.return_value = False
response = handlers.cancel_handler(
{"body": json.dumps({"Matches": [{"DeletionQueueItemId": "id123"}],})},
SimpleNamespace(),
)
assert {"statusCode": 204, "headers": ANY} == response
@patch("backend.lambdas.queue.handlers.running_job_exists")
def test_it_prevents_cancelling_whilst_running_jobs(mock_running_job):
mock_running_job.return_value = True
response = handlers.cancel_handler(
{
"body": json.dumps(
{"Matches": [{"MatchId": "test", "CreatedAt": 123456789,}],}
)
},
SimpleNamespace(),
)
assert 400 == response["statusCode"]
assert "headers" in response
@patch("backend.lambdas.queue.handlers.bucket_count", 1)
@patch("backend.lambdas.queue.handlers.uuid")
@patch("backend.lambdas.queue.handlers.jobs_table")
@patch("backend.lambdas.queue.handlers.running_job_exists")
@patch("backend.lambdas.queue.handlers.get_config")
def test_it_process_queue(mock_config, mock_running_job, job_table, uuid):
mock_running_job.return_value = False
mock_config.return_value = {
"AthenaConcurrencyLimit": 15,
"AthenaQueryMaxRetries": 2,
"DeletionTasksMaxNumber": 50,
"QueryExecutionWaitSeconds": 5,
"QueryQueueWaitSeconds": 5,
"ForgetQueueWaitSeconds": 30,
}
uuid.uuid4.return_value = 123
response = handlers.process_handler(
{"body": "", "requestContext": autorization_mock}, SimpleNamespace()
)
job_table.put_item.assert_called_with(
Item={
"Id": "123",
"Sk": "123",
"Type": "Job",
"JobStatus": "QUEUED",
"GSIBucket": "0",
"CreatedAt": ANY,
"AthenaConcurrencyLimit": 15,
"AthenaQueryMaxRetries": 2,
"DeletionTasksMaxNumber": 50,
"QueryExecutionWaitSeconds": 5,
"QueryQueueWaitSeconds": 5,
"ForgetQueueWaitSeconds": 30,
"CreatedBy": {"Username": "cognitoUsername", "Sub": "cognitoSub"},
}
)
assert 202 == response["statusCode"]
assert "headers" in response
assert {
"Id": "123",
"Sk": "123",
"Type": "Job",
"JobStatus": "QUEUED",
"GSIBucket": "0",
"CreatedAt": ANY,
"AthenaConcurrencyLimit": 15,
"AthenaQueryMaxRetries": 2,
"DeletionTasksMaxNumber": 50,
"QueryExecutionWaitSeconds": 5,
"QueryQueueWaitSeconds": 5,
"ForgetQueueWaitSeconds": 30,
"CreatedBy": {"Username": "cognitoUsername", "Sub": "cognitoSub"},
} == json.loads(response["body"])
@patch("backend.lambdas.queue.handlers.bucket_count", 1)
@patch("backend.lambdas.queue.handlers.uuid")
@patch("backend.lambdas.queue.handlers.jobs_table")
@patch("backend.lambdas.queue.handlers.running_job_exists")
@patch("backend.lambdas.queue.handlers.get_config")
@patch("backend.lambdas.queue.handlers.utc_timestamp")
def test_it_applies_expiry(mock_utc, mock_config, mock_running_job, job_table, uuid):
mock_running_job.return_value = False
mock_utc.return_value = 12346789
mock_config.return_value = {
"AthenaConcurrencyLimit": 15,
"AthenaQueryMaxRetries": 2,
"DeletionTasksMaxNumber": 50,
"JobDetailsRetentionDays": 30,
"QueryExecutionWaitSeconds": 5,
"QueryQueueWaitSeconds": 5,
"ForgetQueueWaitSeconds": 30,
}
uuid.uuid4.return_value = 123
response = handlers.process_handler(
{"body": "", "requestContext": autorization_mock}, SimpleNamespace()
)
mock_utc.assert_called_with(days=30)
job_table.put_item.assert_called_with(
Item={
"Id": "123",
"Sk": "123",
"Type": "Job",
"JobStatus": "QUEUED",
"GSIBucket": "0",
"CreatedAt": ANY,
"Expires": 12346789,
"AthenaConcurrencyLimit": 15,
"AthenaQueryMaxRetries": 2,
"DeletionTasksMaxNumber": 50,
"QueryExecutionWaitSeconds": 5,
"QueryQueueWaitSeconds": 5,
"ForgetQueueWaitSeconds": 30,
"CreatedBy": {"Username": "cognitoUsername", "Sub": "cognitoSub"},
}
)
assert 202 == response["statusCode"]
@patch("backend.lambdas.queue.handlers.running_job_exists")
def test_it_prevents_concurrent_running_jobs(mock_running_job):
mock_running_job.return_value = True
response = handlers.process_handler(
{"body": "", "requestContext": autorization_mock}, SimpleNamespace()
)
assert 400 == response["statusCode"]
assert "headers" in response
def test_it_validates_composite_queue_item_for_matchid_not_array():
items = [
{
"Type": "Composite",
"MatchId": "Test",
"Columns": ["column"],
"DataMappers": [],
}
]
with pytest.raises(ValueError) as e:
handlers.validate_queue_items(items)
assert e.value.args[0] == "MatchIds of Composite type need to be specified as array"
def test_it_validates_composite_queue_item_for_matchid_empty_array():
items = [
{"Type": "Composite", "MatchId": [], "Columns": ["column"], "DataMappers": []}
]
with pytest.raises(ValueError) as e:
handlers.validate_queue_items(items)
assert (
e.value.args[0]
== "MatchIds of Composite type need to have a value for at least one column"
)
def test_it_validates_composite_queue_item_for_data_mapper_empty():
items = [
{
"Type": "Composite",
"MatchId": [{"Column": "first_name", "Value": "Test"}],
"Columns": ["column"],
"DataMappers": [],
}
]
with pytest.raises(ValueError) as e:
handlers.validate_queue_items(items)
assert (
e.value.args[0]
== "MatchIds of Composite type need to be associated to exactly one Data Mapper"
)
def test_it_validates_composite_queue_item_for_too_many_data_mappers():
items = [
{
"Type": "Composite",
"MatchId": [{"Column": "first_name", "Value": "Test"}],
"Columns": ["column"],
"DataMappers": ["foo", "bar"],
}
]
with pytest.raises(ValueError) as e:
handlers.validate_queue_items(items)
assert (
e.value.args[0]
== "MatchIds of Composite type need to be associated to exactly one Data Mapper"
)
| 31.938172 | 88 | 0.581685 | 1,065 | 11,881 | 6.294836 | 0.161502 | 0.045943 | 0.062351 | 0.075179 | 0.84114 | 0.823837 | 0.803252 | 0.777148 | 0.723598 | 0.669004 | 0 | 0.021403 | 0.276408 | 11,881 | 371 | 89 | 32.024259 | 0.758404 | 0 | 0 | 0.638298 | 0 | 0 | 0.32371 | 0.140981 | 0 | 0 | 0 | 0 | 0.085106 | 1 | 0.045593 | false | 0 | 0.021277 | 0 | 0.066869 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e6c385f37782b2984534b67503385ff09b4d41ce | 54 | py | Python | pytest/test-discovery/filetest.py | imsardine/learning | 925841ddd93d60c740a62e12d9f57ef15b6e0a20 | [
"MIT"
] | null | null | null | pytest/test-discovery/filetest.py | imsardine/learning | 925841ddd93d60c740a62e12d9f57ef15b6e0a20 | [
"MIT"
] | null | null | null | pytest/test-discovery/filetest.py | imsardine/learning | 925841ddd93d60c740a62e12d9f57ef15b6e0a20 | [
"MIT"
] | null | null | null | import unittest
def test_method():
assert False
| 9 | 18 | 0.722222 | 7 | 54 | 5.428571 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 54 | 5 | 19 | 10.8 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
fc10b1923f3a88be7ed4cacbe0bf2a6c10d4ac89 | 45,628 | py | Python | os_ken/tests/unit/packet/test_slow.py | faucetsdn/python3-os-ken | 31037f6388b7885c859391802451b867c30f1694 | [
"Apache-2.0"
] | 4 | 2018-10-25T08:42:56.000Z | 2019-04-24T04:01:26.000Z | os_ken/tests/unit/packet/test_slow.py | anlaneg/os-ken | 379a7694c3129cc0156343af71f4fca8830d9de5 | [
"Apache-2.0"
] | 1 | 2021-05-09T06:14:16.000Z | 2021-05-09T06:14:18.000Z | os_ken/tests/unit/packet/test_slow.py | anlaneg/os-ken | 379a7694c3129cc0156343af71f4fca8830d9de5 | [
"Apache-2.0"
] | 5 | 2019-04-24T04:01:01.000Z | 2020-06-20T14:38:04.000Z | # Copyright (C) 2013 Nippon Telegraph and Telephone Corporation.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# vim: tabstop=4 shiftwidth=4 softtabstop=4
import copy
import logging
from struct import pack, unpack_from
import unittest
from nose.tools import ok_, eq_, raises
from os_ken.ofproto import ether
from os_ken.lib.packet.ethernet import ethernet
from os_ken.lib.packet.packet import Packet
from os_ken.lib import addrconv
from os_ken.lib.packet.slow import slow, lacp
from os_ken.lib.packet.slow import SLOW_PROTOCOL_MULTICAST
from os_ken.lib.packet.slow import SLOW_SUBTYPE_LACP
from os_ken.lib.packet.slow import SLOW_SUBTYPE_MARKER
LOG = logging.getLogger(__name__)
class Test_slow(unittest.TestCase):
""" Test case for Slow Protocol
"""
def setUp(self):
self.subtype = SLOW_SUBTYPE_LACP
self.version = lacp.LACP_VERSION_NUMBER
self.actor_tag = lacp.LACP_TLV_TYPE_ACTOR
self.actor_length = 20
self.actor_system_priority = 65534
self.actor_system = '00:07:0d:af:f4:54'
self.actor_key = 1
self.actor_port_priority = 65535
self.actor_port = 1
self.actor_state_activity = lacp.LACP_STATE_ACTIVE
self.actor_state_timeout = lacp.LACP_STATE_LONG_TIMEOUT
self.actor_state_aggregation = lacp.LACP_STATE_AGGREGATEABLE
self.actor_state_synchronization = lacp.LACP_STATE_IN_SYNC
self.actor_state_collecting = lacp.LACP_STATE_COLLECTING_ENABLED
self.actor_state_distributing = lacp.LACP_STATE_DISTRIBUTING_ENABLED
self.actor_state_defaulted = lacp.LACP_STATE_OPERATIONAL_PARTNER
self.actor_state_expired = lacp.LACP_STATE_EXPIRED
self.actor_state = (
(self.actor_state_activity << 0) |
(self.actor_state_timeout << 1) |
(self.actor_state_aggregation << 2) |
(self.actor_state_synchronization << 3) |
(self.actor_state_collecting << 4) |
(self.actor_state_distributing << 5) |
(self.actor_state_defaulted << 6) |
(self.actor_state_expired << 7))
self.partner_tag = lacp.LACP_TLV_TYPE_PARTNER
self.partner_length = 20
self.partner_system_priority = 0
self.partner_system = '00:00:00:00:00:00'
self.partner_key = 0
self.partner_port_priority = 0
self.partner_port = 0
self.partner_state_activity = 0
self.partner_state_timeout = lacp.LACP_STATE_SHORT_TIMEOUT
self.partner_state_aggregation = 0
self.partner_state_synchronization = 0
self.partner_state_collecting = 0
self.partner_state_distributing = 0
self.partner_state_defaulted = 0
self.partner_state_expired = 0
self.partner_state = (
(self.partner_state_activity << 0) |
(self.partner_state_timeout << 1) |
(self.partner_state_aggregation << 2) |
(self.partner_state_synchronization << 3) |
(self.partner_state_collecting << 4) |
(self.partner_state_distributing << 5) |
(self.partner_state_defaulted << 6) |
(self.partner_state_expired << 7))
self.collector_tag = lacp.LACP_TLV_TYPE_COLLECTOR
self.collector_length = 16
self.collector_max_delay = 0
self.terminator_tag = lacp.LACP_TLV_TYPE_TERMINATOR
self.terminator_length = 0
self.head_fmt = lacp._HLEN_PACK_STR
self.head_len = lacp._HLEN_PACK_LEN
self.act_fmt = lacp._ACTPRT_INFO_PACK_STR
self.act_len = lacp._ACTPRT_INFO_PACK_LEN
self.prt_fmt = lacp._ACTPRT_INFO_PACK_STR
self.prt_len = lacp._ACTPRT_INFO_PACK_LEN
self.col_fmt = lacp._COL_INFO_PACK_STR
self.col_len = lacp._COL_INFO_PACK_LEN
self.trm_fmt = lacp._TRM_PACK_STR
self.trm_len = lacp._TRM_PACK_LEN
self.length = lacp._ALL_PACK_LEN
self.head_buf = pack(self.head_fmt,
self.subtype,
self.version)
self.act_buf = pack(self.act_fmt,
self.actor_tag,
self.actor_length,
self.actor_system_priority,
addrconv.mac.text_to_bin(self.actor_system),
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state)
self.prt_buf = pack(self.prt_fmt,
self.partner_tag,
self.partner_length,
self.partner_system_priority,
addrconv.mac.text_to_bin(self.partner_system),
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state)
self.col_buf = pack(self.col_fmt,
self.collector_tag,
self.collector_length,
self.collector_max_delay)
self.trm_buf = pack(self.trm_fmt,
self.terminator_tag,
self.terminator_length)
self.buf = self.head_buf + self.act_buf + self.prt_buf + \
self.col_buf + self.trm_buf
def tearDown(self):
pass
def test_parser(self):
slow.parser(self.buf)
def test_not_implemented_subtype(self):
not_implemented_buf = pack(
slow._PACK_STR, SLOW_SUBTYPE_MARKER) + self.buf[1:]
(instance, nexttype, last) = slow.parser(not_implemented_buf)
assert instance is None
assert nexttype is None
assert last is not None
def test_invalid_subtype(self):
invalid_buf = b'\xff' + self.buf[1:]
(instance, nexttype, last) = slow.parser(invalid_buf)
assert instance is None
assert nexttype is None
assert last is not None
class Test_lacp(unittest.TestCase):
""" Test case for lacp
"""
def setUp(self):
self.subtype = SLOW_SUBTYPE_LACP
self.version = lacp.LACP_VERSION_NUMBER
self.actor_tag = lacp.LACP_TLV_TYPE_ACTOR
self.actor_length = 20
self.actor_system_priority = 65534
self.actor_system = '00:07:0d:af:f4:54'
self.actor_key = 1
self.actor_port_priority = 65535
self.actor_port = 1
self.actor_state_activity = lacp.LACP_STATE_ACTIVE
self.actor_state_timeout = lacp.LACP_STATE_LONG_TIMEOUT
self.actor_state_aggregation = lacp.LACP_STATE_AGGREGATEABLE
self.actor_state_synchronization = lacp.LACP_STATE_IN_SYNC
self.actor_state_collecting = lacp.LACP_STATE_COLLECTING_ENABLED
self.actor_state_distributing = lacp.LACP_STATE_DISTRIBUTING_ENABLED
self.actor_state_defaulted = lacp.LACP_STATE_OPERATIONAL_PARTNER
self.actor_state_expired = lacp.LACP_STATE_EXPIRED
self.actor_state = (
(self.actor_state_activity << 0) |
(self.actor_state_timeout << 1) |
(self.actor_state_aggregation << 2) |
(self.actor_state_synchronization << 3) |
(self.actor_state_collecting << 4) |
(self.actor_state_distributing << 5) |
(self.actor_state_defaulted << 6) |
(self.actor_state_expired << 7))
self.partner_tag = lacp.LACP_TLV_TYPE_PARTNER
self.partner_length = 20
self.partner_system_priority = 0
self.partner_system = '00:00:00:00:00:00'
self.partner_key = 0
self.partner_port_priority = 0
self.partner_port = 0
self.partner_state_activity = 0
self.partner_state_timeout = lacp.LACP_STATE_SHORT_TIMEOUT
self.partner_state_aggregation = 0
self.partner_state_synchronization = 0
self.partner_state_collecting = 0
self.partner_state_distributing = 0
self.partner_state_defaulted = 0
self.partner_state_expired = 0
self.partner_state = (
(self.partner_state_activity << 0) |
(self.partner_state_timeout << 1) |
(self.partner_state_aggregation << 2) |
(self.partner_state_synchronization << 3) |
(self.partner_state_collecting << 4) |
(self.partner_state_distributing << 5) |
(self.partner_state_defaulted << 6) |
(self.partner_state_expired << 7))
self.collector_tag = lacp.LACP_TLV_TYPE_COLLECTOR
self.collector_length = 16
self.collector_max_delay = 0
self.terminator_tag = lacp.LACP_TLV_TYPE_TERMINATOR
self.terminator_length = 0
self.head_fmt = lacp._HLEN_PACK_STR
self.head_len = lacp._HLEN_PACK_LEN
self.act_fmt = lacp._ACTPRT_INFO_PACK_STR
self.act_len = lacp._ACTPRT_INFO_PACK_LEN
self.prt_fmt = lacp._ACTPRT_INFO_PACK_STR
self.prt_len = lacp._ACTPRT_INFO_PACK_LEN
self.col_fmt = lacp._COL_INFO_PACK_STR
self.col_len = lacp._COL_INFO_PACK_LEN
self.trm_fmt = lacp._TRM_PACK_STR
self.trm_len = lacp._TRM_PACK_LEN
self.length = lacp._ALL_PACK_LEN
self.head_buf = pack(self.head_fmt,
self.subtype,
self.version)
self.act_buf = pack(self.act_fmt,
self.actor_tag,
self.actor_length,
self.actor_system_priority,
addrconv.mac.text_to_bin(self.actor_system),
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state)
self.prt_buf = pack(self.prt_fmt,
self.partner_tag,
self.partner_length,
self.partner_system_priority,
addrconv.mac.text_to_bin(self.partner_system),
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state)
self.col_buf = pack(self.col_fmt,
self.collector_tag,
self.collector_length,
self.collector_max_delay)
self.trm_buf = pack(self.trm_fmt,
self.terminator_tag,
self.terminator_length)
self.buf = self.head_buf + self.act_buf + self.prt_buf + \
self.col_buf + self.trm_buf
self.l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
def tearDown(self):
pass
def find_protocol(self, pkt, name):
for p in pkt.protocols:
if p.protocol_name == name:
return p
def test_init(self):
eq_(self.subtype, self.l._subtype)
eq_(self.version, self.l.version)
eq_(self.actor_tag, self.l._actor_tag)
eq_(self.actor_length, self.l._actor_length)
eq_(self.actor_system_priority, self.l.actor_system_priority)
eq_(self.actor_system, self.l.actor_system)
eq_(self.actor_key, self.l.actor_key)
eq_(self.actor_port_priority, self.l.actor_port_priority)
eq_(self.actor_port, self.l.actor_port)
eq_(self.actor_state_activity, self.l.actor_state_activity)
eq_(self.actor_state_timeout, self.l.actor_state_timeout)
eq_(self.actor_state_aggregation,
self.l.actor_state_aggregation)
eq_(self.actor_state_synchronization,
self.l.actor_state_synchronization)
eq_(self.actor_state_collecting,
self.l.actor_state_collecting)
eq_(self.actor_state_distributing,
self.l.actor_state_distributing)
eq_(self.actor_state_defaulted, self.l.actor_state_defaulted)
eq_(self.actor_state_expired, self.l.actor_state_expired)
eq_(self.actor_state, self.l._actor_state)
eq_(self.partner_tag, self.l._partner_tag)
eq_(self.partner_length, self.l._partner_length)
eq_(self.partner_system_priority,
self.l.partner_system_priority)
eq_(self.partner_system, self.l.partner_system)
eq_(self.partner_key, self.l.partner_key)
eq_(self.partner_port_priority, self.l.partner_port_priority)
eq_(self.partner_port, self.l.partner_port)
eq_(self.partner_state_activity, self.l.partner_state_activity)
eq_(self.partner_state_timeout, self.l.partner_state_timeout)
eq_(self.partner_state_aggregation,
self.l.partner_state_aggregation)
eq_(self.partner_state_synchronization,
self.l.partner_state_synchronization)
eq_(self.partner_state_collecting,
self.l.partner_state_collecting)
eq_(self.partner_state_distributing,
self.l.partner_state_distributing)
eq_(self.partner_state_defaulted,
self.l.partner_state_defaulted)
eq_(self.partner_state_expired, self.l.partner_state_expired)
eq_(self.partner_state, self.l._partner_state)
eq_(self.collector_tag, self.l._collector_tag)
eq_(self.collector_length, self.l._collector_length)
eq_(self.collector_max_delay, self.l.collector_max_delay)
eq_(self.terminator_tag, self.l._terminator_tag)
eq_(self.terminator_length, self.l._terminator_length)
def test_parser(self):
_res = self.l.parser(self.buf)
if type(_res) is tuple:
res = _res[0]
else:
res = _res
eq_(res._subtype, self.subtype)
eq_(res.version, self.version)
eq_(res._actor_tag, self.actor_tag)
eq_(res._actor_length, self.actor_length)
eq_(res.actor_system_priority, self.actor_system_priority)
eq_(res.actor_system, self.actor_system)
eq_(res.actor_key, self.actor_key)
eq_(res.actor_port_priority, self.actor_port_priority)
eq_(res.actor_port, self.actor_port)
eq_(res.actor_state_activity, self.actor_state_activity)
eq_(res.actor_state_timeout, self.actor_state_timeout)
eq_(res.actor_state_aggregation, self.actor_state_aggregation)
eq_(res.actor_state_synchronization,
self.actor_state_synchronization)
eq_(res.actor_state_collecting, self.actor_state_collecting)
eq_(res.actor_state_distributing, self.actor_state_distributing)
eq_(res.actor_state_defaulted, self.actor_state_defaulted)
eq_(res.actor_state_expired, self.actor_state_expired)
eq_(res._actor_state, self.actor_state)
eq_(res._partner_tag, self.partner_tag)
eq_(res._partner_length, self.partner_length)
eq_(res.partner_system_priority, self.partner_system_priority)
eq_(res.partner_system, self.partner_system)
eq_(res.partner_key, self.partner_key)
eq_(res.partner_port_priority, self.partner_port_priority)
eq_(res.partner_port, self.partner_port)
eq_(res.partner_state_activity, self.partner_state_activity)
eq_(res.partner_state_timeout, self.partner_state_timeout)
eq_(res.partner_state_aggregation,
self.partner_state_aggregation)
eq_(res.partner_state_synchronization,
self.partner_state_synchronization)
eq_(res.partner_state_collecting, self.partner_state_collecting)
eq_(res.partner_state_distributing,
self.partner_state_distributing)
eq_(res.partner_state_defaulted, self.partner_state_defaulted)
eq_(res.partner_state_expired, self.partner_state_expired)
eq_(res._partner_state, self.partner_state)
eq_(res._collector_tag, self.collector_tag)
eq_(res._collector_length, self.collector_length)
eq_(res.collector_max_delay, self.collector_max_delay)
eq_(res._terminator_tag, self.terminator_tag)
eq_(res._terminator_length, self.terminator_length)
def test_serialize(self):
data = bytearray()
prev = None
buf = self.l.serialize(data, prev)
offset = 0
head_res = unpack_from(self.head_fmt, buf, offset)
offset += self.head_len
act_res = unpack_from(self.act_fmt, buf, offset)
offset += self.act_len
prt_res = unpack_from(self.prt_fmt, buf, offset)
offset += self.prt_len
col_res = unpack_from(self.col_fmt, buf, offset)
offset += self.col_len
trm_res = unpack_from(self.trm_fmt, buf, offset)
eq_(head_res[0], self.subtype)
eq_(head_res[1], self.version)
eq_(act_res[0], self.actor_tag)
eq_(act_res[1], self.actor_length)
eq_(act_res[2], self.actor_system_priority)
eq_(act_res[3], addrconv.mac.text_to_bin(self.actor_system))
eq_(act_res[4], self.actor_key)
eq_(act_res[5], self.actor_port_priority)
eq_(act_res[6], self.actor_port)
eq_(act_res[7], self.actor_state)
eq_(prt_res[0], self.partner_tag)
eq_(prt_res[1], self.partner_length)
eq_(prt_res[2], self.partner_system_priority)
eq_(prt_res[3], addrconv.mac.text_to_bin(self.partner_system))
eq_(prt_res[4], self.partner_key)
eq_(prt_res[5], self.partner_port_priority)
eq_(prt_res[6], self.partner_port)
eq_(prt_res[7], self.partner_state)
eq_(col_res[0], self.collector_tag)
eq_(col_res[1], self.collector_length)
eq_(col_res[2], self.collector_max_delay)
eq_(trm_res[0], self.terminator_tag)
eq_(trm_res[1], self.terminator_length)
def _build_lacp(self):
ethertype = ether.ETH_TYPE_SLOW
dst = SLOW_PROTOCOL_MULTICAST
e = ethernet(dst, self.actor_system, ethertype)
p = Packet()
p.add_protocol(e)
p.add_protocol(self.l)
p.serialize()
return p
def test_build_lacp(self):
p = self._build_lacp()
e = self.find_protocol(p, "ethernet")
ok_(e)
eq_(e.ethertype, ether.ETH_TYPE_SLOW)
l = self.find_protocol(p, "lacp")
ok_(l)
eq_(l._subtype, self.subtype)
eq_(l.version, self.version)
eq_(l._actor_tag, self.actor_tag)
eq_(l._actor_length, self.actor_length)
eq_(l.actor_system_priority, self.actor_system_priority)
eq_(l.actor_system, self.actor_system)
eq_(l.actor_key, self.actor_key)
eq_(l.actor_port_priority, self.actor_port_priority)
eq_(l.actor_port, self.actor_port)
eq_(l.actor_state_activity, self.actor_state_activity)
eq_(l.actor_state_timeout, self.actor_state_timeout)
eq_(l.actor_state_aggregation, self.actor_state_aggregation)
eq_(l.actor_state_synchronization,
self.actor_state_synchronization)
eq_(l.actor_state_collecting, self.actor_state_collecting)
eq_(l.actor_state_distributing, self.actor_state_distributing)
eq_(l.actor_state_defaulted, self.actor_state_defaulted)
eq_(l.actor_state_expired, self.actor_state_expired)
eq_(l._actor_state, self.actor_state)
eq_(l._partner_tag, self.partner_tag)
eq_(l._partner_length, self.partner_length)
eq_(l.partner_system_priority, self.partner_system_priority)
eq_(l.partner_system, self.partner_system)
eq_(l.partner_key, self.partner_key)
eq_(l.partner_port_priority, self.partner_port_priority)
eq_(l.partner_port, self.partner_port)
eq_(l.partner_state_activity, self.partner_state_activity)
eq_(l.partner_state_timeout, self.partner_state_timeout)
eq_(l.partner_state_aggregation, self.partner_state_aggregation)
eq_(l.partner_state_synchronization,
self.partner_state_synchronization)
eq_(l.partner_state_collecting, self.partner_state_collecting)
eq_(l.partner_state_distributing,
self.partner_state_distributing)
eq_(l.partner_state_defaulted, self.partner_state_defaulted)
eq_(l.partner_state_expired, self.partner_state_expired)
eq_(l._partner_state, self.partner_state)
eq_(l._collector_tag, self.collector_tag)
eq_(l._collector_length, self.collector_length)
eq_(l.collector_max_delay, self.collector_max_delay)
eq_(l._terminator_tag, self.terminator_tag)
eq_(l._terminator_length, self.terminator_length)
@raises(Exception)
def test_malformed_lacp(self):
m_short_buf = self.buf[1:self.length]
slow.parser(m_short_buf)
@raises(Exception)
def test_invalid_subtype(self):
invalid_lacv = copy.deepcopy(self.l)
invalid_lacv.subtype = 0xff
invalid_buf = invalid_lacv.serialize()
slow.parser(invalid_buf)
@raises(Exception)
def test_invalid_version(self):
invalid_lacv = copy.deepcopy(self.l)
invalid_lacv.version = 0xff
invalid_buf = invalid_lacv.serialize()
slow.parser(invalid_buf)
@raises(Exception)
def test_invalid_actor_tag(self):
invalid_lacv = copy.deepcopy(self.l)
invalid_lacv.actor_tag = 0x04
invalid_buf = invalid_lacv.serialize()
slow.parser(invalid_buf)
@raises(Exception)
def test_invalid_actor_length(self):
invalid_lacv = copy.deepcopy(self.l)
invalid_lacv.actor_length = 50
invalid_buf = invalid_lacv.serialize()
slow.parser(invalid_buf)
@raises(Exception)
def test_invalid_partner_tag(self):
invalid_lacv = copy.deepcopy(self.l)
invalid_lacv.partner_tag = 0x01
invalid_buf = invalid_lacv.serialize()
slow.parser(invalid_buf)
@raises(Exception)
def test_invalid_partner_length(self):
invalid_lacv = copy.deepcopy(self.l)
invalid_lacv.partner_length = 0
invalid_buf = invalid_lacv.serialize()
slow.parser(invalid_buf)
@raises(Exception)
def test_invalid_collector_tag(self):
invalid_lacv = copy.deepcopy(self.l)
invalid_lacv.collector_tag = 0x00
invalid_buf = invalid_lacv.serialize()
slow.parser(invalid_buf)
@raises(Exception)
def test_invalid_collector_length(self):
invalid_lacv = copy.deepcopy(self.l)
invalid_lacv.collector_length = 20
invalid_buf = invalid_lacv.serialize()
slow.parser(invalid_buf)
@raises(Exception)
def test_invalid_terminator_tag(self):
invalid_lacv = copy.deepcopy(self.l)
invalid_lacv.terminator_tag = 0x04
invalid_buf = invalid_lacv.serialize()
slow.parser(invalid_buf)
@raises(Exception)
def test_invalid_terminator_length(self):
invalid_lacv = copy.deepcopy(self.l)
invalid_lacv.terminator_length = self.trm_len
invalid_buf = invalid_lacv.serialize()
slow.parser(invalid_buf)
@raises(Exception)
def test_invalid_actor_state_activity(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
2,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_actor_state_timeout(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
2,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_actor_state_aggregation(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
2,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_actor_state_synchronization(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
2,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_actor_state_collecting(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
2,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_actor_state_distributing(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
2,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_actor_state_defaulted(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
2,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_actor_state_expired(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
2,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_partner_state_activity(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
-1,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_partner_state_timeout(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
-1,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_partner_state_aggregation(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
-1,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_partner_state_synchronization(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
-1,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_partner_state_collecting(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
-1,
self.partner_state_distributing,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_partner_state_distributing(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
-1,
self.partner_state_defaulted,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_partner_state_defaulted(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
-1,
self.partner_state_expired,
self.collector_max_delay)
l.serialize()
@raises(Exception)
def test_invalid_partner_state_expired(self):
l = lacp(self.version,
self.actor_system_priority,
self.actor_system,
self.actor_key,
self.actor_port_priority,
self.actor_port,
self.actor_state_activity,
self.actor_state_timeout,
self.actor_state_aggregation,
self.actor_state_synchronization,
self.actor_state_collecting,
self.actor_state_distributing,
self.actor_state_defaulted,
self.actor_state_expired,
self.partner_system_priority,
self.partner_system,
self.partner_key,
self.partner_port_priority,
self.partner_port,
self.partner_state_activity,
self.partner_state_timeout,
self.partner_state_aggregation,
self.partner_state_synchronization,
self.partner_state_collecting,
self.partner_state_distributing,
self.partner_state_defaulted,
-1,
self.collector_max_delay)
l.serialize()
def test_json(self):
jsondict = self.l.to_jsondict()
l = lacp.from_jsondict(jsondict['lacp'])
eq_(str(self.l), str(l))
| 41.254973 | 76 | 0.600487 | 4,953 | 45,628 | 5.117101 | 0.043408 | 0.118603 | 0.106056 | 0.023437 | 0.861985 | 0.819294 | 0.781969 | 0.781969 | 0.697139 | 0.689091 | 0 | 0.006823 | 0.33188 | 45,628 | 1,105 | 77 | 41.292308 | 0.824569 | 0.014969 | 0 | 0.749755 | 0 | 0 | 0.001959 | 0 | 0 | 0 | 0.000534 | 0 | 0.005888 | 1 | 0.040236 | false | 0.001963 | 0.012758 | 0 | 0.056919 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fc1c94d186b31cef755ee174569e7e2ecf2abd0a | 129 | py | Python | accountit/invoices/admin.py | nicolasmesa/Accountit | d5d0caf3c827b63be25fac3ef4cde8d482f69911 | [
"MIT"
] | null | null | null | accountit/invoices/admin.py | nicolasmesa/Accountit | d5d0caf3c827b63be25fac3ef4cde8d482f69911 | [
"MIT"
] | 2 | 2022-01-13T00:39:11.000Z | 2022-03-11T23:15:08.000Z | accountit/invoices/admin.py | nicolasmesa/Accountit | d5d0caf3c827b63be25fac3ef4cde8d482f69911 | [
"MIT"
] | 1 | 2019-12-18T18:01:04.000Z | 2019-12-18T18:01:04.000Z | from django.contrib import admin
from . import models
admin.site.register(models.Invoice)
admin.site.register(models.ItemSold)
| 18.428571 | 36 | 0.813953 | 18 | 129 | 5.833333 | 0.555556 | 0.171429 | 0.32381 | 0.438095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093023 | 129 | 6 | 37 | 21.5 | 0.897436 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
fc36957d9ec3049ef336a023fa42ce12fa3694a3 | 1,871 | py | Python | BFS/Leetcode1293.py | Rylie-W/LeetRecord | 623c4efe88b3af54b8a65f6ec23db850b8c6f46f | [
"Apache-2.0"
] | null | null | null | BFS/Leetcode1293.py | Rylie-W/LeetRecord | 623c4efe88b3af54b8a65f6ec23db850b8c6f46f | [
"Apache-2.0"
] | null | null | null | BFS/Leetcode1293.py | Rylie-W/LeetRecord | 623c4efe88b3af54b8a65f6ec23db850b8c6f46f | [
"Apache-2.0"
] | null | null | null | class Solution:
def shortestPath(self, grid, k: int) -> int:
q=[[0,0,k]]
visited={(0,0,0)}
res=0
direction=[[-1,0],[0,-1],[0,1],[1,0]]
while q:
size=len(q)
for s in range(size):
cur=q[0]
q.pop(0)
if (cur[0],cur[1])==(len(grid)-1,len(grid[0])-1):
return res
for ax,ay in direction:
nx=cur[0]+ax
ny=cur[1]+ay
nadd=cur[2]
if nx>-1 and nx<len(grid) and ny>-1 and ny<len(grid[0]):
if grid[nx][ny]==1:
nadd-=1
if nadd>= 0 and (nx,ny,nadd) not in visited:
visited.add((nx,ny,nadd))
# if (nx,ny) not in visited:
# if grid[nx][ny]==0:
q.append([nx,ny,nadd])
# elif cur[2]>0:
# q.append([nx,ny,cur[2]-1])
# visited.add((nx,ny))
res+=1
return -1
if __name__ == '__main__':
sol=Solution()
# grid =[[0, 0, 0],
# [1, 1, 0],
# [0, 0, 0],
# [0, 1, 1],
# [0, 0, 0]]
# k = 1
# grid =[[0, 1, 1],
# [1, 1, 1],
# [1, 0, 0]]
# k = 1
grid=[[0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 1, 1, 1, 1, 1, 1, 1, 1, 0], [0, 1, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 1, 0, 1, 1, 1, 1, 1, 1, 1], [0, 1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 1, 1, 1, 1, 1, 1, 1, 1, 0],
[0, 1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 1, 0, 1, 1, 1, 1, 1, 1, 1], [0, 1, 0, 1, 1, 1, 1, 0, 0, 0],
[0, 1, 0, 0, 0, 0, 0, 0, 1, 0], [0, 1, 1, 1, 1, 1, 1, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 0, 1, 0]]
k=1
# 5374 9520 5899 9878 0266
print(sol.shortestPath(grid,k))
| 36.686275 | 105 | 0.334046 | 314 | 1,871 | 1.964968 | 0.143312 | 0.213938 | 0.23825 | 0.246353 | 0.303079 | 0.262561 | 0.233387 | 0.223663 | 0.179903 | 0.165316 | 0 | 0.200787 | 0.456975 | 1,871 | 50 | 106 | 37.42 | 0.406496 | 0.146446 | 0 | 0 | 0 | 0 | 0.005063 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030303 | false | 0 | 0 | 0 | 0.121212 | 0.030303 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fc37609763e53164b24cf2649d26c10f40fb2d1d | 7,945 | py | Python | cminx/parser/CMakeLexer.py | AutonomicPerfectionist/CMakeDoc | b2121714963d44a529232539ec119e0cbc4f191d | [
"Apache-2.0"
] | null | null | null | cminx/parser/CMakeLexer.py | AutonomicPerfectionist/CMakeDoc | b2121714963d44a529232539ec119e0cbc4f191d | [
"Apache-2.0"
] | 22 | 2020-03-15T02:54:58.000Z | 2022-03-06T22:46:09.000Z | cminx/parser/CMakeLexer.py | AutonomicPerfectionist/CMakeDoc | b2121714963d44a529232539ec119e0cbc4f191d | [
"Apache-2.0"
] | 2 | 2020-04-06T22:45:09.000Z | 2022-01-31T22:06:23.000Z | # Generated from CMake.g4 by ANTLR 4.7.2
from antlr4 import *
from io import StringIO
from typing.io import TextIO
import sys
def serializedATN():
with StringIO() as buf:
buf.write("\3\u608b\ua72a\u8133\ub9ed\u417c\u3be7\u7786\u5964\2\16")
buf.write("\u00c0\b\1\4\2\t\2\4\3\t\3\4\4\t\4\4\5\t\5\4\6\t\6\4\7")
buf.write("\t\7\4\b\t\b\4\t\t\t\4\n\t\n\4\13\t\13\4\f\t\f\4\r\t\r")
buf.write("\4\16\t\16\4\17\t\17\4\20\t\20\4\21\t\21\4\22\t\22\3\2")
buf.write("\3\2\3\3\3\3\3\4\3\4\7\4,\n\4\f\4\16\4/\13\4\3\5\3\5\6")
buf.write("\5\63\n\5\r\5\16\5\64\3\6\3\6\3\6\5\6:\n\6\3\7\3\7\3\7")
buf.write("\3\b\3\b\3\b\3\b\3\b\3\b\5\bE\n\b\3\t\3\t\3\t\3\n\3\n")
buf.write("\3\n\3\n\7\nN\n\n\f\n\16\nQ\13\n\3\n\3\n\3\13\3\13\3\13")
buf.write("\5\13X\n\13\3\13\5\13[\n\13\3\f\3\f\3\f\3\f\3\r\3\r\3")
buf.write("\r\3\r\3\r\3\r\7\rg\n\r\f\r\16\rj\13\r\3\r\5\rm\n\r\3")
buf.write("\16\3\16\3\16\3\16\3\16\3\16\7\16u\n\16\f\16\16\16x\13")
buf.write("\16\3\16\3\16\3\16\3\16\3\17\3\17\3\17\3\17\3\17\3\17")
buf.write("\3\17\3\17\3\20\3\20\3\20\3\20\7\20\u008a\n\20\f\20\16")
buf.write("\20\u008d\13\20\3\20\3\20\7\20\u0091\n\20\f\20\16\20\u0094")
buf.write("\13\20\3\20\3\20\7\20\u0098\n\20\f\20\16\20\u009b\13\20")
buf.write("\3\20\3\20\7\20\u009f\n\20\f\20\16\20\u00a2\13\20\5\20")
buf.write("\u00a4\n\20\3\20\3\20\5\20\u00a8\n\20\3\20\5\20\u00ab")
buf.write("\n\20\3\20\3\20\3\21\3\21\5\21\u00b1\n\21\3\21\6\21\u00b4")
buf.write("\n\21\r\21\16\21\u00b5\3\21\3\21\3\22\6\22\u00bb\n\22")
buf.write("\r\22\16\22\u00bc\3\22\3\22\4hv\2\23\3\3\5\4\7\5\t\6\13")
buf.write("\7\r\2\17\2\21\2\23\b\25\2\27\t\31\2\33\n\35\13\37\f!")
buf.write("\r#\16\3\2\f\5\2C\\aac|\6\2\62;C\\aac|\b\2\13\f\17\17")
buf.write("\"\"$%*+^^\6\2\62;==C\\c|\4\2$$^^\6\2\f\f\17\17??]]\4")
buf.write("\2\f\f\17\17\5\2\f\f\17\17]]\3\3\f\f\4\2\13\13\"\"\2\u00d6")
buf.write("\2\3\3\2\2\2\2\5\3\2\2\2\2\7\3\2\2\2\2\t\3\2\2\2\2\13")
buf.write("\3\2\2\2\2\23\3\2\2\2\2\27\3\2\2\2\2\33\3\2\2\2\2\35\3")
buf.write("\2\2\2\2\37\3\2\2\2\2!\3\2\2\2\2#\3\2\2\2\3%\3\2\2\2\5")
buf.write("\'\3\2\2\2\7)\3\2\2\2\t\62\3\2\2\2\139\3\2\2\2\r;\3\2")
buf.write("\2\2\17D\3\2\2\2\21F\3\2\2\2\23I\3\2\2\2\25T\3\2\2\2\27")
buf.write("\\\3\2\2\2\31l\3\2\2\2\33n\3\2\2\2\35}\3\2\2\2\37\u0085")
buf.write("\3\2\2\2!\u00b3\3\2\2\2#\u00ba\3\2\2\2%&\7*\2\2&\4\3\2")
buf.write("\2\2\'(\7+\2\2(\6\3\2\2\2)-\t\2\2\2*,\t\3\2\2+*\3\2\2")
buf.write("\2,/\3\2\2\2-+\3\2\2\2-.\3\2\2\2.\b\3\2\2\2/-\3\2\2\2")
buf.write("\60\63\n\4\2\2\61\63\5\13\6\2\62\60\3\2\2\2\62\61\3\2")
buf.write("\2\2\63\64\3\2\2\2\64\62\3\2\2\2\64\65\3\2\2\2\65\n\3")
buf.write("\2\2\2\66:\5\r\7\2\67:\5\17\b\28:\5\21\t\29\66\3\2\2\2")
buf.write("9\67\3\2\2\298\3\2\2\2:\f\3\2\2\2;<\7^\2\2<=\n\5\2\2=")
buf.write("\16\3\2\2\2>?\7^\2\2?E\7v\2\2@A\7^\2\2AE\7t\2\2BC\7^\2")
buf.write("\2CE\7p\2\2D>\3\2\2\2D@\3\2\2\2DB\3\2\2\2E\20\3\2\2\2")
buf.write("FG\7^\2\2GH\7=\2\2H\22\3\2\2\2IO\7$\2\2JN\n\6\2\2KN\5")
buf.write("\13\6\2LN\5\25\13\2MJ\3\2\2\2MK\3\2\2\2ML\3\2\2\2NQ\3")
buf.write("\2\2\2OM\3\2\2\2OP\3\2\2\2PR\3\2\2\2QO\3\2\2\2RS\7$\2")
buf.write("\2S\24\3\2\2\2TZ\7^\2\2UW\7\17\2\2VX\7\f\2\2WV\3\2\2\2")
buf.write("WX\3\2\2\2X[\3\2\2\2Y[\7\f\2\2ZU\3\2\2\2ZY\3\2\2\2[\26")
buf.write("\3\2\2\2\\]\7]\2\2]^\5\31\r\2^_\7_\2\2_\30\3\2\2\2`a\7")
buf.write("?\2\2ab\5\31\r\2bc\7?\2\2cm\3\2\2\2dh\7]\2\2eg\13\2\2")
buf.write("\2fe\3\2\2\2gj\3\2\2\2hi\3\2\2\2hf\3\2\2\2ik\3\2\2\2j")
buf.write("h\3\2\2\2km\7_\2\2l`\3\2\2\2ld\3\2\2\2m\32\3\2\2\2no\7")
buf.write("%\2\2op\7]\2\2pq\7]\2\2qr\7]\2\2rv\3\2\2\2su\13\2\2\2")
buf.write("ts\3\2\2\2ux\3\2\2\2vw\3\2\2\2vt\3\2\2\2wy\3\2\2\2xv\3")
buf.write("\2\2\2yz\7%\2\2z{\7_\2\2{|\7_\2\2|\34\3\2\2\2}~\7%\2\2")
buf.write("~\177\7]\2\2\177\u0080\3\2\2\2\u0080\u0081\5\31\r\2\u0081")
buf.write("\u0082\7_\2\2\u0082\u0083\3\2\2\2\u0083\u0084\b\17\2\2")
buf.write("\u0084\36\3\2\2\2\u0085\u00a3\7%\2\2\u0086\u00a4\3\2\2")
buf.write("\2\u0087\u008b\7]\2\2\u0088\u008a\7?\2\2\u0089\u0088\3")
buf.write("\2\2\2\u008a\u008d\3\2\2\2\u008b\u0089\3\2\2\2\u008b\u008c")
buf.write("\3\2\2\2\u008c\u00a4\3\2\2\2\u008d\u008b\3\2\2\2\u008e")
buf.write("\u0092\7]\2\2\u008f\u0091\7?\2\2\u0090\u008f\3\2\2\2\u0091")
buf.write("\u0094\3\2\2\2\u0092\u0090\3\2\2\2\u0092\u0093\3\2\2\2")
buf.write("\u0093\u0095\3\2\2\2\u0094\u0092\3\2\2\2\u0095\u0099\n")
buf.write("\7\2\2\u0096\u0098\n\b\2\2\u0097\u0096\3\2\2\2\u0098\u009b")
buf.write("\3\2\2\2\u0099\u0097\3\2\2\2\u0099\u009a\3\2\2\2\u009a")
buf.write("\u00a4\3\2\2\2\u009b\u0099\3\2\2\2\u009c\u00a0\n\t\2\2")
buf.write("\u009d\u009f\n\b\2\2\u009e\u009d\3\2\2\2\u009f\u00a2\3")
buf.write("\2\2\2\u00a0\u009e\3\2\2\2\u00a0\u00a1\3\2\2\2\u00a1\u00a4")
buf.write("\3\2\2\2\u00a2\u00a0\3\2\2\2\u00a3\u0086\3\2\2\2\u00a3")
buf.write("\u0087\3\2\2\2\u00a3\u008e\3\2\2\2\u00a3\u009c\3\2\2\2")
buf.write("\u00a4\u00aa\3\2\2\2\u00a5\u00a7\7\17\2\2\u00a6\u00a8")
buf.write("\7\f\2\2\u00a7\u00a6\3\2\2\2\u00a7\u00a8\3\2\2\2\u00a8")
buf.write("\u00ab\3\2\2\2\u00a9\u00ab\t\n\2\2\u00aa\u00a5\3\2\2\2")
buf.write("\u00aa\u00a9\3\2\2\2\u00ab\u00ac\3\2\2\2\u00ac\u00ad\b")
buf.write("\20\2\2\u00ad \3\2\2\2\u00ae\u00b0\7\17\2\2\u00af\u00b1")
buf.write("\7\f\2\2\u00b0\u00af\3\2\2\2\u00b0\u00b1\3\2\2\2\u00b1")
buf.write("\u00b4\3\2\2\2\u00b2\u00b4\7\f\2\2\u00b3\u00ae\3\2\2\2")
buf.write("\u00b3\u00b2\3\2\2\2\u00b4\u00b5\3\2\2\2\u00b5\u00b3\3")
buf.write("\2\2\2\u00b5\u00b6\3\2\2\2\u00b6\u00b7\3\2\2\2\u00b7\u00b8")
buf.write("\b\21\2\2\u00b8\"\3\2\2\2\u00b9\u00bb\t\13\2\2\u00ba\u00b9")
buf.write("\3\2\2\2\u00bb\u00bc\3\2\2\2\u00bc\u00ba\3\2\2\2\u00bc")
buf.write("\u00bd\3\2\2\2\u00bd\u00be\3\2\2\2\u00be\u00bf\b\22\2")
buf.write("\2\u00bf$\3\2\2\2\32\2-\62\649DMOWZhlv\u008b\u0092\u0099")
buf.write("\u00a0\u00a3\u00a7\u00aa\u00b0\u00b3\u00b5\u00bc\3\b\2")
buf.write("\2")
return buf.getvalue()
class CMakeLexer(Lexer):
atn = ATNDeserializer().deserialize(serializedATN())
decisionsToDFA = [ DFA(ds, i) for i, ds in enumerate(atn.decisionToState) ]
T__0 = 1
T__1 = 2
Identifier = 3
Unquoted_argument = 4
Escape_sequence = 5
Quoted_argument = 6
Bracket_argument = 7
Bracket_doccomment = 8
Bracket_comment = 9
Line_comment = 10
Newline = 11
Space = 12
channelNames = [ u"DEFAULT_TOKEN_CHANNEL", u"HIDDEN" ]
modeNames = [ "DEFAULT_MODE" ]
literalNames = [ "<INVALID>",
"'('", "')'" ]
symbolicNames = [ "<INVALID>",
"Identifier", "Unquoted_argument", "Escape_sequence", "Quoted_argument",
"Bracket_argument", "Bracket_doccomment", "Bracket_comment",
"Line_comment", "Newline", "Space" ]
ruleNames = [ "T__0", "T__1", "Identifier", "Unquoted_argument", "Escape_sequence",
"Escape_identity", "Escape_encoded", "Escape_semicolon",
"Quoted_argument", "Quoted_cont", "Bracket_argument",
"Bracket_arg_nested", "Bracket_doccomment", "Bracket_comment",
"Line_comment", "Newline", "Space" ]
grammarFileName = "CMake.g4"
def __init__(self, input=None, output:TextIO = sys.stdout):
super().__init__(input, output)
self.checkVersion("4.7.2")
self._interp = LexerATNSimulator(self, self.atn, self.decisionsToDFA, PredictionContextCache())
self._actions = None
self._predicates = None
| 55.950704 | 103 | 0.560856 | 1,911 | 7,945 | 2.302459 | 0.150706 | 0.135 | 0.091364 | 0.09 | 0.254773 | 0.167727 | 0.078864 | 0.059773 | 0.023636 | 0.017045 | 0 | 0.302851 | 0.148018 | 7,945 | 141 | 104 | 56.347518 | 0.347171 | 0.004783 | 0 | 0.016129 | 1 | 0.620968 | 0.603315 | 0.547893 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016129 | false | 0 | 0.032258 | 0 | 0.225806 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fc6ab041eae3961b06bb39f0518fdf0a957dc93d | 4,686 | py | Python | egret/models/tests/test_acopf.py | dilr/Egret-1 | a4afe34e377e65b3b538042bb8a98ce352add10e | [
"BSD-3-Clause"
] | null | null | null | egret/models/tests/test_acopf.py | dilr/Egret-1 | a4afe34e377e65b3b538042bb8a98ce352add10e | [
"BSD-3-Clause"
] | 1 | 2019-12-11T22:45:12.000Z | 2019-12-11T22:45:12.000Z | egret/models/tests/test_acopf.py | austinshort/Egret | e1fe4ece9f524dcd76f77768cf0d8048dc2b9fd7 | [
"BSD-3-Clause"
] | null | null | null | # ___________________________________________________________________________
#
# EGRET: Electrical Grid Research and Engineering Tools
# Copyright 2019 National Technology & Engineering Solutions of Sandia, LLC
# (NTESS). Under the terms of Contract DE-NA0003525 with NTESS, the U.S.
# Government retains certain rights in this software.
# This software is distributed under the Revised BSD License.
# ___________________________________________________________________________
'''
acopf tester
'''
import os
import math
import unittest
from pyomo.opt import SolverFactory, TerminationCondition
from egret.models.acopf import *
from egret.data.model_data import ModelData
from parameterized import parameterized
from egret.parsers.matpower_parser import create_ModelData
current_dir = os.path.dirname(os.path.abspath(__file__))
case_names = ['pglib_opf_case3_lmbd','pglib_opf_case30_ieee','pglib_opf_case300_ieee','pglib_opf_case3012wp_k']
test_cases = [os.path.join(current_dir, 'transmission_test_instances', 'pglib-opf-master', '{}.m'.format(i)) for i in case_names]
soln_cases = [os.path.join(current_dir, 'transmission_test_instances', 'acopf_solution_files', '{}_acopf_solution.json'.format(i)) for i in case_names]
class TestPSVACOPF(unittest.TestCase):
show_output = True
@classmethod
def setUpClass(self):
download_dir = os.path.join(current_dir, 'transmission_test_instances')
if not os.path.exists(os.path.join(download_dir, 'pglib-opf-master')):
from egret.thirdparty.get_pglib_opf import get_pglib_opf
get_pglib_opf(download_dir)
@parameterized.expand(zip(test_cases, soln_cases))
def test_acopf_model(self, test_case, soln_case, include_kwargs=False):
acopf_model = create_psv_acopf_model
md_soln = ModelData.read(soln_case)
md_dict = create_ModelData(test_case)
kwargs = {}
if include_kwargs:
kwargs = {'include_feasibility_slack':True}
md, results = solve_acopf(md_dict, "ipopt", acopf_model_generator=acopf_model, solver_tee=False, return_results=True, **kwargs)
self.assertTrue(results.solver.termination_condition == TerminationCondition.optimal)
comparison = math.isclose(md.data['system']['total_cost'], md_soln.data['system']['total_cost'], rel_tol=1e-6)
self.assertTrue(comparison)
class TestRSVACOPF(unittest.TestCase):
show_output = True
@classmethod
def setUpClass(self):
download_dir = os.path.join(current_dir, 'transmission_test_instances')
if not os.path.exists(os.path.join(download_dir, 'pglib-opf-master')):
from egret.thirdparty.get_pglib_opf import get_pglib_opf
get_pglib_opf(download_dir)
@parameterized.expand(zip(test_cases, soln_cases))
def test_acopf_model(self, test_case, soln_case, include_kwargs=False):
acopf_model = create_rsv_acopf_model
md_soln = ModelData.read(soln_case)
md_dict = create_ModelData(test_case)
kwargs = {}
if include_kwargs:
kwargs = {'include_feasibility_slack':True}
md, results = solve_acopf(md_dict, "ipopt", acopf_model_generator=acopf_model, solver_tee=False, return_results=True, **kwargs)
self.assertTrue(results.solver.termination_condition == TerminationCondition.optimal)
comparison = math.isclose(md.data['system']['total_cost'], md_soln.data['system']['total_cost'], rel_tol=1e-6)
self.assertTrue(comparison)
class TestRIVACOPF(unittest.TestCase):
show_output = True
@classmethod
def setUpClass(self):
download_dir = os.path.join(current_dir, 'transmission_test_instances')
if not os.path.exists(os.path.join(download_dir, 'pglib-opf-master')):
from egret.thirdparty.get_pglib_opf import get_pglib_opf
get_pglib_opf(download_dir)
@parameterized.expand(zip(test_cases, soln_cases))
def test_acopf_model(self, test_case, soln_case, include_kwargs=False):
acopf_model = create_riv_acopf_model
md_soln = ModelData.read(soln_case)
md_dict = create_ModelData(test_case)
kwargs = {}
if include_kwargs:
kwargs = {'include_feasibility_slack':True}
md, results = solve_acopf(md_dict, "ipopt", acopf_model_generator=acopf_model, solver_tee=False, return_results=True, **kwargs)
self.assertTrue(results.solver.termination_condition == TerminationCondition.optimal)
comparison = math.isclose(md.data['system']['total_cost'], md_soln.data['system']['total_cost'], rel_tol=1e-6)
self.assertTrue(comparison)
if __name__ == '__main__':
unittest.main()
| 41.105263 | 151 | 0.733675 | 588 | 4,686 | 5.294218 | 0.239796 | 0.043688 | 0.031802 | 0.036621 | 0.758754 | 0.758754 | 0.758754 | 0.744619 | 0.744619 | 0.712496 | 0 | 0.006905 | 0.1656 | 4,686 | 113 | 152 | 41.469027 | 0.789258 | 0.103073 | 0 | 0.72973 | 0 | 0 | 0.125119 | 0.070917 | 0 | 0 | 0 | 0 | 0.081081 | 1 | 0.081081 | false | 0 | 0.148649 | 0 | 0.310811 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fc9656bba6cf456ae1d707357c15f74ec153f878 | 38,706 | py | Python | pybind/slxos/v17s_1_02/capabilities/l2/__init__.py | extremenetworks/pybind | 44c467e71b2b425be63867aba6e6fa28b2cfe7fb | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v17s_1_02/capabilities/l2/__init__.py | extremenetworks/pybind | 44c467e71b2b425be63867aba6e6fa28b2cfe7fb | [
"Apache-2.0"
] | null | null | null | pybind/slxos/v17s_1_02/capabilities/l2/__init__.py | extremenetworks/pybind | 44c467e71b2b425be63867aba6e6fa28b2cfe7fb | [
"Apache-2.0"
] | 1 | 2021-11-05T22:15:42.000Z | 2021-11-05T22:15:42.000Z |
from operator import attrgetter
import pyangbind.lib.xpathhelper as xpathhelper
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType, RestrictedClassType, TypedListType
from pyangbind.lib.yangtypes import YANGBool, YANGListType, YANGDynClass, ReferenceType
from pyangbind.lib.base import PybindBase
from decimal import Decimal
from bitarray import bitarray
import __builtin__
class l2(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module brocade-system-capabilities - based on the path /capabilities/l2. Each member element of
the container is represented as a class variable - with a specific
YANG type.
"""
__slots__ = ('_pybind_generated_by', '_path_helper', '_yang_name', '_rest_name', '_extmethods', '__port_profile','__overlap_vlan','__rspan','__mac_move','__consistency_check','__learning_mode','__priority_tag','__internal_nsm','__lif_untagged_vlan_id','__lif_egress','__lif_inner_vlan','__bridgedomain_local_switching','__bridgedomain_transparent','__dot1x',)
_yang_name = 'l2'
_rest_name = 'l2'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
path_helper_ = kwargs.pop("path_helper", None)
if path_helper_ is False:
self._path_helper = False
elif path_helper_ is not None and isinstance(path_helper_, xpathhelper.YANGPathHelper):
self._path_helper = path_helper_
elif hasattr(self, "_parent"):
path_helper_ = getattr(self._parent, "_path_helper", False)
self._path_helper = path_helper_
else:
self._path_helper = False
extmethods = kwargs.pop("extmethods", None)
if extmethods is False:
self._extmethods = False
elif extmethods is not None and isinstance(extmethods, dict):
self._extmethods = extmethods
elif hasattr(self, "_parent"):
extmethods = getattr(self._parent, "_extmethods", None)
self._extmethods = extmethods
else:
self._extmethods = False
self.__lif_egress = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="lif_egress", rest_name="lif_egress", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
self.__bridgedomain_local_switching = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="bridgedomain_local_switching", rest_name="bridgedomain_local_switching", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
self.__bridgedomain_transparent = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="bridgedomain_transparent", rest_name="bridgedomain_transparent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
self.__lif_untagged_vlan_id = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="lif_untagged_vlan_id", rest_name="lif_untagged_vlan_id", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
self.__dot1x = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="dot1x", rest_name="dot1x", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
self.__consistency_check = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="consistency_check", rest_name="consistency_check", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
self.__mac_move = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="mac_move", rest_name="mac_move", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
self.__learning_mode = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="learning_mode", rest_name="learning_mode", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
self.__overlap_vlan = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="overlap_vlan", rest_name="overlap_vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
self.__port_profile = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="port_profile", rest_name="port_profile", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
self.__internal_nsm = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="internal_nsm", rest_name="internal_nsm", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
self.__priority_tag = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="priority_tag", rest_name="priority_tag", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
self.__lif_inner_vlan = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="lif_inner_vlan", rest_name="lif_inner_vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
self.__rspan = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="rspan", rest_name="rspan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return [u'capabilities', u'l2']
def _rest_path(self):
if hasattr(self, "_parent"):
if self._rest_name:
return self._parent._rest_path()+[self._rest_name]
else:
return self._parent._rest_path()
else:
return [u'capabilities', u'l2']
def _get_port_profile(self):
"""
Getter method for port_profile, mapped from YANG variable /capabilities/l2/port_profile (boolean)
"""
return self.__port_profile
def _set_port_profile(self, v, load=False):
"""
Setter method for port_profile, mapped from YANG variable /capabilities/l2/port_profile (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_port_profile is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_port_profile() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="port_profile", rest_name="port_profile", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """port_profile must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="port_profile", rest_name="port_profile", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__port_profile = t
if hasattr(self, '_set'):
self._set()
def _unset_port_profile(self):
self.__port_profile = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="port_profile", rest_name="port_profile", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
def _get_overlap_vlan(self):
"""
Getter method for overlap_vlan, mapped from YANG variable /capabilities/l2/overlap_vlan (boolean)
"""
return self.__overlap_vlan
def _set_overlap_vlan(self, v, load=False):
"""
Setter method for overlap_vlan, mapped from YANG variable /capabilities/l2/overlap_vlan (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_overlap_vlan is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_overlap_vlan() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="overlap_vlan", rest_name="overlap_vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """overlap_vlan must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="overlap_vlan", rest_name="overlap_vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__overlap_vlan = t
if hasattr(self, '_set'):
self._set()
def _unset_overlap_vlan(self):
self.__overlap_vlan = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="overlap_vlan", rest_name="overlap_vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
def _get_rspan(self):
"""
Getter method for rspan, mapped from YANG variable /capabilities/l2/rspan (boolean)
"""
return self.__rspan
def _set_rspan(self, v, load=False):
"""
Setter method for rspan, mapped from YANG variable /capabilities/l2/rspan (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_rspan is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_rspan() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="rspan", rest_name="rspan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """rspan must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="rspan", rest_name="rspan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__rspan = t
if hasattr(self, '_set'):
self._set()
def _unset_rspan(self):
self.__rspan = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="rspan", rest_name="rspan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
def _get_mac_move(self):
"""
Getter method for mac_move, mapped from YANG variable /capabilities/l2/mac_move (boolean)
"""
return self.__mac_move
def _set_mac_move(self, v, load=False):
"""
Setter method for mac_move, mapped from YANG variable /capabilities/l2/mac_move (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_mac_move is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_mac_move() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="mac_move", rest_name="mac_move", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """mac_move must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="mac_move", rest_name="mac_move", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__mac_move = t
if hasattr(self, '_set'):
self._set()
def _unset_mac_move(self):
self.__mac_move = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="mac_move", rest_name="mac_move", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
def _get_consistency_check(self):
"""
Getter method for consistency_check, mapped from YANG variable /capabilities/l2/consistency_check (boolean)
"""
return self.__consistency_check
def _set_consistency_check(self, v, load=False):
"""
Setter method for consistency_check, mapped from YANG variable /capabilities/l2/consistency_check (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_consistency_check is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_consistency_check() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="consistency_check", rest_name="consistency_check", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """consistency_check must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="consistency_check", rest_name="consistency_check", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__consistency_check = t
if hasattr(self, '_set'):
self._set()
def _unset_consistency_check(self):
self.__consistency_check = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="consistency_check", rest_name="consistency_check", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
def _get_learning_mode(self):
"""
Getter method for learning_mode, mapped from YANG variable /capabilities/l2/learning_mode (boolean)
"""
return self.__learning_mode
def _set_learning_mode(self, v, load=False):
"""
Setter method for learning_mode, mapped from YANG variable /capabilities/l2/learning_mode (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_learning_mode is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_learning_mode() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="learning_mode", rest_name="learning_mode", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """learning_mode must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="learning_mode", rest_name="learning_mode", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__learning_mode = t
if hasattr(self, '_set'):
self._set()
def _unset_learning_mode(self):
self.__learning_mode = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="learning_mode", rest_name="learning_mode", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
def _get_priority_tag(self):
"""
Getter method for priority_tag, mapped from YANG variable /capabilities/l2/priority_tag (boolean)
"""
return self.__priority_tag
def _set_priority_tag(self, v, load=False):
"""
Setter method for priority_tag, mapped from YANG variable /capabilities/l2/priority_tag (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_priority_tag is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_priority_tag() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="priority_tag", rest_name="priority_tag", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """priority_tag must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="priority_tag", rest_name="priority_tag", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__priority_tag = t
if hasattr(self, '_set'):
self._set()
def _unset_priority_tag(self):
self.__priority_tag = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="priority_tag", rest_name="priority_tag", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
def _get_internal_nsm(self):
"""
Getter method for internal_nsm, mapped from YANG variable /capabilities/l2/internal_nsm (boolean)
"""
return self.__internal_nsm
def _set_internal_nsm(self, v, load=False):
"""
Setter method for internal_nsm, mapped from YANG variable /capabilities/l2/internal_nsm (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_internal_nsm is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_internal_nsm() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="internal_nsm", rest_name="internal_nsm", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """internal_nsm must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="internal_nsm", rest_name="internal_nsm", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__internal_nsm = t
if hasattr(self, '_set'):
self._set()
def _unset_internal_nsm(self):
self.__internal_nsm = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="internal_nsm", rest_name="internal_nsm", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
def _get_lif_untagged_vlan_id(self):
"""
Getter method for lif_untagged_vlan_id, mapped from YANG variable /capabilities/l2/lif_untagged_vlan_id (boolean)
"""
return self.__lif_untagged_vlan_id
def _set_lif_untagged_vlan_id(self, v, load=False):
"""
Setter method for lif_untagged_vlan_id, mapped from YANG variable /capabilities/l2/lif_untagged_vlan_id (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_lif_untagged_vlan_id is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_lif_untagged_vlan_id() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="lif_untagged_vlan_id", rest_name="lif_untagged_vlan_id", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """lif_untagged_vlan_id must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="lif_untagged_vlan_id", rest_name="lif_untagged_vlan_id", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__lif_untagged_vlan_id = t
if hasattr(self, '_set'):
self._set()
def _unset_lif_untagged_vlan_id(self):
self.__lif_untagged_vlan_id = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="lif_untagged_vlan_id", rest_name="lif_untagged_vlan_id", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
def _get_lif_egress(self):
"""
Getter method for lif_egress, mapped from YANG variable /capabilities/l2/lif_egress (boolean)
"""
return self.__lif_egress
def _set_lif_egress(self, v, load=False):
"""
Setter method for lif_egress, mapped from YANG variable /capabilities/l2/lif_egress (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_lif_egress is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_lif_egress() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="lif_egress", rest_name="lif_egress", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """lif_egress must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="lif_egress", rest_name="lif_egress", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__lif_egress = t
if hasattr(self, '_set'):
self._set()
def _unset_lif_egress(self):
self.__lif_egress = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="lif_egress", rest_name="lif_egress", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
def _get_lif_inner_vlan(self):
"""
Getter method for lif_inner_vlan, mapped from YANG variable /capabilities/l2/lif_inner_vlan (boolean)
"""
return self.__lif_inner_vlan
def _set_lif_inner_vlan(self, v, load=False):
"""
Setter method for lif_inner_vlan, mapped from YANG variable /capabilities/l2/lif_inner_vlan (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_lif_inner_vlan is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_lif_inner_vlan() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="lif_inner_vlan", rest_name="lif_inner_vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """lif_inner_vlan must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="lif_inner_vlan", rest_name="lif_inner_vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__lif_inner_vlan = t
if hasattr(self, '_set'):
self._set()
def _unset_lif_inner_vlan(self):
self.__lif_inner_vlan = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="lif_inner_vlan", rest_name="lif_inner_vlan", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
def _get_bridgedomain_local_switching(self):
"""
Getter method for bridgedomain_local_switching, mapped from YANG variable /capabilities/l2/bridgedomain_local_switching (boolean)
"""
return self.__bridgedomain_local_switching
def _set_bridgedomain_local_switching(self, v, load=False):
"""
Setter method for bridgedomain_local_switching, mapped from YANG variable /capabilities/l2/bridgedomain_local_switching (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_bridgedomain_local_switching is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_bridgedomain_local_switching() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="bridgedomain_local_switching", rest_name="bridgedomain_local_switching", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """bridgedomain_local_switching must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="bridgedomain_local_switching", rest_name="bridgedomain_local_switching", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__bridgedomain_local_switching = t
if hasattr(self, '_set'):
self._set()
def _unset_bridgedomain_local_switching(self):
self.__bridgedomain_local_switching = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="bridgedomain_local_switching", rest_name="bridgedomain_local_switching", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
def _get_bridgedomain_transparent(self):
"""
Getter method for bridgedomain_transparent, mapped from YANG variable /capabilities/l2/bridgedomain_transparent (boolean)
"""
return self.__bridgedomain_transparent
def _set_bridgedomain_transparent(self, v, load=False):
"""
Setter method for bridgedomain_transparent, mapped from YANG variable /capabilities/l2/bridgedomain_transparent (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_bridgedomain_transparent is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_bridgedomain_transparent() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="bridgedomain_transparent", rest_name="bridgedomain_transparent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """bridgedomain_transparent must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="bridgedomain_transparent", rest_name="bridgedomain_transparent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__bridgedomain_transparent = t
if hasattr(self, '_set'):
self._set()
def _unset_bridgedomain_transparent(self):
self.__bridgedomain_transparent = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="bridgedomain_transparent", rest_name="bridgedomain_transparent", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
def _get_dot1x(self):
"""
Getter method for dot1x, mapped from YANG variable /capabilities/l2/dot1x (boolean)
"""
return self.__dot1x
def _set_dot1x(self, v, load=False):
"""
Setter method for dot1x, mapped from YANG variable /capabilities/l2/dot1x (boolean)
If this variable is read-only (config: false) in the
source YANG file, then _set_dot1x is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_dot1x() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGBool, is_leaf=True, yang_name="dot1x", rest_name="dot1x", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
except (TypeError, ValueError):
raise ValueError({
'error-string': """dot1x must be of a type compatible with boolean""",
'defined-type': "boolean",
'generated-type': """YANGDynClass(base=YANGBool, is_leaf=True, yang_name="dot1x", rest_name="dot1x", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)""",
})
self.__dot1x = t
if hasattr(self, '_set'):
self._set()
def _unset_dot1x(self):
self.__dot1x = YANGDynClass(base=YANGBool, is_leaf=True, yang_name="dot1x", rest_name="dot1x", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='urn:brocade.com:mgmt:brocade-system-capabilities', defining_module='brocade-system-capabilities', yang_type='boolean', is_config=False)
port_profile = __builtin__.property(_get_port_profile)
overlap_vlan = __builtin__.property(_get_overlap_vlan)
rspan = __builtin__.property(_get_rspan)
mac_move = __builtin__.property(_get_mac_move)
consistency_check = __builtin__.property(_get_consistency_check)
learning_mode = __builtin__.property(_get_learning_mode)
priority_tag = __builtin__.property(_get_priority_tag)
internal_nsm = __builtin__.property(_get_internal_nsm)
lif_untagged_vlan_id = __builtin__.property(_get_lif_untagged_vlan_id)
lif_egress = __builtin__.property(_get_lif_egress)
lif_inner_vlan = __builtin__.property(_get_lif_inner_vlan)
bridgedomain_local_switching = __builtin__.property(_get_bridgedomain_local_switching)
bridgedomain_transparent = __builtin__.property(_get_bridgedomain_transparent)
dot1x = __builtin__.property(_get_dot1x)
_pyangbind_elements = {'port_profile': port_profile, 'overlap_vlan': overlap_vlan, 'rspan': rspan, 'mac_move': mac_move, 'consistency_check': consistency_check, 'learning_mode': learning_mode, 'priority_tag': priority_tag, 'internal_nsm': internal_nsm, 'lif_untagged_vlan_id': lif_untagged_vlan_id, 'lif_egress': lif_egress, 'lif_inner_vlan': lif_inner_vlan, 'bridgedomain_local_switching': bridgedomain_local_switching, 'bridgedomain_transparent': bridgedomain_transparent, 'dot1x': dot1x, }
| 66.965398 | 494 | 0.754327 | 5,100 | 38,706 | 5.418627 | 0.035294 | 0.045594 | 0.058766 | 0.063941 | 0.875231 | 0.855184 | 0.847946 | 0.833038 | 0.828406 | 0.819142 | 0 | 0.001988 | 0.129179 | 38,706 | 577 | 495 | 67.081456 | 0.817896 | 0.166047 | 0 | 0.505747 | 0 | 0.04023 | 0.365882 | 0.221674 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12931 | false | 0 | 0.022989 | 0 | 0.264368 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5dc31bb6ee632474b439812ea76ef65c9da2803e | 107 | py | Python | micropolis/MicropolisCore/src/pyMicropolis/__init__.py | zegerk/gym-micropolis | 554bf41e9c4001140cdba90c5bbb3cc6bacf4c65 | [
"MIT"
] | 775 | 2015-03-15T13:15:10.000Z | 2022-03-27T01:35:59.000Z | micropolis/MicropolisCore/src/pyMicropolis/__init__.py | zegerk/gym-micropolis | 554bf41e9c4001140cdba90c5bbb3cc6bacf4c65 | [
"MIT"
] | 16 | 2015-04-18T05:41:37.000Z | 2021-06-30T19:03:28.000Z | micropolis/MicropolisCore/src/pyMicropolis/__init__.py | zegerk/gym-micropolis | 554bf41e9c4001140cdba90c5bbb3cc6bacf4c65 | [
"MIT"
] | 192 | 2015-03-15T15:33:59.000Z | 2022-03-25T05:15:56.000Z | """
@package pyMicropolis
Python code of the Micropolis project.
@todo Move all Python code to here.
"""
| 13.375 | 38 | 0.728972 | 15 | 107 | 5.2 | 0.866667 | 0.25641 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.17757 | 107 | 7 | 39 | 15.285714 | 0.886364 | 0.915888 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0.142857 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5dceabe1f99c1f4aad838d0df723886a2b7ee319 | 8,670 | py | Python | Integrations/MicrosoftGraphCalendar/MicrosoftGraphCalendar_test.py | RichieB2B/content | 4916d6c5f024da79c22bda85272091e41a700bfa | [
"MIT"
] | 1 | 2020-04-19T11:05:42.000Z | 2020-04-19T11:05:42.000Z | Integrations/MicrosoftGraphCalendar/MicrosoftGraphCalendar_test.py | RichieB2B/content | 4916d6c5f024da79c22bda85272091e41a700bfa | [
"MIT"
] | 9 | 2021-02-08T20:51:18.000Z | 2021-09-23T23:27:38.000Z | Integrations/MicrosoftGraphCalendar/MicrosoftGraphCalendar_test.py | RichieB2B/content | 4916d6c5f024da79c22bda85272091e41a700bfa | [
"MIT"
] | 1 | 2020-05-27T15:26:48.000Z | 2020-05-27T15:26:48.000Z | from MicrosoftGraphCalendar import *
def test_epoch_seconds():
integer = epoch_seconds()
assert isinstance(integer, int)
def test_snakecase_to_camelcase():
assert snakecase_to_camelcase('snake_case_snake_case') == 'SnakeCaseSnakeCase'
def test_camel_case_to_readable():
assert camel_case_to_readable('id') == 'ID'
assert camel_case_to_readable('createdDateTime') == 'Created Date Time'
def test_parse_calendar():
parsed_readable, parsed_outputs = parse_calendar(MOCK_CALENDAR_JSON)
expected_readable = [{'Name': None, 'Owner Name': None, 'Owner Address': None, 'ID': 'some_id'}]
expected_outputs = [
{
'@Odata.Etag': '',
'Attendees': [
{
'emailAddress': {'address': 'someemail@test.com', 'name': 'someemail@test.com'},
'status': {'response': 'none', 'time': '0001-01-01T00:00:00Z'},
'type': 'required'
}
],
'Body': {'content': '<html>', 'contentType': 'html'},
'BodyPreview': '',
'Categories': [],
'ChangeKey': '',
'CreatedDateTime': '2019-11-25T14:20:50.7017675Z',
'End': {'dateTime': '2019-11-25T15:30:00.0000000', 'timeZone': 'UTC'},
'HasAttachments': False,
'ICalUId': '',
'ID': 'some_id',
'Importance': 'normal',
'IsAllDay': False,
'IsCancelled': False,
'IsOrganizer': True,
'IsReminderOn': True,
'LastModifiedDateTime': '2019-11-25T19:17:12.9656678Z',
'Location': {'address': {}, 'coordinates': {}, 'displayName': '', 'locationType': 'default',
'uniqueIdType': 'unknown'},
'Locations': [],
'OnlineMeetingUrl': None,
'Organizer': {'emailAddress': {'address': 'someemail@test.com', 'name': 'Some Name'}},
'OriginalEndTimeZone': 'Israel Standard Time',
'OriginalStartTimeZone': 'Israel Standard Time',
'Recurrence': None,
'ReminderMinutesBeforeStart': 15,
'ResponseRequested': True,
'ResponseStatus': {'response': 'organizer', 'time': '0001-01-01T00:00:00Z'},
'Sensitivity': 'normal',
'SeriesMasterId': None,
'ShowAs': 'busy',
'Start': {'dateTime': '2019-11-25T15:00:00.0000000', 'timeZone': 'UTC'},
'Subject': 'Some Subject ', 'Type': 'singleInstance', 'WebLink': ''
}
]
assert parsed_readable == expected_readable
assert parsed_outputs == expected_outputs
def test_parse_event():
parsed_readable, parsed_outputs = parse_events(MOCK_EVENT_JSON)
expected_readable = [
{'Subject': 'Some Subject ', 'ID': 'some_id', 'Organizer': 'Some Name', 'Attendees': ['somemail@test.com'],
'Start': '2019-11-25T15:00:00.0000000', 'End': '2019-11-25T15:30:00.0000000'}]
expected_outputs = [{'Attendees': [{'emailAddress': {'address': 'somemail@test.com', 'name': 'somemail@test.com'},
'status': {'response': 'none', 'time': '0001-01-01T00:00:00Z'},
'type': 'required'}], 'Body': {'content': '<html>', 'contentType': 'html'},
'BodyPreview': '', 'Categories': [], 'ChangeKey': '',
'CreatedDateTime': '2019-11-25T14:20:50.7017675Z',
'End': {'dateTime': '2019-11-25T15:30:00.0000000', 'timeZone': 'UTC'}, 'HasAttachments': False,
'ICalUId': '', 'ID': 'some_id', 'Importance': 'normal', 'IsAllDay': False,
'IsCancelled': False,
'IsOrganizer': True, 'IsReminderOn': True,
'LastModifiedDateTime': '2019-11-25T19:17:12.9656678Z',
'Location': {'address': {}, 'coordinates': {}, 'displayName': '', 'locationType': 'default',
'uniqueIdType': 'unknown'}, 'Locations': [], 'OnlineMeetingUrl': None,
'Organizer': {'emailAddress': {'address': 'somemail@test.com', 'name': 'Some Name'}},
'OriginalEndTimeZone': 'Israel Standard Time', 'OriginalStartTimeZone': 'Israel Standard Time',
'Recurrence': None, 'ReminderMinutesBeforeStart': 15, 'ResponseRequested': True,
'ResponseStatus': {'response': 'organizer', 'time': '0001-01-01T00:00:00Z'},
'Sensitivity': 'normal', 'SeriesMasterId': None, 'ShowAs': 'busy',
'Start': {'dateTime': '2019-11-25T15:00:00.0000000', 'timeZone': 'UTC'},
'Subject': 'Some Subject ', 'Type': 'singleInstance', 'WebLink': ''}]
assert parsed_readable == expected_readable
assert parsed_outputs == expected_outputs
MOCK_CALENDAR_JSON = [{
"@odata.context": "",
"@odata.etag": "",
"attendees": [
{
"emailAddress": {
"address": "someemail@test.com",
"name": "someemail@test.com"
},
"status": {
"response": "none",
"time": "0001-01-01T00:00:00Z"
},
"type": "required"
}
],
"body": {
"content": "<html>",
"contentType": "html"
},
"bodyPreview": "",
"categories": [],
"changeKey": "",
"createdDateTime": "2019-11-25T14:20:50.7017675Z",
"end": {
"dateTime": "2019-11-25T15:30:00.0000000",
"timeZone": "UTC"
},
"hasAttachments": False,
"iCalUId": "",
"id": "some_id",
"importance": "normal",
"isAllDay": False,
"isCancelled": False,
"isOrganizer": True,
"isReminderOn": True,
"lastModifiedDateTime": "2019-11-25T19:17:12.9656678Z",
"location": {
"address": {},
"coordinates": {},
"displayName": "",
"locationType": "default",
"uniqueIdType": "unknown"
},
"locations": [],
"onlineMeetingUrl": None,
"organizer": {
"emailAddress": {
"address": "someemail@test.com",
"name": "Some Name"
}
},
"originalEndTimeZone": "Israel Standard Time",
"originalStartTimeZone": "Israel Standard Time",
"recurrence": None,
"reminderMinutesBeforeStart": 15,
"responseRequested": True,
"responseStatus": {
"response": "organizer",
"time": "0001-01-01T00:00:00Z"
},
"sensitivity": "normal",
"seriesMasterId": None,
"showAs": "busy",
"start": {
"dateTime": "2019-11-25T15:00:00.0000000",
"timeZone": "UTC"
},
"subject": "Some Subject ",
"type": "singleInstance",
"webLink": ""
}]
MOCK_EVENT_JSON = {
"@odata.etag": "",
"attendees": [
{
"emailAddress": {
"address": "somemail@test.com",
"name": "somemail@test.com"
},
"status": {
"response": "none",
"time": "0001-01-01T00:00:00Z"
},
"type": "required"
}
],
"body": {
"content": "<html>",
"contentType": "html"
},
"bodyPreview": "",
"categories": [],
"changeKey": "",
"createdDateTime": "2019-11-25T14:20:50.7017675Z",
"end": {
"dateTime": "2019-11-25T15:30:00.0000000",
"timeZone": "UTC"
},
"hasAttachments": False,
"iCalUId": "",
"id": "some_id",
"importance": "normal",
"isAllDay": False,
"isCancelled": False,
"isOrganizer": True,
"isReminderOn": True,
"lastModifiedDateTime": "2019-11-25T19:17:12.9656678Z",
"location": {
"address": {},
"coordinates": {},
"displayName": "",
"locationType": "default",
"uniqueIdType": "unknown"
},
"locations": [],
"onlineMeetingUrl": None,
"organizer": {
"emailAddress": {
"address": "somemail@test.com",
"name": "Some Name"
}
},
"originalEndTimeZone": "Israel Standard Time",
"originalStartTimeZone": "Israel Standard Time",
"recurrence": None,
"reminderMinutesBeforeStart": 15,
"responseRequested": True,
"responseStatus": {
"response": "organizer",
"time": "0001-01-01T00:00:00Z"
},
"sensitivity": "normal",
"seriesMasterId": None,
"showAs": "busy",
"start": {
"dateTime": "2019-11-25T15:00:00.0000000",
"timeZone": "UTC"
},
"subject": "Some Subject ",
"type": "singleInstance",
"webLink": ""
}
| 35.826446 | 120 | 0.51857 | 692 | 8,670 | 6.41474 | 0.169075 | 0.02433 | 0.02478 | 0.027033 | 0.876549 | 0.84884 | 0.838928 | 0.838928 | 0.838928 | 0.838928 | 0 | 0.082233 | 0.301499 | 8,670 | 241 | 121 | 35.975104 | 0.65076 | 0 | 0 | 0.588496 | 0 | 0 | 0.446482 | 0.081084 | 0 | 0 | 0 | 0 | 0.035398 | 1 | 0.022124 | false | 0 | 0.022124 | 0 | 0.044248 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
5d38a4727aac681d68f69419288d20946e955ef7 | 44 | py | Python | tensorview/watch/__init__.py | Hourout/tensorview | 6a4f1f62aebf15efee08166922eb86196bfbf71e | [
"Apache-2.0"
] | 13 | 2019-06-28T05:56:31.000Z | 2020-08-20T01:33:30.000Z | tensorview/watch/__init__.py | Hourout/tensorview | 6a4f1f62aebf15efee08166922eb86196bfbf71e | [
"Apache-2.0"
] | null | null | null | tensorview/watch/__init__.py | Hourout/tensorview | 6a4f1f62aebf15efee08166922eb86196bfbf71e | [
"Apache-2.0"
] | 2 | 2020-05-29T03:47:24.000Z | 2020-06-17T10:03:08.000Z | from tensorview.watch._watch_image import *
| 22 | 43 | 0.840909 | 6 | 44 | 5.833333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 44 | 1 | 44 | 44 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5d47c68942fd7397f8310764f06841879848664d | 39 | py | Python | script.py | streib/Casimir-programming | 565325d6044fe64d4ed2261a214861052ff03c7d | [
"MIT"
] | null | null | null | script.py | streib/Casimir-programming | 565325d6044fe64d4ed2261a214861052ff03c7d | [
"MIT"
] | null | null | null | script.py | streib/Casimir-programming | 565325d6044fe64d4ed2261a214861052ff03c7d | [
"MIT"
] | null | null | null | import test
print(test.circle_area(1))
| 13 | 26 | 0.794872 | 7 | 39 | 4.285714 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027778 | 0.076923 | 39 | 2 | 27 | 19.5 | 0.805556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
538edcbce531ae7a063571c463fb9e67ac638dbf | 207 | py | Python | ms_mint/vis/plotly/__init__.py | rokm/ms-mint | 4e2d9c71aa5ff83db55303284e9fd2d22230c0fb | [
"MIT"
] | null | null | null | ms_mint/vis/plotly/__init__.py | rokm/ms-mint | 4e2d9c71aa5ff83db55303284e9fd2d22230c0fb | [
"MIT"
] | null | null | null | ms_mint/vis/plotly/__init__.py | rokm/ms-mint | 4e2d9c71aa5ff83db55303284e9fd2d22230c0fb | [
"MIT"
] | null | null | null |
from .plotly_heatmap import plotly_heatmap
from .plotly_peak_shapes import plotly_peak_shapes
from .plotly_peak_shapes_3d import plotly_peak_shapes_3d
from .plotly_tools import set_template
set_template()
| 25.875 | 56 | 0.879227 | 32 | 207 | 5.21875 | 0.3125 | 0.239521 | 0.383234 | 0.239521 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010638 | 0.091787 | 207 | 7 | 57 | 29.571429 | 0.87766 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.8 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
5394ef0240a7850752a40d90842cf3c6b733fdf0 | 44 | py | Python | marl_env/marl_env/envs/__init__.py | vijay092/floris | 85f2a56fa0ab7c2237d308690a554c6101dbcd34 | [
"Apache-2.0"
] | 2 | 2021-11-04T23:52:02.000Z | 2021-12-09T12:43:21.000Z | marl_env/marl_env/envs/__init__.py | vijay092/floris | 85f2a56fa0ab7c2237d308690a554c6101dbcd34 | [
"Apache-2.0"
] | null | null | null | marl_env/marl_env/envs/__init__.py | vijay092/floris | 85f2a56fa0ab7c2237d308690a554c6101dbcd34 | [
"Apache-2.0"
] | 1 | 2020-07-23T18:30:05.000Z | 2020-07-23T18:30:05.000Z | from marl_env.envs.marl_farm import FarmMARL | 44 | 44 | 0.886364 | 8 | 44 | 4.625 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068182 | 44 | 1 | 44 | 44 | 0.902439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
54e3c77021adb435c4aa65d984c3ecaf72696ace | 37 | py | Python | visdialch/utils/__init__.py | awesome-archive/visdial-challenge-starter-pytorch | e45ab120f4efd599f9d42856f3b58be837783427 | [
"BSD-3-Clause"
] | null | null | null | visdialch/utils/__init__.py | awesome-archive/visdial-challenge-starter-pytorch | e45ab120f4efd599f9d42856f3b58be837783427 | [
"BSD-3-Clause"
] | null | null | null | visdialch/utils/__init__.py | awesome-archive/visdial-challenge-starter-pytorch | e45ab120f4efd599f9d42856f3b58be837783427 | [
"BSD-3-Clause"
] | null | null | null | from .dynamic_rnn import DynamicRNN
| 12.333333 | 35 | 0.837838 | 5 | 37 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135135 | 37 | 2 | 36 | 18.5 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
54e42eec3905468a35a5ddce0988e3b1aea5c3d8 | 4,952 | py | Python | transparency_epias/markets/ancillaryServiceClient.py | ErenEla/transparencyEpias | ad74ea3568b29d7610898243b0b778adb10951d3 | [
"MIT"
] | 8 | 2020-05-14T12:10:19.000Z | 2021-08-15T15:20:25.000Z | transparency_epias/markets/ancillaryServiceClient.py | ErenEla/transparencyEpias | ad74ea3568b29d7610898243b0b778adb10951d3 | [
"MIT"
] | 2 | 2020-05-30T15:59:43.000Z | 2020-06-25T12:39:14.000Z | transparency_epias/markets/ancillaryServiceClient.py | ErenEla/transparencyEpias | ad74ea3568b29d7610898243b0b778adb10951d3 | [
"MIT"
] | 1 | 2020-10-30T03:27:21.000Z | 2020-10-30T03:27:21.000Z | import pandas as pd
import requests
import json
from datetime import timedelta
from datetime import datetime
from transparency_epias.markets import validate as val
class ancillaryServicesClient:
def get_request_result(self, query):
#main_url = "https://seffaflik.epias.com.tr/transparency/service/market/"
url = "https://seffaflik.epias.com.tr/transparency/service/"+query
payload = {}
headers = {
'Cookie': 'TS01f69930=01cbc7c0b229af3f9e170f80092f828abac28c9cacff2f44fbd6391713e0e3f0af97eecc2694f5fc77aefc033595cc62fe9c469b52'
}
response = requests.request("GET", url, headers=headers, data = payload)
json_data = json.loads(response.text.encode('utf8'))
return json_data
def pfc_amount(self, startDate, endDate):
'''
This function returns 3 lists including;
-Date list for specified date as first item.
-Hour information as second item.
-Primary Frequancy Reserve Amounts as third item.
Parameters:
startDate: Start date in YYYY-MM-DD format.
endDate: End date in YYYY-MM-DD format.
'''
val.date_check(startDate, endDate)
query = "market/pfc-amount?startDate="+f'{startDate}'+"&endDate="+f'{endDate}'
json_result = self.get_request_result(query)
key_list = list(json_result['body'].keys())
key_name = key_list[0]
response_list = json_result['body'][f'{key_name}']
date_list = []
hour_list = []
amount_list = []
for item in response_list:
date_list.append(item['effectiveDate'])
hour_list.append(item['hour'])
amount_list.append(item['totalAmount'])
return date_list, hour_list, amount_list
def pfc_price(self, startDate, endDate):
'''
This function returns 3 lists including;
-Date list for specified date as first item.
-Hour information as second item.
-Primary Frequancy Price values as third item.
Parameters:
startDate: Start date in YYYY-MM-DD format.
endDate: End date in YYYY-MM-DD format.
'''
val.date_check(startDate, endDate)
query = "market/pfc-price?startDate="+f'{startDate}'+"&endDate="+f'{endDate}'
json_result = self.get_request_result(query)
key_list = list(json_result['body'].keys())
key_name = key_list[0]
response_list = json_result['body'][f'{key_name}']
date_list = []
hour_list = []
price_list = []
for item in response_list:
date_list.append(item['effectiveDate'])
hour_list.append(item['hour'])
price_list.append(item['price'])
return date_list, hour_list, price_list
def sfc_amount(self, startDate, endDate):
'''
This function returns 3 lists including;
-Date list for specified date as first item.
-Hour information as second item.
-Secondary Frequancy Reserve amounts as third item.
Parameters:
startDate: Start date in YYYY-MM-DD format.
endDate: End date in YYYY-MM-DD format.
'''
val.date_check(startDate, endDate)
query = "market/sfc-amount?startDate="+f'{startDate}'+"&endDate="+f'{endDate}'
json_result = self.get_request_result(query)
key_list = list(json_result['body'].keys())
key_name = key_list[0]
response_list = json_result['body'][f'{key_name}']
date_list = []
hour_list = []
amount_list = []
for item in response_list:
date_list.append(item['effectiveDate'])
hour_list.append(item['hour'])
amount_list.append(item['totalAmount'])
return date_list, hour_list, amount_list
def sfc_price(self, startDate, endDate):
'''
This function returns 3 lists including;
-Date list for specified date as first item.
-Hour information as second item.
-Secondary Frequancy Reserve amounts as third item.
Parameters:
startDate: Start date in YYYY-MM-DD format.
endDate: End date in YYYY-MM-DD format.
'''
val.date_check(startDate, endDate)
query = "market/sfc-price?startDate="+f'{startDate}'+"&endDate="+f'{endDate}'
json_result = self.get_request_result(query)
key_list = list(json_result['body'].keys())
key_name = key_list[0]
response_list = json_result['body'][f'{key_name}']
date_list = []
hour_list = []
price_list = []
for item in response_list:
date_list.append(item['effectiveDate'])
hour_list.append(item['hour'])
price_list.append(item['price'])
return date_list, hour_list, price_list
ancillary = ancillaryServicesClient() | 28.297143 | 137 | 0.616922 | 574 | 4,952 | 5.158537 | 0.155052 | 0.043229 | 0.056738 | 0.032421 | 0.826072 | 0.826072 | 0.826072 | 0.826072 | 0.795002 | 0.795002 | 0 | 0.02095 | 0.27706 | 4,952 | 175 | 138 | 28.297143 | 0.806145 | 0.240913 | 0 | 0.675325 | 0 | 0 | 0.167099 | 0.065399 | 0.051948 | 0 | 0 | 0 | 0 | 1 | 0.064935 | false | 0 | 0.077922 | 0 | 0.220779 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
54ec930bc4aeda620f33b78310263681eca8a615 | 170 | py | Python | plugins/logging_mod.py | yoavrotems/kube-hunter | e7585f4ed38a6fa08b692676ee417dbc1b919b91 | [
"Apache-2.0"
] | 1 | 2021-09-13T21:52:52.000Z | 2021-09-13T21:52:52.000Z | plugins/logging_mod.py | yoavrotems/kube-hunter | e7585f4ed38a6fa08b692676ee417dbc1b919b91 | [
"Apache-2.0"
] | 2 | 2021-05-20T20:17:17.000Z | 2022-02-26T09:20:16.000Z | plugins/logging_mod.py | yoavrotems/kube-hunter | e7585f4ed38a6fa08b692676ee417dbc1b919b91 | [
"Apache-2.0"
] | 1 | 2020-08-13T13:49:38.000Z | 2020-08-13T13:49:38.000Z | import logging
# Suppress logging from scapy
logging.getLogger("scapy.runtime").setLevel(logging.CRITICAL)
logging.getLogger("scapy.loading").setLevel(logging.CRITICAL)
| 28.333333 | 61 | 0.817647 | 20 | 170 | 6.95 | 0.5 | 0.230216 | 0.302158 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 170 | 5 | 62 | 34 | 0.86875 | 0.158824 | 0 | 0 | 0 | 0 | 0.184397 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
07045cb539f37ad49b4941c5400a34e8b542128f | 102 | py | Python | src/cbint/__init__.py | carbonblack/cb-integration | 0a32cf3d921be2a6e7589463bf5cc24882d4697e | [
"MIT"
] | 14 | 2015-08-17T14:12:37.000Z | 2021-12-27T06:11:08.000Z | src/cbint/__init__.py | carbonblack/cb-integration | 0a32cf3d921be2a6e7589463bf5cc24882d4697e | [
"MIT"
] | 17 | 2016-03-02T21:09:23.000Z | 2020-04-03T00:01:07.000Z | src/cbint/__init__.py | carbonblack/cb-integration | 0a32cf3d921be2a6e7589463bf5cc24882d4697e | [
"MIT"
] | 6 | 2015-10-20T12:36:05.000Z | 2019-10-10T08:42:12.000Z | from cbint.utils.bridge import CbIntegrationBridge
from cbint.utils.daemon import CbIntegrationDaemon
| 34 | 50 | 0.882353 | 12 | 102 | 7.5 | 0.666667 | 0.2 | 0.311111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078431 | 102 | 2 | 51 | 51 | 0.957447 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
07172cc81a19113d099971428532241db485e203 | 3,747 | py | Python | colosseum/agents/policy.py | MichelangeloConserva/Colosseum | b0711fd9ce75520deb74cda75c148984a8e4152f | [
"MIT"
] | null | null | null | colosseum/agents/policy.py | MichelangeloConserva/Colosseum | b0711fd9ce75520deb74cda75c148984a8e4152f | [
"MIT"
] | null | null | null | colosseum/agents/policy.py | MichelangeloConserva/Colosseum | b0711fd9ce75520deb74cda75c148984a8e4152f | [
"MIT"
] | null | null | null | from typing import Dict, Tuple, Union
import numpy as np
class ContinuousPolicy:
"""
stores the policy of an infinite horizon agent.
"""
def __init__(self, policy: Dict[int, Union[int, np.ndarray]], num_actions: int):
"""
Parameters
----------
policy : Dict[int, Union[int, np.ndarray]]
the deterministic or stochastic policy of the agent.
num_actions : int
the number of actions.
"""
if type(list(policy.values())[0]) in [
int,
np.int8,
np.int16,
np.int64,
np.int32,
]:
self.is_deterministic = True
else:
self.is_deterministic = False
assert type(list(policy.values())[0]) == np.ndarray
self._policy = policy
self._num_actions = num_actions
self._num_states = len(policy.keys())
self._pi_matrix = None
self._pi = dict()
def pi(self, s: int) -> np.array:
"""
Returns the policy for a given state.
"""
if s not in self._pi:
if self.is_deterministic:
p = np.zeros(self._num_actions, np.float32)
p[self._policy[s]] = 1.0
self._pi[s] = p
self._pi[s] = self._policy[s]
return self._pi[s]
@property
def pi_matrix(self) -> np.ndarray:
"""
Returns a matrix |S| x |A| containing the policy of the agent.
"""
if self._pi_matrix is None:
self._pi_matrix = np.zeros(
(self._num_states, self._num_actions), np.float32
)
for s in self._policy:
self._pi_matrix[s] = self.pi(s)
return self._pi_matrix
def __hash__(self) -> int:
return hash(tuple(self._pi_matrix.tolist()))
class EpisodicPolicy:
"""
Stores the policy of a finite horizon agent.
"""
def __init__(
self, policy: Dict[Tuple[int, int], Union[int, np.ndarray]], num_actions, H
):
"""
Parameters
----------
policy : Dict[int, Union[int, np.ndarray]]
the deterministic or stochastic policy of the agent.
num_actions : int
the number of actions.
H : int
the horizon of the MDP.
"""
if type(list(policy.values())[0]) in [int, np.int64, np.int32]:
self.is_deterministic = True
else:
self.is_deterministic = False
assert type(list(policy.values())[0]) == np.ndarray
self._policy = policy
self._num_actions = num_actions
self._H = H
self._num_states = len(policy.keys())
self._pi_matrix = None
self._pi = dict()
def pi(self, h: int, s: int) -> np.array:
"""
Returns the policy for a given state and time step.
"""
if (h, s) not in self._pi:
if self.is_deterministic:
p = np.zeros(self._num_actions, np.float32)
p[self._policy[h, s]] = 1.0
self._pi[h, s] = p
else:
self._pi[h, s] = self._policy[h, s]
return self._pi[h, s]
@property
def pi_matrix(self) -> np.ndarray:
"""
Returns a matrix |H| x |S| x |A| containing the policy of the agent.
"""
if self._pi_matrix is None:
self._pi_matrix = np.zeros(
(self._H, self._num_states, self._num_actions), np.float32
)
for h, s in self._policy:
self._pi_matrix[h, s] = self.pi(h, s)
return self._pi_matrix
def __hash__(self) -> int:
return hash(tuple(self._pi_matrix.tolist()))
| 29.273438 | 84 | 0.523619 | 471 | 3,747 | 3.96603 | 0.163482 | 0.077088 | 0.077088 | 0.027837 | 0.849572 | 0.831906 | 0.831906 | 0.737687 | 0.737687 | 0.667024 | 0 | 0.011269 | 0.360555 | 3,747 | 127 | 85 | 29.503937 | 0.768364 | 0.182546 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027027 | 1 | 0.108108 | false | 0 | 0.027027 | 0.027027 | 0.243243 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
071a90fbabd5e294d6564144af2d00ed9cd13c8d | 16,399 | py | Python | tests/st/ops/cpu/test_arithmetic_op.py | PowerOlive/mindspore | bda20724a94113cedd12c3ed9083141012da1f15 | [
"Apache-2.0"
] | 5 | 2021-06-04T02:23:01.000Z | 2021-12-13T10:41:07.000Z | tests/st/ops/cpu/test_arithmetic_op.py | zimo-geek/mindspore | 665ec683d4af85c71b2a1f0d6829356f2bc0e1ff | [
"Apache-2.0"
] | null | null | null | tests/st/ops/cpu/test_arithmetic_op.py | zimo-geek/mindspore | 665ec683d4af85c71b2a1f0d6829356f2bc0e1ff | [
"Apache-2.0"
] | 1 | 2021-12-14T06:22:31.000Z | 2021-12-14T06:22:31.000Z | # Copyright 2020 Huawei Technologies Co., Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ============================================================================
import numpy as np
import pytest
import mindspore.context as context
import mindspore.nn as nn
import mindspore
from mindspore import Tensor
from mindspore.ops import operations as P
context.set_context(mode=context.GRAPH_MODE, device_target='CPU')
class SubNet(nn.Cell):
def __init__(self):
super(SubNet, self).__init__()
self.sub = P.Sub()
def construct(self, x, y):
return self.sub(x, y)
class DivNet(nn.Cell):
def __init__(self):
super(DivNet, self).__init__()
self.div = P.Div()
def construct(self, x, y):
return self.div(x, y)
class FloorDivNet(nn.Cell):
def __init__(self):
super(FloorDivNet, self).__init__()
self.floor_div = P.FloorDiv()
def construct(self, x, y):
return self.floor_div(x, y)
class ModNet(nn.Cell):
def __init__(self):
super(ModNet, self).__init__()
self.mod = P.Mod()
def construct(self, x, y):
return self.mod(x, y)
class FloorModNet(nn.Cell):
def __init__(self):
super(FloorModNet, self).__init__()
self.floor_mod = P.FloorMod()
def construct(self, x, y):
return self.floor_mod(x, y)
@pytest.mark.level0
@pytest.mark.platform_x86_cpu
@pytest.mark.env_onecard
def test_sub():
x = np.random.rand(2, 3, 4, 4).astype(np.float32)
y = np.random.rand(4, 1).astype(np.float32)
net = SubNet()
output = net(Tensor(x), Tensor(y, mindspore.float32))
expect_output = x - y
assert np.all(output.asnumpy() == expect_output)
# float64
x = np.random.rand(2, 3, 4, 4).astype(np.float64)
y = np.random.rand(4, 1).astype(np.float64)
net = SubNet()
output = net(Tensor(x), Tensor(y, mindspore.float64))
expect_output = x - y
assert np.all(output.asnumpy() == expect_output)
@pytest.mark.level0
@pytest.mark.platform_x86_cpu
@pytest.mark.env_onecard
def test_div():
prop = 1 if np.random.random() < 0.5 else -1
x0_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
y0_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
x1_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
y1_np = np.random.randint(1, 100, (2, 1, 4, 4)).astype(np.float32) * prop
x2_np = np.random.randint(1, 100, (2, 1, 1, 4)).astype(np.float16) * prop
y2_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float16) * prop
x3_np = np.random.randint(1, 100, 1).astype(np.float32) * prop
y3_np = np.random.randint(1, 100, 1).astype(np.float32) * prop
x4_np = np.array(768).astype(np.float32) * prop
y4_np = np.array(3072.5).astype(np.float32) * prop
x5_np = np.random.randint(1, 100, (2, 1, 1, 4)).astype(np.int32) * prop
y5_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.int32) * prop
x6_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.int32) * prop
y6_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
x7_np = np.random.randint(1, 100, (2, 1, 1, 4)).astype(np.int64) * prop
y7_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.int64) * prop
x8_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float64) * prop
y8_np = np.random.randint(1, 100, (2, 1, 4, 4)).astype(np.float64) * prop
x0 = Tensor(x0_np)
y0 = Tensor(y0_np)
x1 = Tensor(x1_np)
y1 = Tensor(y1_np)
x2 = Tensor(x2_np)
y2 = Tensor(y2_np)
x3 = Tensor(x3_np)
y3 = Tensor(y3_np)
x4 = Tensor(x4_np)
y4 = Tensor(y4_np)
x5 = Tensor(x5_np)
y5 = Tensor(y5_np)
x6 = Tensor(x6_np)
y6 = Tensor(y6_np)
x7 = Tensor(x7_np)
y7 = Tensor(y7_np)
x8 = Tensor(x8_np)
y8 = Tensor(y8_np)
context.set_context(mode=context.GRAPH_MODE, device_target='CPU')
div = DivNet()
output0 = div(x0, y0)
expect0 = np.divide(x0_np, y0_np)
diff0 = output0.asnumpy() - expect0
error0 = np.ones(shape=expect0.shape) * 1.0e-5
assert np.all(diff0 < error0)
assert output0.shape == expect0.shape
output1 = div(x1, y1)
expect1 = np.divide(x1_np, y1_np)
diff1 = output1.asnumpy() - expect1
error1 = np.ones(shape=expect1.shape) * 1.0e-5
assert np.all(diff1 < error1)
assert output1.shape == expect1.shape
output2 = div(x2, y2)
expect2 = np.divide(x2_np, y2_np).astype(np.float16)
diff2 = output2.asnumpy() - expect2
error2 = np.ones(shape=expect2.shape) * 1.0e-5
assert np.all(diff2 < error2)
assert output2.shape == expect2.shape
output3 = div(x3, y3)
expect3 = np.divide(x3_np, y3_np)
diff3 = output3.asnumpy() - expect3
error3 = np.ones(shape=expect3.shape) * 1.0e-5
assert np.all(diff3 < error3)
assert output3.shape == expect3.shape
output4 = div(x4, y4)
expect4 = np.divide(x4_np, y4_np)
diff4 = output4.asnumpy() - expect4
error4 = np.ones(shape=expect4.shape) * 1.0e-5
assert np.all(diff4 < error4)
assert output4.shape == expect4.shape
output5 = div(x5, y5)
expect5 = x5_np // y5_np
assert np.all(output5.asnumpy() == expect5)
output6 = div(x6, y6)
expect6 = np.divide(x6_np, y6_np)
diff6 = output6.asnumpy() - expect6
error6 = np.ones(shape=expect6.shape) * 1.0e-5
assert np.all(diff6 < error6)
assert output6.shape == expect6.shape
output7 = div(x7, y7)
expect7 = np.divide(x7_np, y7_np).astype(np.int64)
assert np.all(output7.asnumpy() == expect7)
assert output7.shape == expect7.shape
output8 = div(x8, y8)
expect8 = np.divide(x8_np, y8_np)
diff8 = output8.asnumpy() - expect8
error8 = np.ones(shape=expect8.shape) * 1.0e-7
assert np.all(diff8 < error8)
assert output8.shape == expect8.shape
@pytest.mark.level0
@pytest.mark.platform_x86_cpu
@pytest.mark.env_onecard
def test_floor_div():
prop = 1 if np.random.random() < 0.5 else -1
x0_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
y0_np = np.random.randint(1, 100, (2, 1, 4, 4)).astype(np.float32) * prop
x1_np = np.random.randint(1, 100, (2, 1, 1, 4)).astype(np.float16) * prop
y1_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float16) * prop
x2_np = np.random.randint(1, 100, (2, 1, 1, 4)).astype(np.int32) * prop
y2_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.int32) * prop
x3_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.int32) * prop
y3_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
x4_np = np.random.randint(1, 100, (2, 1, 1, 4)).astype(np.int64) * prop
y4_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.int64) * prop
x5_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float64) * prop
y5_np = np.random.randint(1, 100, (2, 1, 4, 4)).astype(np.float64) * prop
x0 = Tensor(x0_np)
y0 = Tensor(y0_np)
x1 = Tensor(x1_np)
y1 = Tensor(y1_np)
x2 = Tensor(x2_np)
y2 = Tensor(y2_np)
x3 = Tensor(x3_np)
y3 = Tensor(y3_np)
x4 = Tensor(x4_np)
y4 = Tensor(y4_np)
x5 = Tensor(x5_np)
y5 = Tensor(y5_np)
context.set_context(mode=context.GRAPH_MODE, device_target='CPU')
floor_div = FloorDivNet()
output0 = floor_div(x0, y0)
expect0 = np.floor_divide(x0_np, y0_np)
diff0 = output0.asnumpy() - expect0
error0 = np.ones(shape=expect0.shape) * 1.0e-5
assert np.all(diff0 < error0)
assert output0.shape == expect0.shape
output1 = floor_div(x1, y1)
expect1 = np.floor_divide(x1_np, y1_np)
diff1 = output1.asnumpy() - expect1
error1 = np.ones(shape=expect1.shape) * 1.0e-5
assert np.all(diff1 < error1)
assert output1.shape == expect1.shape
output2 = floor_div(x2, y2)
expect2 = np.floor_divide(x2_np, y2_np).astype(np.float16)
diff2 = output2.asnumpy() - expect2
error2 = np.ones(shape=expect2.shape) * 1.0e-5
assert np.all(diff2 < error2)
assert output2.shape == expect2.shape
output3 = floor_div(x3, y3)
expect3 = np.floor_divide(x3_np, y3_np)
diff3 = output3.asnumpy() - expect3
error3 = np.ones(shape=expect3.shape) * 1.0e-5
assert np.all(diff3 < error3)
assert output3.shape == expect3.shape
output4 = floor_div(x4, y4)
expect4 = np.floor_divide(x4_np, y4_np)
diff4 = output4.asnumpy() - expect4
error4 = np.ones(shape=expect4.shape) * 1.0e-5
assert np.all(diff4 < error4)
assert output4.shape == expect4.shape
output5 = floor_div(x5, y5)
expect5 = np.floor_divide(x5_np, y5_np)
diff5 = output5.asnumpy() - expect5
error5 = np.ones(shape=expect5.shape) * 1.0e-7
assert np.all(diff5 < error5)
assert output5.shape == expect5.shape
@pytest.mark.level0
@pytest.mark.platform_x86_cpu
@pytest.mark.env_onecard
def test_mod():
prop = 1 if np.random.random() < 0.5 else -1
x0_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
y0_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
x1_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
y1_np = np.random.randint(1, 100, (2, 1, 4, 4)).astype(np.float32) * prop
x2_np = np.random.randint(1, 100, (2, 1, 1, 4)).astype(np.float16) * prop
y2_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float16) * prop
x3_np = np.random.randint(1, 100, 1).astype(np.float32) * prop
y3_np = np.random.randint(1, 100, 1).astype(np.float32) * prop
x4_np = np.array(768).astype(np.float32) * prop
y4_np = np.array(3072.5).astype(np.float32) * prop
x5_np = np.random.randint(1, 100, (2, 1, 1, 4)).astype(np.int32) * prop
y5_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.int32) * prop
x6_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.int32) * prop
y6_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
x7_np = np.random.randint(1, 100, (2, 1, 1, 4)).astype(np.int64) * prop
y7_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.int64) * prop
x0 = Tensor(x0_np)
y0 = Tensor(y0_np)
x1 = Tensor(x1_np)
y1 = Tensor(y1_np)
x2 = Tensor(x2_np)
y2 = Tensor(y2_np)
x3 = Tensor(x3_np)
y3 = Tensor(y3_np)
x4 = Tensor(x4_np)
y4 = Tensor(y4_np)
x5 = Tensor(x5_np)
y5 = Tensor(y5_np)
x6 = Tensor(x6_np)
y6 = Tensor(y6_np)
x7 = Tensor(x7_np)
y7 = Tensor(y7_np)
context.set_context(mode=context.GRAPH_MODE, device_target='CPU')
mod = ModNet()
output0 = mod(x0, y0)
expect0 = np.mod(x0_np, y0_np)
diff0 = output0.asnumpy() - expect0
error0 = np.ones(shape=expect0.shape) * 1.0e-5
assert np.all(diff0 < error0)
assert output0.shape == expect0.shape
output1 = mod(x1, y1)
expect1 = np.mod(x1_np, y1_np)
diff1 = output1.asnumpy() - expect1
error1 = np.ones(shape=expect1.shape) * 1.0e-5
assert np.all(diff1 < error1)
assert output1.shape == expect1.shape
output2 = mod(x2, y2)
expect2 = np.mod(x2_np, y2_np).astype(np.float16)
diff2 = output2.asnumpy() - expect2
error2 = np.ones(shape=expect2.shape) * 1.0e-5
assert np.all(diff2 < error2)
assert output2.shape == expect2.shape
output3 = mod(x3, y3)
expect3 = np.mod(x3_np, y3_np)
diff3 = output3.asnumpy() - expect3
error3 = np.ones(shape=expect3.shape) * 1.0e-5
assert np.all(diff3 < error3)
assert output3.shape == expect3.shape
output4 = mod(x4, y4)
expect4 = np.mod(x4_np, y4_np)
diff4 = output4.asnumpy() - expect4
error4 = np.ones(shape=expect4.shape) * 1.0e-5
assert np.all(diff4 < error4)
assert output4.shape == expect4.shape
output5 = mod(x5, y5)
expect5 = np.mod(x5_np, y5_np)
assert np.all(output5.asnumpy() == expect5)
assert output5.shape == expect5.shape
output6 = mod(x6, y6)
expect6 = np.mod(x6_np, y6_np)
diff6 = output6.asnumpy() - expect6
error6 = np.ones(shape=expect6.shape) * 1.0e-5
assert np.all(diff6 < error6)
assert output6.shape == expect6.shape
output7 = mod(x7, y7)
expect7 = np.mod(x7_np, y7_np).astype(np.int64)
assert np.all(output7.asnumpy() == expect7)
assert output6.shape == expect6.shape
@pytest.mark.level0
@pytest.mark.platform_x86_cpu
@pytest.mark.env_onecard
def test_floor_mod():
prop = 1 if np.random.random() < 0.5 else -1
x0_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
y0_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
x1_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
y1_np = np.random.randint(1, 100, (2, 1, 4, 4)).astype(np.float32) * prop
x2_np = np.random.randint(1, 100, (2, 1, 1, 4)).astype(np.float16) * prop
y2_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float16) * prop
x3_np = np.random.randint(1, 100, 1).astype(np.float32) * prop
y3_np = np.random.randint(1, 100, 1).astype(np.float32) * prop
x4_np = np.array(768).astype(np.float32) * prop
y4_np = np.array(3072.5).astype(np.float32) * prop
x5_np = np.random.randint(1, 100, (2, 1, 1, 4)).astype(np.int32) * prop
y5_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.int32) * prop
x6_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.int32) * prop
y6_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.float32) * prop
x7_np = np.random.randint(1, 100, (2, 1, 1, 4)).astype(np.int64) * prop
y7_np = np.random.randint(1, 100, (2, 3, 4, 4)).astype(np.int64) * prop
x0 = Tensor(x0_np)
y0 = Tensor(y0_np)
x1 = Tensor(x1_np)
y1 = Tensor(y1_np)
x2 = Tensor(x2_np)
y2 = Tensor(y2_np)
x3 = Tensor(x3_np)
y3 = Tensor(y3_np)
x4 = Tensor(x4_np)
y4 = Tensor(y4_np)
x5 = Tensor(x5_np)
y5 = Tensor(y5_np)
x6 = Tensor(x6_np)
y6 = Tensor(y6_np)
x7 = Tensor(x7_np)
y7 = Tensor(y7_np)
context.set_context(mode=context.GRAPH_MODE, device_target='CPU')
floor_mod = FloorModNet()
output0 = floor_mod(x0, y0)
expect0 = np.mod(x0_np, y0_np)
diff0 = output0.asnumpy() - expect0
error0 = np.ones(shape=expect0.shape) * 1.0e-5
assert np.all(diff0 < error0)
assert output0.shape == expect0.shape
output1 = floor_mod(x1, y1)
expect1 = np.mod(x1_np, y1_np)
diff1 = output1.asnumpy() - expect1
error1 = np.ones(shape=expect1.shape) * 1.0e-5
assert np.all(diff1 < error1)
assert output1.shape == expect1.shape
output2 = floor_mod(x2, y2)
expect2 = np.mod(x2_np, y2_np).astype(np.float16)
diff2 = output2.asnumpy() - expect2
error2 = np.ones(shape=expect2.shape) * 1.0e-5
assert np.all(diff2 < error2)
assert output2.shape == expect2.shape
output3 = floor_mod(x3, y3)
expect3 = np.mod(x3_np, y3_np)
diff3 = output3.asnumpy() - expect3
error3 = np.ones(shape=expect3.shape) * 1.0e-5
assert np.all(diff3 < error3)
assert output3.shape == expect3.shape
output4 = floor_mod(x4, y4)
expect4 = np.mod(x4_np, y4_np)
diff4 = output4.asnumpy() - expect4
error4 = np.ones(shape=expect4.shape) * 1.0e-5
assert np.all(diff4 < error4)
assert output4.shape == expect4.shape
output5 = floor_mod(x5, y5)
expect5 = np.mod(x5_np, y5_np)
assert np.all(output5.asnumpy() == expect5)
assert output5.shape == expect5.shape
output6 = floor_mod(x6, y6)
expect6 = np.mod(x6_np, y6_np)
diff6 = output6.asnumpy() - expect6
error6 = np.ones(shape=expect6.shape) * 1.0e-5
assert np.all(diff6 < error6)
assert output6.shape == expect6.shape
output7 = floor_mod(x7, y7)
expect7 = np.mod(x7_np, y7_np).astype(np.int64)
assert np.all(output7.asnumpy() == expect7)
assert output6.shape == expect6.shape
test_sub()
test_div()
test_floor_div()
test_mod()
test_floor_mod()
| 35.266667 | 78 | 0.636868 | 2,665 | 16,399 | 3.806379 | 0.072796 | 0.057571 | 0.055205 | 0.093849 | 0.863072 | 0.844834 | 0.833991 | 0.820189 | 0.808754 | 0.800079 | 0 | 0.103636 | 0.205073 | 16,399 | 464 | 79 | 35.342672 | 0.674517 | 0.039393 | 0 | 0.697917 | 0 | 0 | 0.000953 | 0 | 0 | 0 | 0 | 0 | 0.164063 | 1 | 0.039063 | false | 0 | 0.018229 | 0.013021 | 0.083333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
07464c6572f0cbbf170c2d8150cd75dc74ac2a8d | 42 | py | Python | tests/project/__main__.py | DonaldWhyte/module-dependency | 0c4a1bddf3901340f44c28501ff677f2e9caef70 | [
"MIT"
] | 5 | 2015-08-12T15:36:27.000Z | 2021-06-27T22:49:00.000Z | tests/project/__main__.py | DonaldWhyte/module-dependency | 0c4a1bddf3901340f44c28501ff677f2e9caef70 | [
"MIT"
] | null | null | null | tests/project/__main__.py | DonaldWhyte/module-dependency | 0c4a1bddf3901340f44c28501ff677f2e9caef70 | [
"MIT"
] | 1 | 2016-09-20T07:05:08.000Z | 2016-09-20T07:05:08.000Z | from .pack import subpack2
from . import a | 21 | 26 | 0.785714 | 7 | 42 | 4.714286 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028571 | 0.166667 | 42 | 2 | 27 | 21 | 0.914286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4acda77926484e779248b8ee5aff61b4fc28d43c | 2,165 | py | Python | palindrome_check_test.py | igelfiend/Python.Structures.Deque | 4d296615ab1a4c5d7fd4af03164228cb877a0d00 | [
"MIT"
] | null | null | null | palindrome_check_test.py | igelfiend/Python.Structures.Deque | 4d296615ab1a4c5d7fd4af03164228cb877a0d00 | [
"MIT"
] | null | null | null | palindrome_check_test.py | igelfiend/Python.Structures.Deque | 4d296615ab1a4c5d7fd4af03164228cb877a0d00 | [
"MIT"
] | null | null | null | import unittest
from palindrome_check import check_palindrome
class TestPalindromeCheck(unittest.TestCase):
# ------------ TEST PALINDROME CASES-------------------------
def test_palindrome_case_1(self):
check_string = "abccba"
self.assertTrue(check_palindrome(check_string),
"String {0} must be palindrome".format(check_string))
def test_palindrome_case_2(self):
check_string = "a"
self.assertTrue(check_palindrome(check_string),
"String {0} must be palindrome".format(check_string))
def test_palindrome_case_3(self):
check_string = "aa"
self.assertTrue(check_palindrome(check_string),
"String {0} must be palindrome".format(check_string))
def test_palindrome_case_4(self):
check_string = ""
self.assertTrue(check_palindrome(check_string),
"String {0} must be palindrome".format(check_string))
def test_palindrome_case_5(self):
check_string = "abcdcba"
self.assertTrue(check_palindrome(check_string),
"String {0} must be palindrome".format(check_string))
# --------------- TEST NOT PALINDROME CASES -------------------------
def test_not_palindrome_case_1(self):
check_string = "abcde"
self.assertFalse(check_palindrome(check_string),
"String {0} mustn't be palindrome".format(check_string))
def test_not_palindrome_case_2(self):
check_string = "abccbb"
self.assertFalse(check_palindrome(check_string),
"String {0} mustn't be palindrome".format(check_string))
def test_not_palindrome_case_3(self):
check_string = "abcdba"
self.assertFalse(check_palindrome(check_string),
"String {0} mustn't be palindrome".format(check_string))
def test_not_palindrome_case_4(self):
check_string = "aab"
self.assertFalse(check_palindrome(check_string),
"String {0} mustn't be palindrome".format(check_string))
if __name__ == '__main__':
unittest.main()
| 38.660714 | 80 | 0.622171 | 238 | 2,165 | 5.336134 | 0.163866 | 0.233858 | 0.106299 | 0.184252 | 0.829921 | 0.822047 | 0.699213 | 0.699213 | 0.699213 | 0.699213 | 0 | 0.011159 | 0.254965 | 2,165 | 55 | 81 | 39.363636 | 0.776193 | 0.058661 | 0 | 0.439024 | 0 | 0 | 0.155774 | 0 | 0 | 0 | 0 | 0 | 0.219512 | 1 | 0.219512 | false | 0 | 0.04878 | 0 | 0.292683 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
db71d4e1e7320ab2551ee2c81c0d554e7945a7a4 | 62 | py | Python | groundhog/models/__init__.py | lvapeab/GroundHog_INMT | d5ad1d466eaf5040e99b9aaaa1b28c96402436ce | [
"BSD-3-Clause"
] | null | null | null | groundhog/models/__init__.py | lvapeab/GroundHog_INMT | d5ad1d466eaf5040e99b9aaaa1b28c96402436ce | [
"BSD-3-Clause"
] | null | null | null | groundhog/models/__init__.py | lvapeab/GroundHog_INMT | d5ad1d466eaf5040e99b9aaaa1b28c96402436ce | [
"BSD-3-Clause"
] | null | null | null | from LM_model import LM_Model
from BLM_model import BLM_Model
| 20.666667 | 31 | 0.870968 | 12 | 62 | 4.166667 | 0.416667 | 0.28 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 62 | 2 | 32 | 31 | 0.925926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
db8d81772be0fafc652df446acf7345b8ebcf712 | 30 | py | Python | coding/classifier/__init__.py | deep-learning-algorithm/cs231n | b4da574a00622f1993ae3fe9ef777d751ed7e591 | [
"Apache-2.0"
] | 1 | 2020-04-03T08:37:19.000Z | 2020-04-03T08:37:19.000Z | coding/classifier/__init__.py | deep-learning-algorithm/cs231n | b4da574a00622f1993ae3fe9ef777d751ed7e591 | [
"Apache-2.0"
] | 5 | 2021-02-02T22:05:24.000Z | 2022-03-11T23:52:44.000Z | coding/classifier/__init__.py | deep-learning-algorithm/cs231n | b4da574a00622f1993ae3fe9ef777d751ed7e591 | [
"Apache-2.0"
] | null | null | null |
from .nn_classifier import NN | 15 | 29 | 0.833333 | 5 | 30 | 4.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 2 | 29 | 15 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
dbb12b6cf0685b7256b8b66e02cbccf5921bded3 | 242 | py | Python | python/testData/inspections/MoveFromFutureImport.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2018-12-29T09:53:39.000Z | 2018-12-29T09:53:42.000Z | python/testData/inspections/MoveFromFutureImport.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/inspections/MoveFromFutureImport.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | from __future__ import print_function
#comment
from __future__ import absolute_import
class A:
pass
<warning descr="from __future__ imports must occur at the beginning of the file"><caret>from __future__ import with_statement</warning>
| 26.888889 | 135 | 0.81405 | 34 | 242 | 5.235294 | 0.676471 | 0.224719 | 0.269663 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 242 | 8 | 136 | 30.25 | 0.851675 | 0.028926 | 0 | 0 | 0 | 0 | 0.269231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.2 | 0.6 | null | null | 0.2 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 6 |
dbbdcd029c6d31dd9b22dad071d86ba67419b723 | 3,555 | py | Python | metrics/dataset.py | cassianobecker/dnn | bb2ea04f77733de9df10f795bb049ac3b9d30478 | [
"MIT"
] | 3 | 2020-02-21T21:35:07.000Z | 2020-09-29T15:20:00.000Z | metrics/dataset.py | cassianobecker/dnn | bb2ea04f77733de9df10f795bb049ac3b9d30478 | [
"MIT"
] | 27 | 2020-02-20T21:00:23.000Z | 2020-05-22T15:23:25.000Z | metrics/dataset.py | cassianobecker/dnn | bb2ea04f77733de9df10f795bb049ac3b9d30478 | [
"MIT"
] | null | null | null | from fwk.metrics import Metric
class SubjectTotals(Metric):
def __init__(self) -> None:
super().__init__()
self.number_of_train_subjects = None
self.number_of_test_subjects = None
self.regime = None
def on_after_setup(self, local_variables):
self.number_of_train_subjects = len(local_variables['self'].data_loaders['train'].dataset.subjects)
self.number_of_test_subjects = len(local_variables['self'].data_loaders['test'].dataset.subjects)
self.print_metric()
def text_record(self):
train_str = f'number of subjects: {self.number_of_train_subjects} (train)\n'
test_str = f'number of subjects: {self.number_of_test_subjects} (test)\n'
return train_str + test_str
def numpy_record(self, records=None):
if 'number_of_train_subjects' not in records.keys():
records['number_of_train_subjects'] = self.number_of_train_subjects
if 'number_of_test_subjects' not in records.keys():
records['number_of_test_subjects'] = self.number_of_test_subjects
return records
class SubjectList(Metric):
def __init__(self) -> None:
super().__init__()
self.train_subjects = None
self.test_subjects = None
self.regime = None
def on_after_setup(self, local_variables):
self.train_subjects = local_variables['self'].data_loaders['train'].dataset.subjects
self.test_subjects = local_variables['self'].data_loaders['test'].dataset.subjects
self.print_metric()
def text_record(self):
train_str = f'TRAIN:\n'
train_str = train_str + '\n'.join(self.train_subjects)
test_str = f'TEST:\n'
test_str = test_str + '\n'.join(self.test_subjects)
return train_str + '\n\n' + test_str
class ImageTotals(Metric):
def __init__(self) -> None:
super().__init__()
self.number_of_train_images = None
self.number_of_test_images = None
self.regime = None
def on_after_setup(self, local_variables):
self.number_of_train_images = len(local_variables['self'].data_loaders['train'].dataset.images)
self.number_of_test_images = len(local_variables['self'].data_loaders['test'].dataset.images)
self.print_metric()
def text_record(self):
train_str = f'number of subjects: {self.number_of_train_images} (train)\n'
test_str = f'number of subjects: {self.number_of_test_images} (test)\n'
return train_str + test_str
def numpy_record(self, records=None):
if 'number_of_train_subjects' not in records.keys():
records['number_of_train_subjects'] = self.number_of_train_images
if 'number_of_test_subjects' not in records.keys():
records['number_of_test_subjects'] = self.number_of_test_images
return records
class ImageList(Metric):
def __init__(self) -> None:
super().__init__()
self.train_images = None
self.test_images = None
self.regime = None
def on_after_setup(self, local_variables):
self.train_images = local_variables['self'].data_loaders['train'].dataset.images
self.test_images = local_variables['self'].data_loaders['test'].dataset.images
self.print_metric()
def text_record(self):
train_str = f'TRAIN:\n'
train_str = train_str + '\n'.join(self.train_images)
test_str = f'TEST:\n'
test_str = test_str + '\n'.join(self.test_images)
return train_str + '\n\n' + test_str
| 32.916667 | 107 | 0.670042 | 473 | 3,555 | 4.659619 | 0.095137 | 0.101633 | 0.087114 | 0.08167 | 0.917423 | 0.891107 | 0.84755 | 0.813975 | 0.80853 | 0.678766 | 0 | 0 | 0.218284 | 3,555 | 107 | 108 | 33.224299 | 0.793091 | 0 | 0 | 0.547945 | 0 | 0 | 0.151336 | 0.086076 | 0 | 0 | 0 | 0 | 0 | 1 | 0.191781 | false | 0 | 0.013699 | 0 | 0.342466 | 0.054795 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
dbd08eae0c8ab0ec291d2639814147b2c60eb9e1 | 66 | py | Python | config/settings/configurations/__init__.py | vaibhav-jain/skeleton | 90646bd77011facf99e84bb74242691f24b77dcc | [
"Apache-2.0"
] | 2 | 2016-02-07T14:50:21.000Z | 2017-08-23T15:44:42.000Z | config/settings/configurations/__init__.py | vaibhav-jain/skeleton | 90646bd77011facf99e84bb74242691f24b77dcc | [
"Apache-2.0"
] | 35 | 2021-01-19T17:47:07.000Z | 2022-03-01T23:11:36.000Z | config/settings/configurations/__init__.py | vaibhav-jain/skeleton | 90646bd77011facf99e84bb74242691f24b77dcc | [
"Apache-2.0"
] | 1 | 2019-01-06T20:17:55.000Z | 2019-01-06T20:17:55.000Z | from .ENV import *
from .DJANGO import *
from .GRAPPELLI import *
| 16.5 | 24 | 0.727273 | 9 | 66 | 5.333333 | 0.555556 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 66 | 3 | 25 | 22 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
dbd41914e62bd4cd16b03ea971d4834ba2d679cc | 99 | py | Python | bclm/__init__.py | OnlpLab/bclm | 981a87200dcc3c4b94ce17a665ac0b032e9ebe13 | [
"Apache-2.0"
] | null | null | null | bclm/__init__.py | OnlpLab/bclm | 981a87200dcc3c4b94ce17a665ac0b032e9ebe13 | [
"Apache-2.0"
] | null | null | null | bclm/__init__.py | OnlpLab/bclm | 981a87200dcc3c4b94ce17a665ac0b032e9ebe13 | [
"Apache-2.0"
] | null | null | null | from .transforms import *
from .readers import *
from .outputs import *
from .evaluations import *
| 19.8 | 26 | 0.757576 | 12 | 99 | 6.25 | 0.5 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.161616 | 99 | 4 | 27 | 24.75 | 0.903614 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
915fa83edacf94166268427a62373ca1961b9c76 | 158 | py | Python | files_in_python/people1_exercise.py | Ena-Sharma/Meraki_Solution | 1bfff62f6aeb69354712d0b5a9e46ddacff357f5 | [
"MIT"
] | null | null | null | files_in_python/people1_exercise.py | Ena-Sharma/Meraki_Solution | 1bfff62f6aeb69354712d0b5a9e46ddacff357f5 | [
"MIT"
] | null | null | null | files_in_python/people1_exercise.py | Ena-Sharma/Meraki_Solution | 1bfff62f6aeb69354712d0b5a9e46ddacff357f5 | [
"MIT"
] | null | null | null | # with open("people1_exercise.txt","r")as file:
# print (file.read())
# file.close()
file=open("people1_exercise.txt","r")
print(file.read())
file.close() | 19.75 | 47 | 0.670886 | 24 | 158 | 4.333333 | 0.458333 | 0.211538 | 0.365385 | 0.423077 | 0.865385 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013986 | 0.094937 | 158 | 8 | 48 | 19.75 | 0.713287 | 0.5 | 0 | 0 | 0 | 0 | 0.276316 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
9174cffc150cbf2ca83554fab31762ad78e38dd5 | 81 | py | Python | NCM/solvers/__init__.py | OpenXAIProject/Neural-Conversation-Models | 72c2e71f0a5068733762078dfbb5f1405ea0be00 | [
"MIT"
] | 11 | 2020-11-24T00:05:50.000Z | 2020-11-25T07:52:07.000Z | NCM/solvers/__init__.py | OpenXAIProject/Neural-Conversation-Models-Response-Evaluation | 72c2e71f0a5068733762078dfbb5f1405ea0be00 | [
"MIT"
] | null | null | null | NCM/solvers/__init__.py | OpenXAIProject/Neural-Conversation-Models-Response-Evaluation | 72c2e71f0a5068733762078dfbb5f1405ea0be00 | [
"MIT"
] | null | null | null | from .solver import *
from .hred_solver import *
from .speakaddr_solver import *
| 20.25 | 31 | 0.777778 | 11 | 81 | 5.545455 | 0.454545 | 0.590164 | 0.52459 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 81 | 3 | 32 | 27 | 0.884058 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
37d757f89a77e1c3c7f27a361b4fbabf5ff0b1e8 | 7,818 | py | Python | integration/api_setu.py | arpitkjain7/VacciDate | b542028490c76d44d53880798d3e5e8cf13e7c22 | [
"MIT"
] | 4 | 2021-05-23T13:48:22.000Z | 2022-02-23T04:27:35.000Z | integration/api_setu.py | arpitkjain7/VacciDate | b542028490c76d44d53880798d3e5e8cf13e7c22 | [
"MIT"
] | null | null | null | integration/api_setu.py | arpitkjain7/VacciDate | b542028490c76d44d53880798d3e5e8cf13e7c22 | [
"MIT"
] | null | null | null | import json
def get_state_id_by_state_name(state_name: str):
with open("data/state_code.json", "r") as f:
data = json.load(f)
not_found = False
for state in data.get("states"):
if state.get("state_name") == state_name:
return state.get("state_id")
else:
not_found = True
continue
if not_found:
return None
def get_district_id_from_file(file_path: str, district_name: str):
with open(file_path, "r") as f:
data = json.load(f)
not_found = False
for dist in data.get("districts"):
if dist.get("district_name") == district_name:
return dist.get("district_id")
else:
not_found = True
continue
if not_found:
return None
def get_instant_applicable_slots(slot_details: dict, age_group: int):
master_list = []
for slot in slot_details.get("centers"):
if len(slot.get("sessions")) > 0:
for session in slot.get("sessions"):
if (
session.get("min_age_limit") == age_group
and session.get("available_capacity") > 0
):
# booking_details = {
# "Center Name": slot.get("name"),
# "Address": slot.get("address"),
# "Date": session.get("date"),
# "Available": session.get("available_capacity"),
# "Vaccine Type": session.get("vaccine"),
# "Payment Type": slot.get("fee_type"),
# "Time slots": session.get("slots"),
# }
booking_details = f"VACCINE AVAILABLE({session.get('min_age_limit')}+)\n{session.get('date')}\n{slot.get('district_name')}-{slot.get('state_name')}\n{slot.get('name')},{slot.get('address')}\n{session.get('available_capacity')} shots\n\nhttps://selfregistration.cowin.gov.in/"
master_list.append(booking_details)
return master_list
def get_applicable_slots(slot_details: dict, age_group: list):
master_list = []
for slot in slot_details.get("centers"):
if len(slot.get("sessions")) > 0:
for session in slot.get("sessions"):
for age in age_group:
if (
session.get("min_age_limit") == age
and session.get("available_capacity") > 0
):
# booking_details = {
# "Center Name": slot.get("name"),
# "Address": slot.get("address"),
# "Date": session.get("date"),
# "Available": session.get("available_capacity"),
# "Vaccine Type": session.get("vaccine"),
# "Payment Type": slot.get("fee_type"),
# "Time slots": session.get("slots"),
# }
booking_details = f"VACCINE AVAILABLE({session.get('min_age_limit')}+)\n{session.get('date')}\n{slot.get('district_name')}-{slot.get('state_name')}\n{slot.get('name')},{slot.get('address')}\n{session.get('available_capacity')} shots\n\nhttps://selfregistration.cowin.gov.in/"
master_list.append(booking_details)
return master_list
def get_generic_slots(slot_details: dict, age_group):
master_list = []
# n = 0
# m = 0
for slot in slot_details.get("centers"):
if len(slot.get("sessions")) > 0:
for session in slot.get("sessions"):
if (
session.get("min_age_limit") == 18
and session.get("available_capacity") >= 1
# and n < 3
):
booking_details = f"VACCINE AVAILABLE({session.get('min_age_limit')}+)\n{session.get('date')}\n{slot.get('district_name')}-{slot.get('state_name')}\n{slot.get('name')},{slot.get('address')}\nVaccine Type: {session.get('vaccine')}\n{session.get('available_capacity')} shots\n\nhttps://selfregistration.cowin.gov.in/"
master_list.append(booking_details)
if len(master_list) == 5:
return master_list
# n += 1
# elif (
# session.get("min_age_limit") == 45
# and session.get("available_capacity") > 0
# and m < 3
# ):
# booking_details = f"VACCINE AVAILABLE({session.get('min_age_limit')}+)\n{session.get('date')}\n{slot.get('district_name')}-{slot.get('state_name')}\n{slot.get('name')},{slot.get('address')}\n{session.get('available_capacity')} shots\n\nhttps://selfregistration.cowin.gov.in/"
# master_list.append(booking_details)
# m += 1
return master_list
def filter_results(slot_details: dict, age_group, dose):
master_list = []
n = 0
if dose is not None:
dose_filter = f"available_capacity_dose{dose}"
else:
dose_filter = "available_capacity"
for slot in slot_details.get("centers"):
if len(slot.get("sessions")) > 0:
for session in slot.get("sessions"):
if age_group is None:
if (
session.get("min_age_limit") == 18
and session.get(dose_filter) > 1
and n < 3
):
booking_details = f"VACCINE AVAILABLE({session.get('min_age_limit')}+)\n{session.get('date')}\n{slot.get('district_name')}-{slot.get('state_name')}\n{slot.get('name')},{slot.get('address')}\nVaccine Type: {session.get('vaccine')}\n1st Dose availablility --> {session.get('available_capacity_dose1')} shots\n2nd Dose availablility --> {session.get('available_capacity_dose2')} shots\n\nRegister now from : https://selfregistration.cowin.gov.in/"
master_list.append(booking_details)
n += 1
elif (
session.get("min_age_limit") == 45
and session.get(dose_filter) > 1
and n < 6
):
booking_details = f"VACCINE AVAILABLE({session.get('min_age_limit')}+)\n{session.get('date')}\n{slot.get('district_name')}-{slot.get('state_name')}\n{slot.get('name')},{slot.get('address')}\nVaccine Type: {session.get('vaccine')}\n1st Dose availablility --> {session.get('available_capacity_dose1')} shots\n2nd Dose availablility --> {session.get('available_capacity_dose2')} shots\n\nRegister now from : https://selfregistration.cowin.gov.in/"
master_list.append(booking_details)
n += 1
if len(master_list) == 6:
return master_list
else:
if (
session.get("min_age_limit") == int(age_group)
and session.get(dose_filter) > 1
):
booking_details = f"VACCINE AVAILABLE({session.get('min_age_limit')}+)\n{session.get('date')}\n{slot.get('district_name')}-{slot.get('state_name')}\n{slot.get('name')},{slot.get('address')}\nVaccine Type: {session.get('vaccine')}\n1st Dose availablility --> {session.get('available_capacity_dose1')} shots\n2nd Dose availablility --> {session.get('available_capacity_dose2')} shots\n\nRegister now from : https://selfregistration.cowin.gov.in/"
master_list.append(booking_details)
if len(master_list) == 6:
return master_list
return master_list
| 53.547945 | 468 | 0.537094 | 883 | 7,818 | 4.573046 | 0.106455 | 0.123824 | 0.075285 | 0.106984 | 0.872214 | 0.859089 | 0.836057 | 0.810054 | 0.791729 | 0.791729 | 0 | 0.008484 | 0.321566 | 7,818 | 145 | 469 | 53.917241 | 0.752828 | 0.138143 | 0 | 0.699029 | 0 | 0.058252 | 0.360966 | 0.223498 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058252 | false | 0 | 0.009709 | 0 | 0.174757 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
37dcbc7e08d4ad89edacf1d885e34d7f171db499 | 5,935 | py | Python | sciann/utils/validations.py | wangcj05/sciann | 48a0a829303205dc16c7bf7c62f3c1128a39c1c5 | [
"MIT"
] | null | null | null | sciann/utils/validations.py | wangcj05/sciann | 48a0a829303205dc16c7bf7c62f3c1128a39c1c5 | [
"MIT"
] | null | null | null | sciann/utils/validations.py | wangcj05/sciann | 48a0a829303205dc16c7bf7c62f3c1128a39c1c5 | [
"MIT"
] | 1 | 2021-11-05T03:49:25.000Z | 2021-11-05T03:49:25.000Z | """ Utilities to process functionals.
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import sciann
def is_functional(f):
""" Checks whether `f` is a functional object.
# Arguments
f: an object to be tested.
# Returns
True if functional.
# Raises
ValueError: if the object cannot be tested with `isinstance`.
"""
if isinstance(f, (sciann.Functional, sciann.functionals.RNNFunctional)):
return True
else:
return False
def validate_functional(f):
""" if `f` is not a functional object, raises value error.
# Arguments
f: an object to be tested.
# Returns
True if functional, False otherwise.
# Raises
ValueError: if the object is not a Functional object.
"""
if isinstance(f, (sciann.Functional, sciann.functionals.rnn_functional.RNNFunctional)):
return True
else:
raise ValueError(
'These operations can only be applied to the `functional` object. '
'Use `Keras` or `TensorFlow` functions when applying to tensors.'
)
def is_constraint(f):
""" Checks whether `f` is a `Constraint` object.
# Arguments
f: an object to be tested.
# Returns
True if Constraint.
# Raises
ValueError: if the object cannot be tested with `isinstance`.
"""
if isinstance(f, sciann.Constraint):
return True
else:
return False
def validate_constraint(f):
""" if `f` is not a Constraint object, raises value error.
# Arguments
f: an object to be tested.
# Returns
True if Constraint, False otherwise.
# Raises
ValueError: if the object is not a Constraint object.
"""
if isinstance(f, sciann.Constraint):
return True
else:
raise ValueError(
'These operations can only be applied to the `Constraint` object. '
'Use `Keras` or `TensorFlow` functions when applying to tensors '
'or layers. '
)
def is_parameter(f):
""" Checks whether `f` is a parameter object.
# Arguments
f: an object to be tested.
# Returns
True if a parameter.
# Raises
ValueError: if the object cannot be tested with `isinstance`.
"""
if isinstance(f, sciann.Parameter):
return True
else:
return False
def validate_parameter(f):
""" if `f` is not a parameter object, raises value error.
# Arguments
f: an object to be tested.
# Returns
True if parameter, False otherwise.
# Raises
ValueError: if the object is not a Parameter object.
"""
if isinstance(f, sciann.Parameter):
return True
else:
raise ValueError(
'These operations can only be applied to the `parameter` object. '
'Use `Keras` or `TensorFlow` functions when applying to tensors.'
)
def is_field(f):
""" Checks whether `f` is a `Field` object.
# Arguments
f: an object to be tested.
# Returns
True if Field.
# Raises
ValueError: if the object cannot be tested with `isinstance`.
"""
if isinstance(f, (sciann.Field, sciann.functionals.RNNField)):
return True
else:
return False
def validate_field(f):
""" if `f` is not a Field object, raises value error.
# Arguments
f: an object to be tested.
# Returns
True if Field, False otherwise.
# Raises
ValueError: if the object is not a Field object.
"""
if isinstance(f, (sciann.Field, sciann.functionals.RNNField)):
return True
else:
raise ValueError(
'These operations can only be applied to the `Field` object. '
'Use `Keras` or `TensorFlow` functions when applying to tensors '
'or layers. '
)
def is_variable(f):
""" Checks whether `f` is a `Variable` object.
# Arguments
f: an object to be tested.
# Returns
True if Variable.
# Raises
ValueError: if the object cannot be tested with `isinstance`.
"""
if isinstance(f, (sciann.Variable, sciann.functionals.RadialBasis, sciann.functionals.RNNVariable)):
return True
else:
return False
def validate_variable(f):
""" if `f` is not a Variable object, raises value error.
# Arguments
f: an object to be tested.
# Returns
True if Variable, False otherwise.
# Raises
ValueError: if the object is not a Variable object.
"""
if isinstance(f, (sciann.Variable, sciann.functionals.RadialBasis, sciann.functionals.RNNVariable)):
return True
else:
raise ValueError(
'These operations can only be applied to the `Variable` object. '
'Use `Keras` or `TensorFlow` functions when applying to tensors '
'or layers. '
)
def is_scimodel(f):
""" Checks whether `f` is a `SciModel` object.
# Arguments
f: an object to be tested.
# Returns
True if SciModel.
# Raises
ValueError: if the object cannot be tested with `isinstance`.
"""
if isinstance(f, sciann.SciModel):
return True
else:
return False
def validate_scimodel(f):
""" if `f` is not a SciModel object, raises value error.
# Arguments
f: an object to be tested.
# Returns
True if SciModel, False otherwise.
# Raises
ValueError: if the object is not a SciModel object.
"""
if isinstance(f, sciann.SciModel):
return True
else:
raise ValueError(
'These operations can only be applied to the `SciModel` object. '
'Use `Keras` or `TensorFlow` functions when applying to tensors '
'or layers. '
)
| 21.819853 | 104 | 0.605729 | 703 | 5,935 | 5.075391 | 0.093883 | 0.040359 | 0.040359 | 0.060538 | 0.891536 | 0.853419 | 0.806334 | 0.744955 | 0.702074 | 0.702074 | 0 | 0 | 0.313395 | 5,935 | 271 | 105 | 21.900369 | 0.875583 | 0.420388 | 0 | 0.651163 | 0 | 0 | 0.267333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.139535 | false | 0 | 0.046512 | 0 | 0.395349 | 0.011628 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
37ecc18236317369b5c1f01f9376c53b23473f0c | 25,015 | py | Python | resolwe/storage/tests/test_listener.py | dblenkus/resolwe | d64cd9bc7d77b383771a54e01b5db136abb23767 | [
"Apache-2.0"
] | null | null | null | resolwe/storage/tests/test_listener.py | dblenkus/resolwe | d64cd9bc7d77b383771a54e01b5db136abb23767 | [
"Apache-2.0"
] | null | null | null | resolwe/storage/tests/test_listener.py | dblenkus/resolwe | d64cd9bc7d77b383771a54e01b5db136abb23767 | [
"Apache-2.0"
] | null | null | null | # pylint: disable=missing-docstring
from pathlib import Path
from unittest.mock import MagicMock, call, patch
from resolwe.flow.managers.listener import ExecutorListener
from resolwe.flow.managers.protocol import ExecutorProtocol
from resolwe.flow.models import Data, DataDependency
from resolwe.storage.models import FileStorage, ReferencedPath, StorageLocation
from resolwe.test import TestCase
class ListenerTest(TestCase):
fixtures = ["storage_data.yaml", "storage_processes.yaml", "storage_users.yaml"]
@classmethod
def setUpTestData(cls):
cls.listener = ExecutorListener()
cls.file_storage = FileStorage.objects.get(id=1)
cls.storage_location = StorageLocation.objects.create(
file_storage=cls.file_storage, connector_name="GCS", status="OK"
)
cls.path = ReferencedPath.objects.create(
path="test.me", md5="md5", crc32c="crc", awss3etag="aws"
)
cls.storage_location.files.add(cls.path)
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_download_finished_missing_storage_location(
self, async_to_sync_mock, send_reply_mock
):
obj = {
"command": ExecutorProtocol.DOWNLOAD_FINISHED,
"data_id": -1,
"storage_location_id": -2,
}
send_wrapper = MagicMock()
async_to_sync_mock.return_value = send_wrapper
with patch(
"resolwe.storage.models.FileStorage.default_storage_location",
self.storage_location,
):
self.listener.handle_download_finished(obj)
async_to_sync_mock.assert_called_once_with(send_reply_mock)
send_wrapper.assert_called_once_with(
{
"command": "download_finished",
"data_id": -1,
"storage_location_id": -2,
},
{"result": "ER"},
)
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_download_finished(self, async_to_sync_mock, send_reply_mock):
storage_location = StorageLocation.objects.create(
file_storage=self.file_storage, connector_name="local"
)
obj = {
"command": ExecutorProtocol.DOWNLOAD_FINISHED,
"data_id": 1,
"storage_location_id": storage_location.id,
}
send_wrapper = MagicMock()
async_to_sync_mock.return_value = send_wrapper
with patch(
"resolwe.storage.models.FileStorage.default_storage_location",
self.storage_location,
):
self.listener.handle_download_finished(obj)
send_wrapper.assert_called_once_with(
{
"command": "download_finished",
"data_id": 1,
"storage_location_id": storage_location.id,
},
{"result": "OK"},
)
storage_location.refresh_from_db()
self.assertEqual(storage_location.status, StorageLocation.STATUS_DONE)
self.assertEqual(storage_location.files.count(), 1)
file = storage_location.files.get()
self.assertEqual(file.path, "test.me")
self.assertEqual(file.md5, "md5")
self.assertEqual(file.crc32c, "crc")
self.assertEqual(file.awss3etag, "aws")
@patch("resolwe.flow.managers.listener.logger.error")
def test_handle_download_aborted_missing_storage_location(self, error_logger_mock):
obj = {
"command": ExecutorProtocol.DOWNLOAD_ABORTED,
"data_id": -1,
"storage_location_id": -2,
}
self.listener.handle_download_aborted(obj)
error_logger_mock.assert_called_once_with(
"StorageLocation for data does not exist",
extra={"storage_location_id": -2, "data_id": -1},
)
@patch("resolwe.flow.managers.listener.logger.error")
def test_handle_download_aborted(self, error_logger_mock):
storage_location = StorageLocation.objects.create(
file_storage=self.file_storage,
connector_name="local",
status=StorageLocation.STATUS_UPLOADING,
)
obj = {
"command": ExecutorProtocol.DOWNLOAD_ABORTED,
"data_id": -1,
"storage_location_id": storage_location.id,
}
self.listener.handle_download_aborted(obj)
error_logger_mock.assert_not_called()
storage_location.refresh_from_db()
self.assertEqual(storage_location.status, StorageLocation.STATUS_PREPARING)
@patch("resolwe.flow.managers.listener.consumer.send_event")
@patch("resolwe.flow.managers.listener.logger.error")
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_download_started_no_location(
self, async_to_sync_mock, send_reply_mock, error_logger_mock, send_event_mock
):
obj = {
"command": ExecutorProtocol.DOWNLOAD_STARTED,
"data_id": -1,
"storage_location_id": -2,
"download_started_lock": True,
}
self.listener.handle_download_started(obj)
async_to_sync_mock.assert_has_calls(
[
call(send_reply_mock),
call()(
{
"command": "download_started",
"data_id": -1,
"storage_location_id": -2,
"download_started_lock": True,
},
{"result": "ER"},
),
call(send_event_mock),
call()(
{
"command": "abort_data",
"data_id": -1,
"communicate_kwargs": {
"executor": "resolwe.flow.executors.local"
},
}
),
]
)
error_logger_mock.assert_called_once_with(
"StorageLocation for downloaded data does not exist",
extra={"storage_location_id": -2},
)
@patch("resolwe.flow.managers.listener.consumer.send_event")
@patch("resolwe.flow.managers.listener.logger.error")
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_download_started_ok_no_lock_preparing(
self, async_to_sync_mock, send_reply_mock, error_logger_mock, send_event_mock
):
storage_location = StorageLocation.objects.create(
file_storage=self.file_storage, connector_name="local"
)
obj = {
"command": ExecutorProtocol.DOWNLOAD_STARTED,
"data_id": -1,
"storage_location_id": storage_location.id,
"download_started_lock": False,
}
self.listener.handle_download_started(obj)
async_to_sync_mock.assert_has_calls(
[
call(send_reply_mock),
call()(
{
"command": "download_started",
"data_id": -1,
"storage_location_id": storage_location.id,
"download_started_lock": False,
},
{"result": "OK", "download_result": "download_started"},
),
]
)
error_logger_mock.assert_not_called()
storage_location.refresh_from_db()
self.assertEqual(storage_location.status, StorageLocation.STATUS_PREPARING)
@patch("resolwe.flow.managers.listener.consumer.send_event")
@patch("resolwe.flow.managers.listener.logger.error")
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_download_started_ok_no_lock_uploading(
self, async_to_sync_mock, send_reply_mock, error_logger_mock, send_event_mock
):
storage_location = StorageLocation.objects.create(
file_storage=self.file_storage,
connector_name="local",
status=StorageLocation.STATUS_UPLOADING,
)
obj = {
"command": ExecutorProtocol.DOWNLOAD_STARTED,
"data_id": -1,
"storage_location_id": storage_location.id,
"download_started_lock": False,
}
self.listener.handle_download_started(obj)
async_to_sync_mock.assert_has_calls(
[
call(send_reply_mock),
call()(
{
"command": "download_started",
"data_id": -1,
"storage_location_id": storage_location.id,
"download_started_lock": False,
},
{"result": "OK", "download_result": "download_in_progress"},
),
]
)
error_logger_mock.assert_not_called()
storage_location.refresh_from_db()
self.assertEqual(storage_location.status, StorageLocation.STATUS_UPLOADING)
@patch("resolwe.flow.managers.listener.consumer.send_event")
@patch("resolwe.flow.managers.listener.logger.error")
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_download_started_ok_no_lock_done(
self, async_to_sync_mock, send_reply_mock, error_logger_mock, send_event_mock
):
storage_location = StorageLocation.objects.create(
file_storage=self.file_storage,
connector_name="local",
status=StorageLocation.STATUS_DONE,
)
obj = {
"command": ExecutorProtocol.DOWNLOAD_STARTED,
"data_id": -1,
"storage_location_id": storage_location.id,
"download_started_lock": False,
}
self.listener.handle_download_started(obj)
async_to_sync_mock.assert_has_calls(
[
call(send_reply_mock),
call()(
{
"command": "download_started",
"data_id": -1,
"storage_location_id": storage_location.id,
"download_started_lock": False,
},
{"result": "OK", "download_result": "download_finished"},
),
]
)
error_logger_mock.assert_not_called()
storage_location.refresh_from_db()
self.assertEqual(storage_location.status, StorageLocation.STATUS_DONE)
@patch("resolwe.flow.managers.listener.consumer.send_event")
@patch("resolwe.flow.managers.listener.logger.error")
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_download_started_ok_lock(
self, async_to_sync_mock, send_reply_mock, error_logger_mock, send_event_mock
):
storage_location = StorageLocation.objects.create(
file_storage=self.file_storage, connector_name="local"
)
obj = {
"command": ExecutorProtocol.DOWNLOAD_STARTED,
"data_id": -1,
"storage_location_id": storage_location.id,
"download_started_lock": True,
}
self.listener.handle_download_started(obj)
async_to_sync_mock.assert_has_calls(
[
call(send_reply_mock),
call()(
{
"command": "download_started",
"data_id": -1,
"storage_location_id": storage_location.id,
"download_started_lock": True,
},
{"result": "OK", "download_result": "download_started"},
),
]
)
error_logger_mock.assert_not_called()
storage_location.refresh_from_db()
self.assertEqual(storage_location.status, StorageLocation.STATUS_UPLOADING)
@patch("resolwe.flow.managers.listener.consumer.send_event")
@patch("resolwe.flow.managers.listener.logger.error")
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_get_files_to_download_missing_storage_location(
self, async_to_sync_mock, send_reply_mock, error_logger_mock, send_event_mock
):
obj = {
"command": ExecutorProtocol.GET_FILES_TO_DOWNLOAD,
"data_id": -1,
"storage_location_id": -2,
}
self.listener.handle_get_files_to_download(obj)
async_to_sync_mock.assert_has_calls(
[
call(send_reply_mock),
call()(
{
"command": "get_files_to_download",
"data_id": -1,
"storage_location_id": -2,
},
{"result": "ER"},
),
call(send_event_mock),
call()(
{
"command": "abort_data",
"data_id": -1,
"communicate_kwargs": {
"executor": "resolwe.flow.executors.local"
},
}
),
]
)
error_logger_mock.assert_called_once_with(
"StorageLocation object does not exist (handle_get_files_to_download).",
extra={"storage_location_id": -2},
)
@patch("resolwe.flow.managers.listener.consumer.send_event")
@patch("resolwe.flow.managers.listener.logger.error")
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_get_files_to_download(
self, async_to_sync_mock, send_reply_mock, error_logger_mock, send_event_mock
):
obj = {
"command": ExecutorProtocol.GET_FILES_TO_DOWNLOAD,
"data_id": -1,
"storage_location_id": self.storage_location.id,
}
self.listener.handle_get_files_to_download(obj)
async_to_sync_mock.assert_has_calls(
[
call(send_reply_mock),
call()(
{
"command": "get_files_to_download",
"data_id": -1,
"storage_location_id": self.storage_location.id,
},
{
"result": "OK",
"referenced_files": [
{
"id": self.path.id,
"path": "test.me",
"size": -1,
"md5": "md5",
"crc32c": "crc",
"awss3etag": "aws",
}
],
},
),
]
)
@patch("resolwe.flow.managers.listener.consumer.send_event")
@patch("resolwe.flow.managers.listener.logger.error")
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_get_referenced_files_missing_data(
self, async_to_sync_mock, send_reply_mock, error_logger_mock, send_event_mock
):
obj = {
"command": ExecutorProtocol.GET_REFERENCED_FILES,
"data_id": -1,
}
self.listener.handle_get_referenced_files(obj)
async_to_sync_mock.assert_has_calls(
[
call(send_reply_mock),
call()(
{"command": "get_referenced_files", "data_id": -1}, {"result": "ER"}
),
call(send_event_mock),
call()(
{
"command": "abort_data",
"data_id": -1,
"communicate_kwargs": {
"executor": "resolwe.flow.executors.local"
},
}
),
]
)
error_logger_mock.assert_called_once_with(
"Data object does not exist (handle_get_referenced_files).",
extra={"data_id": -1},
)
@patch("resolwe.flow.managers.listener.consumer.send_event")
@patch("resolwe.flow.managers.listener.logger.error")
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_get_referenced_files(
self, async_to_sync_mock, send_reply_mock, error_logger_mock, send_event_mock
):
obj = {
"command": ExecutorProtocol.GET_REFERENCED_FILES,
"data_id": 1,
}
storage_location = StorageLocation.objects.create(
file_storage=self.file_storage,
connector_name="local",
status=StorageLocation.STATUS_DONE,
url=str(self.file_storage.id),
)
path = Path(storage_location.get_path(filename="output.txt"))
path.parent.mkdir(exist_ok=True, parents=True)
path.touch()
data = Data.objects.get(id=1)
data.process.output_schema = [{"name": "output_file", "type": "basic:file:"}]
data.process.save()
data.output = {"output_file": {"file": "output.txt"}}
data.save()
self.listener.handle_get_referenced_files(obj)
async_to_sync_mock.assert_has_calls(
[
call(send_reply_mock),
call()(
{"command": "get_referenced_files", "data_id": 1},
{
"result": "OK",
"referenced_files": [
"jsonout.txt",
"stderr.txt",
"stdout.txt",
"output.txt",
],
},
),
]
)
error_logger_mock.assert_not_called()
@patch("resolwe.flow.managers.listener.consumer.send_event")
@patch("resolwe.flow.managers.listener.logger.error")
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_missing_data_locations_missing_data(
self, async_to_sync_mock, send_reply_mock, error_logger_mock, send_event_mock
):
obj = {
"command": ExecutorProtocol.MISSING_DATA_LOCATIONS,
"data_id": -1,
}
self.listener.handle_missing_data_locations(obj)
async_to_sync_mock.assert_has_calls(
[
call(send_reply_mock),
call()(
{"command": "missing_data_locations", "data_id": -1},
{"result": "ER"},
),
call(send_event_mock),
call()(
{
"command": "abort_data",
"data_id": -1,
"communicate_kwargs": {
"executor": "resolwe.flow.executors.local"
},
}
),
]
)
error_logger_mock.assert_called_once_with(
"Data object does not exist (handle_missing_data_locations).",
extra={"data_id": -1},
)
@patch("resolwe.flow.managers.listener.consumer.send_event")
@patch("resolwe.flow.managers.listener.logger.error")
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_missing_data_locations_missing_storage_location(
self, async_to_sync_mock, send_reply_mock, error_logger_mock, send_event_mock
):
obj = {
"command": ExecutorProtocol.MISSING_DATA_LOCATIONS,
"data_id": 1,
}
parent = Data.objects.get(id=2)
child = Data.objects.get(id=1)
DataDependency.objects.create(
parent=parent, child=child, kind=DataDependency.KIND_IO
)
self.listener.handle_missing_data_locations(obj)
async_to_sync_mock.assert_has_calls(
[
call(send_reply_mock),
call()(
{"command": "missing_data_locations", "data_id": 1},
{"result": "ER"},
),
call(send_event_mock),
call()(
{
"command": "abort_data",
"data_id": 1,
"communicate_kwargs": {
"executor": "resolwe.flow.executors.local"
},
}
),
]
)
error_logger_mock.assert_called_once_with(
"No storage location exists (handle_get_missing_data_locations).",
extra={"data_id": 1, "file_storage_id": 2},
)
self.assertEqual(StorageLocation.all_objects.count(), 1)
@patch("resolwe.flow.managers.listener.consumer.send_event")
@patch("resolwe.flow.managers.listener.logger.error")
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_missing_data_locations_none(
self, async_to_sync_mock, send_reply_mock, error_logger_mock, send_event_mock
):
obj = {
"command": ExecutorProtocol.MISSING_DATA_LOCATIONS,
"data_id": 1,
}
parent = Data.objects.get(id=2)
child = Data.objects.get(id=1)
DataDependency.objects.create(
parent=parent, child=child, kind=DataDependency.KIND_IO
)
StorageLocation.objects.create(
file_storage=parent.location,
connector_name="local",
status=StorageLocation.STATUS_DONE,
url="url",
)
self.listener.handle_missing_data_locations(obj)
async_to_sync_mock.assert_has_calls(
[
call(send_reply_mock),
call()(
{"command": "missing_data_locations", "data_id": 1},
{"result": "OK", "storage_data_locations": []},
),
]
)
error_logger_mock.assert_not_called()
self.assertEqual(StorageLocation.all_objects.count(), 2)
@patch("resolwe.flow.managers.listener.consumer.send_event")
@patch("resolwe.flow.managers.listener.logger.error")
@patch("resolwe.flow.managers.listener.ExecutorListener._send_reply")
@patch("resolwe.flow.managers.listener.async_to_sync")
def test_handle_missing_data_locations(
self, async_to_sync_mock, send_reply_mock, error_logger_mock, send_event_mock
):
obj = {
"command": ExecutorProtocol.MISSING_DATA_LOCATIONS,
"data_id": 1,
}
parent = Data.objects.get(id=2)
child = Data.objects.get(id=1)
DataDependency.objects.create(
parent=parent, child=child, kind=DataDependency.KIND_IO
)
storage_location = StorageLocation.objects.create(
file_storage=parent.location,
connector_name="not_local",
status=StorageLocation.STATUS_DONE,
url="url",
)
self.listener.handle_missing_data_locations(obj)
self.assertEqual(StorageLocation.all_objects.count(), 3)
created = StorageLocation.all_objects.last()
async_to_sync_mock.assert_has_calls(
[
call(send_reply_mock),
call()(
{"command": "missing_data_locations", "data_id": child.id},
{
"result": "OK",
"storage_data_locations": [
{
"connector_name": "not_local",
"url": "url",
"data_id": child.id,
"to_storage_location_id": created.id,
"from_storage_location_id": storage_location.id,
}
],
},
),
]
)
error_logger_mock.assert_not_called()
| 39.270016 | 88 | 0.562103 | 2,397 | 25,015 | 5.508552 | 0.060075 | 0.082929 | 0.086337 | 0.120645 | 0.891851 | 0.882914 | 0.856256 | 0.845274 | 0.842396 | 0.823538 | 0 | 0.004753 | 0.335599 | 25,015 | 636 | 89 | 39.331761 | 0.789711 | 0.001319 | 0 | 0.649502 | 0 | 0 | 0.232706 | 0.147278 | 0 | 0 | 0 | 0 | 0.07309 | 1 | 0.0299 | false | 0 | 0.011628 | 0 | 0.044851 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
531256df64564065140bbe052ade7aeee2e9fc68 | 352 | py | Python | src/graphql/test/__init__.py | btrekkie/graphql | 6c118550267eeb57a9653f4f46d7bbd6c5902110 | [
"MIT"
] | null | null | null | src/graphql/test/__init__.py | btrekkie/graphql | 6c118550267eeb57a9653f4f46d7bbd6c5902110 | [
"MIT"
] | null | null | null | src/graphql/test/__init__.py | btrekkie/graphql | 6c118550267eeb57a9653f4f46d7bbd6c5902110 | [
"MIT"
] | null | null | null | """Runs all of the tests in the "graphql" module."""
if __name__ == '__main__':
import unittest
from graphql.document.test import *
from graphql.executor.test import *
from graphql.scalar_descriptors.lax.test import *
from graphql.scalar_descriptors.strict.test import *
from graphql.schema.test import *
unittest.main()
| 27.076923 | 56 | 0.713068 | 45 | 352 | 5.355556 | 0.488889 | 0.228216 | 0.232365 | 0.348548 | 0.315353 | 0.315353 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190341 | 352 | 12 | 57 | 29.333333 | 0.845614 | 0.130682 | 0 | 0 | 0 | 0 | 0.026667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
533b550579bd9b7e0f86acde53ca15db9907b36f | 10,272 | py | Python | models.py | usxxng/SI3DP | c41d841e4e6e87b24500be68960177a2bae7f0af | [
"MIT"
] | null | null | null | models.py | usxxng/SI3DP | c41d841e4e6e87b24500be68960177a2bae7f0af | [
"MIT"
] | null | null | null | models.py | usxxng/SI3DP | c41d841e4e6e87b24500be68960177a2bae7f0af | [
"MIT"
] | 2 | 2021-10-01T09:59:03.000Z | 2021-12-11T09:21:40.000Z | import torch
import torch.nn as nn
import geffnet
from resnest.torch import resnest101
from pretrainedmodels import se_resnext101_32x4d
'''
class Network_name(nn.Module):
# See the network creation section below.
def __init__(self, net_type, out_dim, n_meta_features=0, n_meta_dim=[512, 128], pretrained=False):
# initialization
enet_type : Network name from argument
out_dim : output layer size
n_meta_dim : mlp size (2 basic layers)
pretrained : Will you use a pre-trained model?
def extract(self, x):
# Extract the results of the base network (image deep features)
x = self.enet(x)
return x
def forward(self, x, x_meta=None):
# Get final network results (Include fc_layer)
'''
class Effnet_MMC(nn.Module):
def __init__(self, enet_type, out_dim, n_meta_dim=[512, 128], pretrained=False):
super(Effnet_MMC, self).__init__()
# efficient net Model
self.enet = geffnet.create_model(enet_type, pretrained=pretrained)
self.dropouts = nn.ModuleList([
nn.Dropout(0.5) for _ in range(5)
])
in_ch = self.enet.classifier.in_features
self.myfc = nn.Sequential(
nn.Linear(in_ch, n_meta_dim[0]),
nn.BatchNorm1d(n_meta_dim[0]),
# swish activation function
Swish_Module(),
nn.Dropout(p=0.3),
nn.Linear(n_meta_dim[0], n_meta_dim[1]),
nn.BatchNorm1d(n_meta_dim[1]),
Swish_Module(),
nn.Linear(n_meta_dim[1], out_dim)
)
self.enet.classifier = nn.Identity()
def extract(self, x):
x = self.enet(x)
return x
def forward(self, x, x_meta=None):
x = self.extract(x).squeeze(-1).squeeze(-1)
for i, dropout in enumerate(self.dropouts):
if i == 0:
out = self.myfc(dropout(x))
else:
out += self.myfc(dropout(x))
out /= len(self.dropouts)
return out
class Effnet_MMC_Multitask(nn.Module):
def __init__(self, enet_type, out_dim, out_dim2, pretrained=False):
super(Effnet_MMC_Multitask, self).__init__()
# efficient net 모델
self.enet = geffnet.create_model(enet_type, pretrained=pretrained)
self.dropouts = nn.ModuleList([
nn.Dropout(0.5) for _ in range(5)
])
in_ch = self.enet.classifier.in_features
self.myfc_1 = nn.Linear(in_ch, out_dim)
self.myfc_2 = nn.Linear(in_ch, out_dim2)
self.enet.classifier = nn.Identity()
def extract(self, x):
x = self.enet(x)
return x
def forward(self, x, x_meta=None):
x = self.extract(x).squeeze(-1).squeeze(-1)
for i, dropout in enumerate(self.dropouts):
if i == 0:
out1 = self.myfc_1(dropout(x))
out2 = self.myfc_2(dropout(x))
else:
out1 += self.myfc_1(dropout(x))
out2 += self.myfc_2(dropout(x))
out1 /= len(self.dropouts)
out2 /= len(self.dropouts)
return out1, out2
class Effnet_MMC_Multi_Modal(nn.Module):
def __init__(self, enet_type, out_dim, out_dim2, pretrained=False):
super(Effnet_MMC_Multi_Modal, self).__init__()
# efficient net 모델
self.enet = geffnet.create_model(enet_type, pretrained=pretrained)
if out_dim2 == 5:
self.enet2 = geffnet.create_model(enet_type, pretrained=pretrained)
self.enet2.classifier = nn.Identity()
self.dropouts = nn.ModuleList([
nn.Dropout(0.5) for _ in range(5)
])
in_ch = self.enet.classifier.in_features
self.fc_device_closefull = nn.Linear(in_ch*2, out_dim)
self.fc_quality_closefull = nn.Linear(in_ch*2, out_dim2)
self.fc_quality_close = nn.Linear(in_ch, out_dim2)
self.enet.classifier = nn.Identity()
def extract(self, x, x2):
x = self.enet(x)
x2 = self.enet(x2)
return x, x2
def forward(self, close, full):
close = self.enet(close).squeeze(-1).squeeze(-1)
full = self.enet(full).squeeze(-1).squeeze(-1)
close_full = torch.cat((close, full), dim=1)
for i, dropout in enumerate(self.dropouts):
out1 = self.fc_device_closefull(dropout(close_full))
out2 = self.fc_quality_closefull(dropout(close_full))
out3 = self.fc_quality_close(dropout(close))
out1 /= len(self.dropouts)
out2 /= len(self.dropouts)
out3 /= len(self.dropouts)
return out1, out2, out3
class Effnet_MMC_Multi_Modal_Single_Task(nn.Module):
def __init__(self, enet_type, out_dim, n_meta_dim=[512, 128], pretrained=False):
super(Effnet_MMC_Multi_Modal_Single_Task, self).__init__()
# efficient net 모델
self.enet = geffnet.create_model(enet_type, pretrained=pretrained)
self.dropouts = nn.ModuleList([
nn.Dropout(0.5) for _ in range(5)
])
in_ch = self.enet.classifier.in_features
self.fc_device_closefull = nn.Linear(in_ch*2, out_dim)
self.fc_device_close = nn.Linear(in_ch, out_dim)
self.enet.classifier = nn.Identity()
def extract(self, x, x2):
x = self.enet(x)
x2 = self.enet(x2)
return x, x2
def forward(self, close, full):
close = self.enet(close).squeeze(-1).squeeze(-1)
full = self.enet(full).squeeze(-1).squeeze(-1)
close_full = torch.cat((close, full), dim=1)
for i, dropout in enumerate(self.dropouts):
out = self.fc_device_closefull(dropout(close_full))
out2 = self.fc_device_close(dropout(close))
# if i == 0:
# out = self.fc_device_closefull(dropout(close_full))
# else:
# out += self.fc_device_closefull(dropout(close_full))
out /= len(self.dropouts)
out2 /= len(self.dropouts)
return out, out2
class Resnest_MMC(nn.Module):
def __init__(self, enet_type, out_dim, n_meta_features=0, n_meta_dim=[512, 128], pretrained=False):
super(Resnest_MMC, self).__init__()
self.n_meta_features = n_meta_features
self.enet = resnest101(pretrained=pretrained)
self.dropouts = nn.ModuleList([
nn.Dropout(0.5) for _ in range(5)
])
in_ch = self.enet.fc.in_features
if n_meta_features > 0:
self.meta = nn.Sequential(
nn.Linear(n_meta_features, n_meta_dim[0]),
nn.BatchNorm1d(n_meta_dim[0]),
Swish_Module(),
nn.Dropout(p=0.3),
nn.Linear(n_meta_dim[0], n_meta_dim[1]),
nn.BatchNorm1d(n_meta_dim[1]),
Swish_Module(),
)
in_ch += n_meta_dim[1]
self.myfc = nn.Sequential(
nn.Linear(in_ch, out_dim),
nn.BatchNorm1d(n_meta_dim[0]),
# swish activation function
Swish_Module(),
nn.Dropout(p=0.3),
nn.Linear(n_meta_dim[0], n_meta_dim[1]),
nn.BatchNorm1d(n_meta_dim[1]),
Swish_Module(),
)
self.enet.fc = nn.Identity()
def extract(self, x):
x = self.enet(x)
return x
def forward(self, x, x_meta=None):
x = self.extract(x).squeeze(-1).squeeze(-1)
if self.n_meta_features > 0:
x_meta = self.meta(x_meta)
x = torch.cat((x, x_meta), dim=1)
for i, dropout in enumerate(self.dropouts):
if i == 0:
out = self.myfc(dropout(x))
else:
out += self.myfc(dropout(x))
out /= len(self.dropouts)
return out
class Seresnext_MMC(nn.Module):
def __init__(self, enet_type, out_dim, n_meta_features=0, n_meta_dim=[512, 128], pretrained=False):
super(Seresnext_MMC, self).__init__()
self.n_meta_features = n_meta_features
if pretrained:
self.enet = se_resnext101_32x4d(num_classes=1000, pretrained='imagenet')
else:
self.enet = se_resnext101_32x4d(num_classes=1000, pretrained=None)
self.enet.avg_pool = nn.AdaptiveAvgPool2d((1, 1))
self.dropouts = nn.ModuleList([
nn.Dropout(0.5) for _ in range(5)
])
in_ch = self.enet.last_linear.in_features
if n_meta_features > 0:
self.meta = nn.Sequential(
nn.Linear(n_meta_features, n_meta_dim[0]),
nn.BatchNorm1d(n_meta_dim[0]),
Swish_Module(),
nn.Dropout(p=0.3),
nn.Linear(n_meta_dim[0], n_meta_dim[1]),
nn.BatchNorm1d(n_meta_dim[1]),
Swish_Module(),
)
in_ch += n_meta_dim[1]
self.myfc = nn.Sequential(
nn.Linear(in_ch, out_dim),
nn.BatchNorm1d(n_meta_dim[0]),
# swish activation function
Swish_Module(),
nn.Dropout(p=0.3),
nn.Linear(n_meta_dim[0], n_meta_dim[1]),
nn.BatchNorm1d(n_meta_dim[1]),
Swish_Module(),
)
self.enet.last_linear = nn.Identity()
def extract(self, x):
x = self.enet(x)
return x
def forward(self, x, x_meta=None):
x = self.extract(x).squeeze(-1).squeeze(-1)
if self.n_meta_features > 0:
x_meta = self.meta(x_meta)
x = torch.cat((x, x_meta), dim=1)
for i, dropout in enumerate(self.dropouts):
if i == 0:
out = self.myfc(dropout(x))
else:
out += self.myfc(dropout(x))
out /= len(self.dropouts)
return out
sigmoid = nn.Sigmoid()
# swish activation function
# sigmoid에 x를 곱한 형태
class Swish(torch.autograd.Function):
@staticmethod
def forward(ctx, i):
result = i * sigmoid(i)
ctx.save_for_backward(i)
return result
@staticmethod
def backward(ctx, grad_output):
i = ctx.saved_variables[0]
sigmoid_i = sigmoid(i)
return grad_output * (sigmoid_i * (1 + i * (1 - sigmoid_i)))
class Swish_Module(nn.Module):
def forward(self, x):
return Swish.apply(x)
| 33.242718 | 103 | 0.581873 | 1,381 | 10,272 | 4.098479 | 0.101376 | 0.039753 | 0.04523 | 0.020671 | 0.79894 | 0.789753 | 0.774558 | 0.760601 | 0.711307 | 0.693993 | 0 | 0.029371 | 0.300623 | 10,272 | 308 | 104 | 33.350649 | 0.758491 | 0.031347 | 0 | 0.704846 | 0 | 0 | 0.000861 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.092511 | false | 0 | 0.022026 | 0.004405 | 0.215859 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
533c83958baf36c57f54afe128bd7cbc2647e207 | 23 | py | Python | src/Lib/browser/__init__.py | martinphellwig/brython_wf | e169afc1e048cba0c12118b4cd6f109df6fe67c9 | [
"BSD-3-Clause"
] | 3 | 2017-04-04T06:18:16.000Z | 2020-01-17T02:03:39.000Z | src/Lib/browser/__init__.py | martinphellwig/brython_wf | e169afc1e048cba0c12118b4cd6f109df6fe67c9 | [
"BSD-3-Clause"
] | 1 | 2017-10-20T19:11:27.000Z | 2017-10-20T19:11:27.000Z | src/Lib/browser/__init__.py | martinphellwig/brython_wf | e169afc1e048cba0c12118b4cd6f109df6fe67c9 | [
"BSD-3-Clause"
] | 8 | 2017-06-27T05:38:52.000Z | 2021-06-19T16:00:03.000Z | from _browser import *
| 11.5 | 22 | 0.782609 | 3 | 23 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
72bfe5e0e87591d2363af379bcd6a44af990d047 | 53 | py | Python | docker_lite_python/__init__.py | jeff-vincent/docker-py | efd7d9fd36b34e8c188c3f579f93e2b80c45bb48 | [
"MIT"
] | null | null | null | docker_lite_python/__init__.py | jeff-vincent/docker-py | efd7d9fd36b34e8c188c3f579f93e2b80c45bb48 | [
"MIT"
] | null | null | null | docker_lite_python/__init__.py | jeff-vincent/docker-py | efd7d9fd36b34e8c188c3f579f93e2b80c45bb48 | [
"MIT"
] | null | null | null | from docker_lite_python.docker_lite import DockerLite | 53 | 53 | 0.924528 | 8 | 53 | 5.75 | 0.75 | 0.434783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.056604 | 53 | 1 | 53 | 53 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
72e3ef2a3ec0052478eff8d158ad9a5ab9dd8c4c | 37,842 | py | Python | firebot/modules/inline.py | ultra-noob/Vivek-UserBot | 6c371a4aaa0c05397efa36237e9a2118deeb0d91 | [
"MIT"
] | null | null | null | firebot/modules/inline.py | ultra-noob/Vivek-UserBot | 6c371a4aaa0c05397efa36237e9a2118deeb0d91 | [
"MIT"
] | null | null | null | firebot/modules/inline.py | ultra-noob/Vivek-UserBot | 6c371a4aaa0c05397efa36237e9a2118deeb0d91 | [
"MIT"
] | 1 | 2021-10-02T00:37:07.000Z | 2021-10-02T00:37:07.000Z | import os
import re
import urllib
from math import ceil
from re import findall
from urllib.parse import quote
import requests
from pornhub_api import PornhubApi
from search_engine_parser import GoogleSearch
from telethon import Button, custom, events, functions
from youtube_search import YoutubeSearch
from firebot import ALIVE_NAME, CMD_HELP, CMD_LIST, lang
from firebot.function import _deezer_dl, _ytdl
from firebot.modules import inlinestats
PMPERMIT_PIC = os.environ.get("PMPERMIT_PIC", None)
if PMPERMIT_PIC is None:
WARN_PIC = "https://telegra.ph/file/3dd42b44d10528fa1f925.jpg"
else:
WARN_PIC = PMPERMIT_PIC
LOG_CHAT = Config.PRIVATE_GROUP_ID
DEFAULTUSER = str(ALIVE_NAME) if ALIVE_NAME else "Fire-X"
if lang == "si":
@tgbot.on(events.InlineQuery)
async def inline_handler(event):
builder = event.builder
result = None
query = event.text
if event.query.user_id == bot.uid and query.startswith("Fire-X"):
rev_text = query[::-1]
buttons = paginate_help(0, CMD_HELP, "helpme")
result = builder.article(
"© Userbot Help",
text="{}\nCurrently Loaded Plugins: {}".format(query, len(CMD_LIST)),
buttons=buttons,
link_preview=False,
)
await event.answer([result])
elif event.query.user_id == bot.uid and query == "stats":
result = builder.article(
title="Stats",
text=f"**Showing Stats For {DEFAULTUSER}'s Fire-XBot** \nNote --> Only Owner Can Check This \n(C) [Fire-X](https://github.com/FireXbot/Fire-X)",
buttons=[
[custom.Button.inline("Show Stats ?", data="terminator")],
[Button.url("Developed By", "https://github.com/FireXbot")],
[Button.url("Support Chat❤️", "t.me/FireXUserBot")],
],
)
await event.answer([result])
elif event.query.user_id == bot.uid and query.startswith("**Hello"):
result = builder.photo(
file=WARN_PIC,
text=query,
buttons=[
[custom.Button.inline("Spamming", data="dontspamnigga")],
[
custom.Button.inline(
"Casual Talk",
data="whattalk",
)
],
[custom.Button.inline("Requesting", data="askme")],
],
)
await event.answer([result])
@tgbot.on(
events.callbackquery.CallbackQuery( # pylint:disable=E0602
data=re.compile(b"helpme_next\((.+?)\)")
)
)
async def on_plug_in_callback_query_handler(event):
if event.query.user_id == bot.uid:
current_page_number = int(event.data_match.group(1).decode("UTF-8"))
buttons = paginate_help(current_page_number + 1, CMD_HELP, "helpme")
# https://t.me/TelethonChat/115200
await event.edit(buttons=buttons)
else:
reply_popp_up_alert = "ඔය මොකද කරන්නෙ, මේක ඔයාගෙ නෙමේ!"
await event.answer(reply_popp_up_alert, cache_time=0, alert=True)
@tgbot.on(
events.callbackquery.CallbackQuery( # pylint:disable=E0602
data=re.compile(b"helpme_prev\((.+?)\)")
)
)
async def on_plug_in_callback_query_handler(event):
if event.query.user_id == bot.uid: # pylint:disable=E0602
current_page_number = int(event.data_match.group(1).decode("UTF-8"))
buttons = paginate_help(
current_page_number - 1, CMD_HELP, "helpme" # pylint:disable=E0602
)
# https://t.me/TelethonChat/115200
await event.edit(buttons=buttons)
else:
reply_pop_up_alert = "මොන පිස්සෙක්ද තෝ? උඹටම කියල බොටෙක් හදාගනිම්.!"
await event.answer(reply_pop_up_alert, cache_time=0, alert=True)
@tgbot.on(
events.callbackquery.CallbackQuery( # pylint:disable=E0602
data=re.compile(b"us_plugin_(.*)")
)
)
async def on_plug_in_callback_query_handler(event):
if not event.query.user_id == bot.uid:
sedok = "මොන පිස්සෙක්ද තෝ? උඹටම කියල බොටෙක් හදාගනිම්."
await event.answer(sedok, cache_time=0, alert=True)
return
plugin_name = event.data_match.group(1).decode("UTF-8")
if plugin_name in CMD_HELP:
help_string = (
f"**🦹♀️ PLUGIN NAME 🦹♀️ :** `{plugin_name}` \n{CMD_HELP[plugin_name]}"
)
reply_pop_up_alert = help_string
reply_pop_up_alert += "\n\n**(C) Fire-X ** ".format(plugin_name)
if len(reply_pop_up_alert) >= 4096:
crackexy = "`Pasting Your Help Menu.`"
await event.answer(crackexy, cache_time=0, alert=True)
out_file = reply_pop_up_alert
url = "https://del.dog/documents"
r = requests.post(url, data=out_file.encode("UTF-8")).json()
url = f"https://del.dog/{r['key']}"
await event.edit(
f"Pasted {plugin_name} to {url}",
link_preview=False,
buttons=[[custom.Button.inline("Go Back", data="backme")]],
)
else:
await event.edit(
message=reply_pop_up_alert,
buttons=[[custom.Button.inline("Go Back", data="backme")]],
)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"terminator")))
async def rip(event):
if event.query.user_id == bot.uid:
text = inlinestats
await event.answer(text, alert=True)
else:
txt = "You Can't View My Boss Stats"
await event.answer(txt, alert=True)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"yt_dla_(.*)")))
async def rip(event):
yt_dl_data = event.data_match.group(1).decode("UTF-8")
link_s = yt_dl_data
if event.query.user_id != bot.uid:
text = f"Please Get Your Own Fire-X And Don't Waste My Resources"
await event.answer(text, alert=True)
return
is_it = True
await _ytdl(link_s, is_it, event, tgbot)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"deezer_dl_(.*)")))
async def rip(event):
sun = event.data_match.group(1).decode("UTF-8")
if event.query.user_id != bot.uid:
text = f"Please Get Your Own FIRE-X And Don't Waste My Resources"
await event.answer(text, alert=True)
return
await _deezer_dl(sun, event, tgbot)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"yt_vid_(.*)")))
async def rip(event):
yt_dl_data = event.data_match.group(1).decode("UTF-8")
link_s = yt_dl_data
if event.query.user_id != bot.uid:
text = f"Please Get Your Own Fire-X And Don't Waste My Resources"
await event.answer(text, alert=True)
return
is_it = False
await _ytdl(link_s, is_it, event, tgbot)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"ph_dl_(.*)")))
async def rip(event):
link_s = event.pattern_match.group(1)
if event.query.user_id != bot.uid:
text = f"Please Get Your Own Fire-X And Don't Waste My Resources."
await event.answer(text, alert=True)
return
await _phdl(link_s, event, tgbot)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"dontspamnigga")))
async def rip(event):
if event.query.user_id == bot.uid:
sedok = "Boss, You Don't Need To Use This."
await event.answer(sedok, cache_time=0, alert=True)
return
await event.get_chat()
him_id = event.query.user_id
text1 = "ඔයා ඇවිත් තියෙන්නෙ හොඳ දේකට නෙමේ.. ඔයා තෝරපු එක පිළිගන්න බෑ.. ඒක නිසා ඔයාව Block කරනවා"
await event.edit("ඔයා තෝරපු එක පිළිගන්න බෑ ❌")
await borg.send_message(event.query.user_id, text1)
await borg(functions.contacts.BlockRequest(event.query.user_id))
await borg.send_message(
LOG_CHAT,
f"ආයුබෝවන්, මෝඩ [පකයා](tg://user?id={him_id}) තහන්ම් එකක් තෝරපු නිසා Block කරා",
)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"backme")))
async def sed(event):
if event.query.user_id != bot.uid:
sedok = "මොන පිස්සෙක්ද තෝ? උඹටම කියල බොටෙක් හදාගනිම්."
await event.answer(sedok, cache_time=0, alert=True)
return
await event.answer("Back", cache_time=0, alert=False)
# This Is Copy of Above Code. (C) @SpEcHiDe
buttons = paginate_help(0, CMD_HELP, "helpme")
sed = f"""Fire-X Modules Are Listed Here !\n
For More Help or Support contact {DEFAULTUSER} \nCurrently Loaded Plugins: {len(CMD_LIST)}\nCurrently using Language - Sinhala (Sinhalese)"""
await event.edit(message=sed, buttons=buttons)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"whattalk")))
async def rip(event):
if event.query.user_id == bot.uid:
sedok = "Boss, You Don't Need To Use This."
await event.answer(sedok, cache_time=0, alert=True)
return
await event.get_chat()
him_id = event.query.user_id
await event.edit("ඔයා තෝරපු එක මම පිළිගන්නවා ✔️")
text2 = "හරි දැන් මගේ අයිතිකාරයා ඔයාට මැසේජ් එකක් දානකන් ටිකක් ඉවසල ඉන්න. \nගොඩාක් ස්තූතී මැසේජ් කරාට."
await borg.send_message(event.query.user_id, text2)
await borg.send_message(
LOG_CHAT,
message=f"Hello, [අලුත් පොරක්](tg://user?id={him_id}). ඔයා එක්ක කතා කරන්න ඉල්ලනවා.",
)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"askme")))
async def rip(event):
if event.query.user_id == bot.uid:
sedok = "මහත්තයෝ, ඔයා මේක පාවිච්චි කරන්න ඕන නෑ"
await event.answer(sedok, cache_time=0, alert=True)
return
await event.get_chat()
him_id = event.query.user_id
await event.edit("ඔයා තෝරපු එක මම පිළිගන්නවා ✔️")
text3 = "හරි දැන් මගේ අයිතිකාරයා ඔයාට මැසේජ් එකක් දානකන් ටිකක් ඉවසල ඉන්න. \nගොඩාක් ස්තූතී මැසේජ් කරාට."
await borg.send_message(event.query.user_id, text3)
await borg.send_message(
LOG_CHAT,
message=f"Hello, [අලුත් පොරකට](tg://user?id={him_id}). ඔයාගෙන් දෙයක් ඉල්ලන්න තියේලු.",
)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"close")))
async def on_plug_in_callback_query_handler(event):
if event.query.user_id == bot.uid:
await event.edit("menu closed")
else:
reply_pop_up_alert = "මොන පිස්සෙක්ද තෝ? උඹටම කියල බොටෙක් හදාගනිම්. "
await event.answer(reply_pop_up_alert, cache_time=0, alert=True)
def paginate_help(page_number, loaded_modules, prefix):
number_of_rows = 8
number_of_cols = 2
helpable_modules = []
for p in loaded_modules:
if not p.startswith("_"):
helpable_modules.append(p)
helpable_modules = sorted(helpable_modules)
modules = [
custom.Button.inline(
"{} {} {}".format(
Config.EMOJI_TO_DISPLAY_IN_HELP, x, Config.EMOJI_TO_DISPLAY_IN_HELP
),
data="us_plugin_{}".format(x),
)
for x in helpable_modules
]
pairs = list(zip(modules[::number_of_cols], modules[1::number_of_cols]))
if len(modules) % number_of_cols == 1:
pairs.append((modules[-1],))
max_num_pages = ceil(len(pairs) / number_of_rows)
modulo_page = page_number % max_num_pages
if len(pairs) > number_of_rows:
pairs = pairs[
modulo_page * number_of_rows : number_of_rows * (modulo_page + 1)
] + [
(
custom.Button.inline(
"⏪ Previous", data="{}_prev({})".format(prefix, modulo_page)
),
custom.Button.inline("Close", data="close"),
custom.Button.inline(
"Next ⏩", data="{}_next({})".format(prefix, modulo_page)
),
)
]
return pairs
else:
@tgbot.on(events.InlineQuery)
async def inline_handler(event):
builder = event.builder
result = None
query = event.text
if event.query.user_id == bot.uid and query.startswith("Fire-X"):
rev_text = query[::-1]
buttons = paginate_help(0, CMD_HELP, "helpme")
result = builder.article(
"© Userbot Help",
text="{}\nCurrently Loaded Plugins: {}".format(query, len(CMD_LIST)),
buttons=buttons,
link_preview=False,
)
await event.answer([result])
elif event.query.user_id == bot.uid and query == "stats":
result = builder.article(
title="Stats",
text=f"**Showing Stats For {DEFAULTUSER}'s Fire-X** \nNote --> Only Owner Can Check This \n(C) Fire-X",
buttons=[
[custom.Button.inline("Show Stats ?", data="terminator")],
[
Button.url(
"Repo Here", "https://github.com/FireXbot/Fire-X"
)
],
[Button.url("Join Channel ❤️", "t.me/https://t.me/Fire_X_CHANNEL")],
],
)
await event.answer([result])
elif event.query.user_id == bot.uid and query.startswith("**Hello"):
result = builder.photo(
file=WARN_PIC,
text=query,
buttons=[
[custom.Button.inline("Spamming", data="dontspamnigga")],
[
custom.Button.inline(
"Casual Talk",
data="whattalk",
)
],
[custom.Button.inline("Requesting", data="askme")],
],
)
await event.answer([result])
@tgbot.on(
events.callbackquery.CallbackQuery( # pylint:disable=E0602
data=re.compile(b"helpme_next\((.+?)\)")
)
)
async def on_plug_in_callback_query_handler(event):
if event.query.user_id == bot.uid:
current_page_number = int(event.data_match.group(1).decode("UTF-8"))
buttons = paginate_help(current_page_number + 1, CMD_HELP, "helpme")
# https://t.me/TelethonChat/115200
await event.edit(buttons=buttons)
else:
reply_popp_up_alert = "Please get your own Userbot, and don't use mine!"
await event.answer(reply_popp_up_alert, cache_time=0, alert=True)
@tgbot.on(
events.callbackquery.CallbackQuery( # pylint:disable=E0602
data=re.compile(b"helpme_prev\((.+?)\)")
)
)
async def on_plug_in_callback_query_handler(event):
if event.query.user_id == bot.uid: # pylint:disable=E0602
current_page_number = int(event.data_match.group(1).decode("UTF-8"))
buttons = paginate_help(
current_page_number - 1, CMD_HELP, "helpme" # pylint:disable=E0602
)
# https://t.me/TelethonChat/115200
await event.edit(buttons=buttons)
else:
reply_pop_up_alert = "Please get your own Userbot, and don't use mine!"
await event.answer(reply_pop_up_alert, cache_time=0, alert=True)
@tgbot.on(
events.callbackquery.CallbackQuery( # pylint:disable=E0602
data=re.compile(b"us_plugin_(.*)")
)
)
async def on_plug_in_callback_query_handler(event):
if not event.query.user_id == bot.uid:
sedok = "Who The Fuck Are You? Get Your Own Fire-X ."
await event.answer(sedok, cache_time=0, alert=True)
return
plugin_name = event.data_match.group(1).decode("UTF-8")
if plugin_name in CMD_HELP:
help_string = (
f"**🦹♀️ PLUGIN NAME 🦹♀️ :** `{plugin_name}` \n{CMD_HELP[plugin_name]}"
)
reply_pop_up_alert = help_string
reply_pop_up_alert += "\n\n**(C) @FIRE_X_CHANNEL** ".format(plugin_name)
if len(reply_pop_up_alert) >= 4096:
crackexy = "`Pasting Your Help Menu.`"
await event.answer(crackexy, cache_time=0, alert=True)
out_file = reply_pop_up_alert
url = "https://del.dog/documents"
r = requests.post(url, data=out_file.encode("UTF-8")).json()
url = f"https://del.dog/{r['key']}"
await event.edit(
f"Pasted {plugin_name} to {url}",
link_preview=False,
buttons=[[custom.Button.inline("Go Back", data="backme")]],
)
else:
await event.edit(
message=reply_pop_up_alert,
buttons=[[custom.Button.inline("Go Back", data="backme")]],
)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"terminator")))
async def rip(event):
if event.query.user_id == bot.uid:
text = inlinestats
await event.answer(text, alert=True)
else:
txt = "You Can't View My Masters Stats"
await event.answer(txt, alert=True)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"yt_dla_(.*)")))
async def rip(event):
yt_dl_data = event.data_match.group(1).decode("UTF-8")
link_s = yt_dl_data
if event.query.user_id != bot.uid:
text = f"Please Get Your Own Fire-X And Don't Waste My Resources"
await event.answer(text, alert=True)
return
is_it = True
await _ytdl(link_s, is_it, event, tgbot)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"deezer_dl_(.*)")))
async def rip(event):
sun = event.data_match.group(1).decode("UTF-8")
if event.query.user_id != bot.uid:
text = f"Please Get Your Own Fire-X And Don't Waste My Resources"
await event.answer(text, alert=True)
return
await _deezer_dl(sun, event, tgbot)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"yt_vid_(.*)")))
async def rip(event):
yt_dl_data = event.data_match.group(1).decode("UTF-8")
link_s = yt_dl_data
if event.query.user_id != bot.uid:
text = f"Please Get Your Own Fire-X And Don't Waste My Resources"
await event.answer(text, alert=True)
return
is_it = False
await _ytdl(link_s, is_it, event, tgbot)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"ph_dl_(.*)")))
async def rip(event):
link_s = event.pattern_match.group(1)
if event.query.user_id != bot.uid:
text = f"Please Get Your Own Fire-X And Don't Waste My Resources."
await event.answer(text, alert=True)
return
await _phdl(link_s, event, tgbot)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"dontspamnigga")))
async def rip(event):
if event.query.user_id == bot.uid:
sedok = "Master, You Don't Need To Use This."
await event.answer(sedok, cache_time=0, alert=True)
return
await event.get_chat()
him_id = event.query.user_id
text1 = "You Have Chosed A Probhited Option. Therefore, You Have Been Blocked"
await event.edit("Choice Not Accepted ❌")
await borg.send_message(event.query.user_id, text1)
await borg(functions.contacts.BlockRequest(event.query.user_id))
await borg.send_message(
LOG_CHAT,
f"Hello, A Noob [Nibba](tg://user?id={him_id}) Selected Probhited Option, Therefore Blocked.",
)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"backme")))
async def sed(event):
if event.query.user_id != bot.uid:
sedok = "Who The Fuck Are You? Get Your Own bot."
await event.answer(sedok, cache_time=0, alert=True)
return
await event.answer("Back", cache_time=0, alert=False)
# This Is Copy of Above Code. (C) @SpEcHiDe
buttons = paginate_help(0, CMD_HELP, "helpme")
sed = f"""Fire-X Userbot Modules Are Listed Here !\n
For More Help or Support contact {DEFAULTUSER} \nCurrently Loaded Plugins: {len(CMD_LIST)}\nCurrently using Language - English (Standard)"""
await event.edit(message=sed, buttons=buttons)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"whattalk")))
async def rip(event):
if event.query.user_id == bot.uid:
sedok = "Master, You Don't Need To Use This."
await event.answer(sedok, cache_time=0, alert=True)
return
await event.get_chat()
him_id = event.query.user_id
await event.edit("Your Choice Accepted ✔️")
text2 = "Ok. Please Wait Until My Master will Approve you soon. Don't Spam Here Or Try Anything Stupid. \nThank You For Contacting Me."
await borg.send_message(event.query.user_id, text2)
await borg.send_message(
LOG_CHAT,
message=f"Hello, A [New User](tg://user?id={him_id}). Wants To Talk With You.",
)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"askme")))
async def rip(event):
if event.query.user_id == bot.uid:
sedok = "Master, You Don't Need To Use This."
await event.answer(sedok, cache_time=0, alert=True)
return
await event.get_chat()
him_id = event.query.user_id
await event.edit("Your choice is Accepted ✔️")
text3 = "Ok, Wait. My Master will reply you soon. Kindly, Wait."
await borg.send_message(event.query.user_id, text3)
await borg.send_message(
LOG_CHAT,
message=f"Hello, A [New User](tg://user?id={him_id}). Wants To Ask You Something.",
)
@tgbot.on(events.callbackquery.CallbackQuery(data=re.compile(b"close")))
async def on_plug_in_callback_query_handler(event):
if event.query.user_id == bot.uid:
await event.edit("menu closed")
else:
reply_pop_up_alert = "WTF are you Doing.. "
await event.answer(reply_pop_up_alert, cache_time=0, alert=True)
def paginate_help(page_number, loaded_modules, prefix):
number_of_rows = 8
number_of_cols = 2
helpable_modules = []
for p in loaded_modules:
if not p.startswith("_"):
helpable_modules.append(p)
helpable_modules = sorted(helpable_modules)
modules = [
custom.Button.inline(
"{} {} {}".format(
Config.EMOJI_TO_DISPLAY_IN_HELP, x, Config.EMOJI_TO_DISPLAY_IN_HELP
),
data="us_plugin_{}".format(x),
)
for x in helpable_modules
]
pairs = list(zip(modules[::number_of_cols], modules[1::number_of_cols]))
if len(modules) % number_of_cols == 1:
pairs.append((modules[-1],))
max_num_pages = ceil(len(pairs) / number_of_rows)
modulo_page = page_number % max_num_pages
if len(pairs) > number_of_rows:
pairs = pairs[
modulo_page * number_of_rows : number_of_rows * (modulo_page + 1)
] + [
(
custom.Button.inline(
"⏪ Previous", data="{}_prev({})".format(prefix, modulo_page)
),
custom.Button.inline("Close", data="close"),
custom.Button.inline(
"Next ⏩", data="{}_next({})".format(prefix, modulo_page)
),
)
]
return pairs
@tgbot.on(events.InlineQuery(pattern=r"torrent (.*)"))
async def inline_id_handler(event: events.InlineQuery.Event):
builder = event.builder
if event.query.user_id != bot.uid:
resultm = builder.article(
title="Not Allowded",
text=f"You Can't Use This Bot. \nDeploy Fire-X To Get Your Own Assistant, Repo Link [Here](https://github.com/FireXbot/Fire-X)",
)
await event.answer([resultm])
return
testinput = event.pattern_match.group(1)
starkisnub = urllib.parse.quote_plus(testinput)
results = []
sedlyf = "https://api.sumanjay.cf/torrent/?query=" + starkisnub
try:
okpro = requests.get(url=sedlyf, timeout=10).json()
except:
pass
sed = len(okpro)
if sed == 0:
resultm = builder.article(
title="No Results Found.",
description="Check Your Spelling / Keyword",
text="**Please, Search Again With Correct Keyword, Thank you !**",
buttons=[
[
Button.switch_inline(
"Search Again", query="torrent ", same_peer=True
)
],
],
)
await event.answer([resultm])
return
if sed > 30:
for i in range(30):
seds = okpro[i]["age"]
okpros = okpro[i]["leecher"]
sadstark = okpro[i]["magnet"]
okiknow = okpro[i]["name"]
starksize = okpro[i]["size"]
starky = okpro[i]["type"]
seeders = okpro[i]["seeder"]
okayz = f"**Title :** `{okiknow}` \n**Size :** `{starksize}` \n**Type :** `{starky}` \n**Seeder :** `{seeders}` \n**Leecher :** `{okpros}` \n**Magnet :** `{sadstark}` "
sedme = f"Size : {starksize} Type : {starky} Age : {seds}"
results.append(
await event.builder.article(
title=okiknow,
description=sedme,
text=okayz,
buttons=Button.switch_inline(
"Search Again", query="torrent ", same_peer=True
),
)
)
else:
for sedz in okpro:
seds = sedz["age"]
okpros = sedz["leecher"]
sadstark = sedz["magnet"]
okiknow = sedz["name"]
starksize = sedz["size"]
starky = sedz["type"]
seeders = sedz["seeder"]
okayz = f"**Title :** `{okiknow}` \n**Size :** `{starksize}` \n**Type :** `{starky}` \n**Seeder :** `{seeders}` \n**Leecher :** `{okpros}` \n**Magnet :** `{sadstark}` "
sedme = f"Size : {starksize} Type : {starky} Age : {seds}"
results.append(
await event.builder.article(
title=okiknow,
description=sedme,
text=okayz,
buttons=[
Button.switch_inline(
"Search Again", query="torrent ", same_peer=True
)
],
)
)
await event.answer(results)
@tgbot.on(events.InlineQuery(pattern=r"yt (.*)"))
async def inline_id_handler(event: events.InlineQuery.Event):
builder = event.builder
if event.query.user_id != bot.uid:
resultm = builder.article(
title="Not Allowded",
text=f"You Can't Use This Bot. \nDeploy Fire-X To Get Your Own Assistant, Repo Link [Here](https://github.com/FireXbot/Fire-X)",
)
await event.answer([resultm])
return
testinput = event.pattern_match.group(1)
urllib.parse.quote_plus(testinput)
results = []
moi = YoutubeSearch(testinput, max_results=9).to_dict()
if not moi:
resultm = builder.article(
title="No Results Found.",
description="Check Your Spelling / Keyword",
text="**Please, Search Again With Correct Keyword, Thank you !**",
buttons=[
[Button.switch_inline("Search Again", query="yt ", same_peer=True)],
],
)
await event.answer([resultm])
return
for moon in moi:
hmm = moon["id"]
mo = f"https://www.youtube.com/watch?v={hmm}"
kek = f"https://www.youtube.com/watch?v={hmm}"
stark_name = moon["title"]
stark_chnnl = moon["channel"]
total_stark = moon["duration"]
stark_views = moon["views"]
moon["long_desc"]
kekme = f"https://img.youtube.com/vi/{hmm}/hqdefault.jpg"
okayz = f"**Title :** `{stark_name}` \n**Link :** `{kek}` \n**Channel :** `{stark_chnnl}` \n**Views :** `{stark_views}` \n**Duration :** `{total_stark}`"
hmmkek = f"Video Name : {stark_name} \nChannel : {stark_chnnl} \nDuration : {total_stark} \nViews : {stark_views}"
results.append(
await event.builder.document(
file=kekme,
title=stark_name,
description=hmmkek,
text=okayz,
include_media=True,
buttons=[
[custom.Button.inline("Download Video - mp4", data=f"yt_vid_{mo}")],
[custom.Button.inline("Download Audio - mp3", data=f"yt_dla_{mo}")],
[Button.switch_inline("Search Again", query="yt ", same_peer=True)],
],
)
)
await event.answer(results)
@tgbot.on(events.InlineQuery(pattern=r"jm (.*)"))
async def inline_id_handler(event: events.InlineQuery.Event):
builder = event.builder
if event.query.user_id != bot.uid:
resultm = builder.article(
title="Not Allowded",
text=f"You Can't Use This Bot. \nDeploy Fire-X To Get Your Own Assistant, Repo Link [Here](https://github.com/FireXbot/Fire-X)",
)
await event.answer([resultm])
return
testinput = event.pattern_match.group(1)
starkisnub = urllib.parse.quote_plus(testinput)
results = []
search = f"http://starkmusic.herokuapp.com/result/?query={starkisnub}"
seds = requests.get(url=search).json()
for okz in seds:
okz["album"]
okmusic = okz["music"]
hmmstar = okz["perma_url"]
singer = okz["singers"]
hmm = okz["duration"]
langs = okz["language"]
hidden_url = okz["media_url"]
okayz = (
f"**Song Name :** `{okmusic}` \n**Singer :** `{singer}` \n**Song Url :** `{hmmstar}`"
f"\n**Language :** `{langs}` \n**Download Able Url :** `{hidden_url}`"
f"\n**Duration :** `{hmm}`"
)
hmmkek = (
f"Song : {okmusic} Singer : {singer} Duration : {hmm} \nLanguage : {langs}"
)
results.append(
await event.builder.article(
title=okmusic,
description=hmmkek,
text=okayz,
buttons=Button.switch_inline(
"Search Again", query="jm ", same_peer=True
),
)
)
await event.answer(results)
@tgbot.on(events.InlineQuery(pattern=r"google (.*)"))
async def inline_id_handler(event: events.InlineQuery.Event):
builder = event.builder
if event.query.user_id != bot.uid:
resultm = builder.article(
title="- Not Allowded -",
text=f"You Can't Use This Bot. \nDeploy Fire-X To Get Your Own Assistant, Repo Link [Here](https://github.com/FireXbot/Fire-X)",
)
await event.answer([resultm])
return
results = []
match = event.pattern_match.group(1)
page = findall(r"page=\d+", match)
try:
page = page[0]
page = page.replace("page=", "")
match = match.replace("page=" + page[0], "")
except IndexError:
page = 1
search_args = (str(match), int(page))
gsearch = GoogleSearch()
gresults = await gsearch.async_search(*search_args)
for i in range(len(gresults["links"])):
try:
title = gresults["titles"][i]
link = gresults["links"][i]
desc = gresults["descriptions"][i]
okiknow = f"**GOOGLE - SEARCH** \n[{title}]({link})\n\n`{desc}`"
results.append(
await event.builder.article(
title=title,
description=desc,
text=okiknow,
buttons=[
Button.switch_inline(
"Search Again", query="google ", same_peer=True
)
],
)
)
except IndexError:
break
await event.answer(results)
@tgbot.on(events.InlineQuery(pattern=r"ph (.*)"))
async def inline_id_handler(event: events.InlineQuery.Event):
builder = event.builder
if event.query.user_id != bot.uid:
resultm = builder.article(
title="- Not Allowded -",
text=f"You Can't Use This Bot. \nDeploy Fire-X To Get Your Own Assistant, Repo Link [Here](https://github.com/FireXbot/Fire-X)",
)
await event.answer([resultm])
return
results = []
input_str = event.pattern_match.group(1)
api = PornhubApi()
data = api.search.search(input_str, ordering="mostviewed")
ok = 1
for vid in data.videos:
if ok <= 5:
lul_m = f"**PORN-HUB SEARCH** \n**Video title :** `{vid.title}` \n**Video link :** `https://www.pornhub.com/view_video.php?viewkey={vid.video_id}`"
results.append(
await event.builder.article(
title=vid.title,
text=lul_m,
buttons=[
Button.switch_inline(
"Search Again", query="ph ", same_peer=True
)
],
)
)
else:
pass
await event.answer(results)
@tgbot.on(events.InlineQuery(pattern=r"xkcd (.*)"))
async def inline_id_handler(event: events.InlineQuery.Event):
builder = event.builder
if event.query.user_id != bot.uid:
resultm = builder.article(
title="- Not Allowded -",
text=f"You Can't Use This Bot. \nDeploy Fire-X To Get Your Own Assistant, Repo Link [Here](https://github.com/FireXbot/Fire-X)",
)
await event.answer([resultm])
return
input_str = event.pattern_match.group(1)
xkcd_id = None
if input_str:
if input_str.isdigit():
xkcd_id = input_str
else:
xkcd_search_url = "https://relevantxkcd.appspot.com/process?"
queryresult = requests.get(
xkcd_search_url, params={"action": "xkcd", "query": quote(input_str)}
).text
xkcd_id = queryresult.split(" ")[2].lstrip("\n")
if xkcd_id is None:
xkcd_url = "https://xkcd.com/info.0.json"
else:
xkcd_url = "https://xkcd.com/{}/info.0.json".format(xkcd_id)
r = requests.get(xkcd_url)
if r.ok:
data = r.json()
year = data.get("year")
month = data["month"].zfill(2)
day = data["day"].zfill(2)
xkcd_link = "https://xkcd.com/{}".format(data.get("num"))
safe_title = data.get("safe_title")
data.get("transcript")
alt = data.get("alt")
img = data.get("img")
data.get("title")
output_str = """
[XKCD]({})
Title: {}
Alt: {}
Day: {}
Month: {}
Year: {}""".format(
xkcd_link, safe_title, alt, day, month, year
)
lul_k = builder.photo(file=img, text=output_str)
await event.answer([lul_k])
else:
resultm = builder.article(title="- No Results :/ -", text=f"No Results Found !")
await event.answer([resultm])
@tgbot.on(events.InlineQuery(pattern=r"deezer ?(.*)"))
async def inline_id_handler(event):
builder = event.builder
if event.query.user_id != bot.uid:
resultm = builder.article(
title="- Not Allowded -",
text=f"You Can't Use This Bot. \nDeploy Fire-X To Get Your Own Assistant, Repo Link [Here](https://github.com/FireXbot/Fire-X)",
)
await event.answer([resultm])
return
results = []
input_str = event.pattern_match.group(1)
link = f"https://api.deezer.com/search?q={input_str}&limit=7"
dato = requests.get(url=link).json()
# data_s = json.loads(data_s)
for match in dato.get("data"):
match.get("link")
hmm_m = f"Title : {match['title']} \nLink : {match['link']} \nDuration : {match['duration']} seconds \nBy : {match['artist']['name']}"
results.append(
await event.builder.document(
file=match["album"]["cover_medium"],
title=match["title"],
text=hmm_m,
description=f"Artist: {match['artist']['name']}\nAlbum: {match['album']['title']}",
buttons=[
[
custom.Button.inline(
"Download Audio - mp3", data=f"deezer_dl_{match['title']}"
)
],
[
Button.switch_inline(
"Search Again", query="deezer ", same_peer=True
)
],
],
),
)
if results:
try:
await event.answer(results)
except TypeError:
pass
| 40.822006 | 180 | 0.553301 | 4,855 | 37,842 | 4.237693 | 0.109784 | 0.0418 | 0.042772 | 0.041217 | 0.800282 | 0.793331 | 0.78293 | 0.764168 | 0.75367 | 0.751239 | 0 | 0.007592 | 0.314333 | 37,842 | 926 | 181 | 40.866091 | 0.776043 | 0.011971 | 0 | 0.667436 | 0 | 0.039261 | 0.211635 | 0.011212 | 0 | 0 | 0 | 0 | 0 | 1 | 0.002309 | false | 0.003464 | 0.016166 | 0 | 0.051963 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f41e87a10c471f4cc61a54b411ef37230acc98f5 | 128 | py | Python | graphene_django/utils/str_converters.py | radekwlsk/graphene-django | b552dcac24364d3ef824f865ba419c74605942b2 | [
"MIT"
] | 2 | 2020-11-20T08:04:31.000Z | 2020-11-20T08:04:33.000Z | graphene_django/utils/str_converters.py | radekwlsk/graphene-django | b552dcac24364d3ef824f865ba419c74605942b2 | [
"MIT"
] | 9 | 2021-03-30T13:56:06.000Z | 2021-09-22T19:27:32.000Z | graphene_django/utils/str_converters.py | radekwlsk/graphene-django | b552dcac24364d3ef824f865ba419c74605942b2 | [
"MIT"
] | null | null | null | import re
from unidecode import unidecode
def to_const(string):
return re.sub(r"[\W|^]+", "_", unidecode(string)).upper()
| 18.285714 | 61 | 0.679688 | 18 | 128 | 4.722222 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140625 | 128 | 6 | 62 | 21.333333 | 0.772727 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
f43150c10ac118b283e85652524b3a462c677ce5 | 8,254 | py | Python | tests/pytests/functional/states/file/test__check_directory_win.py | babs/salt | c536ea716d5308880b244e7980f4b659d86fc104 | [
"Apache-2.0"
] | 2 | 2015-08-21T01:05:03.000Z | 2015-09-02T07:30:45.000Z | tests/pytests/functional/states/file/test__check_directory_win.py | babs/salt | c536ea716d5308880b244e7980f4b659d86fc104 | [
"Apache-2.0"
] | 2 | 2021-04-30T21:38:06.000Z | 2021-12-13T20:51:39.000Z | tests/pytests/functional/states/file/test__check_directory_win.py | babs/salt | c536ea716d5308880b244e7980f4b659d86fc104 | [
"Apache-2.0"
] | 1 | 2020-06-02T14:15:24.000Z | 2020-06-02T14:15:24.000Z | import pytest
import salt.states.file as file
import salt.utils.win_dacl as win_dacl
pytestmark = [pytest.mark.windows_whitelisted, pytest.mark.skip_unless_on_windows]
@pytest.fixture
def configure_loader_modules():
return {
file: {"__opts__": {"test": False}},
}
def test__check_directory_win_owner(tmp_path):
path = str(tmp_path)
_, comment, changes = file._check_directory_win(name=path, win_owner="Everyone")
assert path in comment
assert changes == {"owner": "Everyone"}
def test__check_directory_win_grant_perms_basic(tmp_path):
path = str(tmp_path)
perms = {
"Guest": {
"applies_to": "this_folder_subfolders_files",
"perms": "full_control",
}
}
expected = {
"grant_perms": {
"Guest": {
"applies_to": "this_folder_subfolders_files",
"permissions": "full_control",
}
}
}
_, comment, changes = file._check_directory_win(name=path, win_perms=perms)
assert path in comment
assert changes == expected
def test__check_directory_win_grant_perms_basic_existing_user(tmp_path):
path = str(tmp_path)
win_dacl.set_permissions(
obj_name=path,
principal="Guest",
permissions=["write_data", "write_attributes"],
access_mode="grant",
)
perms = {"Guest": {"perms": "full_control"}}
expected = {"grant_perms": {"Guest": {"permissions": "full_control"}}}
_, comment, changes = file._check_directory_win(name=path, win_perms=perms)
assert path in comment
assert changes == expected
def test__check_directory_win_grant_perms_advanced(tmp_path):
path = str(tmp_path)
perms = {
"Guest": {
"applies_to": "this_folder_subfolders_files",
"perms": ["read_data", "write_data", "create_files"],
}
}
expected = {
"grant_perms": {
"Guest": {
"applies_to": "this_folder_subfolders_files",
"permissions": ["read_data", "write_data", "create_files"],
}
}
}
_, comment, changes = file._check_directory_win(name=path, win_perms=perms)
assert path in comment
assert changes == expected
def test__check_directory_win_grant_perms_advanced_existing_user(tmp_path):
path = str(tmp_path)
win_dacl.set_permissions(
obj_name=path,
principal="Guest",
permissions="full_control",
access_mode="grant",
)
perms = {
"Guest": {
"applies_to": "this_folder_subfolders_files",
"perms": ["read_data", "write_data", "create_files"],
}
}
expected = {
"grant_perms": {
"Guest": {"permissions": ["read_data", "write_data", "create_files"]}
}
}
_, comment, changes = file._check_directory_win(name=path, win_perms=perms)
assert path in comment
assert changes == expected
def test__check_directory_win_grant_perms_basic_no_applies_to(tmp_path):
path = str(tmp_path)
perms = {"Guest": {"perms": "full_control"}}
expected = {"grant_perms": {"Guest": {"permissions": "full_control"}}}
_, comment, changes = file._check_directory_win(name=path, win_perms=perms)
assert path in comment
assert changes == expected
def test__check_directory_win_deny_perms_basic(tmp_path):
path = str(tmp_path)
perms = {
"Guest": {
"applies_to": "this_folder_subfolders_files",
"perms": "full_control",
}
}
expected = {
"deny_perms": {
"Guest": {
"applies_to": "this_folder_subfolders_files",
"permissions": "full_control",
}
}
}
_, comment, changes = file._check_directory_win(name=path, win_deny_perms=perms)
assert path in comment
assert changes == expected
def test__check_directory_win_deny_perms_basic_existing_user(tmp_path):
path = str(tmp_path)
win_dacl.set_permissions(
obj_name=path,
principal="Guest",
permissions=["write_data", "write_attributes"],
access_mode="deny",
)
perms = {"Guest": {"perms": "full_control"}}
expected = {"deny_perms": {"Guest": {"permissions": "full_control"}}}
_, comment, changes = file._check_directory_win(name=path, win_deny_perms=perms)
assert path in comment
assert changes == expected
def test__check_directory_win_deny_perms_advanced(tmp_path):
path = str(tmp_path)
perms = {
"Guest": {
"applies_to": "this_folder_subfolders_files",
"perms": ["read_data", "write_data", "create_files"],
}
}
expected = {
"deny_perms": {
"Guest": {
"applies_to": "this_folder_subfolders_files",
"permissions": ["read_data", "write_data", "create_files"],
}
}
}
_, comment, changes = file._check_directory_win(name=path, win_deny_perms=perms)
assert path in comment
assert changes == expected
def test__check_directory_win_deny_perms_advanced_existing_user(tmp_path):
path = str(tmp_path)
win_dacl.set_permissions(
obj_name=path,
principal="Guest",
permissions="full_control",
access_mode="deny",
)
perms = {
"Guest": {
"applies_to": "this_folder_subfolders_files",
"perms": ["read_data", "write_data", "create_files"],
}
}
expected = {
"deny_perms": {
"Guest": {"permissions": ["read_data", "write_data", "create_files"]}
}
}
_, comment, changes = file._check_directory_win(name=path, win_deny_perms=perms)
assert path in comment
assert changes == expected
def test__check_directory_win_deny_perms_basic_no_applies_to(tmp_path):
path = str(tmp_path)
perms = {"Guest": {"perms": "full_control"}}
expected = {"deny_perms": {"Guest": {"permissions": "full_control"}}}
_, comment, changes = file._check_directory_win(name=path, win_deny_perms=perms)
assert path in comment
assert changes == expected
def test__check_directory_win_inheritance(tmp_path):
path = str(tmp_path)
expected = {}
_, comment, changes = file._check_directory_win(name=path, win_inheritance=True)
assert path in comment
assert changes == expected
def test__check_directory_win_inheritance_false(tmp_path):
path = str(tmp_path)
expected = {"inheritance": False}
_, comment, changes = file._check_directory_win(name=path, win_inheritance=False)
assert path in comment
assert changes == expected
def test__check_directory_reset_no_non_inherited_users(tmp_path):
path = str(tmp_path)
expected = {}
_, comment, changes = file._check_directory_win(name=path, win_perms_reset=True)
assert path in comment
assert changes == expected
def test__check_directory_reset_non_inherited_users_grant(tmp_path):
path = str(tmp_path)
win_dacl.set_permissions(
obj_name=path,
principal="Guest",
permissions="full_control",
access_mode="grant",
reset_perms=True,
)
expected = {
"remove_perms": {
"Guest": {
"grant": {
"applies to": "This folder, subfolders and files",
"permissions": "Full control",
}
}
}
}
_, comment, changes = file._check_directory_win(name=path, win_perms_reset=True)
assert path in comment
assert changes == expected
def test__check_directory_reset_non_inherited_users_deny(tmp_path):
path = str(tmp_path)
win_dacl.set_permissions(
obj_name=path,
principal="Guest",
permissions="full_control",
access_mode="deny",
reset_perms=True,
)
expected = {
"remove_perms": {
"Guest": {
"deny": {
"applies to": "This folder, subfolders and files",
"permissions": "Full control",
}
}
}
}
_, comment, changes = file._check_directory_win(name=path, win_perms_reset=True)
assert path in comment
assert changes == expected
| 30.345588 | 85 | 0.624425 | 918 | 8,254 | 5.202614 | 0.08061 | 0.093802 | 0.103224 | 0.070352 | 0.93907 | 0.932161 | 0.921064 | 0.899079 | 0.893007 | 0.883375 | 0 | 0 | 0.260964 | 8,254 | 271 | 86 | 30.457565 | 0.782951 | 0 | 0 | 0.686441 | 0 | 0 | 0.183305 | 0.033923 | 0 | 0 | 0 | 0 | 0.135593 | 1 | 0.072034 | false | 0 | 0.012712 | 0.004237 | 0.088983 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
be70c723b79fb31ca3e8114689cbfcacd6a8441e | 371 | py | Python | Exercicios/108.py | Geisianny/Curso-em-Video-M2-e-M3 | 4f22145184436518815d76eff6c55d171213f699 | [
"MIT"
] | null | null | null | Exercicios/108.py | Geisianny/Curso-em-Video-M2-e-M3 | 4f22145184436518815d76eff6c55d171213f699 | [
"MIT"
] | null | null | null | Exercicios/108.py | Geisianny/Curso-em-Video-M2-e-M3 | 4f22145184436518815d76eff6c55d171213f699 | [
"MIT"
] | null | null | null | import moeda # as mo
preço = float(input('Digite o preço: R$'))
print(f'A metade de {moeda.moeda(preço)} é {moeda.moeda(moeda.metade(preço))}')
print(f'O dobro de {moeda.moeda(preço)} é {moeda.moeda(moeda.dobro(preço))}')
print(f'Aumentando 10%, temos {moeda.moeda(moeda.aumentar(preço, 10))}')
print(f'Diminuindo 13%, temos {moeda.moeda(moeda.diminuir(preço, 13))}')
| 41.222222 | 79 | 0.703504 | 61 | 371 | 4.278689 | 0.393443 | 0.383142 | 0.229885 | 0.130268 | 0.252874 | 0.252874 | 0.252874 | 0.252874 | 0 | 0 | 0 | 0.023881 | 0.097035 | 371 | 8 | 80 | 46.375 | 0.755224 | 0.013477 | 0 | 0 | 0 | 0.333333 | 0.763736 | 0.370879 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.666667 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
be8e7e89909fa694d62aba1af7fb24b8236a99d9 | 2,950 | py | Python | interpreter/src/virtual_machine/test/vm/test_io.py | Cdayz/simple_lang | dc19d6ef76bb69c87981c8b826cf8f71b0cc475b | [
"MIT"
] | 3 | 2019-08-22T01:20:16.000Z | 2021-02-05T09:11:50.000Z | interpreter/src/virtual_machine/test/vm/test_io.py | Cdayz/simple_lang | dc19d6ef76bb69c87981c8b826cf8f71b0cc475b | [
"MIT"
] | null | null | null | interpreter/src/virtual_machine/test/vm/test_io.py | Cdayz/simple_lang | dc19d6ef76bb69c87981c8b826cf8f71b0cc475b | [
"MIT"
] | 2 | 2019-08-22T01:20:18.000Z | 2021-05-27T14:40:12.000Z | import io
import struct
import mock
import pytest
from interpreter.src.virtual_machine.vm.io_ops import (
vm_input,
vm_print,
VmState
)
from interpreter.src.virtual_machine.test.vm.test_binary_ops import (
gen_bytecode
)
def test_vm_print():
base_state = VmState(
vm_code_buffer=io.BytesIO(gen_bytecode("PRINT r1")),
vm_code_pointer=0,
)
with mock.patch('interpreter.src.virtual_machine.vm.io_ops.print') as p:
p.return_value = 1
state = vm_print(base_state)
p.assert_called_with("VM PRINT: 0")
assert state.vm_code_pointer == 12
base_state = VmState(
vm_code_buffer=io.BytesIO(gen_bytecode("PRINT @r1")),
vm_code_pointer=0,
)
with mock.patch('interpreter.src.virtual_machine.vm.io_ops.print') as p:
p.return_value = 1
state = vm_print(base_state)
p.assert_called_with("VM PRINT: 0")
assert state.vm_code_pointer == 12
base_state = VmState(
vm_code_buffer=io.BytesIO(gen_bytecode("PRINT 12")),
vm_code_pointer=0,
)
with mock.patch('interpreter.src.virtual_machine.vm.io_ops.print') as p:
p.return_value = 1
state = vm_print(base_state)
p.assert_called_with("VM PRINT: 12")
assert state.vm_code_pointer == 12
def test_vm_print_error():
bcode = gen_bytecode("PRINT r1")
op_code = struct.unpack('=hbibi', bcode)
op_code = list(op_code)
op_code[1] = 0
bcode = struct.pack('=hbibi', *op_code)
base_state = VmState(
vm_code_buffer=io.BytesIO(bcode),
vm_code_pointer=0,
)
with pytest.raises(Exception):
vm_print(base_state)
def test_vm_input():
base_state = VmState(
vm_code_buffer=io.BytesIO(gen_bytecode("INPUT r1")),
vm_code_pointer=0,
)
with mock.patch('interpreter.src.virtual_machine.vm.io_ops.input') as inp:
inp.side_effect = ['a', 1]
state = vm_input(base_state)
assert state.vm_code_pointer == 12
assert state.vm_registers[0].value == 1
base_state = VmState(
vm_code_buffer=io.BytesIO(gen_bytecode("INPUT @r1")),
vm_code_pointer=0,
)
with mock.patch('interpreter.src.virtual_machine.vm.io_ops.input') as inp:
inp.side_effect = ['a', 1]
state = vm_input(base_state)
assert state.vm_code_pointer == 12
mem_addr = state.vm_registers[0].value
assert state.vm_memory[mem_addr] == 1
def test_vm_input_error():
bcode = gen_bytecode("INPUT r1")
op_code = struct.unpack('=hbibi', bcode)
op_code = list(op_code)
op_code[1] = 0
bcode = struct.pack('=hbibi', *op_code)
base_state = VmState(
vm_code_buffer=io.BytesIO(bcode),
vm_code_pointer=0,
)
with mock.patch('interpreter.src.virtual_machine.vm.io_ops.input') as inp:
inp.side_effect = ['a', 1]
with pytest.raises(Exception):
vm_input(base_state)
| 24.180328 | 78 | 0.653559 | 429 | 2,950 | 4.207459 | 0.128205 | 0.063158 | 0.086427 | 0.1241 | 0.841551 | 0.769529 | 0.755125 | 0.735734 | 0.735734 | 0.735734 | 0 | 0.018926 | 0.229831 | 2,950 | 121 | 79 | 24.380165 | 0.775528 | 0 | 0 | 0.581395 | 0 | 0 | 0.135932 | 0.095593 | 0 | 0 | 0 | 0 | 0.116279 | 1 | 0.046512 | false | 0 | 0.069767 | 0 | 0.116279 | 0.116279 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
fe2c456a8cb6bdfadd6ece00fd2ac4600af233e6 | 202 | py | Python | packages/plugins/model-define/tensorflow-cycle-gan-model-define/CycleGAN/models/base.py | CandyQiu/pipcook | 12d482d6dcfb828bf80fcf908aee2c8ba5e9bd8a | [
"Apache-2.0"
] | 2 | 2020-04-21T05:49:02.000Z | 2021-03-01T15:14:29.000Z | packages/plugins/model-define/tensorflow-cycle-gan-model-define/CycleGAN/models/base.py | CandyQiu/pipcook | 12d482d6dcfb828bf80fcf908aee2c8ba5e9bd8a | [
"Apache-2.0"
] | null | null | null | packages/plugins/model-define/tensorflow-cycle-gan-model-define/CycleGAN/models/base.py | CandyQiu/pipcook | 12d482d6dcfb828bf80fcf908aee2c8ba5e9bd8a | [
"Apache-2.0"
] | null | null | null | class BaseModel(object):
name = 'BaseModel'
def __init__(self):
raise NotImplemented
def save(self):
raise NotImplemented
def plot(self):
raise NotImplemented
| 16.833333 | 28 | 0.628713 | 20 | 202 | 6.15 | 0.55 | 0.219512 | 0.560976 | 0.422764 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.29703 | 202 | 11 | 29 | 18.363636 | 0.866197 | 0 | 0 | 0.375 | 0 | 0 | 0.044554 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0 | 0 | 0 | 0.625 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
feb1933acbcc0ffe8d9da17628a0ce7a8695e401 | 5,422 | py | Python | test/test_format_addresses.py | jwodder/email2dict | 42596a2dafc764e28a9f7bb3f9058f89fe4bab9b | [
"MIT"
] | 1 | 2021-11-30T03:54:00.000Z | 2021-11-30T03:54:00.000Z | test/test_format_addresses.py | jwodder/mailbits | 42596a2dafc764e28a9f7bb3f9058f89fe4bab9b | [
"MIT"
] | null | null | null | test/test_format_addresses.py | jwodder/mailbits | 42596a2dafc764e28a9f7bb3f9058f89fe4bab9b | [
"MIT"
] | null | null | null | from email.headerregistry import Address, Group
import sys
from typing import List, Union
import pytest
from mailbits import format_addresses
@pytest.mark.parametrize(
"addresses,fmt",
[
([], ""),
(["foo@example.com"], "foo@example.com"),
(["foo@example.com", "bar@example.org"], "foo@example.com, bar@example.org"),
(
[Address("Fabian Oo", addr_spec="foo@example.com")],
"Fabian Oo <foo@example.com>",
),
(
[
Address("Fabian Oo", addr_spec="foo@example.com"),
Address("Bastian Arrr", addr_spec="bar@example.org"),
],
"Fabian Oo <foo@example.com>, Bastian Arrr <bar@example.org>",
),
(
[Address("Fabian O. Oh", addr_spec="foo@example.com")],
'"Fabian O. Oh" <foo@example.com>',
),
(
[Address("Zoë Façade", addr_spec="zoe.facade@naïveté.fr")],
"Zoë Façade <zoe.facade@naïveté.fr>",
),
(
[
Group("undisclosed recipients", ()),
"luser@example.nil",
Group(
"friends",
(
Address("", addr_spec="you@there.net"),
Address("Thaddeus Hem", addr_spec="them@hither.yon"),
),
),
],
"undisclosed recipients:;, luser@example.nil,"
" friends: you@there.net, Thaddeus Hem <them@hither.yon>;",
),
(
[
Address(
"John Jacob Jingleheimer Smith",
addr_spec="john.jacob.jingleheimer.smith@his-name-is-my-name-too.com",
),
Address(
"Jebediah Obadiah Zachariah Jedediah Springfield",
addr_spec="jebediah.obadiah.zachariah.jedediah.springfield@simpsons.state",
),
],
"John Jacob Jingleheimer Smith <john.jacob.jingleheimer.smith@his-name-is-my-name-too.com>, Jebediah Obadiah Zachariah Jedediah Springfield <jebediah.obadiah.zachariah.jedediah.springfield@simpsons.state>",
),
],
)
def test_format_addresses(
addresses: List[Union[str, Address, Group]], fmt: str
) -> None:
assert format_addresses(addresses) == fmt
@pytest.mark.parametrize(
"addresses,fmt",
[
([], ""),
(["foo@example.com"], "foo@example.com"),
(["foo@example.com", "bar@example.org"], "foo@example.com, bar@example.org"),
(
[Address("Fabian Oo", addr_spec="foo@example.com")],
"Fabian Oo <foo@example.com>",
),
(
[
Address("Fabian Oo", addr_spec="foo@example.com"),
Address("Bastian Arrr", addr_spec="bar@example.org"),
],
"Fabian Oo <foo@example.com>, Bastian Arrr <bar@example.org>",
),
(
[Address("Fabian O. Oh", addr_spec="foo@example.com")],
'"Fabian O. Oh" <foo@example.com>',
),
(
[Address("Zoe Facade", addr_spec="zoe.facade@naïveté.fr")],
"Zoe Facade <zoe.facade@xn--navet-fsa2b.fr>",
),
pytest.param(
[Address("Zoë Façade", addr_spec="zoe.facade@naïveté.fr")],
"=?utf-8?q?Zo=C3=AB_Fa=C3=A7ade?= <zoe.facade@xn--navet-fsa2b.fr>",
marks=pytest.mark.xfail(
sys.version_info[:2] < (3, 7),
reason="Cannot encode non-ASCII display names on pre-Python 3.7",
),
),
(
[
Group(
"internationalized",
(
Address("Zoe Facade", addr_spec="zoe.facade@naïveté.fr"),
Address(addr_spec="wong@example.珠宝"),
),
),
],
"internationalized: Zoe Facade <zoe.facade@xn--navet-fsa2b.fr>, wong@example.xn--pbt977c;",
),
(
[
Group("undisclosed recipients", ()),
"luser@example.nil",
Group(
"friends",
(
Address("", addr_spec="you@there.net"),
Address("Thaddeus Hem", addr_spec="them@hither.yon"),
),
),
],
"undisclosed recipients:;, luser@example.nil,"
" friends: you@there.net, Thaddeus Hem <them@hither.yon>;",
),
(
[
Address(
"John Jacob Jingleheimer Smith",
addr_spec="john.jacob.jingleheimer.smith@his-name-is-my-name-too.com",
),
Address(
"Jebediah Obadiah Zachariah Jedediah Springfield",
addr_spec="jebediah.obadiah.zachariah.jedediah.springfield@simpsons.state",
),
],
"John Jacob Jingleheimer Smith <john.jacob.jingleheimer.smith@his-name-is-my-name-too.com>, Jebediah Obadiah Zachariah Jedediah Springfield <jebediah.obadiah.zachariah.jedediah.springfield@simpsons.state>",
),
],
)
def test_format_addresses_encode(
addresses: List[Union[str, Address, Group]], fmt: str
) -> None:
assert format_addresses(addresses, encode=True) == fmt
| 36.635135 | 218 | 0.495205 | 515 | 5,422 | 5.153398 | 0.201942 | 0.063301 | 0.097965 | 0.078372 | 0.858704 | 0.858704 | 0.850038 | 0.850038 | 0.825923 | 0.767898 | 0 | 0.004338 | 0.362228 | 5,422 | 147 | 219 | 36.884354 | 0.763158 | 0 | 0 | 0.664336 | 0 | 0.041958 | 0.416267 | 0.137219 | 0 | 0 | 0 | 0 | 0.013986 | 1 | 0.013986 | false | 0 | 0.034965 | 0 | 0.048951 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
22d65e4975504eadb3086d95ecee92cee6710a6e | 48 | py | Python | src/dewloosh/geom/topo/__init__.py | dewloosh/dewloosh-geom | 5c97fbab4b68f4748bf4309184b9e0e877f94cd6 | [
"MIT"
] | 2 | 2021-12-11T17:25:51.000Z | 2022-01-06T15:36:27.000Z | src/dewloosh/geom/topo/__init__.py | dewloosh/dewloosh-geom | 5c97fbab4b68f4748bf4309184b9e0e877f94cd6 | [
"MIT"
] | null | null | null | src/dewloosh/geom/topo/__init__.py | dewloosh/dewloosh-geom | 5c97fbab4b68f4748bf4309184b9e0e877f94cd6 | [
"MIT"
] | null | null | null | from .topo import *
from .topologyarray import * | 24 | 28 | 0.770833 | 6 | 48 | 6.166667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145833 | 48 | 2 | 28 | 24 | 0.902439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a3fc2a0780ae0a7912159f4b1fb2fe36be1f098a | 92 | py | Python | notes-n-resources/Data-Structures-N-Algo/_DS-n-Algos/__MY_OPRIGINAL_DS/_Extra-Practice/03_recursion/python/07_sum_array.py | side-projects-42/INTERVIEW-PREP-COMPLETE | 627a3315cee4bbc38a0e81c256f27f928eac2d63 | [
"MIT"
] | 13 | 2021-03-11T00:25:22.000Z | 2022-03-19T00:19:23.000Z | notes-n-resources/Data-Structures-N-Algo/_DS-n-Algos/__MY_OPRIGINAL_DS/_Extra-Practice/03_recursion/python/07_sum_array.py | side-projects-42/INTERVIEW-PREP-COMPLETE | 627a3315cee4bbc38a0e81c256f27f928eac2d63 | [
"MIT"
] | 160 | 2021-04-26T19:04:15.000Z | 2022-03-26T20:18:37.000Z | notes-n-resources/Data-Structures-N-Algo/_DS-n-Algos/__MY_OPRIGINAL_DS/_Extra-Practice/03_recursion/python/07_sum_array.py | side-projects-42/INTERVIEW-PREP-COMPLETE | 627a3315cee4bbc38a0e81c256f27f928eac2d63 | [
"MIT"
] | 12 | 2021-04-26T19:43:01.000Z | 2022-01-31T08:36:29.000Z | def sum_array(arr):
if not arr:
return 0
return arr[0] + sum_array(arr[1:])
| 18.4 | 38 | 0.576087 | 16 | 92 | 3.1875 | 0.5625 | 0.313725 | 0.431373 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.046154 | 0.293478 | 92 | 4 | 39 | 23 | 0.738462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
436723cac8ad79d938437fed1830dd3f8fa91fa0 | 24 | py | Python | python/guided/seeds/other.py | theostanton/guided | 407095fee1cb809a798c32f15cd9ec711cd7819f | [
"MIT"
] | null | null | null | python/guided/seeds/other.py | theostanton/guided | 407095fee1cb809a798c32f15cd9ec711cd7819f | [
"MIT"
] | 3 | 2021-03-10T13:32:32.000Z | 2022-02-13T19:08:13.000Z | python/guided/seeds/other.py | theostanton/guided | 407095fee1cb809a798c32f15cd9ec711cd7819f | [
"MIT"
] | null | null | null | def execute():
pass
| 8 | 14 | 0.583333 | 3 | 24 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.291667 | 24 | 2 | 15 | 12 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
438a52fdbd77e47a50aefa0e69f54d53d9009e67 | 188 | py | Python | inventory/admin.py | anurag0singh/Jagrati | d4487e08368da38cf53a77dc1303ea8841c71ba9 | [
"MIT"
] | null | null | null | inventory/admin.py | anurag0singh/Jagrati | d4487e08368da38cf53a77dc1303ea8841c71ba9 | [
"MIT"
] | null | null | null | inventory/admin.py | anurag0singh/Jagrati | d4487e08368da38cf53a77dc1303ea8841c71ba9 | [
"MIT"
] | null | null | null | from django.contrib import admin
# Register your models here.
from .models import *
admin.site.register(asset_donation)
admin.site.register(asset)
admin.site.register(asset_transaction)
| 20.888889 | 38 | 0.81383 | 26 | 188 | 5.807692 | 0.5 | 0.178808 | 0.337748 | 0.437086 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095745 | 188 | 8 | 39 | 23.5 | 0.888235 | 0.138298 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
43944408aff4c6aca365d32fc50df86ce3bffb71 | 7,321 | py | Python | tests/unit/sdb/test_vault.py | Noah-Huppert/salt | 998c382f5f2c3b4cbf7d96aa6913ada6993909b3 | [
"Apache-2.0"
] | 19 | 2016-01-29T14:37:52.000Z | 2022-03-30T18:08:01.000Z | tests/unit/sdb/test_vault.py | Noah-Huppert/salt | 998c382f5f2c3b4cbf7d96aa6913ada6993909b3 | [
"Apache-2.0"
] | 223 | 2016-03-02T16:39:41.000Z | 2022-03-03T12:26:35.000Z | tests/unit/sdb/test_vault.py | Noah-Huppert/salt | 998c382f5f2c3b4cbf7d96aa6913ada6993909b3 | [
"Apache-2.0"
] | 64 | 2016-02-04T19:45:26.000Z | 2021-12-15T02:02:31.000Z | """
Test case for the vault SDB module
"""
# Import python libs
# Import Salt libs
import salt.sdb.vault as vault
from tests.support.mixins import LoaderModuleMockMixin
from tests.support.mock import MagicMock, call, patch
# Import Salt Testing libs
from tests.support.unit import TestCase
class TestVaultSDB(LoaderModuleMockMixin, TestCase):
"""
Test case for the vault SDB module
"""
def setup_loader_modules(self):
return {
vault: {
"__opts__": {
"vault": {
"url": "http://127.0.0.1",
"auth": {"token": "test", "method": "token"},
}
}
}
}
def test_set(self):
"""
Test salt.sdb.vault.set function
"""
version = {"v2": False, "data": None, "metadata": None, "type": None}
mock_version = MagicMock(return_value=version)
mock_vault = MagicMock()
mock_vault.return_value.status_code = 200
with patch.dict(
vault.__utils__, {"vault.make_request": mock_vault}
), patch.dict(vault.__utils__, {"vault.is_v2": mock_version}):
vault.set_("sdb://myvault/path/to/foo/bar", "super awesome")
self.assertEqual(
mock_vault.call_args_list,
[
call(
"POST",
"v1/sdb://myvault/path/to/foo",
json={"bar": "super awesome"},
)
],
)
def test_set_v2(self):
"""
Test salt.sdb.vault.set function with kv v2 backend
"""
version = {
"v2": True,
"data": "path/data/to/foo",
"metadata": "path/metadata/to/foo",
"type": "kv",
}
mock_version = MagicMock(return_value=version)
mock_vault = MagicMock()
mock_vault.return_value.status_code = 200
with patch.dict(
vault.__utils__, {"vault.make_request": mock_vault}
), patch.dict(vault.__utils__, {"vault.is_v2": mock_version}):
vault.set_("sdb://myvault/path/to/foo/bar", "super awesome")
self.assertEqual(
mock_vault.call_args_list,
[
call(
"POST",
"v1/path/data/to/foo",
json={"data": {"bar": "super awesome"}},
)
],
)
def test_set_question_mark(self):
"""
Test salt.sdb.vault.set_ while using the old
deprecated solution with a question mark.
"""
version = {"v2": False, "data": None, "metadata": None, "type": None}
mock_version = MagicMock(return_value=version)
mock_vault = MagicMock()
mock_vault.return_value.status_code = 200
with patch.dict(
vault.__utils__, {"vault.make_request": mock_vault}
), patch.dict(vault.__utils__, {"vault.is_v2": mock_version}):
vault.set_("sdb://myvault/path/to/foo?bar", "super awesome")
self.assertEqual(
mock_vault.call_args_list,
[
call(
"POST",
"v1/sdb://myvault/path/to/foo",
json={"bar": "super awesome"},
)
],
)
def test_get(self):
"""
Test salt.sdb.vault.get function
"""
version = {"v2": False, "data": None, "metadata": None, "type": None}
mock_version = MagicMock(return_value=version)
mock_vault = MagicMock()
mock_vault.return_value.status_code = 200
mock_vault.return_value.json.return_value = {"data": {"bar": "test"}}
with patch.dict(
vault.__utils__, {"vault.make_request": mock_vault}
), patch.dict(vault.__utils__, {"vault.is_v2": mock_version}):
self.assertEqual(vault.get("sdb://myvault/path/to/foo/bar"), "test")
self.assertEqual(
mock_vault.call_args_list, [call("GET", "v1/sdb://myvault/path/to/foo")],
)
def test_get_v2(self):
"""
Test salt.sdb.vault.get function with kv v2 backend
"""
version = {
"v2": True,
"data": "path/data/to/foo",
"metadata": "path/metadata/to/foo",
"type": "kv",
}
mock_version = MagicMock(return_value=version)
mock_vault = MagicMock()
mock_vault.return_value.status_code = 200
mock_vault.return_value.json.return_value = {"data": {"data": {"bar": "test"}}}
with patch.dict(
vault.__utils__, {"vault.make_request": mock_vault}
), patch.dict(vault.__utils__, {"vault.is_v2": mock_version}):
self.assertEqual(vault.get("sdb://myvault/path/to/foo/bar"), "test")
self.assertEqual(
mock_vault.call_args_list, [call("GET", "v1/path/data/to/foo")]
)
def test_get_question_mark(self):
"""
Test salt.sdb.vault.get while using the old
deprecated solution with a question mark.
"""
version = {"v2": False, "data": None, "metadata": None, "type": None}
mock_version = MagicMock(return_value=version)
mock_vault = MagicMock()
mock_vault.return_value.status_code = 200
mock_vault.return_value.json.return_value = {"data": {"bar": "test"}}
with patch.dict(
vault.__utils__, {"vault.make_request": mock_vault}
), patch.dict(vault.__utils__, {"vault.is_v2": mock_version}):
self.assertEqual(vault.get("sdb://myvault/path/to/foo?bar"), "test")
self.assertEqual(
mock_vault.call_args_list, [call("GET", "v1/sdb://myvault/path/to/foo")],
)
def test_get_missing(self):
"""
Test salt.sdb.vault.get function returns None
if vault does not have an entry
"""
version = {"v2": False, "data": None, "metadata": None, "type": None}
mock_version = MagicMock(return_value=version)
mock_vault = MagicMock()
mock_vault.return_value.status_code = 404
with patch.dict(
vault.__utils__, {"vault.make_request": mock_vault}
), patch.dict(vault.__utils__, {"vault.is_v2": mock_version}):
self.assertIsNone(vault.get("sdb://myvault/path/to/foo/bar"))
assert mock_vault.call_args_list == [
call("GET", "v1/sdb://myvault/path/to/foo")
]
def test_get_missing_key(self):
"""
Test salt.sdb.vault.get function returns None
if vault does not have the key but does have the entry
"""
version = {"v2": False, "data": None, "metadata": None, "type": None}
mock_version = MagicMock(return_value=version)
mock_vault = MagicMock()
mock_vault.return_value.status_code = 200
mock_vault.return_value.json.return_value = {"data": {"bar": "test"}}
with patch.dict(
vault.__utils__, {"vault.make_request": mock_vault}
), patch.dict(vault.__utils__, {"vault.is_v2": mock_version}):
self.assertIsNone(vault.get("sdb://myvault/path/to/foo/foo"))
assert mock_vault.call_args_list == [
call("GET", "v1/sdb://myvault/path/to/foo")
]
| 35.538835 | 87 | 0.553749 | 829 | 7,321 | 4.640531 | 0.118215 | 0.084221 | 0.058227 | 0.079023 | 0.900442 | 0.891084 | 0.881206 | 0.826098 | 0.825318 | 0.825318 | 0 | 0.011449 | 0.308018 | 7,321 | 205 | 88 | 35.712195 | 0.747927 | 0.089195 | 0 | 0.666667 | 0 | 0 | 0.174887 | 0.062237 | 0 | 0 | 0 | 0 | 0.088435 | 1 | 0.061224 | false | 0 | 0.027211 | 0.006803 | 0.102041 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4396ac426650172fba1917c9372e0a5bc93be442 | 50 | py | Python | by-session/ta-922/j7/x/b.py | amiraliakbari/sharif-mabani-python | 5d14a08d165267fe71c28389ddbafe29af7078c5 | [
"MIT"
] | 2 | 2015-04-29T20:59:35.000Z | 2018-09-26T13:33:43.000Z | by-session/ta-922/j7/x/b.py | amiraliakbari/sharif-mabani-python | 5d14a08d165267fe71c28389ddbafe29af7078c5 | [
"MIT"
] | null | null | null | by-session/ta-922/j7/x/b.py | amiraliakbari/sharif-mabani-python | 5d14a08d165267fe71c28389ddbafe29af7078c5 | [
"MIT"
] | null | null | null | def f1():
return 4
def g():
return 5
| 8.333333 | 12 | 0.46 | 8 | 50 | 2.875 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103448 | 0.42 | 50 | 5 | 13 | 10 | 0.689655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
6028fc3bfa04aaa47ccd361fea2383a05e593aca | 173 | py | Python | msteams/adaptivecard/containers/layout.py | HarshadRanganathan/pyteams | d9ced98281e594b454ab7d98dce5b997d1711c8b | [
"MIT"
] | 6 | 2019-08-09T05:29:25.000Z | 2021-08-02T10:27:51.000Z | msteams/adaptivecard/containers/layout.py | HarshadRanganathan/pyteams | d9ced98281e594b454ab7d98dce5b997d1711c8b | [
"MIT"
] | 3 | 2020-03-24T17:06:42.000Z | 2021-02-02T22:11:50.000Z | msteams/adaptivecard/containers/layout.py | HarshadRanganathan/pyteams | d9ced98281e594b454ab7d98dce5b997d1711c8b | [
"MIT"
] | 3 | 2019-10-07T21:59:25.000Z | 2021-11-18T09:12:56.000Z |
class Layout:
def __init__(self, layout_type):
self.layout = dict()
self.layout['type'] = layout_type
def build(self):
return self.layout
| 17.3 | 41 | 0.601156 | 21 | 173 | 4.666667 | 0.428571 | 0.408163 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.289017 | 173 | 9 | 42 | 19.222222 | 0.796748 | 0 | 0 | 0 | 0 | 0 | 0.023256 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.166667 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 6 |
6055b2eb6142e45a2a9c7bd026fa6be674244a25 | 32 | py | Python | flexx/myui/__init__.py | ceprio/flask_reverse_proxy_for_flexx | a5cf63f5e602ae5bfb81898c289c84d924de6b61 | [
"MIT"
] | null | null | null | flexx/myui/__init__.py | ceprio/flask_reverse_proxy_for_flexx | a5cf63f5e602ae5bfb81898c289c84d924de6b61 | [
"MIT"
] | null | null | null | flexx/myui/__init__.py | ceprio/flask_reverse_proxy_for_flexx | a5cf63f5e602ae5bfb81898c289c84d924de6b61 | [
"MIT"
] | null | null | null | from ._markdown import Markdown
| 16 | 31 | 0.84375 | 4 | 32 | 6.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 32 | 1 | 32 | 32 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6058fc546e96a63641851d5354100df6f0bb4f72 | 196 | py | Python | baja_app/backend/admin.py | kprohith/django-playlist-manager | d32b711cc6654dbe221a615ef3c81c7e407c854c | [
"MIT"
] | null | null | null | baja_app/backend/admin.py | kprohith/django-playlist-manager | d32b711cc6654dbe221a615ef3c81c7e407c854c | [
"MIT"
] | 6 | 2021-04-08T21:25:33.000Z | 2022-03-12T00:40:42.000Z | baja_app/backend/admin.py | kprohith/django-playlist-manager | d32b711cc6654dbe221a615ef3c81c7e407c854c | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Song, Artist, Album, PlayList
admin.site.register(Artist)
admin.site.register(Album)
admin.site.register(Song)
admin.site.register(PlayList)
| 21.777778 | 49 | 0.806122 | 28 | 196 | 5.642857 | 0.428571 | 0.227848 | 0.43038 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086735 | 196 | 8 | 50 | 24.5 | 0.882682 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
60613c7c917a73f285ce77016d800b4f2950fbe4 | 1,631 | py | Python | testcases/indicator_tests/datadifferencetests.py | quantwizard-com/pythonbacktest | 7056c2804c30ca571eb43dc1ae4cc3d537f6613e | [
"Apache-2.0"
] | null | null | null | testcases/indicator_tests/datadifferencetests.py | quantwizard-com/pythonbacktest | 7056c2804c30ca571eb43dc1ae4cc3d537f6613e | [
"Apache-2.0"
] | null | null | null | testcases/indicator_tests/datadifferencetests.py | quantwizard-com/pythonbacktest | 7056c2804c30ca571eb43dc1ae4cc3d537f6613e | [
"Apache-2.0"
] | null | null | null | import unittest
from pythonbacktest.indicator import DataDifference
import math
class DataDifferenceTests(unittest.TestCase):
def test_number_set_individual_numbers(self):
input_values_1 = [1, 4, None, 2, 8, 12, 14, None, 1]
input_values_2 = [3, 1, None, 0, None, 3, 8, 6, 2]
all_expected_results = [t - k if t is not None and k is not None else None for (t, k) in zip(input_values_1, input_values_2)]
expected_result = input_values_1[-1] - input_values_2[-1]
data_difference = DataDifference()
for value_1, value_2 in zip(input_values_1, input_values_2):
data_difference.on_new_upstream_value(value_1, value_2)
all_actual_results = data_difference.all_result
actual_result = data_difference.result
self.assertEqual(all_expected_results, all_actual_results)
self.assertEqual(expected_result, actual_result)
def test_number_set_list_numbers(self):
input_values_1 = [1, 4, None, 2, 8, 12, 14, None, 1]
input_values_2 = [3, 1, None, 0, None, 3, 14, 6, 2]
all_expected_results = [t - k if t is not None and k is not None else None for (t, k) in zip(input_values_1, input_values_2)]
expected_result = input_values_1[-1] - input_values_2[-1]
data_difference = DataDifference()
data_difference.on_new_upstream_value(input_values_1, input_values_2)
all_actual_results = data_difference.all_result
actual_result = data_difference.result
self.assertEqual(all_expected_results, all_actual_results)
self.assertEqual(expected_result, actual_result) | 37.930233 | 133 | 0.706315 | 244 | 1,631 | 4.377049 | 0.213115 | 0.164794 | 0.089888 | 0.097378 | 0.825843 | 0.825843 | 0.744382 | 0.744382 | 0.717228 | 0.717228 | 0 | 0.044323 | 0.211527 | 1,631 | 43 | 134 | 37.930233 | 0.786159 | 0 | 0 | 0.592593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 1 | 0.074074 | false | 0 | 0.111111 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6062634d80de7d2fafad39272814f05a76e69fbb | 4,075 | py | Python | others/unpaired/baseline/dataset.py | ericlearning/Progressive-Image-Translation-Network | 972c54dfdbc4c065328f7fc54b2b47c2cefcc609 | [
"MIT"
] | 2 | 2019-05-11T12:25:55.000Z | 2019-10-17T16:10:34.000Z | voice_conversion/dataset.py | ericlearning/Progressive-Image-Translation-Network | 972c54dfdbc4c065328f7fc54b2b47c2cefcc609 | [
"MIT"
] | null | null | null | voice_conversion/dataset.py | ericlearning/Progressive-Image-Translation-Network | 972c54dfdbc4c065328f7fc54b2b47c2cefcc609 | [
"MIT"
] | null | null | null | import os
import torch
import random
import numpy as np
from torchvision import datasets, transforms
from torch.utils.data import DataLoader
from PIL import Image
class Dataset():
def __init__(self, train_dir, basic_types = None, shuffle = True, single_channel = False):
self.train_dir = train_dir
self.basic_types = basic_types
self.shuffle = shuffle
self.single_channel = single_channel
def get_loader(self, sz, bs, num_workers = 1):
if(self.single_channel):
dt = {
'input' : transforms.Compose([
transforms.Resize((sz, sz)),
transforms.Grayscale(1),
transforms.ToTensor(),
transforms.Normalize([0.5], [0.5])
]),
'target' : transforms.Compose([
transforms.Resize((sz, sz)),
transforms.Grayscale(1),
transforms.ToTensor(),
transforms.Normalize([0.5], [0.5])
])
}
else:
dt = {
'input' : transforms.Compose([
transforms.Resize((sz, sz)),
transforms.ToTensor(),
transforms.Normalize([0.5, 0.5, 0.5], [0.5, 0.5, 0.5])
]),
'target' : transforms.Compose([
transforms.Resize((sz, sz)),
transforms.ToTensor(),
transforms.Normalize([0.5, 0.5, 0.5], [0.5, 0.5, 0.5])
])
}
if(self.basic_types == 'Pix2Pix'):
input_transform = dt['input']
target_transform = dt['target']
train_dataset = Pix2Pix_Dataset(self.train_dir[0], self.train_dir[1], input_transform, target_transform)
train_loader = DataLoader(train_dataset, batch_size = bs, shuffle = self.shuffle, num_workers = num_workers)
returns = (train_loader)
elif(self.basic_types == 'CycleGan'):
input_transform = dt['input']
target_transform = dt['target']
train_dataset = CycleGan_Dataset(self.train_dir[0], self.train_dir[1], input_transform, target_transform)
train_loader = DataLoader(train_dataset, batch_size = bs, shuffle = self.shuffle, num_workers = num_workers)
returns = (train_loader)
return returns
class Pix2Pix_Dataset():
def __init__(self, input_dir, target_dir, input_transform, target_transform):
self.input_dir = input_dir
self.target_dir = target_dir
self.input_transform = input_transform
self.target_transform = target_transform
self.image_name_list = []
for file in os.listdir(input_dir):
if(file.endswith('.png') or file.endswith('.jpeg') or file.endswith('.jpg') or file.endswith('.bmp')):
self.image_name_list.append(file)
def __len__(self):
return len(self.image_name_list)
def __getitem__(self, idx):
if(self.target_dir == None):
input_img = Image.open(os.path.join(self.input_dir, self.image_name_list[idx]))
target_img = input_img.copy()
else:
input_img = Image.open(os.path.join(self.input_dir, self.image_name_list[idx]))
target_img = Image.open(os.path.join(self.target_dir, self.image_name_list[idx]))
input_img = self.input_transform(input_img)
target_img = self.target_transform(target_img)
sample = (input_img, target_img)
return sample
class CycleGan_Dataset():
def __init__(self, input_dir, target_dir, input_transform, target_transform):
self.input_dir = input_dir
self.target_dir = target_dir
self.input_transform = input_transform
self.target_transform = target_transform
self.A_image_name_list = []
for file in os.listdir(input_dir):
if(file.endswith('.png') or file.endswith('.jpeg') or file.endswith('.jpg') or file.endswith('.bmp')):
self.A_image_name_list.append(file)
self.B_image_name_list = []
for file in os.listdir(target_dir):
if(file.endswith('.png') or file.endswith('.jpeg') or file.endswith('.jpg') or file.endswith('.bmp')):
self.B_image_name_list.append(file)
def __len__(self):
return len(self.A_image_name_list)
def __getitem__(self, idx):
input_img = Image.open(os.path.join(self.input_dir, self.A_image_name_list[idx]))
target_img = Image.open(os.path.join(self.target_dir, self.B_image_name_list[random.randint(0, len(self.B_image_name_list) - 1)]))
input_img = self.input_transform(input_img)
target_img = self.target_transform(target_img)
sample = (input_img, target_img)
return sample | 33.130081 | 132 | 0.715583 | 589 | 4,075 | 4.672326 | 0.140917 | 0.011628 | 0.066134 | 0.017442 | 0.811773 | 0.788154 | 0.78234 | 0.760538 | 0.749273 | 0.744186 | 0 | 0.012691 | 0.149202 | 4,075 | 123 | 133 | 33.130081 | 0.781079 | 0 | 0 | 0.623762 | 0 | 0 | 0.026987 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.079208 | false | 0 | 0.069307 | 0.019802 | 0.227723 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
60870a74ae49230338464bfd3e6558919b37b722 | 73 | py | Python | src/ankidmpy/__init__.py | gitonthescene/ankidmpy | 24c99da93778db2f3ffce83ac611fa0dfef21a80 | [
"MIT"
] | 1 | 2020-12-22T09:43:05.000Z | 2020-12-22T09:43:05.000Z | src/ankidmpy/__init__.py | gitonthescene/ankidmpy | 24c99da93778db2f3ffce83ac611fa0dfef21a80 | [
"MIT"
] | null | null | null | src/ankidmpy/__init__.py | gitonthescene/ankidmpy | 24c99da93778db2f3ffce83ac611fa0dfef21a80 | [
"MIT"
] | null | null | null | import sys
def main():
from ankidmpy.runner import main
main()
| 10.428571 | 36 | 0.657534 | 10 | 73 | 4.8 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.260274 | 73 | 6 | 37 | 12.166667 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
6090ac0d57c9c3ee4c305a65cd51dd4b1b22093f | 84 | py | Python | elabjournal/elabjournal/ExperimentFiles.py | matthijsbrouwer/elabjournal-python | 4063b01993f0bf17ea2857009c1bedc5ace8b87b | [
"Apache-2.0"
] | 2 | 2021-06-29T11:17:27.000Z | 2022-01-11T18:41:49.000Z | elabjournal/elabjournal/ExperimentFiles.py | matthijsbrouwer/elabjournal-python | 4063b01993f0bf17ea2857009c1bedc5ace8b87b | [
"Apache-2.0"
] | null | null | null | elabjournal/elabjournal/ExperimentFiles.py | matthijsbrouwer/elabjournal-python | 4063b01993f0bf17ea2857009c1bedc5ace8b87b | [
"Apache-2.0"
] | 1 | 2019-06-06T13:23:11.000Z | 2019-06-06T13:23:11.000Z | from .eLABJournalPager import *
class ExperimentFiles(eLABJournalPager):
pass | 14 | 40 | 0.785714 | 7 | 84 | 9.428571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.154762 | 84 | 6 | 41 | 14 | 0.929577 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
609cd0b260e8615de498e786cffb4e0e069279f3 | 95 | py | Python | src/core/py/all.py | bboolean/e1-lang | 6087151b00567f2a30272e1b92ee809c2111f684 | [
"MIT"
] | null | null | null | src/core/py/all.py | bboolean/e1-lang | 6087151b00567f2a30272e1b92ee809c2111f684 | [
"MIT"
] | null | null | null | src/core/py/all.py | bboolean/e1-lang | 6087151b00567f2a30272e1b92ee809c2111f684 | [
"MIT"
] | null | null | null | def _core_all(fn, a):
return len(list(filter(fn, a))) == len(a)
core_all = curry2(_core_all)
| 23.75 | 43 | 0.673684 | 18 | 95 | 3.277778 | 0.555556 | 0.355932 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012195 | 0.136842 | 95 | 3 | 44 | 31.666667 | 0.707317 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
60d7bf890e768243aca802d2aefed93b627257d8 | 3,697 | py | Python | tests/test_error.py | nickfrostatx/frost-ci | 97fc234eb174a1242481b40e56aebba595827a69 | [
"MIT"
] | null | null | null | tests/test_error.py | nickfrostatx/frost-ci | 97fc234eb174a1242481b40e56aebba595827a69 | [
"MIT"
] | null | null | null | tests/test_error.py | nickfrostatx/frost-ci | 97fc234eb174a1242481b40e56aebba595827a69 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""Test error handling."""
import frost.error
import flask
import werkzeug.exceptions
def test_custom_handler():
app = flask.Flask(__name__)
def handler(e):
return e.name + '\n', e.code
frost.error.register_error_handler(app, handler)
@app.route('/good')
def good():
return 'OK'
@app.route('/418')
def teapot():
flask.abort(418)
@app.route('/500')
def internal():
raise werkzeug.exceptions.InternalServerError()
@app.route('/internal')
def divide_by_zero():
1/0
with app.test_client() as client:
rv = client.get('/good')
assert rv.status_code == 200
rv = client.get('/418')
assert rv.data == b'I\'m a teapot\n'
assert rv.status_code == 418
rv = client.get('/500')
assert rv.data == b'Internal Server Error\n'
assert rv.status_code == 500
rv = client.get('/internal')
assert rv.data == b'Internal Server Error\n'
assert rv.status_code == 500
def test_blueprint_handler():
app = flask.Flask(__name__)
bp = flask.Blueprint('bp', __name__)
def app_handler(e):
return 'App: ' + e.name + '\n', e.code
frost.error.register_error_handler(app, app_handler)
def bp_handler(e):
return 'Blueprint: ' + e.name + '\n', e.code
frost.error.register_error_handler(bp, bp_handler)
@bp.route('/good')
def good():
return 'OK'
@bp.route('/418')
def teapot():
flask.abort(418)
@bp.route('/500')
def internal():
raise werkzeug.exceptions.InternalServerError()
@bp.route('/internal')
def divide_by_zero():
1/0
app.register_blueprint(bp)
with app.test_client() as client:
rv = client.get('/good')
assert rv.data == b'OK'
assert rv.status_code == 200
rv = client.get('/418')
assert rv.data == b'Blueprint: I\'m a teapot\n'
assert rv.status_code == 418
rv = client.get('/500')
assert rv.data == b'App: Internal Server Error\n'
assert rv.status_code == 500
rv = client.get('/internal')
assert rv.data == b'App: Internal Server Error\n'
assert rv.status_code == 500
def test_html_handler():
app = flask.Flask(__name__, template_folder='../frost/templates')
frost.error.register_error_handler(app, frost.error.html_handler)
@app.route('/good')
def good():
return 'OK'
@app.route('/418')
def teapot():
flask.abort(418)
@app.route('/internal')
def divide_by_zero():
1/0
with app.test_client() as client:
rv = client.get('/good')
assert rv.status_code == 200
rv = client.get('/418')
assert b'<title>Frost CI - I'm a teapot</title>' in rv.data
assert b'>Error 418</h1>' in rv.data
assert rv.status_code == 418
rv = client.get('/internal')
assert b'<title>Frost CI - Internal Server Error</title>' in rv.data
assert b'>Error 500</h1>' in rv.data
assert rv.status_code == 500
def test_decorator():
app = flask.Flask(__name__, template_folder='../frost/templates')
@frost.error.errorhandler(app)
def handler(e):
return e.name + '\n', e.code
@app.route('/good')
def good():
return 'OK'
@app.route('/418')
def teapot():
flask.abort(418)
with app.test_client() as client:
rv = client.get('/good')
assert rv.data == b'OK'
assert rv.status_code == 200
rv = client.get('/418')
assert rv.data == b'I\'m a teapot\n'
assert rv.status_code == 418
| 24.322368 | 76 | 0.581012 | 499 | 3,697 | 4.172345 | 0.126253 | 0.084534 | 0.068684 | 0.112392 | 0.850624 | 0.804995 | 0.782421 | 0.739193 | 0.634966 | 0.613353 | 0 | 0.038547 | 0.270219 | 3,697 | 151 | 77 | 24.483444 | 0.733136 | 0.011631 | 0 | 0.740741 | 0 | 0 | 0.124452 | 0 | 0 | 0 | 0 | 0 | 0.240741 | 1 | 0.194444 | false | 0 | 0.027778 | 0.074074 | 0.296296 | 0.046296 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.