hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
d64f415897e749c6aa56a242b90bd89c895e2511 | 87 | py | Python | protocolws/__init__.py | panmpan17/ProtocolWebsocket | 05893ab47883fec4356f2f617213093eb7d9b4df | [
"MIT"
] | null | null | null | protocolws/__init__.py | panmpan17/ProtocolWebsocket | 05893ab47883fec4356f2f617213093eb7d9b4df | [
"MIT"
] | null | null | null | protocolws/__init__.py | panmpan17/ProtocolWebsocket | 05893ab47883fec4356f2f617213093eb7d9b4df | [
"MIT"
] | null | null | null | from .server import WebsocketServer, ErrMsg
name = "protocolws"
__version__ = "0.2.1"
| 17.4 | 43 | 0.747126 | 11 | 87 | 5.545455 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04 | 0.137931 | 87 | 4 | 44 | 21.75 | 0.773333 | 0 | 0 | 0 | 0 | 0 | 0.172414 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
c38ac90ca808d2dd0ca78ec136aaa0e419c7ab9a | 88 | py | Python | CodeUP/Python basic 100/6081.py | cmsong111/NJ_code | 2df6176d179e168a2789a825ddeb977a82eb8d97 | [
"MIT"
] | null | null | null | CodeUP/Python basic 100/6081.py | cmsong111/NJ_code | 2df6176d179e168a2789a825ddeb977a82eb8d97 | [
"MIT"
] | null | null | null | CodeUP/Python basic 100/6081.py | cmsong111/NJ_code | 2df6176d179e168a2789a825ddeb977a82eb8d97 | [
"MIT"
] | null | null | null | n = int(input(),16)
for i in range(1,16):
print("%X"%n,"*%X"%i,"=%X"%(n*i),sep="")
| 17.6 | 44 | 0.454545 | 19 | 88 | 2.105263 | 0.631579 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 0.147727 | 88 | 4 | 45 | 22 | 0.466667 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c392ee111fc105d1add3bfe72b6ea6451c22de9b | 71 | py | Python | scanapi/__init__.py | marcuxyz/scanapi | e42bcde18d4219fc603b0b9ee8f0b67d4085ec63 | [
"MIT"
] | 2 | 2020-08-26T01:54:19.000Z | 2021-07-23T14:06:34.000Z | scanapi/__init__.py | marcuxyz/scanapi | e42bcde18d4219fc603b0b9ee8f0b67d4085ec63 | [
"MIT"
] | null | null | null | scanapi/__init__.py | marcuxyz/scanapi | e42bcde18d4219fc603b0b9ee8f0b67d4085ec63 | [
"MIT"
] | null | null | null | name = "scanapi"
from scanapi.__main__ import main
__all__ = ["main"]
| 14.2 | 33 | 0.71831 | 9 | 71 | 4.777778 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15493 | 71 | 4 | 34 | 17.75 | 0.716667 | 0 | 0 | 0 | 0 | 0 | 0.15493 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
c39f85552fb76ed1c43748aba7b313d0e063acf8 | 243 | py | Python | BasicCache/BasicCacheModule.py | prodProject/WorkkerAndConsumerServer | 95496f026109279c9891e08af46040c7b9487c81 | [
"MIT"
] | null | null | null | BasicCache/BasicCacheModule.py | prodProject/WorkkerAndConsumerServer | 95496f026109279c9891e08af46040c7b9487c81 | [
"MIT"
] | null | null | null | BasicCache/BasicCacheModule.py | prodProject/WorkkerAndConsumerServer | 95496f026109279c9891e08af46040c7b9487c81 | [
"MIT"
] | null | null | null | from werkzeug.contrib.cache import SimpleCache
class BasicCache:
cache = SimpleCache()
def set(self, key, value):
self.cache.set(key, value, timeout=50 * 1000)
def get(self, key):
return self.cache.get(key=key)
| 20.25 | 53 | 0.658436 | 33 | 243 | 4.848485 | 0.545455 | 0.0875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031915 | 0.226337 | 243 | 11 | 54 | 22.090909 | 0.819149 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0.142857 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
c3b239f366d0aa2e3fa43d5373d1a03a579fb69c | 1,104 | py | Python | tests/test_iter_ij_in_block.py | kalekundert/wellmap | 05a9029807276ec29aea63db10c664ad2ede093c | [
"MIT"
] | 7 | 2020-05-29T21:14:49.000Z | 2022-01-25T15:35:17.000Z | tests/test_iter_ij_in_block.py | kalekundert/wellmap | 05a9029807276ec29aea63db10c664ad2ede093c | [
"MIT"
] | 24 | 2020-06-09T14:29:03.000Z | 2022-03-25T22:43:24.000Z | tests/test_iter_ij_in_block.py | kalekundert/wellmap | 05a9029807276ec29aea63db10c664ad2ede093c | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import pytest
from wellmap import *
@pytest.mark.parametrize(
'args, expected', [
(((0,0), 0, 0), []),
(((0,0), 0, 1), []),
(((0,0), 1, 0), []),
(((0,0), 1, 1), [(0,0)]),
(((0,0), 2, 1), [(0,0), (0,1)]),
(((0,0), 1, 2), [(0,0), (1,0)]),
(((0,0), 2, 2), [(0,0), (0,1), (1,0), (1,1)]),
(((0,1), 0, 0), []),
(((0,1), 0, 1), []),
(((0,1), 1, 0), []),
(((0,1), 1, 1), [(0,1)]),
(((0,1), 2, 1), [(0,1), (0,2)]),
(((0,1), 1, 2), [(0,1), (1,1)]),
(((0,1), 2, 2), [(0,1), (0,2), (1,1), (1,2)]),
(((1,0), 0, 0), []),
(((1,0), 0, 1), []),
(((1,0), 1, 0), []),
(((1,0), 1, 1), [(1,0)]),
(((1,0), 2, 1), [(1,0), (1,1)]),
(((1,0), 1, 2), [(1,0), (2,0)]),
(((1,0), 2, 2), [(1,0), (1,1), (2,0), (2,1)]),
],
)
def test_iter_ij_in_block(args, expected):
print(args)
assert set(iter_ij_in_block(*args)) == set(expected)
| 30.666667 | 58 | 0.272645 | 171 | 1,104 | 1.719298 | 0.140351 | 0.204082 | 0.153061 | 0.108844 | 0.55102 | 0.367347 | 0.221088 | 0.061224 | 0.061224 | 0 | 0 | 0.197724 | 0.363225 | 1,104 | 35 | 59 | 31.542857 | 0.220484 | 0.019022 | 0 | 0 | 0 | 0 | 0.012939 | 0 | 0 | 0 | 0 | 0 | 0.033333 | 1 | 0.033333 | false | 0 | 0.066667 | 0 | 0.1 | 0.033333 | 0 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c3b44be37f288d5b48b6f0177fd4a4019cc7b623 | 215 | py | Python | py-server/src/enum/index.py | Jonnytoshen/wind-layer | 514e9b9c76b6d72faac21543fd5fb1c43e6bd9b5 | [
"BSD-3-Clause"
] | 285 | 2017-12-16T13:29:27.000Z | 2022-03-28T02:59:08.000Z | py-server/src/enum/index.py | Jonnytoshen/wind-layer | 514e9b9c76b6d72faac21543fd5fb1c43e6bd9b5 | [
"BSD-3-Clause"
] | 104 | 2018-01-01T01:40:13.000Z | 2022-03-26T18:20:45.000Z | py-server/src/enum/index.py | Jonnytoshen/wind-layer | 514e9b9c76b6d72faac21543fd5fb1c43e6bd9b5 | [
"BSD-3-Clause"
] | 97 | 2017-12-18T08:05:21.000Z | 2022-03-28T15:49:38.000Z | from enum import Enum
class noaa_site_label(Enum):
DIRECTORY = 'directory:'
FILE_LIST = '\n**NEW** Select one file only '
SURFACES = 'select the levels desired:'
VARIABLES = 'select the variables desired:'
| 26.875 | 47 | 0.716279 | 29 | 215 | 5.206897 | 0.689655 | 0.119205 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176744 | 215 | 7 | 48 | 30.714286 | 0.853107 | 0 | 0 | 0 | 0 | 0 | 0.446512 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
c3d5b63b4aec63e711dc906333d611d89cd9fa71 | 1,360 | py | Python | app/data.py | leecrowe/bandersnatch-app | 575394a7b7c47b8525d182edd4d7cf22de985d18 | [
"MIT"
] | null | null | null | app/data.py | leecrowe/bandersnatch-app | 575394a7b7c47b8525d182edd4d7cf22de985d18 | [
"MIT"
] | null | null | null | app/data.py | leecrowe/bandersnatch-app | 575394a7b7c47b8525d182edd4d7cf22de985d18 | [
"MIT"
] | null | null | null | from os import getenv
from typing import Iterator, Dict, Iterable
from pymongo import MongoClient
import pandas as pd
from dotenv import load_dotenv
class Data:
""" MongoDB Data Model """
load_dotenv()
db_url = getenv("DB_URL")
db_name = getenv("DB_NAME")
db_table = getenv("DB_TABLE")
def connect(self):
return MongoClient(self.db_url)[self.db_name][self.db_table]
def find(self, query_obj: Dict) -> Dict:
return self.connect().find_one(query_obj)
def insert(self, insert_obj: Dict):
self.connect().insert_one(insert_obj)
def find_many(self, query_obj: Dict, limit=0) -> Iterator[Dict]:
return self.connect().find(query_obj, limit=limit)
def insert_many(self, insert_obj: Iterable[Dict]):
self.connect().insert_many(insert_obj)
def update(self, query: Dict, data_update: Dict):
self.connect().update_one(query, {"$set": data_update})
def delete(self, query_obj: Dict):
self.connect().delete_many(query_obj)
def reset_db(self):
self.connect().delete_many({})
def get_df(self, limit=0) -> pd.DataFrame:
return pd.DataFrame(self.find_many({}, limit=limit))
def get_count(self, query_obj: Dict) -> int:
return self.connect().count_documents(query_obj)
def __str__(self):
return f"{self.get_df()}"
| 28.333333 | 68 | 0.668382 | 193 | 1,360 | 4.487047 | 0.243523 | 0.073903 | 0.055427 | 0.073903 | 0.057737 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001842 | 0.201471 | 1,360 | 47 | 69 | 28.93617 | 0.79558 | 0.013235 | 0 | 0 | 0 | 0 | 0.029985 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.34375 | false | 0 | 0.15625 | 0.1875 | 0.8125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
c3dc65c6f95112e170d27e15fb7aca74b4ab07cd | 11,253 | py | Python | lib/fcn/test_common.py | LeiYangJustin/UnseenObjectClustering | 177d67c47bfd973d46b816e6b68bf20660f4b71d | [
"BSD-Source-Code"
] | 101 | 2020-12-13T22:20:12.000Z | 2022-03-22T07:58:58.000Z | lib/fcn/test_common.py | LeiYangJustin/UnseenObjectClustering | 177d67c47bfd973d46b816e6b68bf20660f4b71d | [
"BSD-Source-Code"
] | 6 | 2021-01-13T13:33:04.000Z | 2022-03-08T02:13:27.000Z | lib/fcn/test_common.py | LeiYangJustin/UnseenObjectClustering | 177d67c47bfd973d46b816e6b68bf20660f4b71d | [
"BSD-Source-Code"
] | 19 | 2021-01-21T18:19:42.000Z | 2022-03-07T14:17:21.000Z | # Copyright (c) 2020 NVIDIA Corporation. All rights reserved.
# This work is licensed under the NVIDIA Source Code License - Non-commercial. Full
# text can be found in LICENSE.md
import torch
import time
import sys, os
import numpy as np
import cv2
import matplotlib.pyplot as plt
from fcn.config import cfg
from utils.mask import visualize_segmentation
def normalize_descriptor(res, stats=None):
"""
Normalizes the descriptor into RGB color space
:param res: numpy.array [H,W,D]
Output of the network, per-pixel dense descriptor
:param stats: dict, with fields ['min', 'max', 'mean'], which are used to normalize descriptor
:return: numpy.array
normalized descriptor
"""
if stats is None:
res_min = res.min()
res_max = res.max()
else:
res_min = np.array(stats['min'])
res_max = np.array(stats['max'])
normed_res = np.clip(res, res_min, res_max)
eps = 1e-10
scale = (res_max - res_min) + eps
normed_res = (normed_res - res_min) / scale
return normed_res
def _vis_features(features, labels, rgb, intial_labels, selected_pixels=None):
num = features.shape[0]
height = features.shape[2]
width = features.shape[3]
fig = plt.figure()
start = 1
m = np.ceil((num * 4 ) / 8.0)
n = 8
im_blob = rgb.cpu().numpy()
for i in range(num):
if i < m * n / 4:
# show image
im = im_blob[i, :3, :, :].copy()
im = im.transpose((1, 2, 0)) * 255.0
im += cfg.PIXEL_MEANS
im = im[:, :, (2, 1, 0)]
im = np.clip(im, 0, 255)
im = im.astype(np.uint8)
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(im)
ax.set_title('image')
plt.axis('off')
'''
if selected_pixels is not None:
selected_indices = selected_pixels[i]
for j in range(len(selected_indices)):
index = selected_indices[j]
y = index / width
x = index % width
plt.plot(x, y, 'ro', markersize=1.0)
'''
im = torch.cuda.FloatTensor(height, width, 3)
for j in range(3):
im[:, :, j] = torch.sum(features[i, j::3, :, :], dim=0)
im = normalize_descriptor(im.detach().cpu().numpy())
im *= 255
im = im.astype(np.uint8)
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(im)
ax.set_title('features')
plt.axis('off')
ax = fig.add_subplot(m, n, start)
start += 1
label = labels[i].detach().cpu().numpy()
plt.imshow(label)
ax.set_title('labels')
plt.axis('off')
ax = fig.add_subplot(m, n, start)
start += 1
label = intial_labels[i].detach().cpu().numpy()
plt.imshow(label)
ax.set_title('intial labels')
plt.axis('off')
plt.show()
def _vis_minibatch_segmentation_final(image, depth, label, out_label=None, out_label_refined=None,
features=None, ind=None, selected_pixels=None, bbox=None):
if depth is None:
im_blob = image.cpu().numpy()
else:
im_blob = image.cpu().numpy()
depth_blob = depth.cpu().numpy()
num = im_blob.shape[0]
height = im_blob.shape[2]
width = im_blob.shape[3]
if label is not None:
label_blob = label.cpu().numpy()
if out_label is not None:
out_label_blob = out_label.cpu().numpy()
if out_label_refined is not None:
out_label_refined_blob = out_label_refined.cpu().numpy()
m = 2
n = 3
for i in range(num):
# image
im = im_blob[i, :3, :, :].copy()
im = im.transpose((1, 2, 0)) * 255.0
im += cfg.PIXEL_MEANS
im = im[:, :, (2, 1, 0)]
im = np.clip(im, 0, 255)
im = im.astype(np.uint8)
fig = plt.figure()
start = 1
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(im)
ax.set_title('image')
plt.axis('off')
# depth
if depth is not None:
depth = depth_blob[i][2]
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(depth)
ax.set_title('depth')
plt.axis('off')
# feature
if features is not None:
im_feature = torch.cuda.FloatTensor(height, width, 3)
for j in range(3):
im_feature[:, :, j] = torch.sum(features[i, j::3, :, :], dim=0)
im_feature = normalize_descriptor(im_feature.detach().cpu().numpy())
im_feature *= 255
im_feature = im_feature.astype(np.uint8)
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(im_feature)
ax.set_title('feature map')
plt.axis('off')
# initial seeds
if selected_pixels is not None:
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(im)
ax.set_title('initial seeds')
plt.axis('off')
selected_indices = selected_pixels[i]
for j in range(len(selected_indices)):
index = selected_indices[j]
y = index / width
x = index % width
plt.plot(x, y, 'ro', markersize=2.0)
# intial mask
mask = out_label_blob[i, :, :]
im_label = visualize_segmentation(im, mask, return_rgb=True)
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(im_label)
ax.set_title('initial label')
plt.axis('off')
# refined mask
if out_label_refined is not None:
mask = out_label_refined_blob[i, :, :]
im_label = visualize_segmentation(im, mask, return_rgb=True)
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(im_label)
ax.set_title('refined label')
plt.axis('off')
elif label is not None:
# show gt label
mask = label_blob[i, 0, :, :]
im_label = visualize_segmentation(im, mask, return_rgb=True)
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(im_label)
ax.set_title('gt label')
plt.axis('off')
if ind is not None:
mng = plt.get_current_fig_manager()
mng.resize(*mng.window.maxsize())
plt.pause(0.001)
# plt.show(block=False)
filename = 'output/images/%06d.png' % ind
fig.savefig(filename)
plt.close()
else:
plt.show()
def _vis_minibatch_segmentation(image, depth, label, out_label=None, out_label_refined=None,
features=None, ind=None, selected_pixels=None, bbox=None):
if depth is None:
im_blob = image.cpu().numpy()
m = 2
n = 3
else:
im_blob = image.cpu().numpy()
depth_blob = depth.cpu().numpy()
m = 3
n = 3
num = im_blob.shape[0]
height = im_blob.shape[2]
width = im_blob.shape[3]
if label is not None:
label_blob = label.cpu().numpy()
if out_label is not None:
out_label_blob = out_label.cpu().numpy()
if out_label_refined is not None:
out_label_refined_blob = out_label_refined.cpu().numpy()
for i in range(num):
# image
im = im_blob[i, :3, :, :].copy()
im = im.transpose((1, 2, 0)) * 255.0
im += cfg.PIXEL_MEANS
im = im[:, :, (2, 1, 0)]
im = np.clip(im, 0, 255)
im = im.astype(np.uint8)
'''
if out_label_refined is not None:
mask = out_label_refined_blob[i, :, :]
visualize_segmentation(im, mask)
#'''
# show image
fig = plt.figure()
start = 1
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(im)
ax.set_title('image')
plt.axis('off')
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(im)
plt.axis('off')
if bbox is not None:
boxes = bbox[i].numpy()
for j in range(boxes.shape[0]):
x1 = boxes[j, 0]
y1 = boxes[j, 1]
x2 = boxes[j, 2]
y2 = boxes[j, 3]
plt.gca().add_patch(
plt.Rectangle((x1, y1), x2-x1, y2-y1, fill=False, edgecolor='g', linewidth=3))
if selected_pixels is not None:
selected_indices = selected_pixels[i]
for j in range(len(selected_indices)):
index = selected_indices[j]
y = index / width
x = index % width
plt.plot(x, y, 'ro', markersize=1.0)
if im_blob.shape[1] == 4:
label = im_blob[i, 3, :, :]
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(label)
ax.set_title('initial label')
if depth is not None:
depth = depth_blob[i]
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(depth[0])
ax.set_title('depth X')
plt.axis('off')
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(depth[1])
ax.set_title('depth Y')
plt.axis('off')
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(depth[2])
ax.set_title('depth Z')
plt.axis('off')
# show label
if label is not None:
label = label_blob[i, 0, :, :]
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(label)
ax.set_title('gt label')
plt.axis('off')
# show out label
if out_label is not None:
label = out_label_blob[i, :, :]
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(label)
ax.set_title('out label')
plt.axis('off')
# show out label refined
if out_label_refined is not None:
label = out_label_refined_blob[i, :, :]
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(label)
ax.set_title('out label refined')
plt.axis('off')
if features is not None:
im = torch.cuda.FloatTensor(height, width, 3)
for j in range(3):
im[:, :, j] = torch.sum(features[i, j::3, :, :], dim=0)
im = normalize_descriptor(im.detach().cpu().numpy())
im *= 255
im = im.astype(np.uint8)
ax = fig.add_subplot(m, n, start)
start += 1
plt.imshow(im)
ax.set_title('features')
plt.axis('off')
if ind is not None:
mng = plt.get_current_fig_manager()
plt.show()
filename = 'output/images/%06d.png' % ind
fig.savefig(filename)
plt.show()
| 31.085635 | 98 | 0.50902 | 1,473 | 11,253 | 3.758995 | 0.130346 | 0.0419 | 0.035759 | 0.05689 | 0.707062 | 0.702547 | 0.660827 | 0.646018 | 0.646018 | 0.631389 | 0 | 0.022877 | 0.366836 | 11,253 | 361 | 99 | 31.171745 | 0.754246 | 0.055185 | 0 | 0.718412 | 0 | 0 | 0.029385 | 0.004368 | 0 | 0 | 0 | 0 | 0 | 1 | 0.01444 | false | 0 | 0.028881 | 0 | 0.046931 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c3e63aa7608692e9e7131de69634555c66e2658b | 123 | py | Python | mantraml/templates/projects/default/settings.py | cclauss/mantra | 19e2f72960da8314f11768d9acfe7836629b817c | [
"Apache-2.0"
] | 330 | 2018-09-04T19:07:51.000Z | 2021-09-14T11:21:05.000Z | mantraml/templates/projects/default/settings.py | cclauss/mantra | 19e2f72960da8314f11768d9acfe7836629b817c | [
"Apache-2.0"
] | 13 | 2018-09-06T06:08:16.000Z | 2018-12-01T17:04:38.000Z | mantraml/templates/projects/default/settings.py | cclauss/mantra | 19e2f72960da8314f11768d9acfe7836629b817c | [
"Apache-2.0"
] | 20 | 2018-09-06T11:56:07.000Z | 2021-12-03T19:48:21.000Z | # AWS SETTINGS
AWS_AMI_IMAGE_ID = 'ami-6356761c'
AWS_INSTANCE_TYPE = 'p2.xlarge'
S3_BUCKET_NAME = 'default-s3-bucket-name' | 30.75 | 41 | 0.780488 | 20 | 123 | 4.45 | 0.7 | 0.179775 | 0.269663 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09009 | 0.097561 | 123 | 4 | 41 | 30.75 | 0.711712 | 0.097561 | 0 | 0 | 0 | 0 | 0.390909 | 0.2 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
c3f41031d68e0a2d6308085aa0e246d0f490465c | 968 | py | Python | lct/tasks/taskstore.py | pathbreak/linode-cluster-toolkit | 280257436105703c9a122e7ed111a72efa79adfc | [
"MIT"
] | 11 | 2017-07-19T15:25:39.000Z | 2021-12-02T20:03:21.000Z | lct/tasks/taskstore.py | pathbreak/linode-cluster-toolkit | 280257436105703c9a122e7ed111a72efa79adfc | [
"MIT"
] | null | null | null | lct/tasks/taskstore.py | pathbreak/linode-cluster-toolkit | 280257436105703c9a122e7ed111a72efa79adfc | [
"MIT"
] | 1 | 2021-12-02T20:03:22.000Z | 2021-12-02T20:03:22.000Z | class TaskStore(object):
'''
All task execution plans are stored and tracked in a TaskStore.
Both are crucual for supporting pause and resume of operations.
The interface to be implemented by all providers that provide task
storage capabilities.
'''
def initialize(self):
'''
Initialization opportunity for providers.
This is called when the toolkit itself and all services, including
the Task Service, are initializing themselves.
'''
raise NotImplementedError('subclasses should override this')
def save_execution_plan(self, task_plan):
'''
Save or update the execution plan.
'''
raise NotImplementedError('subclasses should override this')
def close(self):
'''
Provider should release its resources here.
'''
raise NotImplementedError('subclasses should override this')
| 28.470588 | 74 | 0.636364 | 101 | 968 | 6.069307 | 0.623762 | 0.117455 | 0.166395 | 0.195759 | 0.264274 | 0.264274 | 0.179445 | 0 | 0 | 0 | 0 | 0 | 0.303719 | 968 | 33 | 75 | 29.333333 | 0.909496 | 0.469008 | 0 | 0.428571 | 0 | 0 | 0.234257 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
c3f677ecc5dfde9410fda55667fad516d6d7ddf7 | 827 | py | Python | catweazle/applog.py | upendar245/CatWeazle | 58e4e37a71c61b998c6a3adae5e16343db50aff5 | [
"MIT"
] | 1 | 2020-12-17T04:23:32.000Z | 2020-12-17T04:23:32.000Z | catweazle/applog.py | upendar245/CatWeazle | 58e4e37a71c61b998c6a3adae5e16343db50aff5 | [
"MIT"
] | null | null | null | catweazle/applog.py | upendar245/CatWeazle | 58e4e37a71c61b998c6a3adae5e16343db50aff5 | [
"MIT"
] | 1 | 2020-04-24T10:17:03.000Z | 2020-04-24T10:17:03.000Z | import logging
import aiotask_context as context
class AppLogging:
def __init__(self):
self.log = logging.getLogger('application')
self.context = context
def info(self, msg):
self.log.info('{0} {1}'.format(self.context.get('X-Request-ID'), msg))
def warning(self, msg):
self.log.warning('{0} {1}'.format(self.context.get('X-Request-ID'), msg))
def error(self, msg):
self.log.error('{0} {1}'.format(self.context.get('X-Request-ID'), msg))
def critical(self, msg):
self.log.critical('{0} {1}'.format(self.context.get('X-Request-ID'), msg))
def fatal(self, msg):
self.log.fatal('{0} {1}'.format(self.context.get('X-Request-ID'), msg))
def debug(self, msg):
self.log.debug('{0} {1}'.format(self.context.get('X-Request-ID'), msg))
| 29.535714 | 82 | 0.610641 | 122 | 827 | 4.098361 | 0.221311 | 0.098 | 0.132 | 0.168 | 0.45 | 0.45 | 0.45 | 0.45 | 0.45 | 0.45 | 0 | 0.017804 | 0.185006 | 827 | 27 | 83 | 30.62963 | 0.724036 | 0 | 0 | 0 | 0 | 0 | 0.151149 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.388889 | false | 0 | 0.111111 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
7f17a75bcf20017304a34f189d671e3f94815fb9 | 387 | py | Python | service/models/employer/employer_filter.py | CyberArkForTheCommunity/jobli-backend | 2309c9ac33993cb89a8e1581630d99b46f8d55aa | [
"MIT"
] | null | null | null | service/models/employer/employer_filter.py | CyberArkForTheCommunity/jobli-backend | 2309c9ac33993cb89a8e1581630d99b46f8d55aa | [
"MIT"
] | 1 | 2021-12-23T13:36:43.000Z | 2021-12-23T13:36:43.000Z | service/models/employer/employer_filter.py | CyberArkForTheCommunity/jobli-backend | 2309c9ac33993cb89a8e1581630d99b46f8d55aa | [
"MIT"
] | null | null | null | from pydantic import BaseModel, Field
from typing import Optional
from service.lambdas.employer.constants import EmployerConstants
class EmployerFilter(BaseModel):
employer_id: Optional[str]
business_name: Optional[str]
city: Optional[str]
last_pagination_key: Optional[str]
limit_per_page: Optional[int] = Field(default=EmployerConstants.LIMITS_PER_EMPLOYER_PAGE)
| 32.25 | 93 | 0.803618 | 47 | 387 | 6.425532 | 0.595745 | 0.145695 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126615 | 387 | 11 | 94 | 35.181818 | 0.893491 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
7f1abb5060e39a368256be6fdccc62d3097395a3 | 185 | py | Python | user_guide/urls.py | jmcriffey/django-user-guide | f4a1c462d2f7bf8569576f757e2f106b565e3e40 | [
"MIT"
] | null | null | null | user_guide/urls.py | jmcriffey/django-user-guide | f4a1c462d2f7bf8569576f757e2f106b565e3e40 | [
"MIT"
] | null | null | null | user_guide/urls.py | jmcriffey/django-user-guide | f4a1c462d2f7bf8569576f757e2f106b565e3e40 | [
"MIT"
] | null | null | null | from django.conf.urls import patterns, url
from user_guide import views
urlpatterns = patterns(
'',
url(r'^seen/?$', views.GuideSeenView.as_view(), name='user_guide.seen')
)
| 18.5 | 75 | 0.702703 | 25 | 185 | 5.08 | 0.68 | 0.173228 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151351 | 185 | 9 | 76 | 20.555556 | 0.808917 | 0 | 0 | 0 | 0 | 0 | 0.124324 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
61696efc02baa4bd76628d5a785149b2ecf4ac45 | 11,300 | py | Python | 018-crackme-z3/asdf.py | gynvael/stream | 2d1a3f25b2f83241b39dab931d9ff03fca81d26e | [
"MIT"
] | 152 | 2016-02-04T10:40:46.000Z | 2022-03-03T18:25:54.000Z | 018-crackme-z3/asdf.py | gynvael/stream | 2d1a3f25b2f83241b39dab931d9ff03fca81d26e | [
"MIT"
] | 4 | 2016-03-11T23:49:46.000Z | 2017-06-16T18:58:53.000Z | 018-crackme-z3/asdf.py | gynvael/stream | 2d1a3f25b2f83241b39dab931d9ff03fca81d26e | [
"MIT"
] | 48 | 2016-01-31T19:13:36.000Z | 2021-09-03T19:50:17.000Z | from z3 import *
def movsx(v):
return ZeroExt(32 - 8, v)
def imul(a, b, c = None):
if c is None:
return a * b
return b * c
def xor_(r, v):
return r ^ v
def or_(r, v):
return r | v
def mov(_, r2):
return r2
def shr(r1, c):
return LShR(r1, c)
def shl(r1, c):
return r1 << c
def calc():
esp_0x10 = BitVec("esp_0x10", 8)
esp_0x11 = BitVec("esp_0x11", 8)
esp_0x12 = BitVec("esp_0x12", 8)
esp_0x13 = BitVec("esp_0x13", 8)
esp_0x14 = BitVec("esp_0x14", 8)
esp_0x15 = BitVec("esp_0x15", 8)
esp_0x16 = BitVec("esp_0x16", 8)
esp_0x17 = BitVec("esp_0x17", 8)
esp_0x18 = BitVec("esp_0x18", 8)
esp_0x19 = BitVec("esp_0x19", 8)
esp_0x1A = BitVec("esp_0x1A", 8)
esp_0x1B = BitVec("esp_0x1B", 8)
esp_0x1C = BitVec("esp_0x1C", 8)
esp_0x1D = BitVec("esp_0x1D", 8)
esp_0x1E = BitVec("esp_0x1E", 8)
esp_0x1F = BitVec("esp_0x1F", 8)
eax = BitVec("eax", 32)
ebx = BitVec("ebx", 32)
ecx = BitVec("ecx", 32)
edx = BitVec("edx", 32)
esi = BitVec("esi", 32)
edi = BitVec("edi", 32)
ebp = BitVec("ebp", 32)
edi = movsx(esp_0x10)
edx = imul(edx, edi, 0x3039)
edx = xor_(edx, 0x93E6BBCF)
ebx = imul(ebx, edi, 0x0AEDCE)
ebx = xor_(ebx, 0x2ECBBAE2)
ecx = imul(ecx, edi, 0x2EF8F)
ecx = xor_(ecx, 0x0A0A2A282)
edi = imul(edi, 0x0DEDC7)
edi = xor_(edi, 0x9BDFE6F7)
eax = mov(eax, edx)
eax = shr(eax, 3)
edx = shl(edx, 3)
eax = or_(eax, edx)
edx = movsx(esp_0x11)
esi = imul(esi, edx, 0x3039)
eax = xor_(eax, esi)
esi = mov(esi, ebx)
esi = shr(esi, 5)
ebx = shl(ebx, 5)
esi = or_(esi, ebx)
ebx = imul(ebx, edx, 0x0AEDCE)
esi = xor_(esi, ebx)
ebx = mov(ebx, ecx)
ebx = shr(ebx, 7)
ecx = shl(ecx, 7)
ebx = or_(ebx, ecx)
ecx = imul(ecx, edx, 0x2EF8F)
ebx = xor_(ebx, ecx)
ecx = mov(ecx, edi)
ecx = shr(ecx, 9)
edi = shl(edi, 9)
ecx = or_(ecx, edi)
edx = imul(edx, 0x0DEDC7)
ecx = xor_(ecx, edx)
edx = mov(edx, eax)
edx = shr(edx, 3)
eax = shl(eax, 3)
edx = or_(edx, eax)
edi = movsx(esp_0x12)
eax = imul(eax, edi, 0x3039)
edx = xor_(edx, eax)
eax = mov(eax, esi)
eax = shr(eax, 5)
esi = shl(esi, 5)
eax = or_(eax, esi)
esi = imul(esi, edi, 0x0AEDCE)
eax = xor_(eax, esi)
esi = mov(esi, ebx)
esi = shr(esi, 7)
ebx = shl(ebx, 7)
esi = or_(esi, ebx)
ebx = imul(ebx, edi, 0x2EF8F)
esi = xor_(esi, ebx)
ebx = mov(ebx, ecx)
ebx = shr(ebx, 9)
ecx = shl(ecx, 9)
ebx = or_(ebx, ecx)
edi = imul(edi, 0x0DEDC7)
ebx = xor_(ebx, edi)
ecx = mov(ecx, edx)
ecx = shr(ecx, 3)
edx = shl(edx, 3)
ecx = or_(ecx, edx)
edi = movsx(esp_0x13)
edx = imul(edx, edi, 0x3039)
ecx = xor_(ecx, edx)
edx = mov(edx, eax)
edx = shr(edx, 5)
eax = shl(eax, 5)
edx = or_(edx, eax)
eax = imul(eax, edi, 0x0AEDCE)
edx = xor_(edx, eax)
eax = mov(eax, esi)
eax = shr(eax, 7)
esi = shl(esi, 7)
eax = or_(eax, esi)
esi = imul(esi, edi, 0x2EF8F)
eax = xor_(eax, esi)
esi = mov(esi, ebx)
esi = shr(esi, 9)
ebx = shl(ebx, 9)
esi = or_(esi, ebx)
edi = imul(edi, 0x0DEDC7)
esi = xor_(esi, edi)
ebx = mov(ebx, ecx)
ebx = shr(ebx, 3)
ecx = shl(ecx, 3)
ebx = or_(ebx, ecx)
edi = movsx(esp_0x14)
ecx = imul(ecx, edi, 0x3039)
ebx = xor_(ebx, ecx)
ecx = mov(ecx, edx)
ecx = shr(ecx, 5)
edx = shl(edx, 5)
ecx = or_(ecx, edx)
edx = imul(edx, edi, 0x0AEDCE)
ecx = xor_(ecx, edx)
edx = mov(edx, eax)
edx = shr(edx, 7)
eax = shl(eax, 7)
edx = or_(edx, eax)
eax = imul(eax, edi, 0x2EF8F)
edx = xor_(edx, eax)
eax = mov(eax, esi)
eax = shr(eax, 9)
esi = shl(esi, 9)
eax = or_(eax, esi)
edi = imul(edi, 0x0DEDC7)
eax = xor_(eax, edi)
esi = mov(esi, ebx)
esi = shr(esi, 3)
ebx = shl(ebx, 3)
esi = or_(esi, ebx)
edi = movsx(esp_0x15)
ebx = imul(ebx, edi, 0x3039)
esi = xor_(esi, ebx)
ebx = mov(ebx, ecx)
ebx = shr(ebx, 5)
ecx = shl(ecx, 5)
ebx = or_(ebx, ecx)
ecx = imul(ecx, edi, 0x0AEDCE)
ebx = xor_(ebx, ecx)
ecx = mov(ecx, edx)
ecx = shr(ecx, 7)
edx = shl(edx, 7)
ecx = or_(ecx, edx)
edx = imul(edx, edi, 0x2EF8F)
ecx = xor_(ecx, edx)
edx = mov(edx, eax)
edx = shr(edx, 9)
eax = shl(eax, 9)
edx = or_(edx, eax)
edi = imul(edi, 0x0DEDC7)
edx = xor_(edx, edi)
eax = mov(eax, esi)
eax = shr(eax, 3)
esi = shl(esi, 3)
eax = or_(eax, esi)
edi = movsx(esp_0x16)
esi = imul(esi, edi, 0x3039)
eax = xor_(eax, esi)
esi = mov(esi, ebx)
esi = shr(esi, 5)
ebx = shl(ebx, 5)
esi = or_(esi, ebx)
ebx = imul(ebx, edi, 0x0AEDCE)
esi = xor_(esi, ebx)
ebx = mov(ebx, ecx)
ebx = shr(ebx, 7)
ecx = shl(ecx, 7)
ebx = or_(ebx, ecx)
ecx = imul(ecx, edi, 0x2EF8F)
ebx = xor_(ebx, ecx)
ecx = mov(ecx, edx)
ecx = shr(ecx, 9)
edx = shl(edx, 9)
ecx = or_(ecx, edx)
edi = imul(edi, 0x0DEDC7)
ecx = xor_(ecx, edi)
edx = mov(edx, eax)
edx = shr(edx, 3)
eax = shl(eax, 3)
edx = or_(edx, eax)
edi = movsx(esp_0x17)
eax = imul(eax, edi, 0x3039)
edx = xor_(edx, eax)
eax = mov(eax, esi)
eax = shr(eax, 5)
esi = shl(esi, 5)
eax = or_(eax, esi)
esi = imul(esi, edi, 0x0AEDCE)
eax = xor_(eax, esi)
esi = mov(esi, ebx)
esi = shr(esi, 7)
ebx = shl(ebx, 7)
esi = or_(esi, ebx)
ebx = imul(ebx, edi, 0x2EF8F)
esi = xor_(esi, ebx)
ebx = mov(ebx, ecx)
ebx = shr(ebx, 9)
ecx = shl(ecx, 9)
ebx = or_(ebx, ecx)
edi = imul(edi, 0x0DEDC7)
ebx = xor_(ebx, edi)
ecx = mov(ecx, edx)
ecx = shr(ecx, 3)
edx = shl(edx, 3)
ecx = or_(ecx, edx)
edi = movsx(esp_0x18)
edx = imul(edx, edi, 0x3039)
ecx = xor_(ecx, edx)
edx = mov(edx, eax)
edx = shr(edx, 5)
eax = shl(eax, 5)
edx = or_(edx, eax)
eax = imul(eax, edi, 0x0AEDCE)
edx = xor_(edx, eax)
eax = mov(eax, esi)
eax = shr(eax, 7)
esi = shl(esi, 7)
eax = or_(eax, esi)
esi = imul(esi, edi, 0x2EF8F)
eax = xor_(eax, esi)
esi = mov(esi, ebx)
esi = shr(esi, 9)
ebx = shl(ebx, 9)
esi = or_(esi, ebx)
edi = imul(edi, 0x0DEDC7)
esi = xor_(esi, edi)
ebx = mov(ebx, ecx)
ebx = shr(ebx, 3)
ecx = shl(ecx, 3)
ebx = or_(ebx, ecx)
edi = movsx(esp_0x19)
ecx = imul(ecx, edi, 0x3039)
ebx = xor_(ebx, ecx)
ecx = mov(ecx, edx)
ecx = shr(ecx, 5)
edx = shl(edx, 5)
ecx = or_(ecx, edx)
edx = imul(edx, edi, 0x0AEDCE)
ecx = xor_(ecx, edx)
edx = mov(edx, eax)
edx = shr(edx, 7)
eax = shl(eax, 7)
edx = or_(edx, eax)
eax = imul(eax, edi, 0x2EF8F)
edx = xor_(edx, eax)
eax = mov(eax, esi)
eax = shr(eax, 9)
esi = shl(esi, 9)
eax = or_(eax, esi)
edi = imul(edi, 0x0DEDC7)
eax = xor_(eax, edi)
esi = mov(esi, ebx)
esi = shr(esi, 3)
ebx = shl(ebx, 3)
esi = or_(esi, ebx)
edi = movsx(esp_0x1A)
ebx = imul(ebx, edi, 0x3039)
esi = xor_(esi, ebx)
ebx = mov(ebx, ecx)
ebx = shr(ebx, 5)
ecx = shl(ecx, 5)
ebx = or_(ebx, ecx)
ecx = imul(ecx, edi, 0x0AEDCE)
ebx = xor_(ebx, ecx)
ecx = mov(ecx, edx)
ecx = shr(ecx, 7)
edx = shl(edx, 7)
ecx = or_(ecx, edx)
edx = imul(edx, edi, 0x2EF8F)
ecx = xor_(ecx, edx)
edx = mov(edx, eax)
edx = shr(edx, 9)
eax = shl(eax, 9)
edx = or_(edx, eax)
edi = imul(edi, 0x0DEDC7)
edx = xor_(edx, edi)
eax = mov(eax, esi)
eax = shr(eax, 3)
esi = shl(esi, 3)
eax = or_(eax, esi)
esi = movsx(esp_0x1B)
edi = imul(edi, esi, 0x3039)
eax = xor_(eax, edi)
edi = mov(edi, ebx)
edi = shr(edi, 5)
ebx = shl(ebx, 5)
edi = or_(edi, ebx)
ebx = imul(ebx, esi, 0x0AEDCE)
edi = xor_(edi, ebx)
ebx = mov(ebx, ecx)
ebx = shr(ebx, 7)
ecx = shl(ecx, 7)
ebx = or_(ebx, ecx)
ecx = imul(ecx, esi, 0x2EF8F)
ebx = xor_(ebx, ecx)
ecx = mov(ecx, edx)
ecx = shr(ecx, 9)
edx = shl(edx, 9)
ecx = or_(ecx, edx)
esi = imul(esi, 0x0DEDC7)
ecx = xor_(ecx, esi)
edx = mov(edx, eax)
edx = shr(edx, 3)
eax = shl(eax, 3)
edx = or_(edx, eax)
esi = movsx(esp_0x1C)
eax = imul(eax, esi, 0x3039)
edx = xor_(edx, eax)
eax = mov(eax, edi)
eax = shr(eax, 5)
edi = shl(edi, 5)
eax = or_(eax, edi)
edi = imul(edi, esi, 0x0AEDCE)
eax = xor_(eax, edi)
edi = mov(edi, ebx)
edi = shr(edi, 7)
ebx = shl(ebx, 7)
edi = or_(edi, ebx)
ebx = imul(ebx, esi, 0x2EF8F)
edi = xor_(edi, ebx)
ebx = mov(ebx, ecx)
ebx = shr(ebx, 9)
ecx = shl(ecx, 9)
ebx = or_(ebx, ecx)
esi = imul(esi, 0x0DEDC7)
ebx = xor_(ebx, esi)
ecx = mov(ecx, edx)
ecx = shr(ecx, 3)
edx = shl(edx, 3)
ecx = or_(ecx, edx)
esi = movsx(esp_0x1D)
edx = imul(edx, esi, 0x3039)
ecx = xor_(ecx, edx)
edx = mov(edx, eax)
edx = shr(edx, 5)
eax = shl(eax, 5)
edx = or_(edx, eax)
eax = imul(eax, esi, 0x0AEDCE)
edx = xor_(edx, eax)
eax = mov(eax, edi)
eax = shr(eax, 7)
edi = shl(edi, 7)
eax = or_(eax, edi)
edi = imul(edi, esi, 0x2EF8F)
eax = xor_(eax, edi)
edi = mov(edi, ebx)
edi = shr(edi, 9)
ebx = shl(ebx, 9)
edi = or_(edi, ebx)
esi = imul(esi, 0x0DEDC7)
edi = xor_(edi, esi)
ebx = mov(ebx, ecx)
ebx = shr(ebx, 3)
ecx = shl(ecx, 3)
ebx = or_(ebx, ecx)
esi = movsx(esp_0x1E)
ecx = imul(ecx, esi, 0x3039)
ebx = xor_(ebx, ecx)
ecx = mov(ecx, edx)
ecx = shr(ecx, 5)
edx = shl(edx, 5)
ecx = or_(ecx, edx)
edx = imul(edx, esi, 0x0AEDCE)
ecx = xor_(ecx, edx)
edx = mov(edx, eax)
edx = shr(edx, 7)
eax = shl(eax, 7)
edx = or_(edx, eax)
eax = imul(eax, esi, 0x2EF8F)
edx = xor_(edx, eax)
eax = mov(eax, edi)
eax = shr(eax, 9)
edi = shl(edi, 9)
eax = or_(eax, edi)
esi = imul(esi, 0x0DEDC7)
eax = xor_(eax, esi)
esi = mov(esi, ebx)
esi = shr(esi, 3)
ebx = shl(ebx, 3)
esi = or_(esi, ebx)
edi = movsx(esp_0x1F)
ebx = mov(ebx, ecx)
ebx = shr(ebx, 5)
ecx = shl(ecx, 5)
ebx = or_(ebx, ecx)
ecx = imul(ecx, edi, 0x0AEDCE)
ebx = xor_(ebx, ecx)
ecx = mov(ecx, edx)
ecx = shr(ecx, 7)
edx = shl(edx, 7)
ecx = or_(ecx, edx)
edx = imul(edx, edi, 0x2EF8F)
ecx = xor_(ecx, edx)
edx = mov(edx, eax)
edx = shr(edx, 9)
eax = shl(eax, 9)
edx = or_(edx, eax)
eax = imul(eax, edi, 0x0DEDC7)
edx = xor_(edx, eax)
edi = imul(edi, 0x3039)
esi = xor_(esi, edi)
#print simplify(esi)
s = Solver()
s.add(esi == 0xFFF4A1CE)
s.add(ebx == 0xB5A4A9A7)
s.add(ecx == 0xF05A945C)
s.add(edx == 0x9504A82D)
s.add(esp_0x10 >= 32, esp_0x10 <= 126)
s.add(esp_0x11 >= 32, esp_0x11 <= 126)
s.add(esp_0x12 >= 32, esp_0x12 <= 126)
s.add(esp_0x13 >= 32, esp_0x13 <= 126)
s.add(esp_0x14 >= 32, esp_0x14 <= 126)
s.add(esp_0x15 >= 32, esp_0x15 <= 126)
s.add(esp_0x16 >= 32, esp_0x16 <= 126)
s.add(esp_0x17 >= 32, esp_0x17 <= 126)
s.add(esp_0x18 >= 32, esp_0x18 <= 126)
s.add(esp_0x19 >= 32, esp_0x19 <= 126)
s.add(esp_0x1A >= 32, esp_0x1A <= 126)
s.add(esp_0x1B >= 32, esp_0x1B <= 126)
s.add(esp_0x1C >= 32, esp_0x1C <= 126)
s.add(esp_0x1D >= 32, esp_0x1D <= 126)
s.add(esp_0x1E >= 32, esp_0x1E <= 126)
s.add(esp_0x1F >= 32, esp_0x1F <= 126)
s.check()
print s.model()
calc()
| 24.197002 | 43 | 0.544867 | 1,926 | 11,300 | 3.089304 | 0.03946 | 0.035294 | 0.024202 | 0.02521 | 0.725882 | 0.689076 | 0.684706 | 0.684706 | 0.656134 | 0.656134 | 0 | 0.086348 | 0.285664 | 11,300 | 466 | 44 | 24.248927 | 0.650768 | 0.001681 | 0 | 0.739421 | 0 | 0 | 0.01378 | 0 | 0 | 0 | 0.050402 | 0 | 0 | 0 | null | null | 0 | 0.002227 | null | null | 0.002227 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
617acae417a764a9156239c61c9cc59f10cf60e7 | 4,725 | py | Python | tests/models/torch/q_functions/test_qr_q_function.py | ningyixue/AIPI530_Final_Project | b95353ffd003692a37a59042dfcd744a18b7e802 | [
"MIT"
] | 565 | 2020-08-01T02:44:28.000Z | 2022-03-30T15:00:54.000Z | tests/models/torch/q_functions/test_qr_q_function.py | ningyixue/AIPI530_Final_Project | b95353ffd003692a37a59042dfcd744a18b7e802 | [
"MIT"
] | 144 | 2020-08-01T03:45:10.000Z | 2022-03-30T14:51:16.000Z | tests/models/torch/q_functions/test_qr_q_function.py | ningyixue/AIPI530_Final_Project | b95353ffd003692a37a59042dfcd744a18b7e802 | [
"MIT"
] | 103 | 2020-08-26T13:27:34.000Z | 2022-03-31T12:24:27.000Z | import numpy as np
import pytest
import torch
from d3rlpy.models.torch import ContinuousQRQFunction, DiscreteQRQFunction
from d3rlpy.models.torch.q_functions.qr_q_function import _make_taus
from d3rlpy.models.torch.q_functions.utility import (
pick_quantile_value_by_action,
)
from ..model_test import (
DummyEncoder,
check_parameter_updates,
ref_quantile_huber_loss,
)
@pytest.mark.parametrize("feature_size", [100])
@pytest.mark.parametrize("action_size", [2])
@pytest.mark.parametrize("n_quantiles", [200])
@pytest.mark.parametrize("batch_size", [32])
@pytest.mark.parametrize("gamma", [0.99])
def test_discrete_qr_q_function(
feature_size, action_size, n_quantiles, batch_size, gamma
):
encoder = DummyEncoder(feature_size)
q_func = DiscreteQRQFunction(encoder, action_size, n_quantiles)
# check output shape
x = torch.rand(batch_size, feature_size)
y = q_func(x)
assert y.shape == (batch_size, action_size)
# check taus
taus = _make_taus(encoder(x), n_quantiles)
step = 1 / n_quantiles
for i in range(n_quantiles):
assert np.allclose(taus[0][i].numpy(), i * step + step / 2.0)
# check compute_target
action = torch.randint(high=action_size, size=(batch_size,))
target = q_func.compute_target(x, action)
assert target.shape == (batch_size, n_quantiles)
# check compute_target with action=None
targets = q_func.compute_target(x)
assert targets.shape == (batch_size, action_size, n_quantiles)
# check quantile huber loss
obs_t = torch.rand(batch_size, feature_size)
act_t = torch.randint(action_size, size=(batch_size,))
rew_tp1 = torch.rand(batch_size, 1)
q_tp1 = torch.rand(batch_size, n_quantiles)
ter_tp1 = torch.randint(2, size=(batch_size, 1))
# shape check
loss = q_func.compute_error(
obs_t, act_t, rew_tp1, q_tp1, ter_tp1, reduction="none"
)
assert loss.shape == (batch_size, 1)
# mean loss
loss = q_func.compute_error(obs_t, act_t, rew_tp1, q_tp1, ter_tp1)
target = rew_tp1.numpy() + gamma * q_tp1.numpy() * (1 - ter_tp1.numpy())
y = pick_quantile_value_by_action(
q_func._compute_quantiles(encoder(obs_t), taus), act_t
)
reshaped_target = np.reshape(target, (batch_size, -1, 1))
reshaped_y = np.reshape(y.detach().numpy(), (batch_size, 1, -1))
reshaped_taus = np.reshape(taus, (1, 1, -1))
ref_loss = ref_quantile_huber_loss(
reshaped_y, reshaped_target, reshaped_taus, n_quantiles
)
assert np.allclose(loss.cpu().detach(), ref_loss.mean())
# check layer connection
check_parameter_updates(q_func, (obs_t, act_t, rew_tp1, q_tp1, ter_tp1))
@pytest.mark.parametrize("feature_size", [100])
@pytest.mark.parametrize("action_size", [2])
@pytest.mark.parametrize("n_quantiles", [200])
@pytest.mark.parametrize("batch_size", [32])
@pytest.mark.parametrize("gamma", [0.99])
def test_continuous_qr_q_function(
feature_size, action_size, n_quantiles, batch_size, gamma
):
encoder = DummyEncoder(feature_size, action_size, concat=True)
q_func = ContinuousQRQFunction(encoder, n_quantiles)
# check output shape
x = torch.rand(batch_size, feature_size)
action = torch.rand(batch_size, action_size)
y = q_func(x, action)
assert y.shape == (batch_size, 1)
# check taus
taus = _make_taus(encoder(x, action), n_quantiles)
step = 1 / n_quantiles
for i in range(n_quantiles):
assert np.allclose(taus[0][i].numpy(), i * step + step / 2.0)
target = q_func.compute_target(x, action)
assert target.shape == (batch_size, n_quantiles)
# check quantile huber loss
obs_t = torch.rand(batch_size, feature_size)
act_t = torch.rand(batch_size, action_size)
rew_tp1 = torch.rand(batch_size, 1)
q_tp1 = torch.rand(batch_size, n_quantiles)
ter_tp1 = torch.randint(2, size=(batch_size, 1))
# check shape
loss = q_func.compute_error(
obs_t, act_t, rew_tp1, q_tp1, ter_tp1, reduction="none"
)
assert loss.shape == (batch_size, 1)
# mean loss
loss = q_func.compute_error(obs_t, act_t, rew_tp1, q_tp1, ter_tp1)
target = rew_tp1.numpy() + gamma * q_tp1.numpy() * (1 - ter_tp1.numpy())
y = q_func._compute_quantiles(encoder(obs_t, act_t), taus).detach().numpy()
reshaped_target = target.reshape((batch_size, -1, 1))
reshaped_y = y.reshape((batch_size, 1, -1))
reshaped_taus = taus.reshape((1, 1, -1))
ref_loss = ref_quantile_huber_loss(
reshaped_y, reshaped_target, reshaped_taus, n_quantiles
)
assert np.allclose(loss.cpu().detach(), ref_loss.mean())
# check layer connection
check_parameter_updates(q_func, (obs_t, act_t, rew_tp1, q_tp1, ter_tp1))
| 35 | 79 | 0.699471 | 700 | 4,725 | 4.427143 | 0.125714 | 0.084221 | 0.035495 | 0.058083 | 0.815424 | 0.748629 | 0.681187 | 0.64182 | 0.64182 | 0.64182 | 0 | 0.023112 | 0.175873 | 4,725 | 134 | 80 | 35.261194 | 0.772727 | 0.055026 | 0 | 0.536082 | 0 | 0 | 0.02382 | 0 | 0 | 0 | 0 | 0 | 0.113402 | 1 | 0.020619 | false | 0 | 0.072165 | 0 | 0.092784 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
619ba5260e66d9abf9514a7f5139f96e733a0a46 | 4,085 | py | Python | src/lybica/loader.py | protone/lybica-runner | 185b3fc4ffc7a6e066baa6c2f030e109db97b4b1 | [
"MIT"
] | null | null | null | src/lybica/loader.py | protone/lybica-runner | 185b3fc4ffc7a6e066baa6c2f030e109db97b4b1 | [
"MIT"
] | 3 | 2015-11-04T09:43:35.000Z | 2015-11-12T09:41:21.000Z | src/lybica/loader.py | protone/lybica-runner | 185b3fc4ffc7a6e066baa6c2f030e109db97b4b1 | [
"MIT"
] | null | null | null | from .executor import ScriptExecutor
import logging
SCRIPT_CONFIG = [
# health check script before install package
{
"name": 'install_package_pre_check',
"search_path": ["${TESTCASE_PATH}/AreaCI/${PID}/install_package_pre_check",
"${BRANCH_ROOT}/PlatformCI/${PID}/${PRODUCT}/${TEST_TYPE}/install_package_pre_check",
"${BRANCH_ROOT}/PlatformCI/${PID}/${PRODUCT}/install_package_pre_check",
"${BRANCH_ROOT}/PlatformCI/${PID}/install_package_pre_check",
"${BRANCH_ROOT}/PlatformCI/system/install_package_pre_check",],
"failed_actions": ['action_pate_to_maintaining', 'stop_actions'],
"success_actions": ['no_action', ],
},
# health check script after install package
{
"name": 'install_package_post_check',
"search_path": ["${TESTCASE_PATH}/AreaCI/${PID}/install_package_post_check",
"${BRANCH_ROOT}/PlatformCI/${PID}/${PRODUCT}/${TEST_TYPE}/install_package_post_check",
"${BRANCH_ROOT}/PlatformCI/${PID}/${PRODUCT}/install_package_post_check",
"${BRANCH_ROOT}/PlatformCI/${PID}/install_package_post_check",
"${BRANCH_ROOT}/PlatformCI/system/install_package_post_check", ],
"failed_actions": ['action_pate_to_maintaining', 'stop_actions'],
"success_actions": ['no_action', ],
},
{
"name": 'health_check_before_crt',
"search_path": ["${TESTCASE_PATH}/AreaCI/${PID}/health_check_before_crt",
"${BRANCH_ROOT}/PlatformCI/${PID}/${PRODUCT}/${TEST_TYPE}/health_check_before_crt",
"${BRANCH_ROOT}/PlatformCI/${PID}/${PRODUCT}/health_check_before_crt",
"${BRANCH_ROOT}/PlatformCI/${PID}/health_check_before_crt",
"${BRANCH_ROOT}/PlatformCI/system/health_check_before_crt",
],
"failed_actions": ['action_pate_to_maintaining', 'stop_actions'],
"expired_actions": ['action_pkg_to_expired', 'stop_actions'],
"success_actions": ['no_action', ],
},
{
"name": 'health_check_after_crt',
"search_path": ["${TESTCASE_PATH}/AreaCI/${PID}/health_check_after_crt",
"${BRANCH_ROOT}/PlatformCI/${PID}/${PRODUCT}/${TEST_TYPE}/health_check_after_crt",
"${BRANCH_ROOT}/PlatformCI/${PID}/${PRODUCT}/health_check_after_crt",
"${BRANCH_ROOT}/PlatformCI/${PID}/health_check_after_crt",
"${BRANCH_ROOT}/PlatformCI/system/health_check_after_crt",
],
"failed_actions": ['action_pate_to_maintaining_and_task_error' ],
"expired_actions": ['action_pkg_to_expired', 'stop_actions'],
"success_actions": ['no_action', ],
},
{
"name": 'health_check_after_run_case',
"search_path": ["${TESTCASE_PATH}/AreaCI/${PID}/health_check_after_run_case",
"${BRANCH_ROOT}/PlatformCI/${PID}/${PRODUCT}/health_check_after_run_case",
"${BRANCH_ROOT}/PlatformCI/${PID}/health_check_after_run_case",
"${BRANCH_ROOT}/PlatformCI/system/health_check_after_run_case",
],
"failed_actions": ['no_action', ],
"expired_actions": ['no_action', ],
"success_actions": ['no_action', ],
},
]
class ScriptWraper(object):
def __init__(self, configed_scripts=[]):
self.configed_scripts = []
def run_script(self, context, name, param={}, check_in_config=True):
if check_in_config and not name in self.configed_scripts:
logging.info("The external scripts '%s' does not configured to run." % name)
return
if not hasattr(self, name):
logging.info("The external scripts '%s' does not supported by ipaci." % name)
return
return getattr(self, name)(context, param)
def load_scripts():
wrapper = ScriptWraper()
for param in SCRIPT_CONFIG:
name = param['name']
setattr(wrapper, name, ScriptExecutor(param))
return wrapper
| 47.5 | 107 | 0.622766 | 436 | 4,085 | 5.408257 | 0.172018 | 0.088634 | 0.161154 | 0.136556 | 0.749788 | 0.716709 | 0.716709 | 0.656064 | 0.457591 | 0.227311 | 0 | 0 | 0.228641 | 4,085 | 85 | 108 | 48.058824 | 0.748334 | 0.020563 | 0 | 0.197368 | 0 | 0 | 0.576432 | 0.451589 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039474 | false | 0 | 0.026316 | 0 | 0.118421 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
61c7129583ba4b43eb7ad3e93b2aa91fac2d4fa7 | 372 | py | Python | djangae/contrib/locking/kinds.py | bocribbz/djangae | 8a118d755cd707e6a452593050c25d790edde944 | [
"BSD-3-Clause"
] | 467 | 2015-01-02T22:35:37.000Z | 2022-02-22T23:13:36.000Z | djangae/contrib/locking/kinds.py | bocribbz/djangae | 8a118d755cd707e6a452593050c25d790edde944 | [
"BSD-3-Clause"
] | 743 | 2015-01-02T15:55:34.000Z | 2021-01-29T09:43:19.000Z | djangae/contrib/locking/kinds.py | bocribbz/djangae | 8a118d755cd707e6a452593050c25d790edde944 | [
"BSD-3-Clause"
] | 154 | 2015-01-01T17:05:59.000Z | 2021-12-09T06:40:07.000Z |
class LOCK_KINDS(object):
""" The different kinds of lock which you can use.
WEAK is not guaranteed to be robust, but can be used for situations where avoiding
simultaneous code execution is preferable but not critical.
STRONG is for where preventing simultaneous code execution is *required*.
"""
WEAK = 'weak'
STRONG = 'strong'
| 33.818182 | 90 | 0.69086 | 50 | 372 | 5.12 | 0.64 | 0.125 | 0.195313 | 0.210938 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.252688 | 372 | 10 | 91 | 37.2 | 0.920863 | 0.706989 | 0 | 0 | 0 | 0 | 0.140845 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
f6036eaf498e2f4f6785b62692e626d23ee553a9 | 1,075 | py | Python | indy_common/test/auth/multi_sig/test_auth_multi_sig_for_1_owner.py | NeolithEra/indy-node | c1f5ee8643a19d84b06cbb16347df845fa60bdb0 | [
"Apache-2.0"
] | null | null | null | indy_common/test/auth/multi_sig/test_auth_multi_sig_for_1_owner.py | NeolithEra/indy-node | c1f5ee8643a19d84b06cbb16347df845fa60bdb0 | [
"Apache-2.0"
] | 1 | 2019-02-07T18:11:15.000Z | 2019-02-07T18:14:06.000Z | indy_common/test/auth/multi_sig/test_auth_multi_sig_for_1_owner.py | NeolithEra/indy-node | c1f5ee8643a19d84b06cbb16347df845fa60bdb0 | [
"Apache-2.0"
] | null | null | null | import pytest
from indy_common.authorize.auth_constraints import AuthConstraint, IDENTITY_OWNER
@pytest.fixture(scope='module')
def write_auth_req_validator(write_auth_req_validator, key):
write_auth_req_validator.auth_cons_strategy.get_auth_constraint = lambda a: AuthConstraint(IDENTITY_OWNER, 1)
return write_auth_req_validator
def test_claim_def_adding_success_1_owner(write_request_validation, req,
identity_owners, key):
req.signatures = {identity_owners[0]: "signature"}
assert write_request_validation(req, [key])
def test_claim_def_adding_success_2_owner(write_request_validation, req,
identity_owners, key):
req.signatures = {idr: "signature" for idr in identity_owners[:2]}
assert write_request_validation(req, [key])
def test_claim_def_adding_fail_1_trustee(write_request_validation, req,
trustees, key):
req.signatures = {trustees[0]: "signature"}
assert not write_request_validation(req, [key])
| 38.392857 | 113 | 0.713488 | 131 | 1,075 | 5.442748 | 0.351145 | 0.100982 | 0.185133 | 0.210379 | 0.41094 | 0.371669 | 0.322581 | 0.322581 | 0.322581 | 0.322581 | 0 | 0.008245 | 0.210233 | 1,075 | 27 | 114 | 39.814815 | 0.831567 | 0 | 0 | 0.222222 | 0 | 0 | 0.030698 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.388889 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
f603fa533fa7372d6f36ecb1fb1141ca4cbdca2b | 53 | py | Python | code/answer_4-2-5.py | KoyanagiHitoshi/AtCoder-Python-Introduction | 6d014e333a873f545b4d32d438e57cf428b10b96 | [
"MIT"
] | 1 | 2022-03-29T13:50:12.000Z | 2022-03-29T13:50:12.000Z | code/answer_4-2-5.py | KoyanagiHitoshi/AtCoder-Python-Introduction | 6d014e333a873f545b4d32d438e57cf428b10b96 | [
"MIT"
] | null | null | null | code/answer_4-2-5.py | KoyanagiHitoshi/AtCoder-Python-Introduction | 6d014e333a873f545b4d32d438e57cf428b10b96 | [
"MIT"
] | null | null | null | S = input()
print(S+"s" if S[-1] != "s" else S+"es")
| 17.666667 | 40 | 0.471698 | 12 | 53 | 2.083333 | 0.583333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023256 | 0.188679 | 53 | 2 | 41 | 26.5 | 0.55814 | 0 | 0 | 0 | 0 | 0 | 0.075472 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
f6044ea76ca13f4f72c527f9840f77894b5d9897 | 693 | py | Python | computer-truevalue.py | yjnanan/Model_Free_Prediction | 135762a330ce6879973f005f370c886e53491922 | [
"Apache-2.0"
] | null | null | null | computer-truevalue.py | yjnanan/Model_Free_Prediction | 135762a330ce6879973f005f370c886e53491922 | [
"Apache-2.0"
] | null | null | null | computer-truevalue.py | yjnanan/Model_Free_Prediction | 135762a330ce6879973f005f370c886e53491922 | [
"Apache-2.0"
] | null | null | null | import numpy as np
def state_value_function(p,r,v):
gamma=0.9
while True:
value=r.T+gamma*p*v
if(value==v).all():
break
v=value
return v
if __name__ =='__main__':
#c1 c2 c3 pass pub fb sleep
#probability matrix
P_matrix=np.mat([[0,0,0,0,0,0,0],
[0.5,0,0.5,0,0,0,0],
[0,0.5,0,0.5,0,0,0],
[0,0,0.5,0,0.5,0,0],
[0,0,0,0.5,0,0.5,0],
[0,0,0,0,0.5,0,0.5],
[0,0,0,0,0,0,0]])
R_matrix=np.mat([0,0,0,0,0,0,1])
v_function=np.mat(np.zeros((7,1)))
print(state_value_function(P_matrix,R_matrix,v_function)) | 28.875 | 61 | 0.464646 | 135 | 693 | 2.251852 | 0.281481 | 0.282895 | 0.305921 | 0.315789 | 0.286184 | 0.286184 | 0.286184 | 0.282895 | 0.282895 | 0.184211 | 0 | 0.159737 | 0.340548 | 693 | 24 | 61 | 28.875 | 0.50547 | 0.063492 | 0 | 0 | 0 | 0 | 0.012346 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.05 | 0 | 0.15 | 0.05 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
f607b3974a5c6166e58b2e10c377a1e3e486d89a | 363 | py | Python | users/models.py | AhteshamSid/Blog-Post-Django | 12cfe49f9909b3f35decda396626bcd010fabc74 | [
"MIT"
] | null | null | null | users/models.py | AhteshamSid/Blog-Post-Django | 12cfe49f9909b3f35decda396626bcd010fabc74 | [
"MIT"
] | null | null | null | users/models.py | AhteshamSid/Blog-Post-Django | 12cfe49f9909b3f35decda396626bcd010fabc74 | [
"MIT"
] | null | null | null | from django.db import models
from django.contrib.auth.models import User
from PIL import Image
# Create your models here.
class Profile(models.Model):
user = models.OneToOneField(User,on_delete=models.CASCADE)
image = models.ImageField(default='default.jpg', upload_to='profile_pics')
def __str__(self):
return self.user.username
| 30.25 | 79 | 0.732782 | 49 | 363 | 5.285714 | 0.632653 | 0.07722 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173554 | 363 | 11 | 80 | 33 | 0.863333 | 0.066116 | 0 | 0 | 0 | 0 | 0.070769 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0.125 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 3 |
f60fa01fcca07216f2bcd4f1876c31853730ce6d | 1,151 | py | Python | visual_mpc/envs/offline_env.py | Asap7772/visual_foresight | 13c631dc76ca1b61d7159473b3f2207ce2a3da04 | [
"MIT"
] | null | null | null | visual_mpc/envs/offline_env.py | Asap7772/visual_foresight | 13c631dc76ca1b61d7159473b3f2207ce2a3da04 | [
"MIT"
] | null | null | null | visual_mpc/envs/offline_env.py | Asap7772/visual_foresight | 13c631dc76ca1b61d7159473b3f2207ce2a3da04 | [
"MIT"
] | null | null | null | from visual_mpc.envs.base_env import BaseEnv
class OfflineSawyerEnv(BaseEnv):
"""
Emulates a real-image Sawyer Env without access to robot, only works together with the Offline Agent!
"""
def __init__(self, env_params_dict, reset_state=None):
self._hp = self._default_hparams()
self._adim, self._sdim = 4, 4
def _default_hparams(self):
default_dict = {}
parent_params = super()._default_hparams()
for k in default_dict.keys():
parent_params.add_hparam(k, default_dict[k])
return parent_params
def step(self, action):
"""
return None, the offline agent will append loaded observations
:param action:
:return:
"""
return None
def reset(self):
return self.step(None), None
def valid_rollout(self):
return True
@property
def adim(self):
return self._adim
@property
def sdim(self):
return self._sdim
@property
def ncam(self):
return 1
@property
def num_objects(self):
return 1
| 23.489796 | 106 | 0.588184 | 135 | 1,151 | 4.8 | 0.474074 | 0.092593 | 0.064815 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005181 | 0.329279 | 1,151 | 48 | 107 | 23.979167 | 0.834197 | 0.163336 | 0 | 0.206897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.310345 | false | 0 | 0.034483 | 0.206897 | 0.655172 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
f62b5eeb8c988346be283fc9dfbe7ef972666e1e | 713 | py | Python | src/controllers/out.py | marijadebe/lab | b3b213116b25fc89db374f6e61a2513508d00934 | [
"MIT"
] | null | null | null | src/controllers/out.py | marijadebe/lab | b3b213116b25fc89db374f6e61a2513508d00934 | [
"MIT"
] | null | null | null | src/controllers/out.py | marijadebe/lab | b3b213116b25fc89db374f6e61a2513508d00934 | [
"MIT"
] | null | null | null | import sys
import re
from models.types import Types
def out(treechildren, variables):
print("")
for j in range(len(treechildren.children)):
if treechildren.getChild(j).getToken().getType() == Types.STRING:
sys.stdout.write(treechildren.getChild(j).getToken().getValue())
elif treechildren.getChild(j).getToken().getType() == Types.IDENTIFIER:
sys.stdout.write(variables[treechildren.getChild(j).getToken().getValue()])
elif treechildren.getChild(j).getToken().getType() == Types.ARGUMENT:
val = re.findall('[0-9]+',treechildren.getChild(j).getToken().getValue())
val = int(val[0])
sys.stdout.write(str(sys.argv[val+2])) | 44.5625 | 87 | 0.653576 | 84 | 713 | 5.547619 | 0.428571 | 0.257511 | 0.270386 | 0.373391 | 0.519313 | 0.439914 | 0.351931 | 0.351931 | 0.351931 | 0.351931 | 0 | 0.006861 | 0.182328 | 713 | 16 | 88 | 44.5625 | 0.792453 | 0 | 0 | 0 | 0 | 0 | 0.008403 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.214286 | 0 | 0.285714 | 0.071429 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
f63d7e029bf5a5228b9983d3d3b72e181cfc36c8 | 2,313 | py | Python | tests/test_prometheus_querybuilder.py | m-chrome/prometheus-query-builder | 14c391346b99742909265fc00b0e620fc0d6134c | [
"MIT"
] | null | null | null | tests/test_prometheus_querybuilder.py | m-chrome/prometheus-query-builder | 14c391346b99742909265fc00b0e620fc0d6134c | [
"MIT"
] | null | null | null | tests/test_prometheus_querybuilder.py | m-chrome/prometheus-query-builder | 14c391346b99742909265fc00b0e620fc0d6134c | [
"MIT"
] | null | null | null | from prometheus_query_builder.query import Query
import pytest
def test_query():
""" Test query with only metric name """
query = Query("http_requests_total")
assert str(query) == "http_requests_total"
def test_query_with_label():
query = Query("http_requests_total")
query.add_label("environment", "production")
assert str(query) == 'http_requests_total{environment="production"}'
def test_query_with_label_operators():
query = Query("http_requests_total")
query.add_label("environment", "production", "!=")
assert str(query) == 'http_requests_total{environment!="production"}'
def test_query_with_unsupported_operator():
query = Query("http_requests_total")
with pytest.raises(ValueError):
query.add_label("environment", "production", "!===")
def test_query_with_labels():
query = Query("http_requests_total")
query.add_label("environment", "production")
query.add_label("method", "GET")
assert str(query) == 'http_requests_total{environment="production",method="GET"}'
def test_query_with_label_update():
query = Query("http_requests_total")
query.add_label("environment", "production")
query.add_label("environment", "stage")
assert str(query) == 'http_requests_total{environment="stage"}'
def test_query_remove_label():
query = Query("http_requests_total")
query.add_label("environment", "production")
assert len(query.labels) == 1
query.remove_label("environment")
assert len(query.labels) == 0
def test_query_with_time_duration():
query = Query("http_requests_total")
query.add_label("environment", "production")
query.add_time_duration("5m")
assert str(query) == 'http_requests_total{environment="production"}[5m]'
def test_query_with_offset():
query = Query("http_requests_total")
query.add_offset("5m")
assert str(query) == 'http_requests_total offset 5m'
def test_query_with_at_modifier():
query = Query("http_requests_total")
query.add_at_modifier("1609746000")
assert str(query) == 'http_requests_total @ 1609746000'
def test_query_with_time_modifiers():
query = Query("http_requests_total")
query.add_offset("5m")
query.add_at_modifier("1609746000")
assert str(query) == 'http_requests_total offset 5m @ 1609746000'
| 30.434211 | 85 | 0.722006 | 288 | 2,313 | 5.461806 | 0.149306 | 0.114431 | 0.216147 | 0.27972 | 0.78131 | 0.664971 | 0.621742 | 0.576605 | 0.48061 | 0.425938 | 0 | 0.02423 | 0.143537 | 2,313 | 75 | 86 | 30.84 | 0.769813 | 0.013835 | 0 | 0.392157 | 0 | 0 | 0.344919 | 0.104707 | 0 | 0 | 0 | 0 | 0.215686 | 1 | 0.215686 | false | 0 | 0.039216 | 0 | 0.254902 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
f6682ef64e0abbac8e8a75420b895edbc5772431 | 250 | py | Python | OOPs/inheritance.py | Shivams9/pythoncodecamp | e6cd27f4704a407ee360414a8c9236b254117a59 | [
"MIT"
] | 6 | 2021-08-04T08:15:22.000Z | 2022-02-02T11:15:56.000Z | OOPs/inheritance.py | Maurya232Abhishek/Python-repository-for-basics | 3dcec5c529a0847df07c9dcc1424675754ce6376 | [
"MIT"
] | 14 | 2021-08-02T06:28:00.000Z | 2022-03-25T10:44:15.000Z | OOPs/inheritance.py | Maurya232Abhishek/Python-repository-for-basics | 3dcec5c529a0847df07c9dcc1424675754ce6376 | [
"MIT"
] | 6 | 2021-07-16T04:56:41.000Z | 2022-02-16T04:40:06.000Z | class A:
def f1(self):
print("F1 in A")
class B:
def f2(self):
print("F2 in B")
def f1(self):
print("F1 in B")
class C(A,B):
def f(self):
A().f1()
B().f1()
c = C()
c.f1()
c.f2()
c.f()
| 13.157895 | 24 | 0.404 | 44 | 250 | 2.295455 | 0.25 | 0.267327 | 0.178218 | 0.277228 | 0.356436 | 0.356436 | 0 | 0 | 0 | 0 | 0 | 0.065359 | 0.388 | 250 | 18 | 25 | 13.888889 | 0.594771 | 0 | 0 | 0.125 | 0 | 0 | 0.084 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.4375 | 0.1875 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9c9d85a146755f67b9bdd4855fe645173983f2c8 | 223 | py | Python | tests/run_tests.py | in-toto/layout-web-tool | cbadd131a692e8457d548da2d37012306ba4a7b9 | [
"MIT"
] | 1 | 2020-04-01T15:05:54.000Z | 2020-04-01T15:05:54.000Z | tests/run_tests.py | in-toto/layout-web-tool | cbadd131a692e8457d548da2d37012306ba4a7b9 | [
"MIT"
] | 42 | 2017-05-23T17:19:19.000Z | 2021-04-26T12:28:47.000Z | tests/run_tests.py | in-toto/layout-web-tool | cbadd131a692e8457d548da2d37012306ba4a7b9 | [
"MIT"
] | 7 | 2017-05-04T02:13:07.000Z | 2020-07-09T10:56:03.000Z | from unittest import defaultTestLoader, TextTestRunner
import sys
suite = defaultTestLoader.discover(start_dir=".")
result = TextTestRunner(verbosity=2, buffer=True).run(suite)
sys.exit(0 if result.wasSuccessful() else 1)
| 31.857143 | 60 | 0.802691 | 28 | 223 | 6.357143 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014778 | 0.089686 | 223 | 6 | 61 | 37.166667 | 0.862069 | 0 | 0 | 0 | 0 | 0 | 0.004484 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
9cbc11953dd73fb6f511c74d730fad379ee2138f | 91 | py | Python | config.py | agithug/fan-controller | 046ecd9120c3a3fa0c581cbcd3dc89bd50a7373c | [
"MIT"
] | null | null | null | config.py | agithug/fan-controller | 046ecd9120c3a3fa0c581cbcd3dc89bd50a7373c | [
"MIT"
] | null | null | null | config.py | agithug/fan-controller | 046ecd9120c3a3fa0c581cbcd3dc89bd50a7373c | [
"MIT"
] | null | null | null | #used as a configuration file
def fan:
pin = 4
dependencies = [“python3”,”gpiozero”]
| 13 | 37 | 0.67033 | 12 | 91 | 5.083333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027778 | 0.208791 | 91 | 6 | 38 | 15.166667 | 0.819444 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9cc696533b798f11c9e94346fe3734171f19a56d | 328 | py | Python | api/common/JQSDK_Connect.py | abcdcamey/stock-data | bfdc67e60b7d4de59c66dbb52159574b4e0a5e51 | [
"MIT"
] | 1 | 2019-04-10T09:07:59.000Z | 2019-04-10T09:07:59.000Z | api/common/JQSDK_Connect.py | abcdcamey/stock-data | bfdc67e60b7d4de59c66dbb52159574b4e0a5e51 | [
"MIT"
] | 2 | 2021-06-01T23:39:34.000Z | 2021-12-13T19:58:31.000Z | api/common/JQSDK_Connect.py | abcdcamey/stock-data | bfdc67e60b7d4de59c66dbb52159574b4e0a5e51 | [
"MIT"
] | null | null | null | #coding=utf-8
from jqdatasdk import *
class JQSDK_Connect:
def __init__(self):
#auth('18818273592','Chenmin585858') #账号是申请时所填写的手机号;密码为聚宽官网登录密码,新申请用户默认为手机号后6位
auth('15821295829', 'Niuniuniu2020')
#auth('13816896174', '896174')
def get_query_count(self):
return get_query_count()
| 16.4 | 86 | 0.670732 | 33 | 328 | 6.393939 | 0.787879 | 0.075829 | 0.123223 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.196911 | 0.210366 | 328 | 19 | 87 | 17.263158 | 0.617761 | 0.356707 | 0 | 0 | 0 | 0 | 0.119403 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0.166667 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
9cd61c204b329ca886ff80f063d2527cb83d49ab | 462 | py | Python | sanskrit_parser/generator/test/test_ajanta_stri.py | avinashvarna/sanskrit_parser | 79338213128b29927fe2f06031379bb1e3864a83 | [
"MIT"
] | 54 | 2017-06-30T09:11:53.000Z | 2022-03-22T15:35:41.000Z | sanskrit_parser/generator/test/test_ajanta_stri.py | avinashvarna/sanskrit_parser | 79338213128b29927fe2f06031379bb1e3864a83 | [
"MIT"
] | 159 | 2017-06-30T07:04:36.000Z | 2021-06-17T17:03:43.000Z | sanskrit_parser/generator/test/test_ajanta_stri.py | avinashvarna/sanskrit_parser | 79338213128b29927fe2f06031379bb1e3864a83 | [
"MIT"
] | 18 | 2017-08-17T13:22:00.000Z | 2022-01-20T01:08:58.000Z | from sanskrit_parser.generator.pratyaya import * # noqa: F403, F401
from sanskrit_parser.generator.dhatu import * # noqa: F403, F401
from sanskrit_parser.generator.pratipadika import * # noqa: F403, F401
from sanskrit_parser.base.sanskrit_base import DEVANAGARI
from sanskrit_parser.generator.sutras_yaml import sutra_list
from conftest import run_test
def test_vibhakti_ajanta_stri(ajanta_stri):
run_test(ajanta_stri, sutra_list, encoding=DEVANAGARI)
| 38.5 | 71 | 0.824675 | 64 | 462 | 5.703125 | 0.390625 | 0.164384 | 0.246575 | 0.29589 | 0.345205 | 0.345205 | 0.345205 | 0.246575 | 0 | 0 | 0 | 0.043796 | 0.11039 | 462 | 11 | 72 | 42 | 0.844282 | 0.108225 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.75 | 0 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
9cec9b5ef5691da4f654da65ea5fb44ee14018a8 | 63 | py | Python | certifi/__init__.py | andrew-aladev/certifi-shim | efbda777bfbdcf0676a789b15010b43a54e4da1a | [
"CC0-1.0"
] | null | null | null | certifi/__init__.py | andrew-aladev/certifi-shim | efbda777bfbdcf0676a789b15010b43a54e4da1a | [
"CC0-1.0"
] | null | null | null | certifi/__init__.py | andrew-aladev/certifi-shim | efbda777bfbdcf0676a789b15010b43a54e4da1a | [
"CC0-1.0"
] | null | null | null | from certifi.core import contents, where
__version__ = "9999"
| 15.75 | 40 | 0.777778 | 8 | 63 | 5.625 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 0.142857 | 63 | 3 | 41 | 21 | 0.759259 | 0 | 0 | 0 | 0 | 0 | 0.063492 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
9cf34c2b4493cdfb7aff2d968769acb0ab4ee469 | 578 | py | Python | song_generator/tests.py | gorbulus/song_generator | ba527e7a0151177f794995d0d79fdeffff45b7fb | [
"MIT"
] | null | null | null | song_generator/tests.py | gorbulus/song_generator | ba527e7a0151177f794995d0d79fdeffff45b7fb | [
"MIT"
] | null | null | null | song_generator/tests.py | gorbulus/song_generator | ba527e7a0151177f794995d0d79fdeffff45b7fb | [
"MIT"
] | null | null | null | '''
# tests.py
# William Ponton
# 5.9.21
# Tests for the song_generator project
'''
# import pacagkes
import song_generator.generator as gen
import song_generator.song_data as s_data
import song_generator.string_literals as s_lit
from song_generator.SongClass import Song # Song class
# test output from each import file
def test_import_files():
print("\n...testing...")
print("Hello world ~ from main.py")
gen.test_output()
s_data.test_output()
Song.test_output()
s_lit.test_output()
print("Hello world ~ from tests.py")
return print("...testing completed...") | 26.272727 | 54 | 0.743945 | 88 | 578 | 4.693182 | 0.431818 | 0.157385 | 0.138015 | 0.09201 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008032 | 0.138408 | 578 | 22 | 55 | 26.272727 | 0.821285 | 0.237024 | 0 | 0 | 0 | 0 | 0.210648 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | true | 0 | 0.384615 | 0 | 0.538462 | 0.307692 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
1404d7c78ab63ab6c44fb7aef71aaac733eb3dc8 | 578 | py | Python | install_imports.py | surajssd/kube-hunter | 0157ac83ce6a735d4a431e9e3a6d4ec847aa6fc1 | [
"Apache-2.0"
] | 1 | 2019-09-25T12:31:33.000Z | 2019-09-25T12:31:33.000Z | install_imports.py | surajssd/kube-hunter | 0157ac83ce6a735d4a431e9e3a6d4ec847aa6fc1 | [
"Apache-2.0"
] | null | null | null | install_imports.py | surajssd/kube-hunter | 0157ac83ce6a735d4a431e9e3a6d4ec847aa6fc1 | [
"Apache-2.0"
] | 1 | 2020-08-17T16:05:45.000Z | 2020-08-17T16:05:45.000Z | from os.path import basename
import glob
def get_py_files(path):
for py_file in glob.glob("{}*.py".format(path)):
if not py_file.endswith("__init__.py"):
yield basename(py_file)[:-3]
def install_static_imports(path):
with open("{}__init__.py".format(path), 'w') as init_f:
for pf in get_py_files(path):
init_f.write("from .{} import *\n".format(pf))
install_static_imports("src/modules/discovery/")
install_static_imports("src/modules/hunting/")
install_static_imports("src/modules/report/")
install_static_imports("plugins/")
| 32.111111 | 59 | 0.695502 | 85 | 578 | 4.411765 | 0.435294 | 0.173333 | 0.266667 | 0.184 | 0.24 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002037 | 0.150519 | 578 | 17 | 60 | 34 | 0.761711 | 0 | 0 | 0 | 0 | 0 | 0.205882 | 0.038062 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.571429 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
142c7611071a9367cf6a272e8d29329b7ce52109 | 193 | py | Python | api/serializers/server.py | lndba/apasa_backend | e0bb96e22a22f6e2a5a2826f225388113473e7e2 | [
"Apache-2.0"
] | 1 | 2019-08-06T07:31:40.000Z | 2019-08-06T07:31:40.000Z | api/serializers/server.py | lndba/apasa_backend | e0bb96e22a22f6e2a5a2826f225388113473e7e2 | [
"Apache-2.0"
] | null | null | null | api/serializers/server.py | lndba/apasa_backend | e0bb96e22a22f6e2a5a2826f225388113473e7e2 | [
"Apache-2.0"
] | null | null | null | from rest_framework import serializers
from api.models import *
class ServerListSerializers(serializers.ModelSerializer):
class Meta:
model = Server
fields = "__all__"
| 16.083333 | 57 | 0.720207 | 19 | 193 | 7.052632 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222798 | 193 | 11 | 58 | 17.545455 | 0.893333 | 0 | 0 | 0 | 0 | 0 | 0.036842 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
143f866b2d562d6e58d70545f8b51ba4cdbd1e78 | 486 | py | Python | gnosis/eth/oracles/__init__.py | titandac/gnosis-py | cf0af4f25e64b22256eabb415d0f3fe3a6180b14 | [
"MIT"
] | 64 | 2018-09-26T19:56:50.000Z | 2022-03-18T21:45:59.000Z | gnosis/eth/oracles/__init__.py | zhanghao-ic/gnosis-py | d2a5912547b7d1b576c826909f4c1d0155db536f | [
"MIT"
] | 151 | 2018-09-10T21:42:05.000Z | 2022-03-31T12:33:31.000Z | gnosis/eth/oracles/__init__.py | zhanghao-ic/gnosis-py | d2a5912547b7d1b576c826909f4c1d0155db536f | [
"MIT"
] | 50 | 2018-12-13T20:43:46.000Z | 2022-03-30T09:32:32.000Z | # flake8: noqa F401
from .oracles import (
AaveOracle,
BalancerOracle,
CannotGetPriceFromOracle,
ComposedPriceOracle,
CreamOracle,
CurveOracle,
EnzymeOracle,
InvalidPriceFromOracle,
KyberOracle,
MooniswapOracle,
OracleException,
PoolTogetherOracle,
PriceOracle,
PricePoolOracle,
SushiswapOracle,
UnderlyingToken,
UniswapOracle,
UniswapV2Oracle,
UsdPricePoolOracle,
YearnOracle,
ZerionComposedOracle,
)
| 19.44 | 29 | 0.711934 | 27 | 486 | 12.814815 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013441 | 0.234568 | 486 | 24 | 30 | 20.25 | 0.916667 | 0.034979 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.043478 | 0 | 0.043478 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
145854982248ea84d9abe0f478e5e22274de2489 | 142 | py | Python | vet_care/doc_events/patient_appointment.py | neerajvkn/vet_care | 14914b22e7a83265d736f9f9dc5186271ae62d66 | [
"MIT"
] | 2 | 2020-11-23T11:14:32.000Z | 2021-02-03T06:40:33.000Z | vet_care/doc_events/patient_appointment.py | neerajvkn/vet_care | 14914b22e7a83265d736f9f9dc5186271ae62d66 | [
"MIT"
] | null | null | null | vet_care/doc_events/patient_appointment.py | neerajvkn/vet_care | 14914b22e7a83265d736f9f9dc5186271ae62d66 | [
"MIT"
] | 7 | 2019-11-16T14:36:33.000Z | 2021-08-25T07:54:51.000Z | import frappe
def validate(doc, method):
customer = frappe.db.get_value('Patient', doc.patient, 'customer')
doc.vc_owner = customer
| 20.285714 | 70 | 0.711268 | 19 | 142 | 5.210526 | 0.684211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.161972 | 142 | 6 | 71 | 23.666667 | 0.831933 | 0 | 0 | 0 | 0 | 0 | 0.105634 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
1459e47e29f07327f77b66aae27e1f15b3c747d9 | 2,587 | py | Python | PyObjCTest/test_nsjavasetup.py | Khan/pyobjc-framework-Cocoa | f8b015ea2a72d8d78be6084fb12925c4785b8f1f | [
"MIT"
] | 132 | 2015-01-01T10:02:42.000Z | 2022-03-09T12:51:01.000Z | mac/pyobjc-framework-Cocoa/PyObjCTest/test_nsjavasetup.py | mba811/music-player | 7998986b34cfda2244ef622adefb839331b81a81 | [
"BSD-2-Clause"
] | 6 | 2015-01-06T08:23:19.000Z | 2019-03-14T12:22:06.000Z | mac/pyobjc-framework-Cocoa/PyObjCTest/test_nsjavasetup.py | mba811/music-player | 7998986b34cfda2244ef622adefb839331b81a81 | [
"BSD-2-Clause"
] | 27 | 2015-02-23T11:51:43.000Z | 2022-03-07T02:34:18.000Z | from PyObjCTools.TestSupport import *
import os
from Foundation import *
try:
unicode
except NameError:
unicode = str
class TestNSJavaSetup (TestCase):
@max_os_level('10.5')
def testConstants(self):
self.assertIsInstance(NSJavaClasses, unicode)
self.assertIsInstance(NSJavaRoot, unicode)
self.assertIsInstance(NSJavaPath, unicode)
self.assertIsInstance(NSJavaUserPath, unicode)
self.assertIsInstance(NSJavaLibraryPath, unicode)
self.assertIsInstance(NSJavaOwnVirtualMachine, unicode)
self.assertIsInstance(NSJavaPathSeparator, unicode)
self.assertIsInstance(NSJavaWillSetupVirtualMachineNotification, unicode)
self.assertIsInstance(NSJavaDidSetupVirtualMachineNotification, unicode)
self.assertIsInstance(NSJavaWillCreateVirtualMachineNotification, unicode)
self.assertIsInstance(NSJavaDidCreateVirtualMachineNotification, unicode)
@max_os_level('10.5')
def testFunctions(self):
v = NSJavaNeedsVirtualMachine({})
self.assertIs(v, False)
v = NSJavaProvidesClasses({})
self.assertIs(v, False)
v = NSJavaNeedsToLoadClasses({})
self.assertIs(v, False)
vm = NSJavaSetup({})
self.assertIsInstance(vm, objc.objc_object)
v = NSJavaSetupVirtualMachine()
self.assertIsInstance(v, objc.objc_object)
v = NSJavaObjectNamedInPath("java.lang.Object", None)
self.assertIsInstance(v, objc.objc_object)
v, vm = NSJavaClassesFromPath(None, ['java.lang.Object'], True, None)
self.assertIsInstance(v, NSArray)
self.assertEqual(len(v), 1)
self.assertIsInstance(vm, objc.objc_object)
v, vm = NSJavaClassesForBundle(NSBundle.mainBundle(), True, None)
self.assertIsInstance(v, NSArray)
self.assertEqual(len(v), 0)
self.assertIsInstance(vm, objc.objc_object)
vm = NSJavaBundleSetup(NSBundle.mainBundle(), {})
self.assertIsInstance(vm, objc.objc_object)
# FIXME: NSJavaBundleCleanup gives an exception
# This seems to be related to the way we call these APIs and I don't
# plan to fix is (there is no problem with PyObjC or the Foundation
# wrappers)
fd = os.dup(2)
x = os.open('/dev/null', os.O_WRONLY)
os.dup2(x, 2)
os.close(x)
try:
try:
NSJavaBundleCleanup(NSBundle.mainBundle(), {})
except ValueError:
pass
finally:
os.dup2(fd, 2)
if __name__ == "__main__":
main()
| 38.044118 | 82 | 0.665636 | 251 | 2,587 | 6.784861 | 0.426295 | 0.223136 | 0.158544 | 0.061069 | 0.236054 | 0.211392 | 0.150323 | 0.064592 | 0.064592 | 0.064592 | 0 | 0.006602 | 0.238887 | 2,587 | 67 | 83 | 38.61194 | 0.858304 | 0.072671 | 0 | 0.275862 | 0 | 0 | 0.02381 | 0 | 0 | 0 | 0 | 0.014925 | 0.413793 | 1 | 0.034483 | false | 0.017241 | 0.051724 | 0 | 0.103448 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
145dd3d7b020a6d88ed30bd57eb630dd062a950e | 378 | py | Python | DataStructures and Algorithms/Ammortization onArrays/Gameentry.py | abhishekratnam/Datastructuresandalgorithmsinpython | 9339319f441755797f4d2818ac9cf742a63ab5ea | [
"MIT"
] | null | null | null | DataStructures and Algorithms/Ammortization onArrays/Gameentry.py | abhishekratnam/Datastructuresandalgorithmsinpython | 9339319f441755797f4d2818ac9cf742a63ab5ea | [
"MIT"
] | null | null | null | DataStructures and Algorithms/Ammortization onArrays/Gameentry.py | abhishekratnam/Datastructuresandalgorithmsinpython | 9339319f441755797f4d2818ac9cf742a63ab5ea | [
"MIT"
] | null | null | null | class Gameentry:
"""Represents one entry of a list of high scores"""
def __init__(self,name,score):
self._name = name
self._score = score
def get_name(self):
return self._name
def get_score(self):
return self._score
def __str__(self):
return ' ({0},{1})'.format(self._name,self._score)#(bob,98)
| 23.625 | 68 | 0.574074 | 49 | 378 | 4.102041 | 0.469388 | 0.159204 | 0.129353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015209 | 0.304233 | 378 | 15 | 69 | 25.2 | 0.749049 | 0.142857 | 0 | 0 | 0 | 0 | 0.033003 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.3 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
148ceb1fb000771c1bef8a1a38a56aebb024d47d | 1,457 | py | Python | route/__init__.py | zyguan/pyroute | 16527374419780672f7ceae6f2a123a336603572 | [
"MIT"
] | null | null | null | route/__init__.py | zyguan/pyroute | 16527374419780672f7ceae6f2a123a336603572 | [
"MIT"
] | null | null | null | route/__init__.py | zyguan/pyroute | 16527374419780672f7ceae6f2a123a336603572 | [
"MIT"
] | null | null | null | import _route
VINCENTY = 1
HAVERSINE = 2
def distance(lat1, lng1, lat2, lng2, formula=VINCENTY, iterlimit=20):
"""
Calculate distance between two points.
:param lat1: latitude of point 1 in degrees
:param lng1: longitude of point 1 in degrees
:param lat2: latitude of point 2 in degrees
:param lng2: longitude of point 2 in degrees
:param formula: distance formula to use
:param iterlimit: max iterations, used by vincenty's formula
:return: the distance between (lat1, lng1) and (lat2, lng2) in meter
"""
return _route.distance(lat1, lng1, lat2, lng2, formula, iterlimit)
def distance_vincenty(lat1, lng1, lat2, lng2):
return _route.distance(lat1, lng1, lat2, lng2, VINCENTY)
def distance_haversine(lat1, lng1, lat2, lng2):
return _route.distance(lat1, lng1, lat2, lng2, HAVERSINE)
def measure(ps, formula=VINCENTY, iterlimit=20):
"""
Calculate cumulative distances of a route.
:param ps: points in route, eg: [(lng1, lat1), (lng2, lat2), ...]
:param formula: distance formula to use
:param iterlimit: max iterations, used by vincenty's formula
:return: the cumulative distance array `ds` where ds[j]-ds[i] is the
distance from (lat_i, lng_i) to (lat_j, lng_j) in meter.
"""
return _route.measure(ps, formula, iterlimit)
def measure_vincenty(ps):
return _route.measure(ps, VINCENTY)
def measure_haversine(ps):
return _route.measure(ps, HAVERSINE)
| 31 | 72 | 0.702128 | 206 | 1,457 | 4.893204 | 0.252427 | 0.055556 | 0.071429 | 0.095238 | 0.540675 | 0.434524 | 0.314484 | 0.279762 | 0.279762 | 0.279762 | 0 | 0.039316 | 0.19698 | 1,457 | 46 | 73 | 31.673913 | 0.822222 | 0.495539 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.066667 | 0.266667 | 0.866667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
149096338c91567b2635fcffc0b59e5d56c47ad5 | 1,958 | py | Python | TMTool/Scripts/Scoring/owasp_rr.py | tmart234/TMTool | 054d055d73fa5e76be22bf04cf1136b88cb7efb9 | [
"MIT"
] | 7 | 2021-04-23T23:13:02.000Z | 2022-03-21T03:35:47.000Z | TMTool/Scripts/Scoring/owasp_rr.py | tmart234/TMTool | 054d055d73fa5e76be22bf04cf1136b88cb7efb9 | [
"MIT"
] | 1 | 2022-03-21T03:45:45.000Z | 2022-03-21T03:45:45.000Z | TMTool/Scripts/Scoring/owasp_rr.py | tmart234/TMTool | 054d055d73fa5e76be22bf04cf1136b88cb7efb9 | [
"MIT"
] | null | null | null | ## based on OWASP risk rating
# clculates impact or likihood
def calc_scores(_dict):
# set Numerical Score
_dict["Numerical Score"] = int(0)
for key,value in _dict:
if "Score" in key:
continue
else:
_dict["Numerical Score"] = value + _dict["Numerical Score"]
# set Categorical Score
if _dict["Numerical Score"] < 3:
_dict["Categorical Score"] = "Low"
elif _dict["Numerical Score"] < 6:
_dict["Categorical Score"] = "Medium"
elif _dict["Numerical Score"] < 9:
_dict["Categorical Score"] = "High"
else:
_dict["Categorical Score"] = None
return
# translating the risk matrix
def calc_risk(_likihood,_impact):
if _likihood["Categorical Score"] == "Low" and _impact["Categorical Score"] == "Low":
return "Note"
elif (_likihood["Categorical Score"] == "Low" and _impact["Categorical Score"] == "Medium") or \
(_likihood["Categorical Score"] == "Medium" and _impact["Categorical Score"] == "Low"):
return "Low"
elif (_likihood["Categorical Score"] == "Low" and _impact["Categorical Score"] == "High") or \
(_likihood["Categorical Score"] == "High" and _impact["Categorical Score"] == "Low") or \
(_likihood["Categorical Score"] == "Medium" and _impact["Categorical Score"] == "Medium"):
return "Medium"
elif (_likihood["Categorical Score"] == "High" and _impact["Categorical Score"] == "Medium") or \
(_likihood["Categorical Score"] == "Medium" and _impact["Categorical Score"] == "High"):
return "High"
elif _likihood["Categorical Score"] == "High" and _impact["Categorical Score"] == "High":
return "Critical"
else:
return None
likihood = dict.fromkeys("SL","M","O","S","ED","EE","Aw","ID","Numerical Score","Categorical Score")
impact = dict.fromkeys("C","I","A","Ac","FD","RD","NCD","PV", "Numerical Score","Categorical Score")
| 41.659574 | 101 | 0.61287 | 217 | 1,958 | 5.373272 | 0.267281 | 0.343053 | 0.185249 | 0.192967 | 0.484563 | 0.473413 | 0.440823 | 0.440823 | 0.350772 | 0.150943 | 0 | 0.002623 | 0.221144 | 1,958 | 46 | 102 | 42.565217 | 0.761967 | 0.063841 | 0 | 0.085714 | 0 | 0 | 0.370411 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.057143 | false | 0 | 0 | 0 | 0.257143 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
14b10ba84092552ef5446166e93acde05dd54b70 | 376 | py | Python | nbdev_calc/core.py | alinaselega/nbdev_calc | 4e7cffb81a08770aea7652c10481e41d5d13a05e | [
"Apache-2.0"
] | null | null | null | nbdev_calc/core.py | alinaselega/nbdev_calc | 4e7cffb81a08770aea7652c10481e41d5d13a05e | [
"Apache-2.0"
] | null | null | null | nbdev_calc/core.py | alinaselega/nbdev_calc | 4e7cffb81a08770aea7652c10481e41d5d13a05e | [
"Apache-2.0"
] | null | null | null | # AUTOGENERATED! DO NOT EDIT! File to edit: 00_core.ipynb (unless otherwise specified).
__all__ = ['add', 'multiply', 'subtract', 'divide']
# Cell
def add(x, y):
"Add x and y"
return x+y
def multiply(x, y):
"Multiply x and y"
return x*y
def subtract(x, y):
"Subtract y from x"
return add(x, -y)
def divide(x, y):
"Divide x by y"
return x/y | 18.8 | 87 | 0.609043 | 64 | 376 | 3.5 | 0.390625 | 0.071429 | 0.107143 | 0.120536 | 0.142857 | 0.142857 | 0.142857 | 0 | 0 | 0 | 0 | 0.007067 | 0.24734 | 376 | 20 | 88 | 18.8 | 0.784452 | 0.404255 | 0 | 0 | 1 | 0 | 0.288732 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.307692 | false | 0 | 0 | 0 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
14b48aef525713da7768110b19cc3d3319051874 | 456 | py | Python | mainApp/Serializers.py | MSNP1381/rayanRooyeshCityBack | f3e8224538949c4951d4b25e1f14e4fcc02fdc53 | [
"MIT"
] | null | null | null | mainApp/Serializers.py | MSNP1381/rayanRooyeshCityBack | f3e8224538949c4951d4b25e1f14e4fcc02fdc53 | [
"MIT"
] | null | null | null | mainApp/Serializers.py | MSNP1381/rayanRooyeshCityBack | f3e8224538949c4951d4b25e1f14e4fcc02fdc53 | [
"MIT"
] | null | null | null | from rest_framework import serializers
from .models import Transaction,Sections
class TransactionSerializer(serializers.ModelSerializer):
class Meta:
model = Transaction
fields = '__all__'
def create(self, data):
print(100*"%")
print(data)
return Transaction.objects.create(**data)
class SectionSerializer(serializers.ModelSerializer):
class Meta:
model = Sections
fields = '__all__' | 24 | 57 | 0.682018 | 43 | 456 | 7.023256 | 0.55814 | 0.172185 | 0.205298 | 0.231788 | 0.264901 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008596 | 0.234649 | 456 | 19 | 58 | 24 | 0.856734 | 0 | 0 | 0.285714 | 0 | 0 | 0.032823 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.142857 | 0 | 0.571429 | 0.142857 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
1ad8843aee0e7aba85c685628079d06bf297c82d | 2,064 | py | Python | src/leetcode_384_shuffle_an_array.py | sungho-joo/leetcode2github | ce7730ef40f6051df23681dd3c0e1e657abba620 | [
"MIT"
] | null | null | null | src/leetcode_384_shuffle_an_array.py | sungho-joo/leetcode2github | ce7730ef40f6051df23681dd3c0e1e657abba620 | [
"MIT"
] | null | null | null | src/leetcode_384_shuffle_an_array.py | sungho-joo/leetcode2github | ce7730ef40f6051df23681dd3c0e1e657abba620 | [
"MIT"
] | null | null | null | # @l2g 384 python3
# [384] Shuffle an Array
# Difficulty: Medium
# https://leetcode.com/problems/shuffle-an-array
#
# Given an integer array nums,design an algorithm to randomly shuffle the array.
# All permutations of the array should be equally likely as a result of the shuffling.
# Implement the Solution class:
#
# Solution(int[] nums) Initializes the object with the integer array nums.
# int[] reset() Resets the array to its original configuration and returns it.
# int[] shuffle() Returns a random shuffling of the array.
#
#
# Example 1:
#
# Input
# ["Solution", "shuffle", "reset", "shuffle"]
# [[[1, 2, 3]], [], [], []]
# Output
# [null, [3, 1, 2], [1, 2, 3], [1, 3, 2]]
#
# Explanation
# Solution solution = new Solution([1, 2, 3]);
# solution.shuffle(); // Shuffle the array [1,2,3] and return its result.
# // Any permutation of [1,2,3] must be equally likely to be returned.
# // Example: return [3, 1, 2]
# solution.reset(); // Resets the array back to its original configuration [1,2,3].Return [1,2,3]
# solution.shuffle(); // Returns the random shuffling of array [1,2,3]. Example: return [1, 3, 2]
#
#
#
# Constraints:
#
# 1 <= nums.length <= 200
# -10^6 <= nums[i] <= 10^6
# All the elements of nums are unique.
# At most 5 * 10^4 calls in total will be made to reset and shuffle.
#
#
import random
from typing import List
class Solution:
def __init__(self, nums: List[int]):
self.nums = nums
def reset(self) -> List[int]:
"""
Resets the array to its original configuration and return it.
"""
return self.nums
def shuffle(self) -> List[int]:
"""
Returns a random shuffling of the array.
"""
return random.sample(self.nums, len(self.nums))
# Your Solution object will be instantiated and called as such:
# obj = Solution(nums)
# param_1 = obj.reset()
# param_2 = obj.shuffle()
if __name__ == "__main__":
import os
import pytest
pytest.main([os.path.join("tests", "test_384.py")])
| 27.52 | 102 | 0.630814 | 297 | 2,064 | 4.333333 | 0.353535 | 0.01554 | 0.018648 | 0.060606 | 0.146076 | 0.118104 | 0.118104 | 0.066822 | 0 | 0 | 0 | 0.040328 | 0.231105 | 2,064 | 74 | 103 | 27.891892 | 0.770636 | 0.734012 | 0 | 0 | 0 | 0 | 0.052288 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.307692 | 0 | 0.769231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
1af37e5031f27006259d50bd37911e3e99b40d74 | 771 | py | Python | nest_py/knoweng/data_types/public_gene_sets.py | KnowEnG/platform | 7356fabf5e2db4171ef1f910514436b69ecaa701 | [
"MIT"
] | 2 | 2020-02-12T22:20:51.000Z | 2020-07-31T03:19:51.000Z | nest_py/knoweng/data_types/public_gene_sets.py | KnowEnG/platform | 7356fabf5e2db4171ef1f910514436b69ecaa701 | [
"MIT"
] | 1 | 2021-06-02T00:29:02.000Z | 2021-06-02T00:29:02.000Z | nest_py/knoweng/data_types/public_gene_sets.py | KnowEnG/platform | 7356fabf5e2db4171ef1f910514436b69ecaa701 | [
"MIT"
] | 1 | 2018-01-03T22:56:27.000Z | 2018-01-03T22:56:27.000Z | """
A collection of public gene sets.
"""
from nest_py.core.data_types.tablelike_schema import TablelikeSchema
COLLECTION_NAME = 'public_gene_sets'
def generate_schema():
schema = TablelikeSchema(COLLECTION_NAME)
schema.add_categoric_attribute('set_id', valid_values=None)
schema.add_categoric_attribute('set_name', valid_values=None)
schema.add_numeric_attribute('species_id')
schema.add_numeric_attribute('gene_count')
schema.add_categoric_attribute('collection', valid_values=None)
schema.add_categoric_attribute('edge_type_name', valid_values=None)
schema.add_categoric_attribute('supercollection', valid_values=None)
schema.add_categoric_attribute('url', valid_values=None)
schema.add_index(['set_id'])
return schema
| 33.521739 | 72 | 0.787289 | 99 | 771 | 5.747475 | 0.363636 | 0.142355 | 0.189807 | 0.28471 | 0.45167 | 0.351494 | 0.295255 | 0 | 0 | 0 | 0 | 0 | 0.111543 | 771 | 22 | 73 | 35.045455 | 0.830657 | 0.042802 | 0 | 0 | 1 | 0 | 0.134431 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.071429 | 0 | 0.214286 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
1af6926099c1d7d07976357c858e6720fbdd0e0b | 565 | py | Python | server/utils/cors.py | amirdib/dapp_dashboard | e3f556b20ea2d866300e7501f8c720dc0eb56f4b | [
"MIT"
] | null | null | null | server/utils/cors.py | amirdib/dapp_dashboard | e3f556b20ea2d866300e7501f8c720dc0eb56f4b | [
"MIT"
] | null | null | null | server/utils/cors.py | amirdib/dapp_dashboard | e3f556b20ea2d866300e7501f8c720dc0eb56f4b | [
"MIT"
] | null | null | null | from bottle import response, request
def enable_cors(func):
def _enable_cors(*args, **kwargs):
# Set CORS headers
response.headers['Access-Control-Allow-Origin'] = '*'
response.headers['Access-Control-Allow-Methods'] = 'GET, POST, PUT, OPTIONS'
response.headers['Access-Control-Allow-Headers'] = 'Origin, Accept, Content-Type, X-Requested-With, X-CSRF-Token'
if request.method != 'OPTIONS':
# actual request; reply with the actual response
return func(*args, **kwargs)
return _enable_cors
| 35.3125 | 121 | 0.654867 | 67 | 565 | 5.447761 | 0.537313 | 0.082192 | 0.172603 | 0.230137 | 0.271233 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.217699 | 565 | 15 | 122 | 37.666667 | 0.825792 | 0.111504 | 0 | 0 | 0 | 0 | 0.348697 | 0.166333 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
1af6c73c4393c2fe75719524db1ca5b86014c72d | 2,367 | py | Python | TerminalStory/Jack.py | TomlinsonJ03/4006CEM-Mini-Project | ac2ebcfa5793475d7bfb074034b5aa23cfd1858e | [
"CC0-1.0"
] | null | null | null | TerminalStory/Jack.py | TomlinsonJ03/4006CEM-Mini-Project | ac2ebcfa5793475d7bfb074034b5aa23cfd1858e | [
"CC0-1.0"
] | null | null | null | TerminalStory/Jack.py | TomlinsonJ03/4006CEM-Mini-Project | ac2ebcfa5793475d7bfb074034b5aa23cfd1858e | [
"CC0-1.0"
] | null | null | null | # Entering the Basement
def basement():
print("""
I will walk into the basement, there seems to be a robot there, working hard it seems. They don’t see the things the way I do, it exclaimed.
I wander what it means, see what? Maybe I will approach it and find out. It seems to be working on some kind of budgeting, the finances of the family perhaps.
However, it is scribbling the words free will, programming. Perhaps the robot’s have some sort of sentience,
Maybe the robot in the garden has a passion for that sort of thing, maybe robots love to cook. But then again, this one doesn’t seem to love what it’s doing,
more like, it knows that it’s simply a slave who isn’t allowed to do as it pleases. Maybe it ignored me because it hopes that I will make a choice,
will I save the robot or leave it be. No surely not, a robot has no free thought, it’s just an AI with programming,
no-one would program a robot to hate what it is doing, do they have such emotions? How do you make it so a robot can experience such things,
would it be considered real if so? I suppose I must make a choice.
""")
print("1: Unplug the robot")
print("2: Leave the robot to suffer")
choice = int(input("Which option will you choose? "))
if choice == 1:
unplugRobot()
elif choice == 2:
robotSuffer()
# Unplug the robot
def unplugRobot():
print("""
I ponder the overall necessity for the robot and weight in the possible consequences that could unravel if it was no longer active.
None surpassed a high enough level of degree to garner concern, I lay my finger upon a circular button illuminated by a warm white led,
I press it and enjoy a moment of silence as its internal components hum until still.
I then unplug the robot and take the power cord with me in hope of the robot never being activated again.
""")
# Let the robot suffer
def robotSuffer():
print("""
I notice the robot in despair and locate the unorganized box of components and stripped wires, I turn of the robot and begin swapping out parts,
refitting incorrect capacitors and resistors, mismatch the organized wiring with a mesh of unsafe wires and leave the existing body panels in front of
its visual sensor to gaze upon until next reset.
""") | 62.289474 | 167 | 0.709759 | 402 | 2,367 | 4.179104 | 0.5 | 0.057143 | 0.025 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002233 | 0.243346 | 2,367 | 38 | 168 | 62.289474 | 0.93579 | 0.024926 | 0 | 0.193548 | 0 | 0.387097 | 0.895445 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.096774 | false | 0.064516 | 0 | 0 | 0.096774 | 0.16129 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
210693b9ec3b9f3189d565efa51e605d1e16d8d3 | 3,311 | py | Python | src/genie/libs/parser/iosxe/tests/ShowBgpAllDetail/cli/equal/golden_output4_expected.py | jmedina0911/genieparser | 2fcd6d3e44891551af4b9d05e2c053218ee25c32 | [
"Apache-2.0"
] | null | null | null | src/genie/libs/parser/iosxe/tests/ShowBgpAllDetail/cli/equal/golden_output4_expected.py | jmedina0911/genieparser | 2fcd6d3e44891551af4b9d05e2c053218ee25c32 | [
"Apache-2.0"
] | null | null | null | src/genie/libs/parser/iosxe/tests/ShowBgpAllDetail/cli/equal/golden_output4_expected.py | jmedina0911/genieparser | 2fcd6d3e44891551af4b9d05e2c053218ee25c32 | [
"Apache-2.0"
] | null | null | null | expected_output = {
"instance": {
"default": {
"vrf": {
"vrf1": {
"address_family": {
"": {
"default_vrf": "vrf1",
"prefixes": {
"0.0.0.0/0": {
"available_path": "2",
"best_path": "1",
"index": {
1: {
"community": "163:43242 2002:8 2002:35 2002:53 2002:100 2002:1000",
"ext_community": "RT:65002:3014",
"gateway": "9.2.2.2",
"localpref": 950,
"next_hop": "9.2.2.2",
"next_hop_via": "vrf vrf1",
"origin_codes": "i",
"originator": "9.207.128.21",
"recipient_pathid": "0",
"refresh_epoch": 2,
"route_info": "304 13",
"status_codes": "*>",
"transfer_pathid": "0x0",
"update_group": 64624,
},
2: {
"community": "163:43242 2002:8 2002:35 2002:53 2002:100 2002:1000",
"ext_community": "RT:65002:3014",
"gateway": "9.91.117.38",
"imported_path_from": "9.91.117.38:3014:0.0.0.0/0 (global)",
"localpref": 950,
"metric": 0,
"next_hop": "9.91.117.38",
"next_hop_igp_metric": "11",
"next_hop_via": "default",
"origin_codes": "i",
"originator": "9.91.117.38",
"recipient_pathid": "0",
"refresh_epoch": 17,
"route_info": "64624",
"status_codes": "* i",
"transfer_pathid": "0",
"update_group": 64624,
},
},
"paths": "2 available, best #1, table vrf1",
"table_version": "74438",
}
},
"route_distinguisher": "9.1.1.1:3014",
}
}
}
}
}
}
}
| 53.403226 | 111 | 0.231048 | 187 | 3,311 | 3.909091 | 0.385027 | 0.021888 | 0.024624 | 0.021888 | 0.372093 | 0.218878 | 0.218878 | 0.218878 | 0.218878 | 0.218878 | 0 | 0.202752 | 0.670794 | 3,311 | 61 | 112 | 54.278689 | 0.46789 | 0 | 0 | 0.196721 | 0 | 0.016393 | 0.245545 | 0.007853 | 0 | 0 | 0.000906 | 0 | 0 | 1 | 0 | false | 0 | 0.016393 | 0 | 0.016393 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
214a07d8cc0215d98c308b5d7c5428fd93ac98c9 | 355 | py | Python | cvat_reader/video_reader/dummy_reader.py | eyedl/cvat_reader | 9f4837d82b3bb7946f6fb03b3c8fb0991a6d929f | [
"BSD-3-Clause"
] | null | null | null | cvat_reader/video_reader/dummy_reader.py | eyedl/cvat_reader | 9f4837d82b3bb7946f6fb03b3c8fb0991a6d929f | [
"BSD-3-Clause"
] | 5 | 2021-12-03T12:56:38.000Z | 2022-01-18T20:35:01.000Z | cvat_reader/video_reader/dummy_reader.py | eyedl/cvat_reader | 9f4837d82b3bb7946f6fb03b3c8fb0991a6d929f | [
"BSD-3-Clause"
] | 2 | 2022-02-09T08:53:35.000Z | 2022-03-16T10:21:14.000Z | from typing import Tuple, Any
from .base import VideoReader
class DummyVideoReader(VideoReader):
def __init__(self):
self.frame_id = 0
def read_frame(self) -> Tuple[int, Any]:
frame_id = self.frame_id
self.frame_id += 1
return frame_id, None
def seek(self, frame_id: int):
self.frame_id = frame_id
| 20.882353 | 44 | 0.647887 | 50 | 355 | 4.34 | 0.42 | 0.258065 | 0.253456 | 0.147465 | 0.133641 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007663 | 0.264789 | 355 | 16 | 45 | 22.1875 | 0.823755 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0.181818 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
2158eb73536fe9b11e0d8fd0dd4f2a37c52f7ad1 | 458 | py | Python | services/divnik/src/api/migrations/0005_auto_20200503_0933.py | pomo-mondreganto/innoctf-final-10-05-2020 | 8d31bb43b543c6948b1e09d9816728a10125c4be | [
"WTFPL"
] | 1 | 2021-01-04T23:52:34.000Z | 2021-01-04T23:52:34.000Z | services/divnik/src/api/migrations/0005_auto_20200503_0933.py | pomo-mondreganto/innoctf-final-10-05-2020 | 8d31bb43b543c6948b1e09d9816728a10125c4be | [
"WTFPL"
] | null | null | null | services/divnik/src/api/migrations/0005_auto_20200503_0933.py | pomo-mondreganto/innoctf-final-10-05-2020 | 8d31bb43b543c6948b1e09d9816728a10125c4be | [
"WTFPL"
] | 2 | 2020-10-22T12:27:25.000Z | 2020-10-22T12:27:28.000Z | # Generated by Django 3.0.3 on 2020-05-03 09:33
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('api', '0004_auto_20200502_1946'),
]
operations = [
migrations.AlterModelOptions(
name='course',
options={'ordering': ('-id',)},
),
migrations.AlterModelOptions(
name='user',
options={'ordering': ('-id',)},
),
]
| 21.809524 | 47 | 0.543668 | 42 | 458 | 5.857143 | 0.738095 | 0.219512 | 0.252033 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098101 | 0.310044 | 458 | 20 | 48 | 22.9 | 0.68038 | 0.098253 | 0 | 0.4 | 1 | 0 | 0.141119 | 0.055961 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.266667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
21686e2c5a6fb53bb4999288166075c2a704a954 | 7,152 | py | Python | food_reference_listing/database/models.py | bfssi-forest-dussault/food_reference_listing | 85372a81a9201dda02797ab0c11b1bd710f9b70d | [
"MIT"
] | null | null | null | food_reference_listing/database/models.py | bfssi-forest-dussault/food_reference_listing | 85372a81a9201dda02797ab0c11b1bd710f9b70d | [
"MIT"
] | null | null | null | food_reference_listing/database/models.py | bfssi-forest-dussault/food_reference_listing | 85372a81a9201dda02797ab0c11b1bd710f9b70d | [
"MIT"
] | null | null | null | from django.db import models
from simple_history.models import HistoricalRecords
class Language(models.Model):
language_id = models.CharField(max_length=2, blank=True, null=True, unique=True)
description_en = models.CharField(max_length=16, blank=True, null=True, unique=True)
description_fr = models.CharField(max_length=16, blank=True, null=True, unique=True)
updated = models.DateTimeField() # Historical record of previous DB
history = HistoricalRecords()
def __str__(self):
try:
return f'{self.language_id}: {self.description_en}'
except TypeError as e:
return ''
class AcronymType(models.Model):
acronym_type_id = models.CharField(max_length=16, null=True, blank=True, unique=True)
description_en = models.CharField(max_length=16, null=True, blank=True)
description_fr = models.CharField(max_length=16, null=True, blank=True)
updated = models.DateTimeField() # Historical record of previous DB
history = HistoricalRecords()
class Acronym(models.Model):
acronym_id = models.CharField(max_length=8, null=True, blank=True, unique=True)
language = models.ForeignKey(Language, related_name="acronyms", on_delete=models.SET_NULL, null=True)
description_en = models.CharField(max_length=64, blank=True, null=True)
description_fr = models.CharField(max_length=64, blank=True, null=True)
acronym_type = models.ForeignKey(AcronymType, related_name="acronyms", on_delete=models.SET_NULL, null=True)
updated = models.DateTimeField() # Historical record of previous DB
history = HistoricalRecords()
class Category(models.Model):
category_id = models.IntegerField(null=True, blank=True, unique=True)
header_en = models.CharField(max_length=256, blank=True, null=True)
header_fr = models.CharField(max_length=256, blank=True, null=True)
note_en = models.TextField(blank=True, null=True)
note_fr = models.TextField(blank=True, null=True)
updated = models.DateTimeField() # Historical record of previous DB
history = HistoricalRecords()
def __str__(self):
try:
return f'{self.category_id}: {self.header_en}'
except TypeError as e:
return ''
class Meta:
verbose_name = 'Category'
verbose_name_plural = 'Categories'
class Country(models.Model):
country_id = models.IntegerField(null=True, blank=True, unique=True)
description_en = models.CharField(max_length=256, null=True, blank=True)
description_fr = models.CharField(max_length=256, null=True, blank=True)
updated = models.DateTimeField() # Historical record of previous DB
history = HistoricalRecords()
class ProvinceState(models.Model):
province_state_id = models.IntegerField(null=True, blank=True, unique=True)
province_state_code = models.CharField(max_length=8, unique=True, null=True, blank=True)
description_en = models.CharField(max_length=256, null=True, blank=True)
description_fr = models.CharField(max_length=256, null=True, blank=True)
country = models.ForeignKey(Country, related_name="provincestates", on_delete=models.CASCADE)
updated = models.DateTimeField() # Historical record of previous DB
history = HistoricalRecords()
class City(models.Model):
city_id = models.IntegerField(null=True, blank=True, unique=True)
description_en = models.CharField(max_length=256, null=True, blank=True)
description_fr = models.CharField(max_length=256, null=True, blank=True)
province_state = models.ForeignKey(ProvinceState, on_delete=models.SET_NULL, blank=True, null=True)
country = models.ForeignKey(Country, related_name="cities", on_delete=models.SET_NULL, null=True, blank=True)
updated = models.DateTimeField() # Historical record of previous DB
history = HistoricalRecords()
class Company(models.Model):
company_id = models.CharField(max_length=16, unique=True, blank=True, null=True)
name_en = models.CharField(max_length=256, blank=True, null=True)
name_fr = models.CharField(max_length=256, blank=True, null=True)
language = models.ForeignKey(Language, related_name="companies", on_delete=models.SET_NULL, null=True)
city = models.ForeignKey(City, related_name="companies", on_delete=models.SET_NULL, null=True)
postal_code = models.CharField(max_length=16, blank=True, null=True)
updated = models.DateTimeField() # Historical record of previous DB
history = HistoricalRecords()
def __str__(self):
try:
return f'{self.company_id}: {self.name_en}'
except TypeError as e:
return ''
class Meta:
verbose_name = 'Company'
verbose_name_plural = 'Companies'
class Subcategory(models.Model):
subcategory_id = models.IntegerField(null=True, blank=True, unique=True)
subcategory_code = models.CharField(max_length=8, null=True, blank=True, unique=True)
topic_en = models.CharField(max_length=256, null=True, blank=True)
topic_fr = models.CharField(max_length=256, null=True, blank=True)
long_topic_en = models.TextField(null=True, blank=True)
long_topic_fr = models.TextField(null=True, blank=True)
condition_use_en = models.TextField(null=True, blank=True)
condition_use_fr = models.TextField(null=True, blank=True)
category = models.ForeignKey(Category, related_name="subcategories", on_delete=models.CASCADE)
updated = models.DateTimeField() # Historical record of previous DB
history = HistoricalRecords()
def __str__(self):
try:
return f'{self.subcategory_code}: {self.topic_en}'
except TypeError as e:
return ''
class Meta:
verbose_name = 'Subcategory'
verbose_name_plural = 'Subcategories'
class Product(models.Model):
"""
Corresponds with extremely poorly named 'Final Web Update' table --> might need to combine with original Products
table, not sure how these datasets overlap though
"""
product_code = models.CharField(max_length=16, unique=True, blank=True, null=True)
product_name_en = models.CharField(max_length=256, unique=False, blank=True, null=True)
product_name_fr = models.CharField(max_length=256, unique=False, blank=True, null=True)
language = models.ForeignKey(Language, on_delete=models.SET_NULL, null=True, blank=True)
acceptance_date = models.DateTimeField(blank=True, null=True)
company = models.ForeignKey(Company, related_name="products", on_delete=models.CASCADE)
subcategory = models.ForeignKey(Subcategory, related_name="products", on_delete=models.CASCADE, blank=True,
null=True)
updated = models.DateTimeField(blank=True, null=True) # Historical record of previous DB
history = HistoricalRecords()
@property
def acceptance_date_pretty(self):
return f'{self.acceptance_date.strftime("%Y-%m-%d")}'
def __str__(self):
try:
return f'{self.product_code}: {self.product_name_en}'
except TypeError as e:
return ''
class Meta:
verbose_name = 'Product'
verbose_name_plural = 'Products'
| 44.981132 | 117 | 0.719938 | 909 | 7,152 | 5.49615 | 0.123212 | 0.078463 | 0.100881 | 0.134508 | 0.768415 | 0.754404 | 0.718975 | 0.623098 | 0.590072 | 0.522018 | 0 | 0.011152 | 0.172539 | 7,152 | 158 | 118 | 45.265823 | 0.833052 | 0.069072 | 0 | 0.395161 | 0 | 0 | 0.059134 | 0.016594 | 0 | 0 | 0 | 0 | 0 | 1 | 0.048387 | false | 0 | 0.016129 | 0.008065 | 0.83871 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
dcc424db20a8d7965b1e60f57bf13d6c0c24b2c1 | 308 | py | Python | GrAm/forms.py | claudianjeri/Instagram | 93b16495c0ec62eb4ab0327d18e3155b82f1b4bd | [
"MIT"
] | 3 | 2018-06-14T12:07:49.000Z | 2020-06-16T02:44:14.000Z | GrAm/forms.py | claudianjeri/Instagram | 93b16495c0ec62eb4ab0327d18e3155b82f1b4bd | [
"MIT"
] | null | null | null | GrAm/forms.py | claudianjeri/Instagram | 93b16495c0ec62eb4ab0327d18e3155b82f1b4bd | [
"MIT"
] | null | null | null | from .models import Image, Profile, Comment
from django import forms
class EditProfileForm(forms.ModelForm):
class Meta:
model = Profile
exclude = ['user']
class UploadForm(forms.ModelForm):
class Meta:
model = Image
exclude = ['user','likes','upload_date','profile'] | 28 | 58 | 0.662338 | 34 | 308 | 5.970588 | 0.558824 | 0.137931 | 0.187192 | 0.226601 | 0.275862 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.227273 | 308 | 11 | 58 | 28 | 0.852941 | 0 | 0 | 0.2 | 0 | 0 | 0.100324 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
dcc7cc7a40b45ab40e2a51260b50373958a8da92 | 187 | py | Python | data_collection/gazette/spiders/sc_tangara.py | Jefersonalves/diario-oficial | 9a4bdfe2e31414c993d88831a67160c49a5ee657 | [
"MIT"
] | 3 | 2021-08-18T17:50:31.000Z | 2021-11-12T23:36:33.000Z | data_collection/gazette/spiders/sc_tangara.py | Jefersonalves/diario-oficial | 9a4bdfe2e31414c993d88831a67160c49a5ee657 | [
"MIT"
] | 4 | 2021-02-10T02:36:48.000Z | 2022-03-02T14:55:34.000Z | data_collection/gazette/spiders/sc_tangara.py | Jefersonalves/diario-oficial | 9a4bdfe2e31414c993d88831a67160c49a5ee657 | [
"MIT"
] | null | null | null | from gazette.spiders.base import FecamGazetteSpider
class ScTangaraSpider(FecamGazetteSpider):
name = "sc_tangara"
FECAM_QUERY = "cod_entidade:268"
TERRITORY_ID = "4217907"
| 23.375 | 51 | 0.764706 | 20 | 187 | 6.95 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063291 | 0.15508 | 187 | 7 | 52 | 26.714286 | 0.816456 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
dcefdb99fe2524931fcc166cd7b76a8ef691d120 | 216 | py | Python | djconnectwise/management/commands/delete_callback.py | kti-cameron/django-connectwise | e3438d6f303163cd482ce31a98aca5abf15ec932 | [
"MIT"
] | null | null | null | djconnectwise/management/commands/delete_callback.py | kti-cameron/django-connectwise | e3438d6f303163cd482ce31a98aca5abf15ec932 | [
"MIT"
] | null | null | null | djconnectwise/management/commands/delete_callback.py | kti-cameron/django-connectwise | e3438d6f303163cd482ce31a98aca5abf15ec932 | [
"MIT"
] | null | null | null | from django.utils.translation import ugettext_lazy as _
from . import _callback
class Command(_callback.Command):
help = str(_('Deletes the callback from the target connectwise system.'))
ACTION = 'delete'
| 27 | 77 | 0.75463 | 27 | 216 | 5.851852 | 0.740741 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162037 | 216 | 7 | 78 | 30.857143 | 0.872928 | 0 | 0 | 0 | 0 | 0 | 0.287037 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
dcfb3f72e395046eb90ede6f87900476cf79be0a | 43 | py | Python | conftest.py | sergei-bondarenko/exchange-simulator | a5bba00c38016823a5d18471da01fd4e04319b8e | [
"Unlicense"
] | 1 | 2021-03-16T06:04:12.000Z | 2021-03-16T06:04:12.000Z | conftest.py | sergei-bondarenko/exchange-simulator | a5bba00c38016823a5d18471da01fd4e04319b8e | [
"Unlicense"
] | null | null | null | conftest.py | sergei-bondarenko/exchange-simulator | a5bba00c38016823a5d18471da01fd4e04319b8e | [
"Unlicense"
] | null | null | null | pytest_plugins = [
'xchg.tests.data',
]
| 10.75 | 21 | 0.627907 | 5 | 43 | 5.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186047 | 43 | 3 | 22 | 14.333333 | 0.742857 | 0 | 0 | 0 | 0 | 0 | 0.348837 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
0d096ee6b4123c7c0c3a2ec91a6b9b448d9095f1 | 131 | py | Python | code/webhookhandler/urls/hooks.py | superfluidity/RDCL3D | 3c5717941bd4046aa1be178e9004db1dc1c469a0 | [
"Apache-2.0"
] | 8 | 2017-03-13T16:34:28.000Z | 2021-11-16T11:35:56.000Z | code/webhookhandler/urls/hooks.py | superfluidity/RDCL3D | 3c5717941bd4046aa1be178e9004db1dc1c469a0 | [
"Apache-2.0"
] | null | null | null | code/webhookhandler/urls/hooks.py | superfluidity/RDCL3D | 3c5717941bd4046aa1be178e9004db1dc1c469a0 | [
"Apache-2.0"
] | 3 | 2017-03-28T09:26:40.000Z | 2020-12-08T14:16:12.000Z | from django.conf.urls import url
from webhookhandler import views
urlpatterns = [
url(r'^$', views.webhook, name='webhook'),
] | 21.833333 | 46 | 0.717557 | 17 | 131 | 5.529412 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.145038 | 131 | 6 | 47 | 21.833333 | 0.839286 | 0 | 0 | 0 | 0 | 0 | 0.068182 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
0d17348f8f1d18cb7ea86e8791e3c59ce2012a91 | 240 | py | Python | kohtaaminen/__main__.py | sthagen/kohtaaminen | ce0784ccd8be109164d63f2b5dcea128bd6f4534 | [
"MIT"
] | 1 | 2021-11-13T10:57:55.000Z | 2021-11-13T10:57:55.000Z | kohtaaminen/__main__.py | sthagen/kohtaaminen | ce0784ccd8be109164d63f2b5dcea128bd6f4534 | [
"MIT"
] | 4 | 2021-11-14T15:12:06.000Z | 2021-11-30T13:54:47.000Z | kohtaaminen/__main__.py | sthagen/kohtaaminen | ce0784ccd8be109164d63f2b5dcea128bd6f4534 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# pylint: disable=expression-not-assigned,line-too-long,missing-module-docstring
import sys
from kohtaaminen.cli import app
if __name__ == '__main__':
sys.exit(app(prog_name='kohtaaminen')) # pragma: no cover
| 26.666667 | 80 | 0.725 | 33 | 240 | 5 | 0.848485 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004762 | 0.125 | 240 | 8 | 81 | 30 | 0.780952 | 0.4875 | 0 | 0 | 0 | 0 | 0.159664 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
0d2236c09fb61de0facb0db8ff5d2c14d3843c5c | 608 | py | Python | Virtualenv/Env/src/GoTravel/Home/models.py | Anoop01234/Go-Travel | aa91f1a4ce7e7ed78de8eadc55e6a25d1a73bdd8 | [
"MIT"
] | null | null | null | Virtualenv/Env/src/GoTravel/Home/models.py | Anoop01234/Go-Travel | aa91f1a4ce7e7ed78de8eadc55e6a25d1a73bdd8 | [
"MIT"
] | null | null | null | Virtualenv/Env/src/GoTravel/Home/models.py | Anoop01234/Go-Travel | aa91f1a4ce7e7ed78de8eadc55e6a25d1a73bdd8 | [
"MIT"
] | 1 | 2021-12-21T17:27:34.000Z | 2021-12-21T17:27:34.000Z | from django.db import models
class Chef(models.Model):
name=models.CharField(max_length=50)
position=models.CharField(max_length=50)
description=models.TextField()
photo=models.ImageField(upload_to='chef/')
class Meta:
verbose_name ='Chef'
verbose_name_plural='Chefs'
def __str__(self):
return self.name
class Slider(models.Model):
image=models.ImageField(upload_to='slider/')
name=models.CharField(max_length=50)
class Meta:
verbose_name='Slider'
verbose_name_plural='Sliders'
def __str__(self):
return self.name
| 23.384615 | 48 | 0.685855 | 76 | 608 | 5.236842 | 0.407895 | 0.110553 | 0.135678 | 0.180905 | 0.336683 | 0.271357 | 0 | 0 | 0 | 0 | 0 | 0.012422 | 0.205592 | 608 | 25 | 49 | 24.32 | 0.811594 | 0 | 0 | 0.421053 | 0 | 0 | 0.055921 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0 | 0.052632 | 0.105263 | 0.789474 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
0d2792d0fd44769568c671a62acca11f98fc94a4 | 305 | py | Python | iotapp/Classes/httprequest.py | fcarbah/IOT | 6b323c75844d0a6744df12059466d6c54bd0b242 | [
"MIT"
] | null | null | null | iotapp/Classes/httprequest.py | fcarbah/IOT | 6b323c75844d0a6744df12059466d6c54bd0b242 | [
"MIT"
] | null | null | null | iotapp/Classes/httprequest.py | fcarbah/IOT | 6b323c75844d0a6744df12059466d6c54bd0b242 | [
"MIT"
] | null | null | null | import urllib3
class HttpRequest:
__http=None
def __init__(self):
self.__http = urllib3.PoolManager()
def post(self,url,data={},headers={}):
self.__http.request('POST',url,data,headers)
def get(self,url,headers={}):
self.__http.request('get',url,{},headers) | 17.941176 | 52 | 0.629508 | 37 | 305 | 4.864865 | 0.432432 | 0.133333 | 0.155556 | 0.244444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008264 | 0.206557 | 305 | 17 | 53 | 17.941176 | 0.735537 | 0 | 0 | 0 | 0 | 0 | 0.022876 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.111111 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
0d2b31d40272b35f2023f3820d8bcc5908033214 | 1,795 | py | Python | UnityEngine/MatchTargetWeightMask/__init__.py | Grim-es/udon-pie-auto-completion | c2cd86554ed615cdbbb01e19fa40665eafdfaedc | [
"MIT"
] | null | null | null | UnityEngine/MatchTargetWeightMask/__init__.py | Grim-es/udon-pie-auto-completion | c2cd86554ed615cdbbb01e19fa40665eafdfaedc | [
"MIT"
] | null | null | null | UnityEngine/MatchTargetWeightMask/__init__.py | Grim-es/udon-pie-auto-completion | c2cd86554ed615cdbbb01e19fa40665eafdfaedc | [
"MIT"
] | null | null | null | from UdonPie import System
from UdonPie import UnityEngine
from UdonPie.Undefined import *
class MatchTargetWeightMask:
def __new__(cls, arg1=None):
'''
:returns: MatchTargetWeightMask
:rtype: UnityEngine.MatchTargetWeightMask
'''
pass
@staticmethod
def ctor(arg1, arg2):
'''
:param arg1: Vector3
:type arg1: UnityEngine.Vector3
:param arg2: Single
:type arg2: System.Single or float
:returns: MatchTargetWeightMask
:rtype: UnityEngine.MatchTargetWeightMask
'''
pass
@staticmethod
def get_positionXYZWeight():
'''
:returns: Vector3
:rtype: UnityEngine.Vector3
'''
pass
@staticmethod
def set_positionXYZWeight(arg1):
'''
:param arg1: Vector3
:type arg1: UnityEngine.Vector3
'''
pass
@staticmethod
def get_rotationWeight():
'''
:returns: Single
:rtype: System.Single
'''
pass
@staticmethod
def set_rotationWeight(arg1):
'''
:param arg1: Single
:type arg1: System.Single or float
'''
pass
@staticmethod
def Equals(arg1):
'''
:param arg1: Object
:type arg1: System.Object
:returns: Boolean
:rtype: System.Boolean
'''
pass
@staticmethod
def ToString():
'''
:returns: String
:rtype: System.String
'''
pass
@staticmethod
def GetHashCode():
'''
:returns: Int32
:rtype: System.Int32
'''
pass
@staticmethod
def GetType():
'''
:returns: Type
:rtype: System.Type
'''
pass
| 19.725275 | 49 | 0.528691 | 144 | 1,795 | 6.534722 | 0.256944 | 0.153029 | 0.181722 | 0.093518 | 0.327311 | 0.2678 | 0.2678 | 0.178533 | 0 | 0 | 0 | 0.023235 | 0.376602 | 1,795 | 90 | 50 | 19.944444 | 0.817694 | 0.359889 | 0 | 0.575758 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.30303 | false | 0.30303 | 0.090909 | 0 | 0.424242 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
0d2c7e6cf904b313acc9aca26daeafb4a2b97695 | 107 | py | Python | teagles_advent_2021/day_13/pt2.py | teagles/teagles-advent-2021 | d49d842663d6382bd0d4d93198ccd66ab68e681b | [
"MIT"
] | null | null | null | teagles_advent_2021/day_13/pt2.py | teagles/teagles-advent-2021 | d49d842663d6382bd0d4d93198ccd66ab68e681b | [
"MIT"
] | null | null | null | teagles_advent_2021/day_13/pt2.py | teagles/teagles-advent-2021 | d49d842663d6382bd0d4d93198ccd66ab68e681b | [
"MIT"
] | null | null | null | import sys
from .lib import parse_input
def main():
print()
if __name__ == '__main__':
main()
| 8.916667 | 28 | 0.626168 | 14 | 107 | 4.142857 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.252336 | 107 | 11 | 29 | 9.727273 | 0.725 | 0 | 0 | 0 | 0 | 0 | 0.074766 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 0.333333 | 0 | 0.5 | 0.166667 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
b49c01b7f4d4caa36d93674286e808354fe98984 | 278 | py | Python | drogher/package/dhl.py | jbittel/drogher | 9a01a19aa6575f82949ef83a3e3b8834807d9cc9 | [
"BSD-3-Clause"
] | 13 | 2017-04-24T07:49:30.000Z | 2020-09-22T13:13:13.000Z | drogher/package/dhl.py | jbittel/drogher | 9a01a19aa6575f82949ef83a3e3b8834807d9cc9 | [
"BSD-3-Clause"
] | null | null | null | drogher/package/dhl.py | jbittel/drogher | 9a01a19aa6575f82949ef83a3e3b8834807d9cc9 | [
"BSD-3-Clause"
] | 4 | 2018-09-08T05:31:57.000Z | 2022-02-10T17:42:31.000Z | from .base import Package
class DHL(Package):
barcode_pattern = r'^\d{10}$'
shipper = 'DHL'
@property
def valid_checksum(self):
chars, check_digit = self.tracking_number[:-1], self.tracking_number[-1]
return int(chars) % 7 == int(check_digit)
| 23.166667 | 80 | 0.643885 | 37 | 278 | 4.675676 | 0.702703 | 0.115607 | 0.208092 | 0.219653 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023041 | 0.219424 | 278 | 11 | 81 | 25.272727 | 0.774194 | 0 | 0 | 0 | 0 | 0 | 0.039568 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
b4bee4983a1e8f4098187f57c27bfb93f46aeb4b | 338 | py | Python | src/addongen/src/addongen/__object__.py | KAIYOHUGO/addon-gen | 65c7db176bdef8c8746eb6db50b684f8cce23bd6 | [
"MIT"
] | null | null | null | src/addongen/src/addongen/__object__.py | KAIYOHUGO/addon-gen | 65c7db176bdef8c8746eb6db50b684f8cce23bd6 | [
"MIT"
] | null | null | null | src/addongen/src/addongen/__object__.py | KAIYOHUGO/addon-gen | 65c7db176bdef8c8746eb6db50b684f8cce23bd6 | [
"MIT"
] | null | null | null | class component:
__type__ = str()
data = object()
class query:
__type__ = str()
query_id = int()
class itemStack:
__identifier__ = str()
__type__ = str()
count = str()
item = str()
class block:
__identifier__ = str()
__type__ = str()
block_position = object()
ticking_area = object()
| 14.695652 | 29 | 0.591716 | 35 | 338 | 4.942857 | 0.485714 | 0.16185 | 0.196532 | 0.231214 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.289941 | 338 | 22 | 30 | 15.363636 | 0.720833 | 0 | 0 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
b4c44cfa861a3cb2f7399a87693dea11f5c7ba5f | 481 | py | Python | search_service/models/base.py | verdan/amundsensearchlibrary | ca57770d82f7956339376655aaae736870671a66 | [
"Apache-2.0"
] | null | null | null | search_service/models/base.py | verdan/amundsensearchlibrary | ca57770d82f7956339376655aaae736870671a66 | [
"Apache-2.0"
] | null | null | null | search_service/models/base.py | verdan/amundsensearchlibrary | ca57770d82f7956339376655aaae736870671a66 | [
"Apache-2.0"
] | null | null | null | from abc import ABCMeta, abstractmethod
from typing import Set
class Base(metaclass=ABCMeta):
"""
A base class for ES model
"""
@abstractmethod
def get_id(cls) -> str:
# return a document id in ES
pass
@abstractmethod
def get_attrs(cls) -> Set:
# return a set of attributes for the class
pass
@staticmethod
@abstractmethod
def get_type() -> str:
# return a type string for the class
pass
| 19.24 | 50 | 0.609148 | 61 | 481 | 4.754098 | 0.491803 | 0.175862 | 0.206897 | 0.103448 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.322245 | 481 | 24 | 51 | 20.041667 | 0.889571 | 0.268191 | 0 | 0.461538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0.230769 | 0.153846 | 0 | 0.461538 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
370211a081a208a56f69edfdacecf44b58bc1ce6 | 26,188 | py | Python | airflow/providers/google/cloud/operators/vertex_ai/auto_ml.py | arezamoosavi/airflow | c3c81c3144386d1de535c1c5e777270e727bb69e | [
"Apache-2.0"
] | 2 | 2016-08-23T14:22:15.000Z | 2017-09-28T19:45:26.000Z | airflow/providers/google/cloud/operators/vertex_ai/auto_ml.py | arezamoosavi/airflow | c3c81c3144386d1de535c1c5e777270e727bb69e | [
"Apache-2.0"
] | 4 | 2019-01-24T11:01:17.000Z | 2022-02-28T04:28:07.000Z | airflow/providers/google/cloud/operators/vertex_ai/auto_ml.py | arezamoosavi/airflow | c3c81c3144386d1de535c1c5e777270e727bb69e | [
"Apache-2.0"
] | 6 | 2018-04-09T07:46:05.000Z | 2019-07-16T00:13:15.000Z | #
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
#
"""This module contains Google Vertex AI operators."""
from typing import TYPE_CHECKING, Dict, List, Optional, Sequence, Tuple, Union
from google.api_core.exceptions import NotFound
from google.api_core.retry import Retry
from google.cloud.aiplatform import datasets
from google.cloud.aiplatform.models import Model
from google.cloud.aiplatform_v1.types.training_pipeline import TrainingPipeline
from airflow.models import BaseOperator
from airflow.providers.google.cloud.hooks.vertex_ai.auto_ml import AutoMLHook
from airflow.providers.google.cloud.links.vertex_ai import VertexAIModelLink, VertexAITrainingPipelinesLink
if TYPE_CHECKING:
from airflow.utils.context import Context
class AutoMLTrainingJobBaseOperator(BaseOperator):
"""The base class for operators that launch AutoML jobs on VertexAI."""
def __init__(
self,
*,
project_id: str,
region: str,
display_name: str,
labels: Optional[Dict[str, str]] = None,
training_encryption_spec_key_name: Optional[str] = None,
model_encryption_spec_key_name: Optional[str] = None,
# RUN
training_fraction_split: Optional[float] = None,
test_fraction_split: Optional[float] = None,
model_display_name: Optional[str] = None,
model_labels: Optional[Dict[str, str]] = None,
sync: bool = True,
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.project_id = project_id
self.region = region
self.display_name = display_name
self.labels = labels
self.training_encryption_spec_key_name = training_encryption_spec_key_name
self.model_encryption_spec_key_name = model_encryption_spec_key_name
# START Run param
self.training_fraction_split = training_fraction_split
self.test_fraction_split = test_fraction_split
self.model_display_name = model_display_name
self.model_labels = model_labels
self.sync = sync
# END Run param
self.gcp_conn_id = gcp_conn_id
self.delegate_to = delegate_to
self.impersonation_chain = impersonation_chain
self.hook = None # type: Optional[AutoMLHook]
def on_kill(self) -> None:
"""
Callback called when the operator is killed.
Cancel any running job.
"""
if self.hook:
self.hook.cancel_auto_ml_job()
class CreateAutoMLForecastingTrainingJobOperator(AutoMLTrainingJobBaseOperator):
"""Create AutoML Forecasting Training job"""
template_fields = [
'region',
'impersonation_chain',
]
operator_extra_links = (VertexAIModelLink(),)
def __init__(
self,
*,
dataset_id: str,
target_column: str,
time_column: str,
time_series_identifier_column: str,
unavailable_at_forecast_columns: List[str],
available_at_forecast_columns: List[str],
forecast_horizon: int,
data_granularity_unit: str,
data_granularity_count: int,
optimization_objective: Optional[str] = None,
column_specs: Optional[Dict[str, str]] = None,
column_transformations: Optional[List[Dict[str, Dict[str, str]]]] = None,
validation_fraction_split: Optional[float] = None,
predefined_split_column_name: Optional[str] = None,
weight_column: Optional[str] = None,
time_series_attribute_columns: Optional[List[str]] = None,
context_window: Optional[int] = None,
export_evaluated_data_items: bool = False,
export_evaluated_data_items_bigquery_destination_uri: Optional[str] = None,
export_evaluated_data_items_override_destination: bool = False,
quantiles: Optional[List[float]] = None,
validation_options: Optional[str] = None,
budget_milli_node_hours: int = 1000,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.dataset_id = dataset_id
self.target_column = target_column
self.time_column = time_column
self.time_series_identifier_column = time_series_identifier_column
self.unavailable_at_forecast_columns = unavailable_at_forecast_columns
self.available_at_forecast_columns = available_at_forecast_columns
self.forecast_horizon = forecast_horizon
self.data_granularity_unit = data_granularity_unit
self.data_granularity_count = data_granularity_count
self.optimization_objective = optimization_objective
self.column_specs = column_specs
self.column_transformations = column_transformations
self.validation_fraction_split = validation_fraction_split
self.predefined_split_column_name = predefined_split_column_name
self.weight_column = weight_column
self.time_series_attribute_columns = time_series_attribute_columns
self.context_window = context_window
self.export_evaluated_data_items = export_evaluated_data_items
self.export_evaluated_data_items_bigquery_destination_uri = (
export_evaluated_data_items_bigquery_destination_uri
)
self.export_evaluated_data_items_override_destination = (
export_evaluated_data_items_override_destination
)
self.quantiles = quantiles
self.validation_options = validation_options
self.budget_milli_node_hours = budget_milli_node_hours
def execute(self, context: "Context"):
self.hook = AutoMLHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
model = self.hook.create_auto_ml_forecasting_training_job(
project_id=self.project_id,
region=self.region,
display_name=self.display_name,
dataset=datasets.TimeSeriesDataset(dataset_name=self.dataset_id),
target_column=self.target_column,
time_column=self.time_column,
time_series_identifier_column=self.time_series_identifier_column,
unavailable_at_forecast_columns=self.unavailable_at_forecast_columns,
available_at_forecast_columns=self.available_at_forecast_columns,
forecast_horizon=self.forecast_horizon,
data_granularity_unit=self.data_granularity_unit,
data_granularity_count=self.data_granularity_count,
optimization_objective=self.optimization_objective,
column_specs=self.column_specs,
column_transformations=self.column_transformations,
labels=self.labels,
training_encryption_spec_key_name=self.training_encryption_spec_key_name,
model_encryption_spec_key_name=self.model_encryption_spec_key_name,
training_fraction_split=self.training_fraction_split,
validation_fraction_split=self.validation_fraction_split,
test_fraction_split=self.test_fraction_split,
predefined_split_column_name=self.predefined_split_column_name,
weight_column=self.weight_column,
time_series_attribute_columns=self.time_series_attribute_columns,
context_window=self.context_window,
export_evaluated_data_items=self.export_evaluated_data_items,
export_evaluated_data_items_bigquery_destination_uri=(
self.export_evaluated_data_items_bigquery_destination_uri
),
export_evaluated_data_items_override_destination=(
self.export_evaluated_data_items_override_destination
),
quantiles=self.quantiles,
validation_options=self.validation_options,
budget_milli_node_hours=self.budget_milli_node_hours,
model_display_name=self.model_display_name,
model_labels=self.model_labels,
sync=self.sync,
)
result = Model.to_dict(model)
model_id = self.hook.extract_model_id(result)
VertexAIModelLink.persist(context=context, task_instance=self, model_id=model_id)
return result
class CreateAutoMLImageTrainingJobOperator(AutoMLTrainingJobBaseOperator):
"""Create Auto ML Image Training job"""
template_fields = [
'region',
'impersonation_chain',
]
operator_extra_links = (VertexAIModelLink(),)
def __init__(
self,
*,
dataset_id: str,
prediction_type: str = "classification",
multi_label: bool = False,
model_type: str = "CLOUD",
base_model: Optional[Model] = None,
validation_fraction_split: Optional[float] = None,
training_filter_split: Optional[str] = None,
validation_filter_split: Optional[str] = None,
test_filter_split: Optional[str] = None,
budget_milli_node_hours: Optional[int] = None,
disable_early_stopping: bool = False,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.dataset_id = dataset_id
self.prediction_type = prediction_type
self.multi_label = multi_label
self.model_type = model_type
self.base_model = base_model
self.validation_fraction_split = validation_fraction_split
self.training_filter_split = training_filter_split
self.validation_filter_split = validation_filter_split
self.test_filter_split = test_filter_split
self.budget_milli_node_hours = budget_milli_node_hours
self.disable_early_stopping = disable_early_stopping
def execute(self, context: "Context"):
self.hook = AutoMLHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
model = self.hook.create_auto_ml_image_training_job(
project_id=self.project_id,
region=self.region,
display_name=self.display_name,
dataset=datasets.ImageDataset(dataset_name=self.dataset_id),
prediction_type=self.prediction_type,
multi_label=self.multi_label,
model_type=self.model_type,
base_model=self.base_model,
labels=self.labels,
training_encryption_spec_key_name=self.training_encryption_spec_key_name,
model_encryption_spec_key_name=self.model_encryption_spec_key_name,
training_fraction_split=self.training_fraction_split,
validation_fraction_split=self.validation_fraction_split,
test_fraction_split=self.test_fraction_split,
training_filter_split=self.training_filter_split,
validation_filter_split=self.validation_filter_split,
test_filter_split=self.test_filter_split,
budget_milli_node_hours=self.budget_milli_node_hours,
model_display_name=self.model_display_name,
model_labels=self.model_labels,
disable_early_stopping=self.disable_early_stopping,
sync=self.sync,
)
result = Model.to_dict(model)
model_id = self.hook.extract_model_id(result)
VertexAIModelLink.persist(context=context, task_instance=self, model_id=model_id)
return result
class CreateAutoMLTabularTrainingJobOperator(AutoMLTrainingJobBaseOperator):
"""Create Auto ML Tabular Training job"""
template_fields = [
'region',
'impersonation_chain',
]
operator_extra_links = (VertexAIModelLink(),)
def __init__(
self,
*,
dataset_id: str,
target_column: str,
optimization_prediction_type: str,
optimization_objective: Optional[str] = None,
column_specs: Optional[Dict[str, str]] = None,
column_transformations: Optional[List[Dict[str, Dict[str, str]]]] = None,
optimization_objective_recall_value: Optional[float] = None,
optimization_objective_precision_value: Optional[float] = None,
validation_fraction_split: Optional[float] = None,
predefined_split_column_name: Optional[str] = None,
timestamp_split_column_name: Optional[str] = None,
weight_column: Optional[str] = None,
budget_milli_node_hours: int = 1000,
disable_early_stopping: bool = False,
export_evaluated_data_items: bool = False,
export_evaluated_data_items_bigquery_destination_uri: Optional[str] = None,
export_evaluated_data_items_override_destination: bool = False,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.dataset_id = dataset_id
self.target_column = target_column
self.optimization_prediction_type = optimization_prediction_type
self.optimization_objective = optimization_objective
self.column_specs = column_specs
self.column_transformations = column_transformations
self.optimization_objective_recall_value = optimization_objective_recall_value
self.optimization_objective_precision_value = optimization_objective_precision_value
self.validation_fraction_split = validation_fraction_split
self.predefined_split_column_name = predefined_split_column_name
self.timestamp_split_column_name = timestamp_split_column_name
self.weight_column = weight_column
self.budget_milli_node_hours = budget_milli_node_hours
self.disable_early_stopping = disable_early_stopping
self.export_evaluated_data_items = export_evaluated_data_items
self.export_evaluated_data_items_bigquery_destination_uri = (
export_evaluated_data_items_bigquery_destination_uri
)
self.export_evaluated_data_items_override_destination = (
export_evaluated_data_items_override_destination
)
def execute(self, context: "Context"):
self.hook = AutoMLHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
model = self.hook.create_auto_ml_tabular_training_job(
project_id=self.project_id,
region=self.region,
display_name=self.display_name,
dataset=datasets.TabularDataset(dataset_name=self.dataset_id),
target_column=self.target_column,
optimization_prediction_type=self.optimization_prediction_type,
optimization_objective=self.optimization_objective,
column_specs=self.column_specs,
column_transformations=self.column_transformations,
optimization_objective_recall_value=self.optimization_objective_recall_value,
optimization_objective_precision_value=self.optimization_objective_precision_value,
labels=self.labels,
training_encryption_spec_key_name=self.training_encryption_spec_key_name,
model_encryption_spec_key_name=self.model_encryption_spec_key_name,
training_fraction_split=self.training_fraction_split,
validation_fraction_split=self.validation_fraction_split,
test_fraction_split=self.test_fraction_split,
predefined_split_column_name=self.predefined_split_column_name,
timestamp_split_column_name=self.timestamp_split_column_name,
weight_column=self.weight_column,
budget_milli_node_hours=self.budget_milli_node_hours,
model_display_name=self.model_display_name,
model_labels=self.model_labels,
disable_early_stopping=self.disable_early_stopping,
export_evaluated_data_items=self.export_evaluated_data_items,
export_evaluated_data_items_bigquery_destination_uri=(
self.export_evaluated_data_items_bigquery_destination_uri
),
export_evaluated_data_items_override_destination=(
self.export_evaluated_data_items_override_destination
),
sync=self.sync,
)
result = Model.to_dict(model)
model_id = self.hook.extract_model_id(result)
VertexAIModelLink.persist(context=context, task_instance=self, model_id=model_id)
return result
class CreateAutoMLTextTrainingJobOperator(AutoMLTrainingJobBaseOperator):
"""Create Auto ML Text Training job"""
template_fields = [
'region',
'impersonation_chain',
]
operator_extra_links = (VertexAIModelLink(),)
def __init__(
self,
*,
dataset_id: str,
prediction_type: str,
multi_label: bool = False,
sentiment_max: int = 10,
validation_fraction_split: Optional[float] = None,
training_filter_split: Optional[str] = None,
validation_filter_split: Optional[str] = None,
test_filter_split: Optional[str] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.dataset_id = dataset_id
self.prediction_type = prediction_type
self.multi_label = multi_label
self.sentiment_max = sentiment_max
self.validation_fraction_split = validation_fraction_split
self.training_filter_split = training_filter_split
self.validation_filter_split = validation_filter_split
self.test_filter_split = test_filter_split
def execute(self, context: "Context"):
self.hook = AutoMLHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
model = self.hook.create_auto_ml_text_training_job(
project_id=self.project_id,
region=self.region,
display_name=self.display_name,
dataset=datasets.TextDataset(dataset_name=self.dataset_id),
prediction_type=self.prediction_type,
multi_label=self.multi_label,
sentiment_max=self.sentiment_max,
labels=self.labels,
training_encryption_spec_key_name=self.training_encryption_spec_key_name,
model_encryption_spec_key_name=self.model_encryption_spec_key_name,
training_fraction_split=self.training_fraction_split,
validation_fraction_split=self.validation_fraction_split,
test_fraction_split=self.test_fraction_split,
training_filter_split=self.training_filter_split,
validation_filter_split=self.validation_filter_split,
test_filter_split=self.test_filter_split,
model_display_name=self.model_display_name,
model_labels=self.model_labels,
sync=self.sync,
)
result = Model.to_dict(model)
model_id = self.hook.extract_model_id(result)
VertexAIModelLink.persist(context=context, task_instance=self, model_id=model_id)
return result
class CreateAutoMLVideoTrainingJobOperator(AutoMLTrainingJobBaseOperator):
"""Create Auto ML Video Training job"""
template_fields = [
'region',
'impersonation_chain',
]
operator_extra_links = (VertexAIModelLink(),)
def __init__(
self,
*,
dataset_id: str,
prediction_type: str = "classification",
model_type: str = "CLOUD",
training_filter_split: Optional[str] = None,
test_filter_split: Optional[str] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.dataset_id = dataset_id
self.prediction_type = prediction_type
self.model_type = model_type
self.training_filter_split = training_filter_split
self.test_filter_split = test_filter_split
def execute(self, context: "Context"):
self.hook = AutoMLHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
model = self.hook.create_auto_ml_video_training_job(
project_id=self.project_id,
region=self.region,
display_name=self.display_name,
dataset=datasets.VideoDataset(dataset_name=self.dataset_id),
prediction_type=self.prediction_type,
model_type=self.model_type,
labels=self.labels,
training_encryption_spec_key_name=self.training_encryption_spec_key_name,
model_encryption_spec_key_name=self.model_encryption_spec_key_name,
training_fraction_split=self.training_fraction_split,
test_fraction_split=self.test_fraction_split,
training_filter_split=self.training_filter_split,
test_filter_split=self.test_filter_split,
model_display_name=self.model_display_name,
model_labels=self.model_labels,
sync=self.sync,
)
result = Model.to_dict(model)
model_id = self.hook.extract_model_id(result)
VertexAIModelLink.persist(context=context, task_instance=self, model_id=model_id)
return result
class DeleteAutoMLTrainingJobOperator(BaseOperator):
"""Deletes an AutoMLForecastingTrainingJob, AutoMLImageTrainingJob, AutoMLTabularTrainingJob,
AutoMLTextTrainingJob, or AutoMLVideoTrainingJob.
"""
template_fields = ("region", "project_id", "impersonation_chain")
def __init__(
self,
*,
training_pipeline_id: str,
region: str,
project_id: str,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Sequence[Tuple[str, str]] = (),
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.training_pipeline = training_pipeline_id
self.region = region
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.delegate_to = delegate_to
self.impersonation_chain = impersonation_chain
def execute(self, context: "Context"):
hook = AutoMLHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
try:
self.log.info("Deleting Auto ML training pipeline: %s", self.training_pipeline)
training_pipeline_operation = hook.delete_training_pipeline(
training_pipeline=self.training_pipeline,
region=self.region,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
hook.wait_for_operation(timeout=self.timeout, operation=training_pipeline_operation)
self.log.info("Training pipeline was deleted.")
except NotFound:
self.log.info("The Training Pipeline ID %s does not exist.", self.training_pipeline)
class ListAutoMLTrainingJobOperator(BaseOperator):
"""Lists AutoMLForecastingTrainingJob, AutoMLImageTrainingJob, AutoMLTabularTrainingJob,
AutoMLTextTrainingJob, or AutoMLVideoTrainingJob in a Location.
"""
template_fields = [
"region",
"project_id",
"impersonation_chain",
]
operator_extra_links = [
VertexAITrainingPipelinesLink(),
]
def __init__(
self,
*,
region: str,
project_id: str,
page_size: Optional[int] = None,
page_token: Optional[str] = None,
filter: Optional[str] = None,
read_mask: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Sequence[Tuple[str, str]] = (),
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.region = region
self.project_id = project_id
self.page_size = page_size
self.page_token = page_token
self.filter = filter
self.read_mask = read_mask
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.delegate_to = delegate_to
self.impersonation_chain = impersonation_chain
def execute(self, context: "Context"):
hook = AutoMLHook(
gcp_conn_id=self.gcp_conn_id,
delegate_to=self.delegate_to,
impersonation_chain=self.impersonation_chain,
)
results = hook.list_training_pipelines(
region=self.region,
project_id=self.project_id,
page_size=self.page_size,
page_token=self.page_token,
filter=self.filter,
read_mask=self.read_mask,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
VertexAITrainingPipelinesLink.persist(context=context, task_instance=self)
return [TrainingPipeline.to_dict(result) for result in results]
| 41.967949 | 107 | 0.688941 | 2,883 | 26,188 | 5.855706 | 0.097121 | 0.035422 | 0.033764 | 0.042649 | 0.778699 | 0.730364 | 0.674446 | 0.652411 | 0.628835 | 0.605734 | 0 | 0.000754 | 0.240568 | 26,188 | 623 | 108 | 42.035313 | 0.848099 | 0.055942 | 0 | 0.702048 | 0 | 0 | 0.018409 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029795 | false | 0 | 0.018622 | 0 | 0.098696 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
370239612665db191d987f35efe965057710194c | 18 | py | Python | ret_benchmark/utils/log_info.py | alibaba-edu/Ranking-based-Instance-Selection | 06e7fa2061d42f5e8181f7181fe591fc3d294f8d | [
"MIT"
] | 20 | 2021-04-15T09:15:28.000Z | 2022-03-30T02:31:20.000Z | vit_retri/utils/log_info.py | ludics/ViT-Retri | 4a17ae8392a0f8145a2f5ee37854e76503c26009 | [
"MIT"
] | 1 | 2021-06-03T05:51:52.000Z | 2021-06-19T05:52:33.000Z | vit_retri/utils/log_info.py | ludics/ViT-Retri | 4a17ae8392a0f8145a2f5ee37854e76503c26009 | [
"MIT"
] | 5 | 2021-05-17T09:05:38.000Z | 2022-02-28T10:10:50.000Z | log_info = dict()
| 9 | 17 | 0.666667 | 3 | 18 | 3.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 18 | 1 | 18 | 18 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
371c11466bbee73e5a5eb806d61588dcac84b8ca | 106 | py | Python | start.py | fahimfarhan/python-Package-Demo | ce4fe79dd0fe9494892baf620aca9d88107894a7 | [
"MIT"
] | null | null | null | start.py | fahimfarhan/python-Package-Demo | ce4fe79dd0fe9494892baf620aca9d88107894a7 | [
"MIT"
] | null | null | null | start.py | fahimfarhan/python-Package-Demo | ce4fe79dd0fe9494892baf620aca9d88107894a7 | [
"MIT"
] | null | null | null | from measure import metrics, norms
if __name__ == "__main__":
metrics.metric(1)
norms.norms()
| 11.777778 | 34 | 0.669811 | 13 | 106 | 4.846154 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012048 | 0.216981 | 106 | 8 | 35 | 13.25 | 0.746988 | 0 | 0 | 0 | 0 | 0 | 0.07767 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
2ec18eb4c9d609c45f211bbba5ede5409418c196 | 173 | py | Python | crawl_cvs/handlers/my_func.py | MacHu-GWU/crawl_cvs-project | 1cf63254ea1d7c8026a9d0e0543aaa2d7b7f4918 | [
"MIT"
] | 1 | 2020-06-19T09:45:20.000Z | 2020-06-19T09:45:20.000Z | crawl_cvs/handlers/my_func.py | MacHu-GWU/crawl_cvs-project | 1cf63254ea1d7c8026a9d0e0543aaa2d7b7f4918 | [
"MIT"
] | 1 | 2019-12-27T18:41:21.000Z | 2019-12-27T18:41:21.000Z | crawl_cvs/handlers/my_func.py | MacHu-GWU/crawl_cvs-project | 1cf63254ea1d7c8026a9d0e0543aaa2d7b7f4918 | [
"MIT"
] | 1 | 2018-08-22T01:27:32.000Z | 2018-08-22T01:27:32.000Z | # -*- coding: utf-8 -*-
def handler(event, context):
if event.get("name"):
return "Hello {}!".format(event.get("name"))
else:
return "Hello World!"
| 21.625 | 52 | 0.549133 | 21 | 173 | 4.52381 | 0.714286 | 0.168421 | 0.252632 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007634 | 0.242775 | 173 | 7 | 53 | 24.714286 | 0.717557 | 0.121387 | 0 | 0 | 0 | 0 | 0.193333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
2ecbc23da4493780a4b43db3b26245823e15d32e | 5,925 | py | Python | bsm/paradag/__init__.py | bsmsoft/bsm | e45ec5442de39e5f948023cd5b4c6181073cf9a2 | [
"MIT"
] | 3 | 2019-06-12T17:19:12.000Z | 2022-01-07T02:10:06.000Z | bsm/paradag/__init__.py | bsmsoft/bsm | e45ec5442de39e5f948023cd5b4c6181073cf9a2 | [
"MIT"
] | null | null | null | bsm/paradag/__init__.py | bsmsoft/bsm | e45ec5442de39e5f948023cd5b4c6181073cf9a2 | [
"MIT"
] | null | null | null | ''' This package comes from https://github.com/xianghuzhao/paradag
It can also be installed by "pip install paradag"
'''
import random
class DagVertexNotFoundError(Exception):
pass
class DagEdgeNotFoundError(Exception):
pass
class DagCycleError(Exception):
pass
class DagData(object):
def __init__(self):
self.__graph = {}
self.__graph_reverse = {}
def vertice(self):
return set(self.__graph.keys())
def add_vertex(self, vertex):
if vertex not in self.__graph:
self.__graph[vertex] = set()
self.__graph_reverse[vertex] = set()
def add_edge(self, v_from, v_to):
self.__graph[v_from].add(v_to)
self.__graph_reverse[v_to].add(v_from)
def remove_edge(self, v_from, v_to):
self.__graph[v_from].remove(v_to)
self.__graph_reverse[v_to].remove(v_from)
def successors(self, vertex):
return self.__graph[vertex]
def predecessors(self, vertex):
return self.__graph_reverse[vertex]
class Dag(object):
def __init__(self):
self.__data = DagData()
def __validate_vertex(self, *vertice):
for vertex in vertice:
if vertex not in self.__data.vertice():
raise DagVertexNotFoundError('Vertex "{0}" does not belong to DAG'.format(vertex))
def __has_path_to(self, v_from, v_to):
if v_from == v_to:
return True
for v in self.__data.successors(v_from):
if self.__has_path_to(v, v_to):
return True
return False
def vertice(self):
return self.__data.vertice()
def add_vertex(self, *vertice):
for vertex in vertice:
self.__data.add_vertex(vertex)
def add_edge(self, v_from, *v_tos):
self.__validate_vertex(v_from, *v_tos)
for v_to in v_tos:
if self.__has_path_to(v_to, v_from):
raise DagCycleError('Cycle if add edge from "{0}" to "{1}"'.format(v_from, v_to))
self.__data.add_edge(v_from, v_to)
def remove_edge(self, v_from, v_to):
self.__validate_vertex(v_from, v_to)
if v_to not in self.__data.successors(v_from):
raise DagEdgeNotFoundError('Edge not found from "{0}" to "{1}"'.format(v_from, v_to))
self.__data.remove_edge(v_from, v_to)
def vertex_size(self):
return len(self.__data.vertice())
def edge_size(self):
size = 0
for vertex in self.__data.vertice():
size += self.outdegree(vertex)
return size
def successors(self, vertex):
self.__validate_vertex(vertex)
return self.__data.successors(vertex)
def predecessors(self, vertex):
self.__validate_vertex(vertex)
return self.__data.predecessors(vertex)
def indegree(self, vertex):
return len(self.predecessors(vertex))
def outdegree(self, vertex):
return len(self.successors(vertex))
def __endpoints(self, degree_callback):
endpoints = set()
for vertex in self.__data.vertice():
if degree_callback(vertex) == 0:
endpoints.add(vertex)
return endpoints
def all_starts(self):
return self.__endpoints(self.indegree)
def all_terminals(self):
return self.__endpoints(self.outdegree)
class SingleSelector(object):
def select(self, running, idle):
return [next(iter(idle))]
class FullSelector(object):
def select(self, running, idle):
return list(idle)
class RandomSelector(object):
def select(self, running, idle):
return [random.choice(list(idle))]
class ShuffleSelector(object):
def select(self, running, idle):
idle_list = list(idle)
random.shuffle(idle_list)
return idle_list
class NullProcessor(object):
def process(self, vertice, executor):
return [(vertex, None) for vertex in vertice]
# TODO: report_*, deliver, abort should be optional
class NullExecutor(object):
def param(self, vertex):
return None
def execute(self, param_vertex):
return None
def report_start(self, vertice):
pass
def report_finish(self, vertice_results):
pass
def report_running(self, vertice):
pass
def deliver(self, vertex, result):
pass
def abort(self, vertice):
pass
# TODO: Use Vertice as core. Processor, Selector and Executor should all surround Vertice
def dag_run(dag, selector=None, processor=None, executor=None):
if selector is None:
selector = FullSelector()
if processor is None:
processor = NullProcessor()
if executor is None:
executor = NullExecutor()
indegree_dict = {}
for vertex in dag.vertice():
indegree_dict[vertex] = dag.indegree(vertex)
vertice_final = []
vertice_processing = set()
vertice_zero_indegree = dag.all_starts()
while vertice_zero_indegree:
vertice_to_run = selector.select(vertice_processing, vertice_zero_indegree-vertice_processing)
executor.report_start(vertice_to_run)
executor.report_running(set(vertice_to_run) | vertice_processing)
vertice_processed_results = processor.process(vertice_to_run, executor)
executor.report_finish(vertice_processed_results)
vertice_processed = [result[0] for result in vertice_processed_results]
vertice_processing |= set(vertice_to_run)
vertice_processing -= set(vertice_processed)
vertice_final += vertice_processed
vertice_zero_indegree -= set(vertice_processed)
for vertex, result in vertice_processed_results:
for v_to in dag.successors(vertex):
executor.deliver(v_to, result)
indegree_dict[v_to] -= 1
if indegree_dict[v_to] == 0:
vertice_zero_indegree.add(v_to)
return vertice_final
| 27.430556 | 102 | 0.649451 | 743 | 5,925 | 4.889637 | 0.158816 | 0.018993 | 0.019818 | 0.02202 | 0.316818 | 0.224883 | 0.138178 | 0.069915 | 0.069915 | 0.033031 | 0 | 0.002266 | 0.25519 | 5,925 | 215 | 103 | 27.55814 | 0.820983 | 0.042363 | 0 | 0.217687 | 0 | 0 | 0.018711 | 0 | 0 | 0 | 0 | 0.004651 | 0 | 1 | 0.244898 | false | 0.054422 | 0.006803 | 0.102041 | 0.489796 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 3 |
2c0aca15167ff5ae1f5d402750c284f46ad21e27 | 1,585 | py | Python | models/metrics_and_losses.py | marioviti/nn_segmentation | b754b38cd1898c0746e383ecd32d9d4c33c60b33 | [
"MIT"
] | null | null | null | models/metrics_and_losses.py | marioviti/nn_segmentation | b754b38cd1898c0746e383ecd32d9d4c33c60b33 | [
"MIT"
] | null | null | null | models/metrics_and_losses.py | marioviti/nn_segmentation | b754b38cd1898c0746e383ecd32d9d4c33c60b33 | [
"MIT"
] | null | null | null | from keras import backend as K
from skimage.transform import resize
import tensorflow as tf
K.set_image_data_format('channels_last')
def true_pos(y_true, y_pred):
return K.sum(y_true * K.round(y_pred))
def false_pos(y_true, y_pred):
return K.sum(y_true * (1. - K.round(y_pred)))
def false_neg(y_true, y_pred):
return K.sum((1. - y_true) * K.round(y_pred))
def precision(y_true, y_pred):
return true_pos(y_true, y_pred) / \
(true_pos(y_true, y_pred) + false_pos(y_true, y_pred))
def PSNR(y_true, y_pred):
shape = y_pred.get_shape()
return K.sum((y_true - K.round(y_pred)))
def dice_coef(y_true, y_pred):
"""
Attention:
y_true can be weighted to modify learning therefore
apply sign to get back to labels
y_pred have to be rounded to nearest integer to obtain labels.
"""
smooth = 1.
y_true_f = K.flatten(K.sign(y_true))
y_pred_f = K.flatten(K.round(y_pred))
intersection = K.sum(y_true_f * y_pred_f)
return (2. * intersection + smooth) / (K.sum(y_true_f) + K.sum(y_pred_f) + smooth)
def dice_coef_loss(y_true, y_pred):
return -dice_coef(y_true, y_pred)
def softmax_categorical_crossentropy(target, output):
"""Categorical crossentropy between an output tensor and a target tensor.
# Arguments
target: A tensor of the same shape as `output`.
output: result of a softmax, or is a tensor of logits.
# Returns
Output tensor.
"""
# manual computation of crossentropy
return - tf.reduce_sum(target * tf.log(output),len(output.get_shape())-1) | 32.346939 | 86 | 0.68265 | 269 | 1,585 | 3.776952 | 0.301115 | 0.103346 | 0.070866 | 0.11811 | 0.307087 | 0.25 | 0.135827 | 0.097441 | 0.097441 | 0.097441 | 0 | 0.003975 | 0.206309 | 1,585 | 49 | 87 | 32.346939 | 0.803657 | 0.264353 | 0 | 0 | 0 | 0 | 0.011722 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.307692 | false | 0 | 0.115385 | 0.192308 | 0.730769 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
2c0ca091ba8017a6ea07dedf32869a44838d16fc | 537 | py | Python | basics01/esacpechar.py | DevAnuragGarg/Python-Learning-Basics | 2c58c82ba79bf2c7c3317628222554133dd50713 | [
"Apache-2.0"
] | 1 | 2020-06-09T09:49:02.000Z | 2020-06-09T09:49:02.000Z | basics01/esacpechar.py | DevAnuragGarg/Python-Learning-Basics | 2c58c82ba79bf2c7c3317628222554133dd50713 | [
"Apache-2.0"
] | null | null | null | basics01/esacpechar.py | DevAnuragGarg/Python-Learning-Basics | 2c58c82ba79bf2c7c3317628222554133dd50713 | [
"Apache-2.0"
] | null | null | null | split_string = "This string has been \nsplit over\nseveral\nlines"
print(split_string)
tabbed_string = "1\t2\t3\t"
print(tabbed_string)
print('Hello what\'s the situation like. He\'s is not responding')
print("Hello what's the situation like. He's is not responding")
print("""Hello what's the situation like. He's is not responding""")
print("""Hello
the thing is
this is going
to next line""")
print("""Hello \
the thing is \
this is going \
to next line""")
print("C:\\users\\abc\\tim\\python")
print(r"C:\users\abc\tim\python")
| 23.347826 | 68 | 0.713222 | 93 | 537 | 4.075269 | 0.387097 | 0.131926 | 0.110818 | 0.118734 | 0.712401 | 0.617414 | 0.617414 | 0.617414 | 0.617414 | 0.617414 | 0 | 0.006424 | 0.130354 | 537 | 22 | 69 | 24.409091 | 0.805139 | 0 | 0 | 0.117647 | 0 | 0 | 0.640596 | 0.09311 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.529412 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
2c0ca4433555acb470f71744f8f25a30302d9742 | 299 | py | Python | tests/app/libs/test_utils.py | pricem14pc/eq-questionnaire-runner | 54cc2947ba181a2673ea1fb7cf6b4acdd609e06b | [
"MIT"
] | null | null | null | tests/app/libs/test_utils.py | pricem14pc/eq-questionnaire-runner | 54cc2947ba181a2673ea1fb7cf6b4acdd609e06b | [
"MIT"
] | null | null | null | tests/app/libs/test_utils.py | pricem14pc/eq-questionnaire-runner | 54cc2947ba181a2673ea1fb7cf6b4acdd609e06b | [
"MIT"
] | null | null | null | from app.helpers.uuid_helper import is_valid_uuid
from app.libs.utils import convert_tx_id
def test_convert_tx_id():
tx_id_to_convert = "bc26d5ef-8475-4710-ac82-753a0a150708"
assert is_valid_uuid(tx_id_to_convert)
assert convert_tx_id(tx_id_to_convert) == "BC26 - D5EF - 8475 - 4710"
| 29.9 | 73 | 0.782609 | 51 | 299 | 4.176471 | 0.470588 | 0.112676 | 0.15493 | 0.183099 | 0.225352 | 0.225352 | 0.225352 | 0 | 0 | 0 | 0 | 0.131274 | 0.133779 | 299 | 9 | 74 | 33.222222 | 0.69112 | 0 | 0 | 0 | 0 | 0 | 0.204013 | 0.120401 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
2c16360d424581c214b60779daa9baad492484f7 | 2,311 | py | Python | tests/unit/bokeh/models/test_filters.py | teresafds/bokeh | 95b2a74ff463cfabdf9e3390951fa380166e6691 | [
"BSD-3-Clause"
] | null | null | null | tests/unit/bokeh/models/test_filters.py | teresafds/bokeh | 95b2a74ff463cfabdf9e3390951fa380166e6691 | [
"BSD-3-Clause"
] | null | null | null | tests/unit/bokeh/models/test_filters.py | teresafds/bokeh | 95b2a74ff463cfabdf9e3390951fa380166e6691 | [
"BSD-3-Clause"
] | null | null | null | #-----------------------------------------------------------------------------
# Copyright (c) 2012 - 2022, Anaconda, Inc., and Bokeh Contributors.
# All rights reserved.
#
# The full license is in the file LICENSE.txt, distributed with this software.
#-----------------------------------------------------------------------------
#-----------------------------------------------------------------------------
# Boilerplate
#-----------------------------------------------------------------------------
from __future__ import annotations # isort:skip
import pytest ; pytest
#-----------------------------------------------------------------------------
# Imports
#-----------------------------------------------------------------------------
# Module under test
import bokeh.models.filters as bmf # isort:skip
#-----------------------------------------------------------------------------
# Setup
#-----------------------------------------------------------------------------
#-----------------------------------------------------------------------------
# General API
#-----------------------------------------------------------------------------
def test_Filter_set_operators() -> None:
f0 = ~bmf.BooleanFilter()
assert isinstance(f0, bmf.InversionFilter)
f1 = bmf.BooleanFilter() & bmf.IndexFilter()
assert isinstance(f1, bmf.IntersectionFilter)
assert len(f1.operands) == 2
f2 = bmf.BooleanFilter() | bmf.IndexFilter()
assert isinstance(f2, bmf.UnionFilter)
assert len(f2.operands) == 2
f3 = bmf.BooleanFilter() - bmf.IndexFilter()
assert isinstance(f3, bmf.DifferenceFilter)
assert len(f3.operands) == 2
f4 = bmf.BooleanFilter() ^ bmf.IndexFilter()
assert isinstance(f4, bmf.SymmetricDifferenceFilter)
assert len(f4.operands) == 2
#-----------------------------------------------------------------------------
# Dev API
#-----------------------------------------------------------------------------
#-----------------------------------------------------------------------------
# Private API
#-----------------------------------------------------------------------------
#-----------------------------------------------------------------------------
# Code
#-----------------------------------------------------------------------------
| 37.885246 | 78 | 0.320208 | 129 | 2,311 | 5.682171 | 0.51938 | 0.109141 | 0.103683 | 0.163711 | 0.251023 | 0.251023 | 0 | 0 | 0 | 0 | 0 | 0.012375 | 0.09087 | 2,311 | 60 | 79 | 38.516667 | 0.336506 | 0.64907 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.055556 | false | 0 | 0.166667 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
2c17df50a8d83000fd4e766f46fb4bd0e6305963 | 755 | py | Python | tests/test_mhk_models_no_db.py | time-link/timelink-py | 60d51bfedb64688aa7603f074d7bbc7432b5e841 | [
"MIT"
] | null | null | null | tests/test_mhk_models_no_db.py | time-link/timelink-py | 60d51bfedb64688aa7603f074d7bbc7432b5e841 | [
"MIT"
] | 3 | 2021-08-02T13:25:46.000Z | 2022-03-27T11:17:59.000Z | tests/test_mhk_models_no_db.py | time-link/timelink-py | 60d51bfedb64688aa7603f074d7bbc7432b5e841 | [
"MIT"
] | null | null | null | """
Test models with requiring a db connection
"""
import pytest
from timelink.mhk.models import base # noqa
from timelink.mhk.models.entity import Entity # noqa
from timelink.mhk.models.pom_som_mapper import PomSomMapper
from timelink.mhk.models.base_class import Base
from timelink.mhk.models.db import TimelinkDB
def test_entity_subclasses():
scl = list(Entity.get_subclasses())
sc1 = len(scl)
class SubEntity(Entity):
pass
scl2 = list(Entity.get_subclasses())
sc2 = len(scl2)
assert sc2 == sc1 + 1, "wrong direct subclasses of Entity"
class SubSubEntity(SubEntity):
pass
scl3 = list(Entity.get_subclasses())
sc3 = len(scl3)
assert sc3 == sc2 + 1, "wrong recursive subclasses of Entity"
| 25.166667 | 65 | 0.711258 | 102 | 755 | 5.186275 | 0.401961 | 0.113422 | 0.141777 | 0.198488 | 0.094518 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021417 | 0.196026 | 755 | 29 | 66 | 26.034483 | 0.850082 | 0.070199 | 0 | 0.105263 | 0 | 0 | 0.099855 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 1 | 0.052632 | false | 0.105263 | 0.315789 | 0 | 0.473684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 3 |
2c1890bcbe415de6b707f1f914bd82dec46e3744 | 218 | py | Python | twitter_problems/problem_5.py | loftwah/Daily-Coding-Problem | 0327f0b4f69ef419436846c831110795c7a3c1fe | [
"MIT"
] | 129 | 2018-10-14T17:52:29.000Z | 2022-01-29T15:45:57.000Z | twitter_problems/problem_5.py | loftwah/Daily-Coding-Problem | 0327f0b4f69ef419436846c831110795c7a3c1fe | [
"MIT"
] | 2 | 2019-11-30T23:28:23.000Z | 2020-01-03T16:30:32.000Z | twitter_problems/problem_5.py | loftwah/Daily-Coding-Problem | 0327f0b4f69ef419436846c831110795c7a3c1fe | [
"MIT"
] | 60 | 2019-02-21T09:18:31.000Z | 2022-03-25T21:01:04.000Z | """This problem was asked by Twitter.
Given a list of numbers, create an algorithm that arranges them in order to form the largest possible integer.
For example, given [10, 7, 76, 415], you should return 77641510.""" | 54.5 | 111 | 0.752294 | 36 | 218 | 4.555556 | 0.972222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088398 | 0.169725 | 218 | 4 | 112 | 54.5 | 0.81768 | 0.972477 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
2c2b3898833c53843db57d3da758950386a9aad9 | 7,592 | py | Python | languages/python/cp857_5x7.py | ercanersoy/font-library | 7d71b41bddea9d87c230afbaec1a92412ebd7ad9 | [
"CC0-1.0"
] | 1 | 2019-03-30T13:34:24.000Z | 2019-03-30T13:34:24.000Z | languages/python/cp857_5x7.py | ercanersoy/font-library | 7d71b41bddea9d87c230afbaec1a92412ebd7ad9 | [
"CC0-1.0"
] | null | null | null | languages/python/cp857_5x7.py | ercanersoy/font-library | 7d71b41bddea9d87c230afbaec1a92412ebd7ad9 | [
"CC0-1.0"
] | null | null | null | # cp857_5x7.py - CP857 5x7 font file for Python
#
# Copyright (c) 2019-2022 Ercan Ersoy
# This file is written by Ercan Ersoy.
# This file is licensed under CC0-1.0 Universal License.
cp857_5x7 = [
0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x5F, 0x00, 0x00,
0x00, 0x03, 0x00, 0x03, 0x00,
0x14, 0x7F, 0x14, 0x7F, 0x14,
0x04, 0x2A, 0x7F, 0x2A, 0x10,
0x23, 0x13, 0x08, 0x64, 0x62,
0x36, 0x49, 0x55, 0x22, 0x50,
0x00, 0x04, 0x03, 0x00, 0x00,
0x00, 0x3E, 0x41, 0x00, 0x00,
0x00, 0x00, 0x41, 0x3E, 0x00,
0x00, 0x0A, 0x07, 0x0A, 0x00,
0x08, 0x08, 0x3E, 0x08, 0x08,
0x00, 0x20, 0x18, 0x00, 0x00,
0x00, 0x08, 0x08, 0x08, 0x00,
0x00, 0x00, 0x40, 0x00, 0x00,
0x40, 0x30, 0x08, 0x06, 0x01,
0x3E, 0x61, 0x5D, 0x43, 0x3E,
0x00, 0x44, 0x42, 0x7F, 0x40,
0x72, 0x49, 0x49, 0x49, 0x46,
0x22, 0x49, 0x49, 0x49, 0x36,
0x08, 0x0C, 0x0A, 0x7F, 0x08,
0x27, 0x49, 0x49, 0x49, 0x31,
0x3E, 0x49, 0x49, 0x49, 0x32,
0x41, 0x21, 0x11, 0x09, 0x07,
0x36, 0x49, 0x49, 0x49, 0x36,
0x26, 0x49, 0x49, 0x49, 0x3E,
0x00, 0x00, 0x14, 0x00, 0x00,
0x00, 0x40, 0x34, 0x00, 0x00,
0x08, 0x14, 0x36, 0x22, 0x41,
0x14, 0x14, 0x14, 0x14, 0x14,
0x41, 0x22, 0x36, 0x14, 0x08,
0x02, 0x01, 0x51, 0x09, 0x06,
0x3E, 0x49, 0x55, 0x59, 0x4E,
0x7E, 0x09, 0x09, 0x09, 0x7E,
0x7F, 0x49, 0x49, 0x49, 0x36,
0x3E, 0x41, 0x41, 0x41, 0x22,
0x7F, 0x41, 0x41, 0x41, 0x3E,
0x7F, 0x49, 0x49, 0x49, 0x49,
0x7F, 0x09, 0x09, 0x09, 0x09,
0x3E, 0x41, 0x49, 0x49, 0x32,
0x7F, 0x08, 0x08, 0x08, 0x7F,
0x41, 0x41, 0x7F, 0x41, 0x41,
0x20, 0x40, 0x40, 0x40, 0x3F,
0x7F, 0x08, 0x14, 0x22, 0x41,
0x7F, 0x40, 0x40, 0x40, 0x40,
0x7F, 0x02, 0x04, 0x02, 0x7F,
0x7F, 0x02, 0x1C, 0x20, 0x7F,
0x3E, 0x41, 0x41, 0x41, 0x3E,
0x7F, 0x09, 0x09, 0x09, 0x06,
0x1E, 0x21, 0x31, 0x21, 0x5E,
0x7F, 0x0D, 0x15, 0x25, 0x42,
0x26, 0x49, 0x49, 0x49, 0x32,
0x01, 0x01, 0x7F, 0x01, 0x01,
0x3F, 0x40, 0x40, 0x40, 0x3F,
0x1F, 0x20, 0x40, 0x20, 0x1F,
0x3F, 0x40, 0x3F, 0x40, 0x3F,
0x63, 0x14, 0x08, 0x14, 0x63,
0x03, 0x04, 0x78, 0x04, 0x03,
0x61, 0x51, 0x49, 0x45, 0x43,
0x00, 0x7F, 0x41, 0x00, 0x00,
0x01, 0x06, 0x08, 0x30, 0x40,
0x00, 0x00, 0x41, 0x7F, 0x00,
0x00, 0x02, 0x01, 0x02, 0x00,
0x40, 0x40, 0x40, 0x40, 0x40,
0x00, 0x00, 0x03, 0x04, 0x00,
0x20, 0x54, 0x54, 0x54, 0x78,
0x7F, 0x48, 0x48, 0x48, 0x30,
0x30, 0x48, 0x48, 0x48, 0x00,
0x30, 0x48, 0x48, 0x48, 0x7F,
0x38, 0x54, 0x54, 0x54, 0x08,
0x08, 0x7C, 0x0A, 0x0A, 0x00,
0x0C, 0x52, 0x52, 0x3E, 0x00,
0x7F, 0x08, 0x08, 0x08, 0x70,
0x00, 0x00, 0x74, 0x00, 0x00,
0x00, 0x20, 0x40, 0x3A, 0x00,
0x7F, 0x10, 0x28, 0x44, 0x00,
0x00, 0x00, 0x3F, 0x40, 0x00,
0x70, 0x08, 0x70, 0x08, 0x70,
0x00, 0x70, 0x08, 0x70, 0x00,
0x30, 0x48, 0x48, 0x30, 0x00,
0x78, 0x14, 0x14, 0x08, 0x00,
0x08, 0x14, 0x14, 0x7C, 0x00,
0x00, 0x70, 0x08, 0x08, 0x00,
0x00, 0x48, 0x54, 0x24, 0x00,
0x00, 0x08, 0x3C, 0x48, 0x00,
0x00, 0x38, 0x40, 0x78, 0x00,
0x00, 0x38, 0x40, 0x38, 0x00,
0x38, 0x40, 0x38, 0x40, 0x38,
0x44, 0x28, 0x10, 0x28, 0x44,
0x0C, 0x50, 0x50, 0x3C, 0x00,
0x48, 0x68, 0x58, 0x48, 0x00,
0x08, 0x36, 0x41, 0x41, 0x00,
0x00, 0x00, 0x7F, 0x00, 0x00,
0x00, 0x41, 0x41, 0x36, 0x08,
0x08, 0x04, 0x08, 0x10, 0x08,
0x00, 0x00, 0x00, 0x00, 0x00,
0x1E, 0x21, 0x61, 0x21, 0x12,
0x00, 0x3A, 0x40, 0x7A, 0x00,
0x38, 0x54, 0x56, 0x55, 0x08,
0x20, 0x56, 0x55, 0x56, 0x78,
0x20, 0x55, 0x54, 0x55, 0x78,
0x20, 0x55, 0x56, 0x54, 0x78,
0x20, 0x54, 0x55, 0x54, 0x78,
0x0C, 0x12, 0x52, 0x12, 0x00,
0x38, 0x56, 0x55, 0x56, 0x08,
0x38, 0x55, 0x54, 0x55, 0x08,
0x38, 0x55, 0x56, 0x54, 0x08,
0x00, 0x04, 0x70, 0x04, 0x00,
0x00, 0x04, 0x72, 0x04, 0x00,
0x00, 0x00, 0x70, 0x00, 0x00,
0x78, 0x15, 0x14, 0x15, 0x78,
0x78, 0x14, 0x15, 0x14, 0x78,
0x7C, 0x54, 0x56, 0x55, 0x54,
0x32, 0x2A, 0x1C, 0x2A, 0x24,
0x60, 0x18, 0x14, 0x7C, 0x54,
0x32, 0x49, 0x49, 0x32, 0x00,
0x32, 0x48, 0x48, 0x32, 0x00,
0x30, 0x49, 0x4A, 0x30, 0x00,
0x00, 0x3A, 0x41, 0x7A, 0x00,
0x00, 0x39, 0x42, 0x78, 0x00,
0x00, 0x48, 0x7A, 0x48, 0x00,
0x38, 0x45, 0x44, 0x45, 0x38,
0x38, 0x42, 0x40, 0x42, 0x38,
0x58, 0x34, 0x2C, 0x38, 0x04,
0x48, 0x3E, 0x49, 0x41, 0x22,
0x5C, 0x32, 0x2A, 0x26, 0x1D,
0x02, 0x15, 0x55, 0x15, 0x08,
0x00, 0x24, 0x6A, 0x12, 0x00,
0x20, 0x54, 0x56, 0x55, 0x78,
0x00, 0x48, 0x7A, 0x49, 0x00,
0x30, 0x4A, 0x49, 0x30, 0x00,
0x00, 0x38, 0x42, 0x79, 0x00,
0x02, 0x71, 0x0B, 0x72, 0x01,
0x7E, 0x09, 0x13, 0x22, 0x7D,
0x38, 0x45, 0x56, 0x55, 0x24,
0x08, 0x55, 0x56, 0x3D, 0x00,
0x30, 0x48, 0x45, 0x40, 0x20,
0x3E, 0x5D, 0x4B, 0x55, 0x3E,
0x04, 0x04, 0x04, 0x04, 0x0C,
0x4A, 0x3F, 0x4C, 0x6E, 0x51,
0x4A, 0x3F, 0x2C, 0x36, 0x79,
0x00, 0x00, 0x7D, 0x00, 0x00,
0x08, 0x14, 0x00, 0x08, 0x14,
0x14, 0x08, 0x00, 0x14, 0x08,
0x00, 0x55, 0x00, 0x2A, 0x00,
0x55, 0x2A, 0x55, 0x2A, 0x55,
0x7F, 0x2A, 0x7F, 0x55, 0x7F,
0x00, 0x00, 0x7F, 0x00, 0x00,
0x08, 0x08, 0x7F, 0x00, 0x00,
0x7A, 0x15, 0x14, 0x14, 0x78,
0x78, 0x16, 0x15, 0x16, 0x78,
0x78, 0x14, 0x14, 0x15, 0x7A,
0x3E, 0x49, 0x55, 0x41, 0x3E,
0x14, 0x77, 0x00, 0x7F, 0x00,
0x00, 0x7F, 0x00, 0x7F, 0x00,
0x14, 0x74, 0x04, 0x7C, 0x00,
0x14, 0x17, 0x10, 0x1F, 0x00,
0x1C, 0x22, 0x63, 0x22, 0x00,
0x29, 0x2A, 0x7C, 0x2A, 0x29,
0x08, 0x08, 0x78, 0x00, 0x00,
0x00, 0x00, 0x0F, 0x08, 0x08,
0x08, 0x08, 0x0F, 0x08, 0x08,
0x08, 0x08, 0x78, 0x08, 0x08,
0x00, 0x00, 0x7F, 0x08, 0x08,
0x08, 0x08, 0x08, 0x08, 0x08,
0x08, 0x08, 0x7F, 0x08, 0x08,
0x22, 0x55, 0x57, 0x56, 0x79,
0x7A, 0x15, 0x17, 0x16, 0x79,
0x00, 0x1F, 0x10, 0x17, 0x14,
0x00, 0x7C, 0x04, 0x74, 0x14,
0x14, 0x17, 0x10, 0x17, 0x14,
0x14, 0x74, 0x04, 0x74, 0x14,
0x00, 0x7F, 0x00, 0x77, 0x14,
0x14, 0x14, 0x14, 0x14, 0x14,
0x14, 0x77, 0x00, 0x77, 0x14,
0x24, 0x18, 0x24, 0x18, 0x24,
0x00, 0x12, 0x15, 0x12, 0x00,
0x00, 0x12, 0x15, 0x16, 0x00,
0x7C, 0x56, 0x55, 0x56, 0x54,
0x7C, 0x55, 0x54, 0x55, 0x54,
0x7C, 0x55, 0x56, 0x54, 0x54,
0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x48, 0x7A, 0x49, 0x00,
0x00, 0x4A, 0x79, 0x4A, 0x00,
0x00, 0x49, 0x78, 0x49, 0x00,
0x08, 0x08, 0x0F, 0x00, 0x00,
0x00, 0x00, 0x78, 0x08, 0x08,
0x7F, 0x7F, 0x7F, 0x7F, 0x7F,
0x78, 0x78, 0x78, 0x78, 0x78,
0x00, 0x00, 0x77, 0x00, 0x00,
0x00, 0x49, 0x7A, 0x48, 0x00,
0x07, 0x07, 0x07, 0x07, 0x07,
0x38, 0x44, 0x46, 0x45, 0x38,
0x7E, 0x01, 0x49, 0x49, 0x36,
0x38, 0x46, 0x45, 0x46, 0x38,
0x38, 0x45, 0x46, 0x44, 0x38,
0x32, 0x49, 0x4A, 0x31, 0x00,
0x32, 0x49, 0x4B, 0x4A, 0x31,
0x00, 0x7C, 0x10, 0x1C, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x14, 0x08, 0x14, 0x00,
0x38, 0x40, 0x42, 0x41, 0x38,
0x38, 0x42, 0x41, 0x42, 0x38,
0x38, 0x41, 0x42, 0x40, 0x38,
0x00, 0x49, 0x7A, 0x48, 0x00,
0x0D, 0x50, 0x50, 0x3D, 0x00,
0x01, 0x01, 0x01, 0x01, 0x01,
0x00, 0x02, 0x01, 0x00, 0x00,
0x00, 0x08, 0x08, 0x08, 0x00,
0x00, 0x24, 0x2E, 0x24, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00,
0x55, 0x3F, 0x2C, 0x36, 0x79,
0x0E, 0x7F, 0x01, 0x7F, 0x01,
0x2A, 0x55, 0x55, 0x28, 0x00,
0x08, 0x08, 0x2A, 0x08, 0x08,
0x20, 0x40, 0x58, 0x20, 0x00,
0x06, 0x09, 0x09, 0x06, 0x00,
0x00, 0x02, 0x00, 0x02, 0x00,
0x00, 0x00, 0x08, 0x00, 0x00,
0x00, 0x12, 0x1F, 0x10, 0x00,
0x00, 0x11, 0x15, 0x0A, 0x00,
0x00, 0x12, 0x19, 0x16, 0x00,
0x00, 0x38, 0x38, 0x38, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00
]
| 32.583691 | 56 | 0.609326 | 1,154 | 7,592 | 4.006932 | 0.107452 | 0.179931 | 0.116782 | 0.076125 | 0.126298 | 0.055363 | 0.033737 | 0.021626 | 0 | 0 | 0 | 0.557272 | 0.241043 | 7,592 | 232 | 57 | 32.724138 | 0.245227 | 0.022787 | 0 | 0.066372 | 0 | 0 | 0 | 0 | 0 | 0 | 0.604344 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
257e05fcff394b8322ba4dd1554e86692b4c8292 | 3,918 | py | Python | editline/tests/test_editline.py | mark-nicholson/python-editline | c23f1071c4b832a92f66e2f49142e5c5f00e500d | [
"BSD-3-Clause"
] | 4 | 2017-10-05T19:34:32.000Z | 2021-05-18T23:29:44.000Z | editline/tests/test_editline.py | mark-nicholson/python-editline | c23f1071c4b832a92f66e2f49142e5c5f00e500d | [
"BSD-3-Clause"
] | 2 | 2018-03-30T22:38:17.000Z | 2018-03-30T22:39:13.000Z | editline/tests/test_editline.py | mark-nicholson/python-editline | c23f1071c4b832a92f66e2f49142e5c5f00e500d | [
"BSD-3-Clause"
] | null | null | null | """
Unit testing for parts of the editline and _editline modules.
"""
import os
import sys
import unittest
import subprocess
from test.support import import_module
# too bad this thing moved ...
try:
if sys.version_info[0] >= 3 and sys.version_info[1] >= 5:
from test.support.script_helper import assert_python_ok
else:
from test.script_helper import assert_python_ok
have_assert_python_ok = True
except ImportError:
have_assert_python_ok = False
def check_test_support():
return have_assert_python_ok
def check_nose_runner():
"""Certain situations prevent NOSE from running tests -- it appears that nose does
not allow access to the terminal."""
return 'nose' not in sys.modules.keys()
class TestEditline(unittest.TestCase):
def load_assert_python_ok(self):
if sys.version_info[0] >= 3 and sys.version_info[1] >= 5:
from test.support.script_helper import assert_python_ok
else:
from test.script_helper import assert_python_ok
def test_001_import_pkg(self):
_editline = import_module('editline')
def test_002_import__el(self):
_editline = import_module('editline._editline')
def test_002_import_el(self):
editline = import_module('editline.editline')
@unittest.skipUnless(check_nose_runner(), "nose cannot run this test")
def test_003_build_instance(self):
editline = import_module('editline.editline')
el = editline.EditLine("testcase",
sys.stdin, sys.stdout, sys.stderr)
self.assertIsNotNone(el)
@unittest.skipUnless(check_test_support(), "no script_helper")
def test_100_import_pkg(self):
self.load_assert_python_ok()
#from test.support.script_helper import assert_python_ok
rc, stdout, stderr = assert_python_ok('-c', 'import editline')
self.assertEqual(stdout, b'')
self.assertEqual(rc, 0)
@unittest.skipUnless(check_test_support(), "no script_helper")
def test_100_import_module(self):
self.load_assert_python_ok()
#from test.support.script_helper import assert_python_ok
rc, stdout, stderr = assert_python_ok(
'-c', 'from editline import editline')
self.assertEqual(stdout, b'')
self.assertEqual(rc, 0)
@unittest.skipUnless(check_test_support(), "no script_helper")
def test_100_import_class(self):
self.load_assert_python_ok()
#from test.support.script_helper import assert_python_ok
rc, stdout, stderr = assert_python_ok(
'-c', 'from editline.editline import EditLine')
self.assertEqual(stdout, b'')
self.assertEqual(rc, 0)
@unittest.skipUnless(check_test_support(), "no script_helper")
def test_101_init(self):
# Issue #19884: Ensure that the ANSI sequence "\033[1034h" is not
# written into stdout when the readline module is imported and stdout
# is redirected to a pipe.
self.load_assert_python_ok()
#from test.support.script_helper import assert_python_ok
rc, stdout, stderr = assert_python_ok(
'-c', 'from editline.editline import EditLine',
TERM='xterm-256color')
self.assertEqual(stdout, b'')
self.assertEqual(rc, 0)
@unittest.skipUnless(check_nose_runner(), "nose cannot run this test")
def test_200_terminal_size(self):
rows = int(subprocess.check_output(['tput', 'lines']).decode())
columns = int(subprocess.check_output(['tput', 'cols']).decode())
self.assertNotEqual(columns, 0)
editline = import_module('editline.editline')
el = editline.EditLine("testcase",
sys.stdin, sys.stdout, sys.stderr)
el_cols = el.gettc('co')
self.assertEqual(el_cols, columns)
if __name__ == "__main__":
unittest.main()
| 36.962264 | 86 | 0.671006 | 499 | 3,918 | 5.01002 | 0.248497 | 0.096 | 0.112 | 0.0768 | 0.6724 | 0.6404 | 0.6388 | 0.6388 | 0.6388 | 0.6388 | 0 | 0.018218 | 0.229454 | 3,918 | 105 | 87 | 37.314286 | 0.809871 | 0.14829 | 0 | 0.48 | 0 | 0 | 0.114838 | 0 | 0 | 0 | 0 | 0 | 0.36 | 1 | 0.16 | false | 0 | 0.333333 | 0.013333 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
2584851a4d17126b2fb8d6436c754e69633733dc | 724 | py | Python | validator/rules_src/dict.py | Varun-22/Validator | 2c5caa9323aef35e2796813a357e5a7d6d4a80ba | [
"MIT"
] | 41 | 2020-05-07T15:35:12.000Z | 2021-11-01T03:57:09.000Z | validator/rules_src/dict.py | Varun-22/Validator | 2c5caa9323aef35e2796813a357e5a7d6d4a80ba | [
"MIT"
] | 83 | 2020-05-09T22:11:26.000Z | 2022-03-10T19:06:46.000Z | validator/rules_src/dict.py | Varun-22/Validator | 2c5caa9323aef35e2796813a357e5a7d6d4a80ba | [
"MIT"
] | 25 | 2020-05-27T22:46:01.000Z | 2022-03-04T01:36:11.000Z | from validator.rules_src import Rule
class Dict(Rule):
"""
The field under validation must be a dictionary (Python map)
Examples:
>>> from validator import validate
>>> reqs = {"data" : {"key1" : "val1", "key2" : "val2"} }
>>> rule = {"data" : "dict"}
>>> validate(reqs, rule)
True
>>> reqs = {"data" : ["val1", "val2", "val3", "val4"]}
>>> rule = {"data" : "dict"}
>>> validate(reqs, rule)
False
"""
def __init__(self):
Rule.__init__(self)
def check(self, arg):
if isinstance(arg, dict):
return True
self.set_error(f"Expected type dict, Got:{type(arg)}")
return False
def __from_str__(self):
pass
| 20.685714 | 64 | 0.540055 | 83 | 724 | 4.53012 | 0.554217 | 0.095745 | 0.06383 | 0.106383 | 0.148936 | 0.148936 | 0 | 0 | 0 | 0 | 0 | 0.015595 | 0.291436 | 724 | 34 | 65 | 21.294118 | 0.717349 | 0.470994 | 0 | 0 | 0 | 0 | 0.107034 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0.090909 | 0.090909 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
2585e87eaaff2fdd76309cfdbd38bd7f7c95efca | 935 | py | Python | setup.py | DiegoLing33/ling-simple-api | d4b3789cf6110afa3c2db822ff6593079927eca1 | [
"MIT"
] | 2 | 2020-10-28T14:00:05.000Z | 2020-10-30T11:55:27.000Z | setup.py | DiegoLing33/ling-simple-api | d4b3789cf6110afa3c2db822ff6593079927eca1 | [
"MIT"
] | null | null | null | setup.py | DiegoLing33/ling-simple-api | d4b3789cf6110afa3c2db822ff6593079927eca1 | [
"MIT"
] | null | null | null | # ██╗░░░░░██╗███╗░░██╗░██████╗░░░░██████╗░██╗░░░░░░█████╗░░█████╗░██╗░░██╗
# ██║░░░░░██║████╗░██║██╔════╝░░░░██╔══██╗██║░░░░░██╔══██╗██╔══██╗██║░██╔╝
# ██║░░░░░██║██╔██╗██║██║░░██╗░░░░██████╦╝██║░░░░░███████║██║░░╚═╝█████═╝░
# ██║░░░░░██║██║╚████║██║░░╚██╗░░░██╔══██╗██║░░░░░██╔══██║██║░░██╗██╔═██╗░
# ███████╗██║██║░╚███║╚██████╔╝░░░██████╦╝███████╗██║░░██║╚█████╔╝██║░╚██╗
# ╚══════╝╚═╝╚═╝░░╚══╝░╚═════╝░░░░╚═════╝░╚══════╝╚═╝░░╚═╝░╚════╝░╚═╝░░╚═╝
#
# Developed by Yakov V. Panov (C) Ling • Black 2020
# @site http://ling.black
import setuptools
setuptools.setup(
name="ling-simple-api",
version="0.0.1",
author="Yakov V. Ling",
author_email="diegoling33@yandex.ru",
packages=setuptools.find_packages(),
classifiers=[
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
],
python_requires='>=3.8',
)
| 35.961538 | 75 | 0.319786 | 63 | 935 | 11.571429 | 0.777778 | 0.016461 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014815 | 0.13369 | 935 | 25 | 76 | 37.4 | 0.350617 | 0.554011 | 0 | 0 | 0 | 0 | 0.408867 | 0.051724 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.071429 | 0 | 0.071429 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
25a38ec4aea7b8a56416a4d31b2980a2324c2705 | 1,460 | py | Python | digsby/src/util/plistutil.py | ifwe/digsby | f5fe00244744aa131e07f09348d10563f3d8fa99 | [
"Python-2.0"
] | 35 | 2015-08-15T14:32:38.000Z | 2021-12-09T16:21:26.000Z | digsby/src/util/plistutil.py | niterain/digsby | 16a62c7df1018a49eaa8151c0f8b881c7e252949 | [
"Python-2.0"
] | 4 | 2015-09-12T10:42:57.000Z | 2017-02-27T04:05:51.000Z | digsby/src/util/plistutil.py | niterain/digsby | 16a62c7df1018a49eaa8151c0f8b881c7e252949 | [
"Python-2.0"
] | 15 | 2015-07-10T23:58:07.000Z | 2022-01-23T22:16:33.000Z | '''
Some tools to convert a .plist into python objects. Incomplete.
TODO: add load(s)/dump(s) functions to match the 'data shelving' interfaces of pickle, simplejson, pyyaml, etc.
'''
#_type_map = dict(real = (float, 'cdata'),
# integer = (int, 'cdata'),
# true = (lambda x: True, 'cdata'),
# false = (lambda x: False, 'cdata'),
# data = (_from_data, 'cdata'),
# array = (_to_plist, 'children'),
# dict =
# )
def plisttype_to_pytype(plist):
type = plist._name
transformer = globals().get('plist_to_%s' % type, None)
if transformer is not None:
return transformer(plist)
else:
return plist
def _from_data(cdata):
return cdata.decode('base64')
def _to_data(data):
return data.encode('base64')
def plist_to_real(plist):
return float(plist._cdata)
def plist_to_string(plist):
return unicode(plist._cdata)
def plist_to_array(plist):
return map(plisttype_to_pytype, plist._children)
def plist_to_dict(plist):
result = {}
key = None
value = None
for child in plist._children:
if child._name == 'key':
key = child._cdata
else:
value = plisttype_to_pytype(child)
result[key] = value
key = value = None
return result
| 27.54717 | 112 | 0.55 | 165 | 1,460 | 4.666667 | 0.393939 | 0.045455 | 0.051948 | 0.057143 | 0.051948 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004158 | 0.341096 | 1,460 | 52 | 113 | 28.076923 | 0.796258 | 0.369863 | 0 | 0.068966 | 0 | 0 | 0.030516 | 0 | 0 | 0 | 0 | 0.019231 | 0 | 1 | 0.241379 | false | 0 | 0 | 0.172414 | 0.517241 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
25ac8194d8ccbd9a09f074296efd798e9bbe2bfd | 132 | py | Python | PyTrinamic/ic/TMC2041/TMC2041_register_variant.py | trinamic-AA/PyTrinamic | b054f4baae8eb6d3f5d2574cf69c232f66abb4ee | [
"MIT"
] | 37 | 2019-01-13T11:08:45.000Z | 2022-03-25T07:18:15.000Z | PyTrinamic/ic/TMC2041/TMC2041_register_variant.py | AprDec/PyTrinamic | a9db10071f8fbeebafecb55c619e5893757dd0ce | [
"MIT"
] | 56 | 2019-02-25T02:48:27.000Z | 2022-03-31T08:45:34.000Z | PyTrinamic/ic/TMC2041/TMC2041_register_variant.py | AprDec/PyTrinamic | a9db10071f8fbeebafecb55c619e5893757dd0ce | [
"MIT"
] | 26 | 2019-01-14T05:20:16.000Z | 2022-03-08T13:27:35.000Z | '''
Created on 24.10.2019
@author: JM
'''
class TMC2041_register_variant:
" ===== TMC2041 register variants ===== "
"..." | 13.2 | 45 | 0.583333 | 14 | 132 | 5.357143 | 0.857143 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152381 | 0.204545 | 132 | 10 | 46 | 13.2 | 0.561905 | 0.55303 | 0 | 0 | 0 | 0 | 0.461538 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
25afbfde05c67785ccd3dc5ab3cf731469a73398 | 23,741 | py | Python | shirp/handler.py | cedricg92/event-manager | d07fafe88dca06a4d1a07e2a368122b0fb18ccfd | [
"Apache-2.0"
] | null | null | null | shirp/handler.py | cedricg92/event-manager | d07fafe88dca06a4d1a07e2a368122b0fb18ccfd | [
"Apache-2.0"
] | null | null | null | shirp/handler.py | cedricg92/event-manager | d07fafe88dca06a4d1a07e2a368122b0fb18ccfd | [
"Apache-2.0"
] | null | null | null | import datetime
import fnmatch
import gzip
import shutil
import tarfile
import time
import croniter
import hdfs
from watchdog.events import PatternMatchingEventHandler, FileCreatedEvent, FileModifiedEvent, FileMovedEvent, os, \
FileSystemEvent
def add_log_str(log_pattern, strlog):
"""Add log in logfile
:param log_pattern: Pattern of log
:type log_pattern: str
:param strlog: Log message
:type strlog: str
"""
logfile = log_pattern
logfile = logfile.replace("%Y", time.strftime("%Y"))
logfile = logfile.replace("%m", time.strftime("%m"))
logfile = logfile.replace("%d", time.strftime("%d"))
logfile = logfile.replace("%H", time.strftime("%H"))
logfile = logfile.replace("%M", time.strftime("%M"))
logfile = logfile.replace("%S", time.strftime("%S"))
with open(logfile, "ab+") as myfile:
myfile.write(strlog)
myfile.close()
def add_log(log_pattern, event, filename, destination, event_type, event_subtype, exec_program, args, return_val):
"""Add log in filelog
:param log_pattern: Pattern of log
:type log_pattern: str
:param event: Event Name
:param filename: File name
:param destination: Destination
:param event_type: Event type
:param event_subtype: Event sub type
:param exec_program: executable
:param args: Arguments
:param return_val: Return Value
"""
log = time.strftime("%Y-%m-%d %H:%M:%S")
log += "|" + str(event)
log += "|" + str(filename)
log += "|" + str(destination)
log += "|" + str(event_type)
log += "|" + str(event_subtype)
log += "|" + str(exec_program)
log += "|" + str(args)
if return_val:
log += "|" + str(0)
else:
log += "|" + str(1)
log += os.linesep
add_log_str(log_pattern, log)
class ExecHandler(PatternMatchingEventHandler):
"""Class ExecHandler
"""
FILE_LOG = ""
def __init__(self, event_conf):
"""
:param event_conf: EventConf
:type event_conf: em.event.EventConf
:return:
"""
super(self.__class__, self).__init__(["*"], ["*.err"], True, False)
self._patterns = event_conf.patterns
self.event_conf = event_conf
def is_scheduled(self):
"""Check if the event handler is scheduled
:return: True if the event handler is scheduled
:rtype: bool
"""
return self.event_conf.is_scheduled()
def on_created(self, event):
"""Function called when file is created
:param event: File created event
:type event: FileCreatedEvent
"""
if isinstance(event, FileCreatedEvent):
self.process(event)
def on_modified(self, event):
"""Function called when file is modified
:param event: File modified event
:type event: FileModifiedEvent
"""
if isinstance(event, FileModifiedEvent):
self.process(event)
def on_moved(self, event):
"""Function called when file is moved
:param event: File moved event
:type event: FileMovedEvent
"""
if isinstance(event, FileMovedEvent):
self.process(event)
def process(self, event):
"""Function process
:param event: Event file
:type event: FileSystemEvent
:return: Return value (True success, False error)
:rtype: bool
"""
if self.event_conf.enabled == 0:
return 0
args = str(self.event_conf.get_context_value("execArgs"))
args = args.replace("%filenale", event.src_path)
args = args.replace("%destination", self.event_conf.destination)
exec_dir = os.path.dirname(self.event_conf.get_context_value("execProgram"))
exec_app = os.path.dirname(self.event_conf.get_context_value("execProgram"))
ret = os.system("cd "+exec_dir+";./"+exec_app+" "+args)
add_log(self.FILE_LOG, self.event_conf.name, event.src_path, self.event_conf.destination, self.event_conf.type,
self.event_conf.subtype, self.event_conf.get_context_value("execProgram"), args, ret)
if ret is False:
os.rename(event.src_path, event.src_path + ".err")
else:
os.remove(event.src_path)
return ret
def check_schedule(self, now):
"""Check if the event should be launched
:param now: Actual date and time
:type now: datetime.datetime
:return: True if the event should be launched
:rtype: bool
"""
cron = croniter.croniter(self.event_conf.get_cron(), now)
current_exec_datetime = cron.get_current(datetime.datetime)
return (current_exec_datetime.year == now.year and current_exec_datetime.month == now.month and
current_exec_datetime.day == now.day and current_exec_datetime.hour == now.hour and
current_exec_datetime.minute == now.minute)
class FsHandler(PatternMatchingEventHandler):
"""File System Handler
"""
FILE_LOG = ""
TYPE_MOVE = 1
TYPE_ARCHIVE = 2
TYPE_COMPRESS = 3
TYPE_UNARCHIVE = 4
TYPE_UNCOMPRESS = 5
STR_TYPE_MOVE = "move"
STR_TYPE_ARCHIVE = "archive"
STR_TYPE_COMPRESS = "compress"
STR_TYPE_UNARCHIVE = "unarchive"
STR_TYPE_UNCOMPRESS = "uncompress"
def __init__(self, event_conf, fs_type):
"""
:param event_conf: ExecConf
:type event_conf: em.event.EventConf
:param fs_type: Process type of Fs handler
:type fs_type: str
"""
super(self.__class__, self).__init__(["*"], ["*.tmp", "*.err", "*.run"], True, False)
self.event_conf = event_conf
self.fs_type = fs_type
self.delimiter = os.path.sep
def is_scheduled(self):
"""Check if the event handler is scheduled
:return: True if the event handler is scheduled
:rtype: bool
"""
return self.event_conf.is_scheduled()
def on_any_event(self, event):
if not event.src_path.endswith(".tmp"):
for pattern in self.event_conf.patterns:
if fnmatch.fnmatch(os.path.basename(event.src_path), pattern):
print(event)
def on_created(self, event):
"""Handler listener on creation of file
:param event: File created event
:type event: FileCreatedEvent
:return: True if the process run correctly
:rtype: bool
"""
if isinstance(event, FileCreatedEvent):
return self.process(event.src_path)
return False
def on_modified(self, event):
"""Handler listener on modification of file
:param event: File modified event
:type event: FileModifiedEvent
:return: True if the process run correctly
:rtype: bool
"""
if isinstance(event, FileModifiedEvent):
return self.process(event.src_path)
return False
def on_moved(self, event):
"""Handler listener on move of file
:param event: File moved event
:type event: FileMovedEvent
:return: True if the process run correctly
:rtype: bool
"""
if isinstance(event, FileMovedEvent):
return self.process(event.dest_path)
return False
def process(self, full_filename):
"""Function process
:param full_filename: Full path of filename
:type full_filename: str
:return: True if the process run correctly
:rtype: bool
"""
if self.event_conf.enabled == 0:
return True
if not os.path.exists(full_filename):
return False
if os.path.dirname(full_filename) != self.event_conf.directory:
return False
res = False
filename = os.path.basename(full_filename)
matched = False
for pattern in self.event_conf.patterns:
if fnmatch.fnmatch(filename, pattern):
matched = True
break
if not matched:
return False
os.rename(full_filename, full_filename + ".run")
if self.fs_type == FsHandler.TYPE_MOVE or self.fs_type == FsHandler.STR_TYPE_MOVE:
res = self.process_move(filename, "run")
elif self.fs_type == FsHandler.TYPE_ARCHIVE or self.fs_type == FsHandler.STR_TYPE_ARCHIVE:
res = self.process_archive(filename, "run", "tmp")
elif self.fs_type == FsHandler.TYPE_COMPRESS or self.fs_type == FsHandler.STR_TYPE_COMPRESS:
res = self.process_compress(filename, "run", "tmp")
elif self.fs_type == FsHandler.TYPE_UNCOMPRESS or self.fs_type == FsHandler.STR_TYPE_UNCOMPRESS:
res = self.process_uncompress(filename, "run", "tmp")
elif self.fs_type == FsHandler.TYPE_UNARCHIVE or self.fs_type == FsHandler.STR_TYPE_UNARCHIVE:
res = self.process_unarchive(filename, "run")
add_log(self.FILE_LOG, self.event_conf.name, full_filename, self.event_conf.destination, self.event_conf.type,
self.event_conf.subtype, "", "", res)
if not res:
os.rename(full_filename + ".run", full_filename + ".err")
return res
def process_move(self, filename, extension):
"""Move file to destination
directory -> File -> destination
:param filename: Filename
:type filename: str
:param extension: Extention of file (run)
:type extension: str
:return: True if the process run correctly
:rtype: bool
"""
if os.path.exists(self.event_conf.destination + self.delimiter + filename):
os.remove(self.event_conf.destination + self.delimiter + filename)
os.rename(self.event_conf.directory + self.delimiter + filename + "." + extension,
self.event_conf.destination + self.delimiter + filename)
return os.path.exists(self.event_conf.destination + self.delimiter + filename)
def process_archive(self, filename, extension, tmp_extension):
"""Create archive file (tar)
:param filename: Filename
:type filename: str
:param extension: Extension of file (run)
:type extension: str
:param tmp_extension: Tmp extension
:type tmp_extension: str
:return: True if the process run correctly
:rtype: bool
"""
if os.path.exists(self.event_conf.destination + self.delimiter + filename + ".tar" + "." + tmp_extension):
os.remove(self.event_conf.destination + self.delimiter + filename + ".tar" + "." + tmp_extension)
if os.path.exists(self.event_conf.destination + self.delimiter + filename + ".tar"):
os.remove(self.event_conf.destination + self.delimiter + filename + ".tar")
tar = tarfile.open(self.event_conf.destination + self.delimiter + filename + ".tar" + "." + tmp_extension, "w")
tar.add(self.event_conf.directory + self.delimiter + filename + "." + extension, filename)
tar.close()
os.remove(self.event_conf.directory + self.delimiter + filename + "." + extension)
os.rename(self.event_conf.destination + self.delimiter + filename + ".tar" + "." + tmp_extension,
self.event_conf.destination + self.delimiter + filename + ".tar")
return os.path.exists(self.event_conf.destination + self.delimiter + filename + ".tar")
def process_compress(self, filename, extension, tmp_extension):
"""Compress file (gzip)
:param filename: Filename
:type filename: str
:param extension: Extension of file (run)
:type extension: str
:param tmp_extension: Tmp extension
:type tmp_extension: str
:return: True if the process run correctly
:rtype: bool
"""
if os.path.exists(self.event_conf.destination + self.delimiter + filename + ".gz"):
os.remove(self.event_conf.destination + self.delimiter + filename + ".gz")
if os.path.exists(self.event_conf.destination + self.delimiter + filename + ".gz" + "." + tmp_extension):
os.remove(self.event_conf.destination + self.delimiter + filename + ".gz" + "." + tmp_extension)
with open(self.event_conf.directory + self.delimiter + filename + "." + extension, 'rb') as f_in, \
gzip.open(self.event_conf.destination + self.delimiter + filename + ".gz" + "." + tmp_extension,
'wb') as f_out:
shutil.copyfileobj(f_in, f_out)
f_in.close()
f_out.close()
os.remove(self.event_conf.directory + self.delimiter + filename + "." + extension)
os.rename(self.event_conf.destination + self.delimiter + filename + ".gz" + "." + tmp_extension,
self.event_conf.destination + self.delimiter + filename + ".gz")
return os.path.exists(self.event_conf.destination + self.delimiter + filename + ".gz")
def process_uncompress(self, filename, extension, tmp_extension):
"""Uncompress file (gunzip)
:param filename: Filename
:type filename: str
:param extension: Extension of file (run)
:type extension: str
:param tmp_extension: Tmp extension
:type tmp_extension: str
:return: True if the process run correctly
:rtype: bool
"""
if os.path.exists(self.event_conf.destination + self.delimiter + filename.replace(".gz", "") + "." +
tmp_extension):
os.remove(self.event_conf.destination + self.delimiter + filename.replace(".gz", "") + "." + tmp_extension)
if os.path.exists(self.event_conf.destination + self.delimiter + filename.replace(".gz", "")):
os.remove(self.event_conf.destination + self.delimiter + filename.replace(".gz", ""))
with gzip.open(self.event_conf.directory + self.delimiter + filename + "." + extension) as f_in, \
open(self.event_conf.destination + self.delimiter + filename.replace(".gz", "") + "." + tmp_extension,
"w") as f_out:
f_out.write(f_in.read())
os.rename(self.event_conf.destination + self.delimiter + filename.replace(".gz", "") + "." + tmp_extension,
self.event_conf.destination + self.delimiter + filename.replace(".gz", ""))
os.remove(self.event_conf.directory + self.delimiter + filename + "." + extension)
return os.path.exists(self.event_conf.destination + self.delimiter + filename.replace(".gz", ""))
def process_unarchive(self, filename, extension):
"""Unarchive file (untar)
:param filename: Filename
:type filename: str
:param extension: Extension of file (run)
:type extension: str
:return: True if the process run correctly
:rtype: bool
"""
try:
tar = tarfile.open(self.event_conf.directory + self.delimiter + filename + "." + extension)
tar.extractall(self.event_conf.destination)
tar.close()
os.remove(self.event_conf.directory + self.delimiter + filename + "." + extension)
return True
except Exception:
return False
def run_schedule(self):
"""Run scheduled event
:return: None
"""
for filename in os.listdir(self.event_conf.directory):
for pattern in self.event_conf.patterns:
if fnmatch.fnmatch(filename, pattern):
exit(self.process(self.event_conf.directory + self.delimiter + filename))
exit(1)
def check_schedule(self, now):
"""Check if the event should be launched
:param now: Actual date and time
:type now: datetime.datetime
:return: True if the event should be launched
:rtype: bool
"""
cron = croniter.croniter(self.event_conf.get_cron(), now)
current_exec_datetime = cron.get_current(datetime.datetime)
return (current_exec_datetime.year == now.year and current_exec_datetime.month == now.month and
current_exec_datetime.day == now.day and current_exec_datetime.hour == now.hour and
current_exec_datetime.minute == now.minute)
class HDFSHandler(PatternMatchingEventHandler):
"""HDFS handler
"""
FILE_LOG = ""
TYPE_PUT = 1
TYPE_GET = 2
STR_TYPE_PUT = "put"
STR_TYPE_GET = "get"
def __init__(self, event_conf, hdfs_type):
"""
:param self:
:param event_conf: ExecConf
:type event_conf: em.event.EventConf
:return:
"""
super(self.__class__, self).__init__(["*"], ["*.tmp", "*.err", "*.run"], True, False)
self.event_conf = event_conf
self.hdfs_type = hdfs_type
self.delimiter = os.path.sep
def is_scheduled(self):
"""Check if the event handler is scheduled
:return: True if the event handler is scheduled
:rtype: bool
"""
if not self.event_conf.is_scheduled() and\
(self.event_conf.subtype == self.STR_TYPE_GET or self.event_conf.subtype == self.TYPE_GET):
raise ValueError("HDFSHandler: Subtype error - get should be scheduled !")
return self.event_conf.is_scheduled()
def on_any_event(self, event):
if not event.src_path.endswith((".tmp", ".err", ".run")):
for pattern in self.event_conf.patterns:
if fnmatch.fnmatch(os.path.basename(event.src_path), pattern):
print(event)
def on_created(self, event):
"""Handler listener on creation of file
:param event: File created event
:type event: FileCreatedEvent
:return: True if the process run correctly
:rtype: bool
"""
if isinstance(event, FileCreatedEvent):
return self.process(event.src_path)
return False
def on_modified(self, event):
"""Handler listener on modification of file
:param event: File modified event
:type event: FileModifiedEvent
:return: True if the process run correctly
:rtype: bool
"""
if isinstance(event, FileModifiedEvent):
return self.process(event.src_path)
return False
def on_moved(self, event):
"""Handler listener on move of file
:param event: File moved event
:type event: FileMovedEvent
:return: True if the process run correctly
:rtype: bool
"""
if isinstance(event, FileMovedEvent):
return self.process(event.dest_path)
return False
def process(self, full_filename):
"""Function process
:param full_filename: Full path of filename
:type full_filename: str
:return: True if the process run correctly
:rtype: bool
"""
ret = False
if self.event_conf.enabled == 0:
return False
filename = os.path.basename(full_filename)
matched = False
for pattern in self.event_conf.patterns:
if fnmatch.fnmatch(filename, pattern):
matched = True
break
if not matched:
return False
if self.event_conf.is_fs_directory():
os.rename(full_filename, full_filename + ".run")
if self.hdfs_type == self.TYPE_PUT or self.hdfs_type == self.STR_TYPE_PUT:
ret = self.process_put(filename, "run", "err")
if self.hdfs_type == self.TYPE_GET or self.hdfs_type == self.STR_TYPE_GET:
ret = self.process_get(filename)
add_log(self.FILE_LOG, self.event_conf.name, full_filename, self.event_conf.destination, self.event_conf.type,
self.event_conf.subtype, self.event_conf.get_context_value("hdfsUrl"),
self.event_conf.get_context_value("hdfsUser"), ret)
return ret
def process_put(self, filename, extension, err_extension):
"""Put file to HDFS
:param filename: Filename
:type filename: str
:param extension: extension of file (run)
:type extension: str
:param err_extension: error extension (err)
:type err_extension: str
:return: True if the process run correctly
:rtype: bool
"""
res = None
try:
client = hdfs.InsecureClient(self.event_conf.get_context_value("hdfsUrl"),
self.event_conf.get_context_value("hdfsUser"))
client.upload(self.event_conf.destination + "/" + filename,
self.event_conf.directory + self.delimiter + filename + "." + extension, overwrite=True)
res = client.status(self.event_conf.destination + "/" + filename, False)
except Exception as e:
print(e)
ret = False
if res is None:
os.rename(self.event_conf.directory + self.delimiter + filename + "." + extension,
self.event_conf.directory + self.delimiter + filename + "." + extension + "." + err_extension)
else:
ret = True
os.remove(self.event_conf.directory + self.delimiter + filename + "." + extension)
return ret
def process_get(self, filename):
"""Get file from HDFS
:param filename: Filename
:type filename: str
:return: True if the process run correctly
:rtype: bool
"""
ret = False
try:
client = hdfs.InsecureClient(self.event_conf.get_context_value("hdfsUrl"),
self.event_conf.get_context_value("hdfsUser"))
res = client.download(self.event_conf.directory + "/" + filename, self.event_conf.destination, True)
ret = res == (self.event_conf.destination + os.path.sep +
os.path.basename(self.event_conf.directory + "/" + filename))
if ret:
client.delete(self.event_conf.directory + "/" + filename)
except Exception as e:
print(e.message)
return ret
def run_schedule(self):
"""Run scheduled event
:return: None
"""
if self.event_conf.subtype == self.STR_TYPE_GET or self.event_conf.subtype == self.TYPE_GET:
client = hdfs.InsecureClient(self.event_conf.get_context_value("hdfsUrl"),
self.event_conf.get_context_value("hdfsUser"))
files = client.list(self.event_conf.directory)
for filename in files:
for pattern in self.event_conf.patterns:
if fnmatch.fnmatch(filename, pattern):
exit(self.process(self.event_conf.directory + "/" + filename))
else:
for filename in os.listdir(self.event_conf.directory):
for pattern in self.event_conf.patterns:
if fnmatch.fnmatch(filename, pattern):
exit(self.process(self.event_conf.directory + self.delimiter + filename))
exit(1)
def check_schedule(self, now):
"""Check if the event should be launched
:param now: Actual date and time
:type now: datetime.datetime
:return: True if the event should be launched
:rtype: bool
"""
cron = croniter.croniter(self.event_conf.get_cron(), now)
current_exec_datetime = cron.get_current(datetime.datetime)
return (current_exec_datetime.year == now.year and current_exec_datetime.month == now.month and
current_exec_datetime.day == now.day and current_exec_datetime.hour == now.hour and
current_exec_datetime.minute == now.minute)
| 39.241322 | 119 | 0.614212 | 2,771 | 23,741 | 5.103934 | 0.071815 | 0.076999 | 0.100191 | 0.062787 | 0.771689 | 0.729831 | 0.718447 | 0.692852 | 0.665064 | 0.617832 | 0 | 0.000876 | 0.278463 | 23,741 | 604 | 120 | 39.306291 | 0.824752 | 0.204372 | 0 | 0.474843 | 0 | 0 | 0.028859 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.106918 | false | 0 | 0.028302 | 0 | 0.311321 | 0.012579 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
25b0c94c1b7162ba39eb9ea90001ad594220bda8 | 2,417 | py | Python | tests/test_status.py | tbartelmess/python-nomad | 3ddbf904af2947992722a9dcf58c23eb0078fc8f | [
"MIT"
] | null | null | null | tests/test_status.py | tbartelmess/python-nomad | 3ddbf904af2947992722a9dcf58c23eb0078fc8f | [
"MIT"
] | null | null | null | tests/test_status.py | tbartelmess/python-nomad | 3ddbf904af2947992722a9dcf58c23eb0078fc8f | [
"MIT"
] | null | null | null | import pytest
import tests.common as common
import nomad
import sys
@pytest.fixture
def nomad_setup():
n = nomad.Nomad(host=common.IP, port=common.NOMAD_PORT, verify=False, token=common.NOMAD_TOKEN)
return n
# integration tests requires nomad Vagrant VM or Binary running
def test_get_leader(nomad_setup):
if int(sys.version[0]) == 3:
assert isinstance(nomad_setup.status.leader.get_leader(), str) == True
else:
assert isinstance(
nomad_setup.status.leader.get_leader(), unicode) == True
def test_get_peers(nomad_setup):
assert isinstance(nomad_setup.status.peers.get_peers(), list) == True
def test_peers_dunder_getitem_exist(nomad_setup):
n = nomad_setup.status.peers["{IP}:4647".format(IP=common.IP)]
if int(sys.version[0]) == 3:
assert isinstance(n, str)
else:
assert isinstance(n, unicode)
def test_peers_dunder_getitem_not_exist(nomad_setup):
with pytest.raises(KeyError):
p = nomad_setup.status.peers["{IP}:4647".format(IP="172.16.10.100")]
def test_peers_dunder_contain_exists(nomad_setup):
assert "{IP}:4647".format(IP=common.IP) in nomad_setup.status.peers
def test_peers_dunder_contain_not_exist(nomad_setup):
assert "{IP}:4647".format(
IP="172.16.10.100") not in nomad_setup.status.peers
def test_leader_dunder_contain_exists(nomad_setup):
assert "{IP}:4647".format(IP=common.IP) in nomad_setup.status.leader
def test_leader_dunder_contain_not_exist(nomad_setup):
assert "{IP}:4647".format(
IP="172.16.10.100") not in nomad_setup.status.leader
def test_dunder_str(nomad_setup):
assert isinstance(str(nomad_setup.status), str)
assert isinstance(str(nomad_setup.status.leader), str)
assert isinstance(str(nomad_setup.status.peers), str)
def test_dunder_repr(nomad_setup):
assert isinstance(repr(nomad_setup.status), str)
assert isinstance(repr(nomad_setup.status.leader), str)
assert isinstance(repr(nomad_setup.status.peers), str)
def test_dunder_getattr(nomad_setup):
with pytest.raises(AttributeError):
d = nomad_setup.status.does_not_exist
def test_peers_dunder_iter(nomad_setup):
assert hasattr(nomad_setup.status.peers, '__iter__')
for p in nomad_setup.status.peers:
pass
def test_dunder_len(nomad_setup):
assert len(nomad_setup.status.leader) >= 0
assert len(nomad_setup.status.peers) >= 0
| 28.435294 | 99 | 0.733968 | 357 | 2,417 | 4.72549 | 0.193277 | 0.201541 | 0.189686 | 0.124481 | 0.658566 | 0.518672 | 0.471844 | 0.362774 | 0.186129 | 0.186129 | 0 | 0.029254 | 0.151427 | 2,417 | 84 | 100 | 28.77381 | 0.793272 | 0.025238 | 0 | 0.113208 | 0 | 0 | 0.042906 | 0 | 0 | 0 | 0 | 0 | 0.339623 | 1 | 0.264151 | false | 0.018868 | 0.075472 | 0 | 0.358491 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
25b158d89271f5b65a9c02d9dbc2083e03cdfb9a | 154 | py | Python | enums/Matiere.py | ythepaut/visi301_univdefender | 13f0e564ad3265aabf77544755873b8584d296de | [
"CC-BY-3.0"
] | 7 | 2019-10-08T15:48:18.000Z | 2021-07-06T13:20:54.000Z | enums/Matiere.py | ythepaut/visi301_univdefender | 13f0e564ad3265aabf77544755873b8584d296de | [
"CC-BY-3.0"
] | null | null | null | enums/Matiere.py | ythepaut/visi301_univdefender | 13f0e564ad3265aabf77544755873b8584d296de | [
"CC-BY-3.0"
] | 3 | 2019-10-08T15:47:55.000Z | 2019-12-13T23:28:12.000Z | """Module Enum:Matiere"""
from enum import Enum
class Matiere(Enum):
"""Enum : Matiere"""
HISTOIRE = 0
MATHS = 1
INFO = 2
SPORT = 3
| 14 | 25 | 0.571429 | 20 | 154 | 4.4 | 0.7 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036697 | 0.292208 | 154 | 10 | 26 | 15.4 | 0.770642 | 0.220779 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
25b171c31a40f6d8668db31079821a30e464b8d6 | 155 | py | Python | tests/plugins/viz/conftest.py | andrsd/otter | 3d47d4ad433b5e7bd1a1d5fed45b4950de14c25f | [
"MIT"
] | null | null | null | tests/plugins/viz/conftest.py | andrsd/otter | 3d47d4ad433b5e7bd1a1d5fed45b4950de14c25f | [
"MIT"
] | 30 | 2020-02-25T13:29:59.000Z | 2022-01-07T20:19:05.000Z | tests/plugins/viz/conftest.py | andrsd/otter | 3d47d4ad433b5e7bd1a1d5fed45b4950de14c25f | [
"MIT"
] | null | null | null | import pytest
@pytest.fixture
def viz_plugin(qtbot):
from otter.plugins.viz.VizPlugin import VizPlugin
plugin = VizPlugin(None)
yield plugin
| 17.222222 | 53 | 0.748387 | 20 | 155 | 5.75 | 0.65 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.180645 | 155 | 8 | 54 | 19.375 | 0.905512 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
25d27198f145805ad47007d54d3831652ac2959c | 207 | py | Python | AlgoMethod/full_search/multi_array_full_serach/01.py | Nishi05/Competitive-programming | e59a6755b706d9d5c1f359f4511d92c114e6a94e | [
"MIT"
] | null | null | null | AlgoMethod/full_search/multi_array_full_serach/01.py | Nishi05/Competitive-programming | e59a6755b706d9d5c1f359f4511d92c114e6a94e | [
"MIT"
] | null | null | null | AlgoMethod/full_search/multi_array_full_serach/01.py | Nishi05/Competitive-programming | e59a6755b706d9d5c1f359f4511d92c114e6a94e | [
"MIT"
] | null | null | null | n, m = map(int, input().split())
a_lst = list(map(int, input().split()))
b_lst = list(map(int, input().split()))
cnt = 0
for i in a_lst:
for j in b_lst:
if i > j:
cnt += 1
print(cnt)
| 20.7 | 39 | 0.531401 | 39 | 207 | 2.717949 | 0.487179 | 0.169811 | 0.311321 | 0.45283 | 0.433962 | 0.433962 | 0 | 0 | 0 | 0 | 0 | 0.013245 | 0.270531 | 207 | 9 | 40 | 23 | 0.688742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
25fc5f6e0c01ba67a5687398c8403a5380a5ac0e | 2,176 | py | Python | videostreams.py | agnanp/multithreading-videostream | c3d47f2f68d4d238f0923909c5cce5de42ae35a4 | [
"MIT"
] | null | null | null | videostreams.py | agnanp/multithreading-videostream | c3d47f2f68d4d238f0923909c5cce5de42ae35a4 | [
"MIT"
] | null | null | null | videostreams.py | agnanp/multithreading-videostream | c3d47f2f68d4d238f0923909c5cce5de42ae35a4 | [
"MIT"
] | null | null | null | from threading import Thread
import datetime
import cv2
class FPS:
"""docstring for FPS"""
def __init__(self):
self._start = None
self._end = None
self._numFrames = 0
def start(self):
self._start = datetime.datetime.now()
return self
def stop(self):
self._end = datetime.datetime.now()
def update(self):
self._numFrames += 1
def elapsed(self):
return (self._end - self._start).total_seconds()
def fps(self):
return self._numFrames / self.elapsed()
class videoStream:
def __init__(self, src=0, name="videoStream", fps=None):
self.stream = cv2.VideoCapture(src)
(self.grabbed, self.frame) = self.stream.read()
if fps:
self.stream.set(cv2.CAP_PROP_FPS, fps)
self.fokus = 0
self.name = name
self.stopped = False
def start(self):
t = Thread(target=self.update, name=self.name, args=())
t.daemon = True
t.start()
return self
def update(self):
while True:
if self.stopped:
return
(self.grabbed, self.frame) = self.stream.read()
def read(self):
return self.grabbed, self.frame
def stop(self):
self.stopped = True
def getWidth(self):
return int(self.stream.get(cv2.CAP_PROP_FRAME_WIDTH))
def getHeight(self):
return int(self.stream.get(cv2.CAP_PROP_FRAME_HEIGHT))
def getFPS(self):
return int(self.stream.get(cv2.CAP_PROP_FPS))
def isOpen(self):
return self.stream.isOpened()
def setFramePosition(self, framePos):
self.stream.set(cv2.CAP_PROP_POS_FRAMES, framePos)
def getFramePosition(self):
return int(self.stream.get(cv2.CAP_PROP_POS_FRAMES))
def getFrameCount(self):
return int(self.stream.get(cv2.CAP_PROP_FRAME_COUNT))
def setFocus(self):
self.stream.set(cv2.CAP_PROP_AUTOFOCUS, 0)
self.stream.set(cv2.CAP_PROP_FOCUS, 0)
class multiCamera(videoStream):
camAddrList = []
def __init__(self,camAddrList, setFps=None):
self.camAddr = camAddrList
self._cams = []
self._frames = []
for cam_Addr in self.camAddr:
cs = videoStream(src=cam_Addr, fps=setFps).start()
self._cams.append(cs)
def capture(self):
self._frames = []
for cm in self._cams:
ret, frame = cm.read()
if ret:
self._frames.append(frame)
return self._frames
| 20.72381 | 57 | 0.705423 | 319 | 2,176 | 4.642633 | 0.247649 | 0.087779 | 0.06077 | 0.057394 | 0.273464 | 0.239703 | 0.177583 | 0.131668 | 0.131668 | 0.083052 | 0 | 0.009341 | 0.163603 | 2,176 | 104 | 58 | 20.923077 | 0.804396 | 0.007813 | 0 | 0.16 | 0 | 0 | 0.005109 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.28 | false | 0 | 0.04 | 0.12 | 0.546667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
d30aa2b8ad3b8e1e64e065adc63f6c47c9d9af5b | 391 | py | Python | rl/tools/normalizers/__init__.py | haamoon/librl | f9fc633b79c1bec1cd1b1a275834eb5221cffff3 | [
"MIT"
] | 3 | 2019-08-11T23:25:17.000Z | 2021-08-24T13:24:01.000Z | rl/tools/normalizers/__init__.py | haamoon/librl | f9fc633b79c1bec1cd1b1a275834eb5221cffff3 | [
"MIT"
] | 6 | 2020-01-28T22:45:38.000Z | 2022-02-10T00:13:42.000Z | rl/tools/normalizers/__init__.py | haamoon/librl | f9fc633b79c1bec1cd1b1a275834eb5221cffff3 | [
"MIT"
] | 3 | 2019-07-03T21:18:27.000Z | 2019-10-03T04:51:55.000Z | from .normalizer import OnlineNormalizer, NormalizerStd, NormalizerMax, NormalizerId
from .tf_normalizer import tfNormalizer, tfNormalizerMax, tfNormalizerStd, tfNormalizerId
def create_build_nor_from_str(nor_cls_str, nor_kwargs):
nor_cls = globals()[nor_cls_str]
def build_nor(shape):
return nor_cls(shape, unscale=False, unbias=False, **nor_kwargs)
return build_nor
| 35.545455 | 89 | 0.792839 | 49 | 391 | 6.020408 | 0.510204 | 0.081356 | 0.061017 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132992 | 391 | 10 | 90 | 39.1 | 0.870206 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.285714 | 0.142857 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
d32d9dc00142a808ab4884c037d719b88a753e99 | 176 | py | Python | openiPrototype/openiPrototype/APIS/Products_and_Services/Shop/admin.py | OPENi-ict/ntua_demo | 104118fbe1f54db35386ca96286317ceb64cb658 | [
"Apache-2.0"
] | null | null | null | openiPrototype/openiPrototype/APIS/Products_and_Services/Shop/admin.py | OPENi-ict/ntua_demo | 104118fbe1f54db35386ca96286317ceb64cb658 | [
"Apache-2.0"
] | null | null | null | openiPrototype/openiPrototype/APIS/Products_and_Services/Shop/admin.py | OPENi-ict/ntua_demo | 104118fbe1f54db35386ca96286317ceb64cb658 | [
"Apache-2.0"
] | null | null | null | __author__ = 'mpetyx'
from django.contrib import admin
from .models import OpeniShop
class ShopAdmin(admin.ModelAdmin):
pass
admin.site.register(OpeniShop, ShopAdmin)
| 14.666667 | 41 | 0.778409 | 21 | 176 | 6.333333 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142045 | 176 | 11 | 42 | 16 | 0.880795 | 0 | 0 | 0 | 0 | 0 | 0.034091 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.166667 | 0.333333 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 3 |
d3314a32508d50103f80a9ed56866ebaa49e1a0f | 755 | py | Python | src/solver.py | geraldzakwan/katla-helper | c4f7d655fe00dc07bf1eee656b1e64684b5eded7 | [
"MIT"
] | null | null | null | src/solver.py | geraldzakwan/katla-helper | c4f7d655fe00dc07bf1eee656b1e64684b5eded7 | [
"MIT"
] | null | null | null | src/solver.py | geraldzakwan/katla-helper | c4f7d655fe00dc07bf1eee656b1e64684b5eded7 | [
"MIT"
] | null | null | null | from abc import ABC, abstractmethod
from src.utils import read_dictionary
class Solver(ABC):
def __init__(self, word_dict_filepath, hist_dict_filepath):
self.word_dict = read_dictionary(word_dict_filepath)
self.hist_dict = read_dictionary(hist_dict_filepath)
def is_in_dictionary(self, word):
if word in self.word_dict:
return True
return False
def is_used_previously(self, word):
if word in self.hist_dict:
return True
return False
@abstractmethod
def set_important_consonants(self, num_important_consonants):
pass
@abstractmethod
def get_starters(self):
pass
@abstractmethod
def get_guesses(self, states):
pass
| 22.205882 | 65 | 0.678146 | 94 | 755 | 5.138298 | 0.361702 | 0.082816 | 0.074534 | 0.057971 | 0.186335 | 0.082816 | 0 | 0 | 0 | 0 | 0 | 0 | 0.263576 | 755 | 33 | 66 | 22.878788 | 0.868705 | 0 | 0 | 0.434783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.26087 | false | 0.130435 | 0.130435 | 0 | 0.608696 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
d35834e4ac51677b11ebf3a3403d9007c865a7d7 | 69 | py | Python | pymiddy/__init__.py | godvsdeity/pymiddy | 5126110a3f9e673f062e4248301796b5de10507d | [
"MIT"
] | 1 | 2021-01-15T06:17:50.000Z | 2021-01-15T06:17:50.000Z | pymiddy/__init__.py | godvsdeity/pymiddy | 5126110a3f9e673f062e4248301796b5de10507d | [
"MIT"
] | null | null | null | pymiddy/__init__.py | godvsdeity/pymiddy | 5126110a3f9e673f062e4248301796b5de10507d | [
"MIT"
] | null | null | null | __name__ = 'pymiddy'
__version__ = '0.1.0'
from .middy import Middy
| 13.8 | 24 | 0.710145 | 10 | 69 | 4.1 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051724 | 0.15942 | 69 | 4 | 25 | 17.25 | 0.655172 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
d374f59d67c6fe97e2ccc0a2b608d4637cc76e57 | 168 | py | Python | search.py | urplatshubham/arr_deal | a3b0563e78b9f1fa267bb934f3cf0ff922dc35f2 | [
"MIT"
] | null | null | null | search.py | urplatshubham/arr_deal | a3b0563e78b9f1fa267bb934f3cf0ff922dc35f2 | [
"MIT"
] | null | null | null | search.py | urplatshubham/arr_deal | a3b0563e78b9f1fa267bb934f3cf0ff922dc35f2 | [
"MIT"
] | null | null | null | class searching:
def lin_search(self, arr, num):
for i in range(len(arr)):
if arr[i] == num:
return i
return -1
| 21 | 36 | 0.458333 | 22 | 168 | 3.454545 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010638 | 0.440476 | 168 | 7 | 37 | 24 | 0.797872 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
d385ca7241bd26e2e1fb9eb4af0ac1bee221e31c | 1,465 | py | Python | sandbox/lib/jumpscale/JumpscaleLibs/clients/gitea/GiteaOrgForMember.py | threefoldtech/threebot_prebuilt | 1f0e1c65c14cef079cd80f73927d7c8318755c48 | [
"Apache-2.0"
] | 2 | 2019-05-09T07:21:25.000Z | 2019-08-05T06:37:53.000Z | sandbox/lib/jumpscale/JumpscaleLibs/clients/gitea/GiteaOrgForMember.py | threefoldtech/threebot_prebuilt | 1f0e1c65c14cef079cd80f73927d7c8318755c48 | [
"Apache-2.0"
] | 664 | 2018-12-19T12:43:44.000Z | 2019-08-23T04:24:42.000Z | sandbox/lib/jumpscale/JumpscaleLibs/clients/gitea/GiteaOrgForMember.py | threefoldtech/threebot_prebuilt | 1f0e1c65c14cef079cd80f73927d7c8318755c48 | [
"Apache-2.0"
] | 7 | 2019-05-03T07:14:37.000Z | 2019-08-05T12:36:52.000Z | from .GiteaOrg import GiteaOrg
class GiteaOrgForMember(GiteaOrg):
def save(self, commit=True):
is_valid, err = self._validate(create=True)
if not commit or not is_valid:
self._log_debug(err)
return is_valid
try:
resp = self.user.client.api.admin.adminCreateOrg(data=self.data, username=self.user.username)
org = resp.json()
for k, v in org.items():
setattr(self, k, v)
return True
except Exception as e:
self._log_debug(e.response.content)
return False
def update(self, commit=True):
is_valid, err = self._validate(update=True)
if not commit or not is_valid:
self._log_debug(err)
return is_valid
try:
resp = self.user.client.api.orgs.orgEdit(data=self.data, org=self.username)
return True
except Exception as e:
self._log_debug(e.response.content)
return False
# def delete(self, commit=True):
# is_valid, err = self._validate(delete=True)
#
# if not commit or not is_valid:
# self._log_debug(err)
# return is_valid
# try:
# resp = self.user.client.api.admin.adminDeleteUser(username=self.username)
# return True
# except Exception as e:
# self._log_debug(e.response.content)
# return False
| 31.847826 | 105 | 0.574061 | 180 | 1,465 | 4.538889 | 0.277778 | 0.077111 | 0.088127 | 0.058752 | 0.717258 | 0.717258 | 0.717258 | 0.717258 | 0.585067 | 0.585067 | 0 | 0 | 0.335836 | 1,465 | 45 | 106 | 32.555556 | 0.839671 | 0.251877 | 0 | 0.592593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074074 | false | 0 | 0.037037 | 0 | 0.37037 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
d3894fa9b5b91661bd5eeb0871ee8e2e049cf34c | 1,486 | py | Python | crits/emails/urls.py | dutrow/crits | 6b357daa5c3060cf622d3a3b0c7b41a9ca69c049 | [
"MIT"
] | 738 | 2015-01-02T12:39:55.000Z | 2022-03-23T11:05:51.000Z | crits/emails/urls.py | dutrow/crits | 6b357daa5c3060cf622d3a3b0c7b41a9ca69c049 | [
"MIT"
] | 605 | 2015-01-01T01:03:39.000Z | 2021-11-17T18:51:07.000Z | crits/emails/urls.py | dutrow/crits | 6b357daa5c3060cf622d3a3b0c7b41a9ca69c049 | [
"MIT"
] | 316 | 2015-01-07T12:35:01.000Z | 2022-03-30T04:44:30.000Z | from django.conf.urls import url
from . import views
urlpatterns = [
url(r'^search/$', views.email_search, name='crits-emails-views-email_search'),
url(r'^delete/(?P<email_id>\w+)/$', views.email_del, name='crits-emails-views-email_del'),
url(r'^upload/attach/(?P<email_id>\w+)/$', views.upload_attach, name='crits-emails-views-upload_attach'),
url(r'^details/(?P<email_id>\w+)/$', views.email_detail, name='crits-emails-views-email_detail'),
url(r'^new/fields/$', views.email_fields_add, name='crits-emails-views-email_fields_add'),
url(r'^new/outlook/$', views.email_outlook_add, name='crits-emails-views-email_outlook_add'),
url(r'^new/raw/$', views.email_raw_add, name='crits-emails-views-email_raw_add'),
url(r'^new/yaml/$', views.email_yaml_add, name='crits-emails-views-email_yaml_add'),
url(r'^new/eml/$', views.email_eml_add, name='crits-emails-views-email_eml_add'),
url(r'^edit/(?P<email_id>\w+)/$', views.email_yaml_add, name='crits-emails-views-email_yaml_add'),
url(r'^update_header_value/(?P<email_id>\w+)/$', views.update_header_value, name='crits-emails-views-update_header_value'),
url(r'^indicator_from_header_field/(?P<email_id>\w+)/$', views.indicator_from_header_field, name='crits-emails-views-indicator_from_header_field'),
url(r'^list/$', views.emails_listing, name='crits-emails-views-emails_listing'),
url(r'^list/(?P<option>\S+)/$', views.emails_listing, name='crits-emails-views-emails_listing'),
]
| 70.761905 | 151 | 0.718708 | 234 | 1,486 | 4.320513 | 0.17094 | 0.178042 | 0.207715 | 0.276954 | 0.531157 | 0.363007 | 0.205737 | 0.205737 | 0.205737 | 0.104847 | 0 | 0 | 0.07537 | 1,486 | 20 | 152 | 74.3 | 0.735808 | 0 | 0 | 0 | 0 | 0 | 0.519515 | 0.469717 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
d39d87163c302a5224242e13bcd3a257bf4c8bac | 22,731 | py | Python | lib/extraction/XRR.py | ArnaudHemmerle/JupyLabBook | 1975eabc85e73e28432514fea2199fddd033ecfc | [
"MIT"
] | null | null | null | lib/extraction/XRR.py | ArnaudHemmerle/JupyLabBook | 1975eabc85e73e28432514fea2199fddd033ecfc | [
"MIT"
] | 20 | 2020-05-07T16:47:14.000Z | 2021-04-01T10:15:12.000Z | lib/extraction/XRR.py | ArnaudHemmerle/JupyLabBook | 1975eabc85e73e28432514fea2199fddd033ecfc | [
"MIT"
] | null | null | null | # custom libraries
from lib.extraction.common import PyNexus as PN
from lib.extraction.common import Check_dead_pixels
from lib.extraction import PilatusSum as PilatusSum
import numpy as np
import matplotlib as mpl
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
import matplotlib.colors as colors
import os
import sys
def Treat(nxs_filename, recording_dir, direct_nxs_filename,
ROIx0, ROIy0, ROIsizex, ROIsizey, m4pitch0, wavelength,
force_direct=True, fdirect=1.,
working_dir='', plot_gains=False, plot_XRR_m4pitch=False, plot_XRR_qz=False,
save=False, verbose=False):
'''
Call functions for extracting, plotting, and saving a series of XRR scans.
Parameters
----------
nxs_filename : str
nexus filename
recording_dir : str
directory where the nexus file is stored
direct_nxs_filename : str
nexus filename of the direct scan
ROIx0 : int
x0 of the integrated ROI (for the direct and the XRR scans)
ROIy0 : int
y0 of the integrated ROI (for the direct and the XRR scans)
ROIsizex : int
sizex of the integrated ROI (for the direct and the XRR scans)
ROIsizey : int
sizey of the integrated ROI (for the direct and the XRR scans)
m4pitch0 : float
value of m4pitch0 (m4pitch aligned with the beam) in deg
wavelength : float
wavelength in nm
force_direct : bool, optional
enforce the value of fdirect (not extracted from the direct scan)
fdirect : float, optional
value of the normalisation to be used if force_direct is True
working_dir : str, optional
directory where the treated files will be stored
plot_gains : bool, optional
plot the intensity of the chamber for each gain
plot_XRR_m4pitch : bool, optional
plot the XRR as a function of m4pitch
plot_XRR_qz : bool, optional
plot the XRR as a function of qz
save : bool, optional
save the XRR
verbose : bool, optional
verbose mode
Returns
-------
array_like
m4pitch, an array containing the list of m4pitch
array_like
theta, an array containing the list of theta (i.e. 2.*theta/2.)
array_like
qz, an array containing the list of qz
array_like
gains, an array of arrays, each containing the list of intensities for a specific gain of the chamber
array_like
I0, an array containing the intensities of the chamber normalized by its gain
array_like
I, an array containing the values of the integrated ROI on the XRR scans
array_like
Inorm, an array containing the values I normalized by I0 and the direct (i.e. the reflectivity)
Raises
------
SystemExit('XRR_files.dat not found')
when XXX_XRR_files.dat file not found
SystemExit('Sensor not found')
when a particular sensor is not found in the nxs
SystemExit('direct_gains.dat not found')
when XXX_direct_gains.dat file not found
'''
# Check if the direct should be extracted from a file, or is given by the user
if force_direct:
print('Direct is forced to the value: direct=%g'%fdirect)
direct = fdirect
else:
direct = extract_direct(direct_nxs_filename, ROIx0, ROIy0, ROIsizex, ROIsizey,
recording_dir, verbose)
print('Direct extracted from %s: direct=%g'%(direct_nxs_filename,direct))
m4pitch, theta, qz, gains, I0, I, Inorm = \
Extract(nxs_filename, ROIx0, ROIy0, ROIsizex, ROIsizey,
m4pitch0, wavelength, direct,
recording_dir, verbose)
if (plot_gains or plot_XRR_m4pitch or plot_XRR_qz):
Plot(m4pitch, theta, qz, gains, I0, Inorm,
nxs_filename, plot_gains, plot_XRR_m4pitch, plot_XRR_qz)
if save:
Save(m4pitch, theta, qz, I0, I, direct, Inorm, nxs_filename, working_dir, verbose)
return m4pitch, theta, qz, gains, I0, I, Inorm
def Extract(nxs_filename, ROIx0, ROIy0, ROIsizex, ROIsizey,
m4pitch0, wavelength, direct,
recording_dir, verbose):
'''
Extract the nexus scan and companion files and return useful quantities for XRR.
Parameters
----------
nxs_filename : str
nexus filename
ROIx0 : int
x0 of the integrated ROI (for the direct and the XRR scans)
ROIy0 : int
y0 of the integrated ROI (for the direct and the XRR scans)
ROIsizex : int
sizex of the integrated ROI (for the direct and the XRR scans)
ROIsizey : int
sizey of the integrated ROI (for the direct and the XRR scans)
m4pitch0 : float
value of m4pitch0 (m4pitch aligned with the beam) in deg
wavelength : float
wavelength in nm
direct : float
value of the normalisation
working_dir : str
directory where the treated files will be stored
verbose : bool
verbose mode
Returns
-------
array_like
m4pitch, an array containing the list of m4pitch
array_like
theta, an array containing the list of theta (i.e. 2.*theta/2.)
array_like
qz, an array containing the list of qz
array_like
gains, an array of arrays, each containing the list of intensities for a specific gain of the chamber
array_like
I0, an array containing the intensities of the chamber normalized by its gain
array_like
I, an array containing the values of the integrated ROI on the XRR scans
array_like
Inorm, an array containing the values I normalized by I0 and the direct (i.e. the reflectivity)
Raises
------
SystemExit('XRR_files.dat not found')
when XXX_XRR_files.dat file not found
SystemExit('Sensor not found')
when a particular sensor is not found in the nxs
SystemExit('direct_gains.dat not found')
when XXX_direct_gains.dat file not found
'''
# Extract the intensity of the ionization chamber for the different gains
# The nxs_filename should be the first XRR, with a companion file XXX_XRR_files.dat
files_path = recording_dir+nxs_filename[:-4]+'_XRR_files.dat'
if not os.path.isfile(files_path):
print(PN._RED+'The file %s seems not to exist in recording directory'%
(nxs_filename[:-4]+'_XRR_files.dat')+PN._RESET)
print('\t\t recording directory: %s'%recording_dir)
sys.exit('XRR_files.dat not found')
else:
file_list = np.genfromtxt(recording_dir+nxs_filename[:-4]+'_XRR_files.dat',dtype='U')
#########################
# Extraction of the gains
gain1 = np.array([])
gain2 = np.array([])
gain3 = np.array([])
gain4 = np.array([])
gain5 = np.array([])
gain6 = np.array([])
for file in file_list:
file += '_XRR_gains.dat'
if verbose: print('Extracting I0 for different gains from file %s'%file)
gain1_temp = np.genfromtxt(recording_dir+file)[1]
gain2_temp = np.genfromtxt(recording_dir+file)[2]
gain3_temp = np.genfromtxt(recording_dir+file)[3]
gain4_temp = np.genfromtxt(recording_dir+file)[4]
gain5_temp = np.genfromtxt(recording_dir+file)[5]
gain6_temp = np.genfromtxt(recording_dir+file)[6]
gain1 = np.append(gain1, gain1_temp)
gain2 = np.append(gain2, gain2_temp)
gain3 = np.append(gain3, gain3_temp)
gain4 = np.append(gain4, gain4_temp)
gain5 = np.append(gain5, gain5_temp)
gain6 = np.append(gain6, gain6_temp)
gains = [gain1, gain2, gain3, gain4, gain5, gain6]
# Identify saturated values
g1s = np.where(gain1<9.9, gain1, -1)
g2s = np.where(gain2<9.9, gain2, -1)
g3s = np.where(gain3<9.9, gain3, -1)
g4s = np.where(gain4<9.9, gain4, -1)
g5s = np.where(gain5<9.9, gain5, -1)
g6s = np.where(gain6<9.9, gain6, -1)
# Construct the final curve
g_temp = np.where(g6s<0, g5s/1e4, g6s/1e5)
g_temp = np.where(g_temp<0, g4s/1e3, g_temp)
g_temp = np.where(g_temp<0, g3s/1e2, g_temp)
g_temp = np.where(g_temp<0, g2s/1e1, g_temp)
I0 = np.where(g_temp<0, g1s, g_temp)
I = np.array([])
m4pitch = np.array([])
#########################
# Extraction of XRR
for file in file_list:
nxs_filename = file+'.nxs'
# Extract the sum of all images
image, _, _ =PilatusSum.Extract(nxs_filename, recording_dir,
show_data_stamps=False, verbose=verbose)
# Extract the ROI containing reflected beams
# Full image: ROI = [0, 0, 981, 1043]
ROI = [ROIx0, ROIy0, ROIsizex, ROIsizey]
#Apply the ROI
image_ROI = image[ROI[1]:ROI[1]+ROI[3], ROI[0]:ROI[0]+ROI[2]]
# Extract info from nexus file
nexus = PN.PyNexusFile(recording_dir+nxs_filename, fast=True)
stamps0D, data0D = nexus.extractData("0D")
nbpts=int(nexus.get_nbpts())
sensor_list = [stamps0D[i][0] if stamps0D[i][1]== None else stamps0D[i][1] for i in range(len(stamps0D))]
if 'm4pitch' in sensor_list:
m4pitchArg = sensor_list.index('m4pitch')
else:
print(PN._RED+'\t Sensor %s is not in the sensor list'%('m4pitch')+PN._RESET)
sys.exit('Sensor not found')
if 'integration_time' in sensor_list:
integration_timeArg = sensor_list.index('integration_time')
else:
print(PN._RED+'\t Sensor %s is not in the sensor list'%('integration_time')+PN._RESET)
sys.exit('Sensor not found')
m4pitch = np.append(m4pitch, np.mean(data0D[m4pitchArg]))
integration_time = np.mean(data0D[integration_timeArg])
# Sum the ROI and normalize with the integration time and the number of images
integrated_ROI = image_ROI.sum(axis=0).sum(axis=0)/integration_time/nbpts
I = np.append(I, integrated_ROI)
# Convert to qz
theta = np.abs(2*(m4pitch-m4pitch0)*np.pi/180.) #Incident angle in rad
qz = 4*np.pi/wavelength*np.sin(theta) #qz in nm-1
# Normalize by I0 and direct (Inorm is the reflectivity)
Inorm = I/I0/direct
return m4pitch, theta, qz, gains, I0, I, Inorm
def Plot(m4pitch, theta, qz, gains, I0, Inorm,
nxs_filename, plot_gains, plot_XRR_m4pitch, plot_XRR_qz):
'''
Plot the intensity of the ion chamber for each gain and the XRR.
Parameters
----------
nxs_filename : str
nexus filename
m4_pitch : array of float
list of m4pitch
theta : array of float
list of theta (i.e. 2.*theta/2.)
qz : array of float
list of qz
gains : array of array of float
list of intensities for a specific gain of the chamber
I0 : array of float
intensities of the chamber normalized by its gain
I : array of float
values of the integrated ROI on the XRR scans
Inorm : array of float
alues I normalized by I0 and the direct (i.e. the reflectivity)
plot_gains : bool
plot the intensity of the chamber for each gain
plot_XRR_m4pitch : bool
plot the XRR as a function of m4pitch
plot_XRR_qz : bool
plot the XRR as a function of qz
'''
# Unpack the gains
[gain1, gain2, gain3, gain4, gain5, gain6] = gains
if plot_gains:
# Plot gains not normalized
fig = plt.figure(figsize=(12,5))
ax=fig.add_subplot(111)
plt.yscale('log')
plt.plot(m4pitch, gain1, 'ro-', label = 'gain 1')
plt.plot(m4pitch, gain2, 'cx-', label = 'gain 2')
plt.plot(m4pitch, gain3, 'bv-', label = 'gain 3')
plt.plot(m4pitch, gain4, 'y^-', label = 'gain 4')
plt.plot(m4pitch, gain5, 'ms-', label = 'gain 5')
plt.plot(m4pitch, gain6, 'g*-', label = 'gain 6')
plt.ylabel('I', fontsize=16)
plt.legend()
ax.set_title('I0 not normalized', fontsize=16)
ax.set_xlabel('m4pitch', fontsize=16)
ax.set_ylabel('I0', fontsize=16)
fig.subplots_adjust(top=0.85)
fig.suptitle('First scan: '+nxs_filename.split('\\')[-1], fontsize=16)
ax.tick_params(labelsize=16)
ax.yaxis.offsetText.set_fontsize(16)
plt.show()
# Plot the final curve
fig = plt.figure(figsize=(12,5))
ax=fig.add_subplot(111)
plt.yscale('log')
plt.plot(m4pitch, gain1, 'ro-')
plt.plot(m4pitch, gain2/1e1, 'cx-')
plt.plot(m4pitch, gain3/1e2, 'bv-')
plt.plot(m4pitch, gain4/1e3, 'y^-')
plt.plot(m4pitch, gain5/1e4, 'ms-')
plt.plot(m4pitch, gain6/1e5, 'g*-')
plt.plot(m4pitch, I0, 'k-', lw = 3, label='Final I0')
plt.legend()
ax.set_title('I0 normalized', fontsize=16)
ax.set_xlabel('m4pitch', fontsize=16)
ax.set_ylabel('I0 norm.', fontsize=16)
fig.subplots_adjust(top=0.85)
ax.tick_params(labelsize=16)
ax.yaxis.offsetText.set_fontsize(16)
plt.show()
if plot_XRR_m4pitch:
fig = plt.figure(figsize=(12,5))
ax=fig.add_subplot(111)
plt.yscale('log')
plt.plot(m4pitch, Inorm,'x-k')
ax.set_xlabel('m4pitch', fontsize=16)
ax.set_ylabel('R', fontsize=16)
fig.subplots_adjust(top=0.9)
fig.suptitle('First scan: '+nxs_filename.split('\\')[-1], fontsize=16)
ax.tick_params(labelsize=16)
ax.yaxis.offsetText.set_fontsize(16)
plt.show()
if plot_XRR_qz:
fig = plt.figure(figsize=(12,5))
ax=fig.add_subplot(111)
plt.yscale('log')
plt.plot(qz, Inorm,'x-k')
ax.set_xlabel('qz (nm-1)', fontsize=16)
ax.set_ylabel('R', fontsize=16)
fig.subplots_adjust(top=0.9)
fig.suptitle('First scan: '+nxs_filename.split('\\')[-1], fontsize=16)
ax.tick_params(labelsize=16)
ax.yaxis.offsetText.set_fontsize(16)
plt.show()
def Save(m4pitch, theta, qz, I0, I, direct, Inorm, nxs_filename, working_dir, verbose):
'''
Save.
XXX_XRR.dat: different parameters and XRR for each point of m4pitch.
Parameters
----------
m4_pitch : array of float
list of m4pitch
theta : array of float
list of theta (i.e. 2.*theta/2.)
qz : array of float
list of qz
I0 : array of float
intensities of the chamber normalized by its gain
I : array of float
values of the integrated ROI on the XRR scans
direct: float
value of the direct
Inorm : array of float
alues I normalized by I0 and the direct (i.e. the reflectivity)
nxs_filename : str
nexus filename
working_dir : str
directory where the treated files will be stored
verbose : bool
verbose mode
'''
# Create Save Name
savename=working_dir+nxs_filename[:nxs_filename.rfind('.nxs')]
# Save XRR
tobesaved = [m4pitch, theta, qz, I0, I, len(I)*[direct], Inorm]
np.savetxt(savename+'_XRR.dat', np.transpose(tobesaved),
delimiter = '\t', comments='',
header ='#m4pitch(deg)\t#theta(rad)\t#qz(nm-1)\t#I0\t#I\t#direct\t#R')
if verbose:
print('\t. XRR saved in:')
print('\t%s_XRR.dat'%savename)
def extract_direct(nxs_filename, ROIx0, ROIy0, ROIsizex, ROIsizey,
recording_dir, verbose):
'''
Extract the nexus scan of the direct beam and companion files and return the value of the direct.
Parameters
----------
nxs_filename : str
nexus filename
ROIx0 : int
x0 of the integrated ROI (for the direct and the XRR scans)
ROIy0 : int
y0 of the integrated ROI (for the direct and the XRR scans)
ROIsizex : int
sizex of the integrated ROI (for the direct and the XRR scans)
ROIsizey : int
sizey of the integrated ROI (for the direct and the XRR scans)
recording_dir : str
directory where the nexus file is stored
verbose : bool
verbose mode
Returns
-------
float
direct, value of the direct
Raises
------
SystemExit('direct_gains.dat not found')
when XXX_direct_gains.dat file not found
SystemExit('Sensor not found')
when a particular sensor is not found in the nxs
'''
file_path = recording_dir+nxs_filename[:-4]+'_direct_gains.dat'
import os
import sys
if not os.path.isfile(file_path):
print(PN._RED+'The file %s seems not to exist in recording directory'%
(nxs_filename[:-4]+'_direct_gains.dat')+PN._RESET)
print('\t\t recording directory: %s'%recording_dir)
sys.exit('direct_gains.dat not found')
else:
file = nxs_filename[:-4]+'_direct_gains.dat'
#########################
# Extraction of the gains
gain1 = np.array([])
gain2 = np.array([])
gain3 = np.array([])
gain4 = np.array([])
gain5 = np.array([])
gain6 = np.array([])
if verbose: print('Extracting I0 for different gains from file %s'%file)
gain1_temp = np.genfromtxt(recording_dir+file)[0]
gain2_temp = np.genfromtxt(recording_dir+file)[1]
gain3_temp = np.genfromtxt(recording_dir+file)[2]
gain4_temp = np.genfromtxt(recording_dir+file)[3]
gain5_temp = np.genfromtxt(recording_dir+file)[4]
gain6_temp = np.genfromtxt(recording_dir+file)[5]
gain1 = np.append(gain1, gain1_temp)
gain2 = np.append(gain2, gain2_temp)
gain3 = np.append(gain3, gain3_temp)
gain4 = np.append(gain4, gain4_temp)
gain5 = np.append(gain5, gain5_temp)
gain6 = np.append(gain6, gain6_temp)
gains = [gain1, gain2, gain3, gain4, gain5, gain6]
# Identify saturated values
g1s = np.where(gain1<9.9, gain1, -1)
g2s = np.where(gain2<9.9, gain2, -1)
g3s = np.where(gain3<9.9, gain3, -1)
g4s = np.where(gain4<9.9, gain4, -1)
g5s = np.where(gain5<9.9, gain5, -1)
g6s = np.where(gain6<9.9, gain6, -1)
# Construct the final curve
g_temp = np.where(g6s<0, g5s/1e4, g6s/1e5)
g_temp = np.where(g_temp<0, g4s/1e3, g_temp)
g_temp = np.where(g_temp<0, g3s/1e2, g_temp)
g_temp = np.where(g_temp<0, g2s/1e1, g_temp)
I0 = np.where(g_temp<0, g1s, g_temp)[0]
#########################
# Extraction of direct
# Extract the sum of all images
image, _, _ =PilatusSum.Extract(nxs_filename, recording_dir,
show_data_stamps=False, verbose=verbose)
# Extract the ROI containing reflected beams
# Full image: ROI = [0, 0, 981, 1043]
ROI = [ROIx0, ROIy0, ROIsizex, ROIsizey]
#Apply the ROI
image_ROI = image[ROI[1]:ROI[1]+ROI[3], ROI[0]:ROI[0]+ROI[2]]
# Extract info from nexus file
nexus = PN.PyNexusFile(recording_dir+nxs_filename, fast=True)
nbpts=int(nexus.get_nbpts())
stamps0D, data0D = nexus.extractData("0D")
sensor_list = [stamps0D[i][0] if stamps0D[i][1]== None else stamps0D[i][1] for i in range(len(stamps0D))]
if 'integration_time' in sensor_list:
integration_timeArg = sensor_list.index('integration_time')
else:
print(PN._RED+'\t Sensor %s is not in the sensor list'%('integration_time')+PN._RESET)
sys.exit('Sensor not found')
integration_time = np.mean(data0D[integration_timeArg])
# Sum the ROI and normalize with the integration time, the number of images
image_ROI_sum = image_ROI.sum(axis=0).sum(axis=0)
direct = image_ROI_sum/integration_time/I0/nbpts
return direct
def Calib(calib_XRR_data, distance):
'''
Fit and plot the values from the calibration, to give the user the coefficient to be used in the XRR scripts.
Parameters
----------
calib_XRR_data : array
array with the values of the calibration for m4pitch, c10tablepitch, gamma and zs
distance : float
the distance pilatus-trough in mm
'''
# c10tablepitch vs m4pitch
fig=plt.figure(1, figsize=(10,5))
ax=fig.add_subplot(111)
xfit=calib_XRR_data[:,0]
yfit=calib_XRR_data[:,1]
ax.plot(xfit,yfit, 'bo')
B_c10, A_c10 = np.polyfit(xfit, yfit, 1)
ax.plot(xfit, A_c10+B_c10*xfit, 'r-', lw=2)
ax.set_title('Calibration c10tablepitch')
ax.set_xlabel('m4pitch (deg)', fontsize=16)
ax.set_ylabel('c10tablepitch (step)', fontsize=16)
ax.tick_params(labelsize=16)
ax.yaxis.offsetText.set_fontsize(16)
fig.text(0.15, .8, "coeff_c10tablepitch = %3.5g (steps per deg)"%(B_c10), fontsize=14)
plt.show()
# zs vs m4pitch
fig=plt.figure(1, figsize=(10,5))
ax=fig.add_subplot(111)
xfit=calib_XRR_data[:,0]
yfit=calib_XRR_data[:,3]
ax.plot(xfit,yfit, 'bo')
B_zs, A_zs = np.polyfit(xfit, yfit, 1)
ax.plot(xfit, A_zs+B_zs*xfit, 'r-', lw=2)
ax.set_title('Calibration zs')
ax.set_xlabel('m4pitch (deg)', fontsize=16)
ax.set_ylabel('zs (mm)', fontsize=16)
ax.tick_params(labelsize=16)
ax.yaxis.offsetText.set_fontsize(16)
fig.text(0.15, .8, "coeff_zs = %3.5f (mm per deg)"%(B_zs), fontsize=14)
plt.show()
# gamma vs m4pitch
fig=plt.figure(1, figsize=(10,5))
ax=fig.add_subplot(111)
xfit=calib_XRR_data[:,0]
yfit=calib_XRR_data[:,2]
ax.plot(xfit,yfit, 'bo')
B_gamma, A_gamma = np.polyfit(xfit, yfit, 1)
coeff_gamma = (B_gamma+2)*np.pi*distance/(180.*B_zs)
ax.plot(xfit, A_gamma + B_gamma*xfit, 'r-', lw=2)
ax.set_title('Calibration gamma')
ax.set_xlabel('m4pitch (deg)', fontsize=16)
ax.set_ylabel('gamma (deg)', fontsize=16)
ax.tick_params(labelsize=16)
ax.yaxis.offsetText.set_fontsize(16)
#fig.text(0.2, .8, "%3.5g deg per deg"%(B_gamma), fontsize=14)
fig.text(0.15, .8, "coeff_gamma = %3.2f "%(coeff_gamma), fontsize=14)
plt.show()
| 36.427885 | 117 | 0.607232 | 3,163 | 22,731 | 4.251028 | 0.101802 | 0.013759 | 0.017849 | 0.021419 | 0.771233 | 0.733675 | 0.712033 | 0.670608 | 0.641306 | 0.627398 | 0 | 0.039528 | 0.284369 | 22,731 | 623 | 118 | 36.486356 | 0.787054 | 0.338524 | 0 | 0.566901 | 0 | 0.003521 | 0.094401 | 0.004197 | 0 | 0 | 0 | 0 | 0 | 1 | 0.021127 | false | 0 | 0.042254 | 0 | 0.073944 | 0.045775 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
d3a0e130e8b9d7d7c42e0164502452cf061d2603 | 499 | py | Python | app/controllers/__init__.py | s1hofmann/MrHyde2.0-Backbone | d1d95cb8ba3e2db60c75218625453b915e87e1bf | [
"MIT"
] | 3 | 2017-05-14T20:11:49.000Z | 2018-04-12T03:56:06.000Z | app/controllers/__init__.py | s1hofmann/MrHyde2.0-Backbone | d1d95cb8ba3e2db60c75218625453b915e87e1bf | [
"MIT"
] | null | null | null | app/controllers/__init__.py | s1hofmann/MrHyde2.0-Backbone | d1d95cb8ba3e2db60c75218625453b915e87e1bf | [
"MIT"
] | null | null | null | from flask import Blueprint
from app import current_config
jekyll = Blueprint('jekyll',
__name__,
static_folder=current_config.STATICDIR,
template_folder=current_config.TEMPLATEDIR)
status = Blueprint('status',
__name__,
static_folder=current_config.STATICDIR,
template_folder=current_config.TEMPLATEDIR)
from .status_controller import status
from .jekyll_controller import jekyll
| 29.352941 | 62 | 0.655311 | 47 | 499 | 6.553191 | 0.340426 | 0.211039 | 0.246753 | 0.149351 | 0.493506 | 0.493506 | 0.493506 | 0.493506 | 0.493506 | 0.493506 | 0 | 0 | 0.292585 | 499 | 16 | 63 | 31.1875 | 0.872521 | 0 | 0 | 0.5 | 0 | 0 | 0.024048 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.25 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
d3a1dc062b30267f8b6a7cc3486fac2bf169ce7d | 1,808 | py | Python | tests/test_util.py | BrechtBa/knxpy | 9e486f4a4623f586091e72cc6472441f3efbdd72 | [
"MIT"
] | null | null | null | tests/test_util.py | BrechtBa/knxpy | 9e486f4a4623f586091e72cc6472441f3efbdd72 | [
"MIT"
] | null | null | null | tests/test_util.py | BrechtBa/knxpy | 9e486f4a4623f586091e72cc6472441f3efbdd72 | [
"MIT"
] | null | null | null | #!/usr/bin/env/ python
################################################################################
# Copyright (c) 2016 Daniel Matuschek
# This file is part of knxpy.
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
################################################################################
import unittest
import asyncio
import knxpy
class TestUtil(unittest.TestCase):
def test_encode_ga(self):
self.assertEqual( knxpy.util.encode_ga('1/1/71'), 2375 )
def test_decode_ga(self):
self.assertEqual( knxpy.util.decode_ga(2375), '1/1/71' )
def test_encode_dpt_1(self):
self.assertEqual( knxpy.util.encode_dpt(0,'1'), [0] )
def test_decode_dpt_1(self):
self.assertEqual( knxpy.util.decode_dpt(1,'1'), 1 )
def test_encode_dpt_5(self):
self.assertEqual( knxpy.util.encode_dpt(140,'5'), [0,140] )
def test_decode_dpt_5(self):
self.assertEqual( knxpy.util.decode_dpt(b'\x8c','5'), 140 )
def test_encode_dpt_9(self):
self.assertEqual( knxpy.util.encode_dpt(22.64,'9'), b'\x00\x0cl' )
def test_decode_dpt_9(self):
self.assertEqual( knxpy.util.decode_dpt(b'\x0cl','9'), 22.64 )
if __name__ == '__main__':
unittest.main()
| 34.769231 | 80 | 0.626659 | 244 | 1,808 | 4.487705 | 0.393443 | 0.051142 | 0.138813 | 0.175342 | 0.292237 | 0.292237 | 0.226484 | 0.069406 | 0 | 0 | 0 | 0.039946 | 0.183075 | 1,808 | 51 | 81 | 35.45098 | 0.701422 | 0.380531 | 0 | 0 | 0 | 0 | 0.046463 | 0 | 0 | 0 | 0 | 0 | 0.363636 | 1 | 0.363636 | false | 0 | 0.136364 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
d3a8fa7d4f538e6d4f3b23579f159cc8cfadaa69 | 480 | py | Python | 4/oop6.py | ikramulkayes/Python_season2 | d057460d07c5d2d218ecd52e08c1d355add44df2 | [
"MIT"
] | null | null | null | 4/oop6.py | ikramulkayes/Python_season2 | d057460d07c5d2d218ecd52e08c1d355add44df2 | [
"MIT"
] | null | null | null | 4/oop6.py | ikramulkayes/Python_season2 | d057460d07c5d2d218ecd52e08c1d355add44df2 | [
"MIT"
] | null | null | null | class Vehicle:
def __init__(self):
self.lst = [0,0]
def moveUp(self):
self.lst[1] += 1
def moveDown(self):
self.lst[1] -= 1
def moveRight(self):
self.lst[0] += 1
def moveLeft(self):
self.lst[0] -= 1
def print_position(self):
print(tuple(self.lst))
car = Vehicle()
car.print_position()
car.moveUp()
car.print_position()
car.moveLeft()
car.print_position()
car.moveDown()
car.print_position()
car.moveRight() | 20 | 30 | 0.604167 | 67 | 480 | 4.19403 | 0.238806 | 0.149466 | 0.19573 | 0.270463 | 0.227758 | 0.227758 | 0 | 0 | 0 | 0 | 0 | 0.027322 | 0.2375 | 480 | 24 | 31 | 20 | 0.740437 | 0 | 0 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0 | 0 | 0.318182 | 0.272727 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
6c92d27382dba0aad5c7403248ee9b4f5c933459 | 85 | py | Python | ABC_A/ABC143_A.py | ryosuke0825/atcoder_python | 185cdbe7db44ecca1aaf357858d16d31ce515ddb | [
"MIT"
] | null | null | null | ABC_A/ABC143_A.py | ryosuke0825/atcoder_python | 185cdbe7db44ecca1aaf357858d16d31ce515ddb | [
"MIT"
] | null | null | null | ABC_A/ABC143_A.py | ryosuke0825/atcoder_python | 185cdbe7db44ecca1aaf357858d16d31ce515ddb | [
"MIT"
] | null | null | null | a, b = map(int, input().split())
if b * 2 >= a:
print(0)
else:
print(a-b*2)
| 12.142857 | 32 | 0.482353 | 17 | 85 | 2.411765 | 0.647059 | 0.097561 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.048387 | 0.270588 | 85 | 6 | 33 | 14.166667 | 0.612903 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.4 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
6c9b3c5cc62a2d5ac5dd14196042eed862b77aa0 | 858 | py | Python | recruitment_agency_api/agency_api/views.py | swingthrough/recruitment-agency-storage-api-task | ab388cdfb21cf5611e9fb00a0e7dfc20c125c5c3 | [
"MIT"
] | null | null | null | recruitment_agency_api/agency_api/views.py | swingthrough/recruitment-agency-storage-api-task | ab388cdfb21cf5611e9fb00a0e7dfc20c125c5c3 | [
"MIT"
] | null | null | null | recruitment_agency_api/agency_api/views.py | swingthrough/recruitment-agency-storage-api-task | ab388cdfb21cf5611e9fb00a0e7dfc20c125c5c3 | [
"MIT"
] | null | null | null | from rest_framework import generics
from rest_framework import mixins
from . import models
from . import serializers
# Create your views here.
# class JobCandidateView(
# mixins.CreateModelMixin,
# mixins.RetrieveModelMixin,
# mixins.ListModelMixin,
# mixins.UpdateModelMixin,
# mixins.DestroyModelMixin,
# generics.GenericAPIView,
# ):
# queryset = models.JobCandidate.objects.all()
# serializer_class = serializers.JobCandidateSerializer
# def get_queryset(self):
# return super().get_queryset()
# class JobAdView(
# mixins.CreateModelMixin,
# mixins.RetrieveModelMixin,
# mixins.ListModelMixin,
# mixins.UpdateModelMixin,
# mixins.DestroyModelMixin,
# generics.GenericAPIView,
# ):
# queryset = models.JobAd.objects.all()
# serializer_class = serializers.JobAdSerializer | 26.8125 | 59 | 0.719114 | 74 | 858 | 8.256757 | 0.459459 | 0.026187 | 0.055646 | 0.075286 | 0.599018 | 0.481178 | 0.481178 | 0.481178 | 0.481178 | 0.481178 | 0 | 0 | 0.187646 | 858 | 32 | 60 | 26.8125 | 0.876614 | 0.799534 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
6ca3c3072f88c49c10e934740064094359f1f4b8 | 999 | py | Python | kalasearch/http_client.py | oeddyo/kalasearch-python-sdk | bbb0cd44bd19d23ee4f1a3ea0aa5599e3e1fd38b | [
"MIT"
] | 9 | 2020-08-19T06:48:25.000Z | 2022-02-05T07:24:30.000Z | kalasearch/http_client.py | oeddyo/kalasearch-python-sdk | bbb0cd44bd19d23ee4f1a3ea0aa5599e3e1fd38b | [
"MIT"
] | 1 | 2020-08-24T00:55:32.000Z | 2020-08-24T00:55:32.000Z | kalasearch/http_client.py | oeddyo/kalasearch-python-sdk | bbb0cd44bd19d23ee4f1a3ea0aa5599e3e1fd38b | [
"MIT"
] | 4 | 2021-11-09T20:41:13.000Z | 2022-03-22T09:13:54.000Z | import requests
class HttpClient():
config = None
headers = {}
def __init__(self, config):
self.config = config
self.headers = {
"X-Kalasearch-Id": self.config.appId,
"X-Kalasearch-Key": self.config.apiKey,
"Content-Type": "application/json"
}
def send_request(self, http_method, path, body=None):
endpoint = self.config.domain + "/" + path
if body is None:
request = http_method(endpoint, headers=self.headers)
else:
request = http_method(endpoint, headers=self.headers, json=body)
return request.json()
def get(self, path):
return self.send_request(requests.get, path)
def post(self, path, body=None):
return self.send_request(requests.post, path, body)
def put(self, path, body=None):
return self.send_request(requests.put, path, body)
def delete(self, path):
return self.send_request(requests.delete, path)
| 28.542857 | 76 | 0.613614 | 120 | 999 | 5.008333 | 0.3 | 0.083195 | 0.093178 | 0.139767 | 0.415973 | 0.415973 | 0.415973 | 0.14975 | 0.14975 | 0 | 0 | 0 | 0.27027 | 999 | 34 | 77 | 29.382353 | 0.824417 | 0 | 0 | 0 | 0 | 0 | 0.06006 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.038462 | 0.153846 | 0.576923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
6ca943dcea2e6106828fb2f413dbeb83826f198e | 171 | py | Python | tests/pyb/pyb_f405.py | TG-Techie/circuitpython | 390295dd218fb705fe652de77132dea472adf1ed | [
"MIT",
"BSD-3-Clause",
"MIT-0",
"Unlicense"
] | 3 | 2020-01-09T21:50:22.000Z | 2020-01-15T08:27:48.000Z | tests/pyb/pyb_f405.py | TG-Techie/circuitpython | 390295dd218fb705fe652de77132dea472adf1ed | [
"MIT",
"BSD-3-Clause",
"MIT-0",
"Unlicense"
] | null | null | null | tests/pyb/pyb_f405.py | TG-Techie/circuitpython | 390295dd218fb705fe652de77132dea472adf1ed | [
"MIT",
"BSD-3-Clause",
"MIT-0",
"Unlicense"
] | 1 | 2020-01-11T12:42:41.000Z | 2020-01-11T12:42:41.000Z | # test pyb module on F405 MCUs
import os, pyb
if not "STM32F405" in os.uname().machine:
print("SKIP")
raise SystemExit
print(pyb.freq())
print(type(pyb.rng()))
| 15.545455 | 41 | 0.672515 | 27 | 171 | 4.259259 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057143 | 0.181287 | 171 | 10 | 42 | 17.1 | 0.764286 | 0.163743 | 0 | 0 | 0 | 0 | 0.092199 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.166667 | 0 | 0.166667 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.