hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
a1de14ec6277bfec1f83bc1158b25a9e6f73c868 | 65 | py | Python | autoprotocol/version.py | kevin-ss-kim/autoprotocol-python | f55818e31b5c49bc093291f3ecc452f2b061e0a9 | [
"BSD-3-Clause"
] | null | null | null | autoprotocol/version.py | kevin-ss-kim/autoprotocol-python | f55818e31b5c49bc093291f3ecc452f2b061e0a9 | [
"BSD-3-Clause"
] | null | null | null | autoprotocol/version.py | kevin-ss-kim/autoprotocol-python | f55818e31b5c49bc093291f3ecc452f2b061e0a9 | [
"BSD-3-Clause"
] | null | null | null | """Maintains current version of package"""
__version__ = "6.1.2"
| 21.666667 | 42 | 0.707692 | 9 | 65 | 4.666667 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 0.123077 | 65 | 2 | 43 | 32.5 | 0.684211 | 0.553846 | 0 | 0 | 0 | 0 | 0.217391 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a1e2e6423c6af48c84a3959d270e3cdaa9b51fa4 | 874 | py | Python | mdm/utils.py | agnihotri7/dj_mdm | 9fc68393d270d361d2a37b726282277b15121658 | [
"MIT"
] | null | null | null | mdm/utils.py | agnihotri7/dj_mdm | 9fc68393d270d361d2a37b726282277b15121658 | [
"MIT"
] | null | null | null | mdm/utils.py | agnihotri7/dj_mdm | 9fc68393d270d361d2a37b726282277b15121658 | [
"MIT"
] | null | null | null | """
"""
import sys
import uuid
import base64
import fileinput
import datetime
from django.utils import timezone
from django.conf import settings
from django.shortcuts import get_object_or_404
from urlparse import urlparse, parse_qs
from APNSWrapper import *
from mdm.models import MDMDevice, DeviceCommand
def replaceAll(file, searchExp, replaceExp):
for line in fileinput.input(file, inplace=1):
if searchExp in line:
line = line.replace(searchExp, replaceExp)
sys.stdout.write(line)
def notify_device(device):
device_token = base64.b64decode(device.device_token)
cert = settings.APNS_CERT
wrapper = APNSNotificationWrapper(cert, False)
message = APNSNotification()
message.token(device_token)
message.appendProperty(APNSProperty('mdm', str(device.push_magic)))
wrapper.append(message)
wrapper.notify()
| 25.705882 | 71 | 0.751716 | 107 | 874 | 6.046729 | 0.53271 | 0.046368 | 0.05255 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013736 | 0.167048 | 874 | 33 | 72 | 26.484848 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0.00346 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08 | false | 0 | 0.44 | 0 | 0.52 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
a1f747225cd20292d907c35e437ba676e03d1874 | 511 | py | Python | app/core/auth.py | oxfn/owtest | f4eeae225ef67684d96edd5708c44a0fd639d037 | [
"Unlicense"
] | null | null | null | app/core/auth.py | oxfn/owtest | f4eeae225ef67684d96edd5708c44a0fd639d037 | [
"Unlicense"
] | null | null | null | app/core/auth.py | oxfn/owtest | f4eeae225ef67684d96edd5708c44a0fd639d037 | [
"Unlicense"
] | null | null | null | from fastapi import Depends
from fastapi.exceptions import HTTPException
from fastapi.security import OAuth2PasswordBearer
from app.models.users import User, UserRepository
get_token = OAuth2PasswordBearer(tokenUrl="/login")
async def get_user(
token: str = Depends(get_token), users: UserRepository = Depends()
) -> User:
"""Get current authenticated user."""
user = await users.get(token=token)
if user:
return user
raise HTTPException(status_code=403, detail="Invalid token")
| 28.388889 | 70 | 0.749511 | 61 | 511 | 6.213115 | 0.508197 | 0.087071 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011655 | 0.16047 | 511 | 17 | 71 | 30.058824 | 0.871795 | 0 | 0 | 0 | 0 | 0 | 0.040084 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.166667 | 0.333333 | 0 | 0.416667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
a1fa4d83464708be7267466fae9107d6a82954d1 | 32,249 | py | Python | modelling/model_seiihurd_matrices.py | lhunlindeion/Mathematical-and-Statistical-Modeling-of-COVID19-in-Brazil | 164f19fcf04fe391aa7515fe436c63c6534fa89c | [
"MIT"
] | 37 | 2020-03-28T16:36:56.000Z | 2021-11-16T11:34:55.000Z | modelling/model_seiihurd_matrices.py | lhunlindeion/Mathematical-and-Statistical-Modeling-of-COVID19-in-Brazil | 164f19fcf04fe391aa7515fe436c63c6534fa89c | [
"MIT"
] | 1 | 2020-05-29T16:39:03.000Z | 2020-06-01T19:29:55.000Z | modelling/model_seiihurd_matrices.py | lhunlindeion/Mathematical-and-Statistical-Modeling-of-COVID19-in-Brazil | 164f19fcf04fe391aa7515fe436c63c6534fa89c | [
"MIT"
] | 9 | 2020-03-28T00:00:16.000Z | 2021-02-19T14:41:47.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue May 19 18:08:01 2020
@author: Felipe A. C. Pereira
Implementação do ajuste do modelo SEIIHURD com separação de grupos. Necessita
de mais verificações e funções para simplificar o input. Baseado nas classes
disponíveis no modelos.py
"""
import numpy as np
from functools import reduce
import scipy.integrate as spi
from scipy.optimize import least_squares
from platypus import NSGAII, Problem, Real
from pyswarms.single.global_best import GlobalBestPSO
import pyswarms as ps
from pyswarms.backend.topology import Star
from pyswarms.utils.plotters import plot_cost_history
from itertools import repeat
import multiprocessing as mp
import copy
import joblib
'''
Social contact matrices from
PREM, Kiesha; COOK, Alex R.; JIT, Mark. Projecting social contact matrices in
152 countries using contact surveys and demographic data. PLoS computational
biology, v. 13, n. 9, p. e1005697, 2017.
'''
ages_Mu_min = 5 * np.arange(16)
Mu_house = np.array([[0.47868515, 0.50507561, 0.29848922, 0.15763748, 0.26276959,
0.40185462, 0.46855027, 0.42581354, 0.2150961 , 0.0856771 ,
0.08705463, 0.07551931, 0.05129175, 0.02344832, 0.00793644,
0.01072846],
[0.35580205, 0.77874482, 0.51392686, 0.21151069, 0.08597966,
0.28306027, 0.49982218, 0.52854893, 0.41220947, 0.15848728,
0.07491245, 0.07658339, 0.04772343, 0.02588962, 0.01125956,
0.01073152],
[0.25903114, 0.63488713, 1.36175618, 0.50016515, 0.11748191,
0.10264613, 0.24113458, 0.47274372, 0.54026417, 0.26708819,
0.11007723, 0.04406045, 0.02746409, 0.02825033, 0.02044872,
0.01214665],
[0.14223192, 0.24383932, 0.53761638, 1.05325205, 0.28778496,
0.10925453, 0.0651564 , 0.2432454 , 0.39011334, 0.41381277,
0.23194909, 0.07541471, 0.03428398, 0.02122257, 0.01033573,
0.00864859],
[0.27381886, 0.15430529, 0.16053062, 0.5104134 , 0.95175366,
0.3586594 , 0.09248672, 0.04774269, 0.15814197, 0.36581739,
0.25544811, 0.13338965, 0.03461345, 0.01062458, 0.00844199,
0.00868782],
[0.59409802, 0.26971847, 0.10669146, 0.18330524, 0.39561893,
0.81955947, 0.26376865, 0.06604084, 0.03824556, 0.11560004,
0.23218163, 0.15331788, 0.07336147, 0.02312255, 0.00412646,
0.01025778],
[0.63860889, 0.75760606, 0.43109156, 0.09913293, 0.13935789,
0.32056062, 0.65710277, 0.25488454, 0.1062129 , 0.0430932 ,
0.06880784, 0.09938458, 0.09010691, 0.02233902, 0.01155556,
0.00695246],
[0.56209348, 0.87334544, 0.75598244, 0.33199136, 0.07233271,
0.08674171, 0.20243583, 0.60062714, 0.17793601, 0.06307045,
0.04445926, 0.04082447, 0.06275133, 0.04051762, 0.01712777,
0.00598721],
[0.35751289, 0.66234582, 0.77180208, 0.54993616, 0.17368099,
0.07361914, 0.13016852, 0.19937327, 0.46551558, 0.15412263,
0.06123041, 0.0182514 , 0.04234381, 0.04312892, 0.01656267,
0.01175358],
[0.208131 , 0.41591452, 0.56510014, 0.67760241, 0.38146504,
0.14185001, 0.06160354, 0.12945701, 0.16470166, 0.41150841,
0.14596804, 0.04404807, 0.02395316, 0.01731295, 0.01469059,
0.02275339],
[0.30472548, 0.26744442, 0.41631962, 0.46516888, 0.41751365,
0.28520772, 0.13931619, 0.07682945, 0.11404965, 0.16122096,
0.33813266, 0.1349378 , 0.03755396, 0.01429426, 0.01356763,
0.02551792],
[0.52762004, 0.52787011, 0.33622117, 0.43037934, 0.36416323,
0.42655672, 0.33780201, 0.13492044, 0.0798784 , 0.15795568,
0.20367727, 0.33176385, 0.12256126, 0.05573807, 0.0124446 ,
0.02190564],
[0.53741472, 0.50750067, 0.3229994 , 0.30706704, 0.21340314,
0.27424513, 0.32838657, 0.26023515, 0.13222548, 0.07284901,
0.11950584, 0.16376401, 0.25560123, 0.09269703, 0.02451284,
0.00631762],
[0.37949376, 0.55324102, 0.47449156, 0.24796638, 0.19276924,
0.20675484, 0.3267867 , 0.39525729, 0.3070043 , 0.10088992,
0.10256839, 0.13016641, 0.1231421 , 0.24067708, 0.05475668,
0.01401368],
[0.16359554, 0.48536065, 0.40533723, 0.31542539, 0.06890518,
0.15670328, 0.12884062, 0.27912381, 0.25685832, 0.20143856,
0.12497647, 0.07565566, 0.10331686, 0.08830789, 0.15657321,
0.05744065],
[0.29555039, 0.39898035, 0.60257982, 0.5009724 , 0.13799378,
0.11716593, 0.14366306, 0.31602298, 0.34691652, 0.30960511,
0.31253708, 0.14557295, 0.06065554, 0.10654772, 0.06390924,
0.09827735]])
Mu_school = np.array([[3.21885854e-001, 4.31659966e-002, 7.88269419e-003,
8.09548363e-003, 5.35038146e-003, 2.18201974e-002,
4.01633514e-002, 2.99376002e-002, 1.40680283e-002,
1.66587853e-002, 9.47774696e-003, 7.41041622e-003,
1.28200661e-003, 7.79120405e-004, 8.23608272e-066,
6.37926405e-120],
[5.40133328e-002, 4.84870697e+000, 2.70046494e-001,
3.14778450e-002, 3.11206331e-002, 8.56826951e-002,
1.08251879e-001, 9.46101139e-002, 8.63528188e-002,
5.51141159e-002, 4.19385198e-002, 1.20958942e-002,
4.77242219e-003, 1.39787217e-003, 3.47452943e-004,
8.08973738e-039],
[4.56461982e-004, 1.04840235e+000, 6.09152459e+000,
1.98915822e-001, 1.99709921e-002, 6.68319525e-002,
6.58949586e-002, 9.70851505e-002, 9.54147078e-002,
6.70538232e-002, 4.24864096e-002, 1.98701346e-002,
5.11869429e-003, 7.27320438e-004, 4.93746124e-025,
1.82153965e-004],
[2.59613205e-003, 4.73315233e-002, 1.99337834e+000,
7.20040500e+000, 8.57326037e-002, 7.90668822e-002,
8.54208542e-002, 1.10816964e-001, 8.76955236e-002,
9.22975521e-002, 4.58035025e-002, 2.51130956e-002,
5.71391798e-003, 1.07818752e-003, 6.21174558e-033,
1.70710246e-070],
[7.19158720e-003, 2.48833195e-002, 9.89727235e-003,
8.76815025e-001, 4.33963352e-001, 5.05185217e-002,
3.30594492e-002, 3.81384107e-002, 2.34709676e-002,
2.67235372e-002, 1.32913985e-002, 9.00655556e-003,
6.94913059e-004, 1.25675951e-003, 1.77164197e-004,
1.21957619e-047],
[7.04119204e-003, 1.19412206e-001, 3.75016980e-002,
2.02193056e-001, 2.79822908e-001, 1.68610223e-001,
2.86939363e-002, 3.56961469e-002, 4.09234494e-002,
3.32290896e-002, 8.12074348e-003, 1.26152144e-002,
4.27869081e-003, 2.41737477e-003, 4.63116893e-004,
1.28597237e-003],
[1.41486320e-002, 3.86561429e-001, 2.55902236e-001,
1.69973534e-001, 4.98104010e-002, 8.98122446e-002,
7.95333394e-002, 5.19274611e-002, 5.46612930e-002,
2.64567137e-002, 2.03241595e-002, 2.96263220e-003,
5.42888613e-003, 4.47585970e-004, 1.65440335e-048,
3.11189454e-055],
[2.40945305e-002, 2.11030046e-001, 1.54767246e-001,
8.17929897e-002, 1.84061608e-002, 5.43009779e-002,
7.39351186e-002, 5.21677009e-002, 5.63267084e-002,
2.51807147e-002, 3.53972554e-003, 7.96646343e-003,
5.56929776e-004, 2.08530461e-003, 1.84428290e-123,
9.69555083e-067],
[7.81313905e-003, 1.14371898e-001, 9.09011945e-002,
3.80212104e-001, 8.54533192e-003, 2.62430162e-002,
2.51880009e-002, 3.22563508e-002, 6.73506045e-002,
2.24997143e-002, 2.39241043e-002, 6.50627191e-003,
5.50892674e-003, 4.78308850e-004, 4.81213215e-068,
2.40231425e-092],
[6.55265016e-002, 2.31163536e-001, 1.49970765e-001,
5.53563093e-001, 5.74032526e-003, 3.02865481e-002,
5.72506883e-002, 4.70559232e-002, 4.28736553e-002,
2.42614518e-002, 2.86665377e-002, 1.29570473e-002,
3.24362518e-003, 1.67930318e-003, 6.20916950e-134,
3.27297624e-072],
[1.72765646e-002, 3.43744913e-001, 4.30902785e-001,
4.74293073e-001, 5.39328187e-003, 1.44128740e-002,
3.95545363e-002, 3.73781860e-002, 4.56834488e-002,
5.92135906e-002, 2.91473801e-002, 1.54857502e-002,
4.53105390e-003, 8.87272668e-024, 1.23797452e-117,
5.64262349e-078],
[6.14363036e-002, 2.98367348e-001, 2.59092700e-001,
3.00800812e-001, 5.92454596e-003, 5.26458862e-002,
2.02188672e-002, 3.27897605e-002, 4.07753741e-002,
2.83422407e-002, 2.43657809e-002, 2.73993226e-002,
8.87990718e-003, 1.13279180e-031, 7.81960493e-004,
7.62467510e-004],
[3.63695643e-002, 5.96870355e-002, 3.05072624e-002,
1.45523978e-001, 1.26062984e-002, 1.69458169e-003,
1.55127292e-002, 4.22097670e-002, 9.21792425e-003,
1.42200652e-002, 1.10967529e-002, 5.77020348e-003,
2.04474044e-002, 1.11075734e-002, 4.42271199e-067,
2.12068625e-037],
[1.67937029e-003, 2.72971001e-002, 1.05886266e-002,
7.61087735e-032, 1.97191559e-003, 1.92885006e-003,
1.24343737e-002, 5.39297787e-003, 5.41684968e-003,
8.63502071e-003, 1.94554498e-003, 1.49082274e-002,
8.11781100e-003, 1.74395489e-002, 1.11239023e-002,
3.45693088e-126],
[1.28088348e-028, 5.11065200e-026, 1.93019797e-040,
7.60476035e-003, 2.63586947e-022, 1.69749024e-024,
1.25875005e-026, 7.62109877e-003, 7.84979948e-003,
2.11516023e-002, 3.52117832e-002, 2.14360383e-002,
7.73902109e-003, 8.01328325e-003, 7.91285055e-003,
2.13825814e-002],
[2.81655586e-094, 2.11305187e-002, 8.46562506e-042,
2.12592841e-002, 4.89802057e-036, 7.59232387e-003,
9.77247001e-069, 2.23108239e-060, 1.43715978e-048,
8.56015694e-060, 4.69469043e-042, 1.59822047e-046,
2.20978550e-083, 8.85861277e-107, 1.02042815e-080,
6.61413913e-113]])
Mu_work = np.array([[0.00000000e+000, 0.00000000e+000, 0.00000000e+000,
0.00000000e+000, 0.00000000e+000, 0.00000000e+000,
0.00000000e+000, 0.00000000e+000, 0.00000000e+000,
0.00000000e+000, 0.00000000e+000, 0.00000000e+000,
0.00000000e+000, 8.20604524e-092, 1.20585150e-005,
3.16436834e-125],
[0.00000000e+000, 1.16840561e-003, 9.90713236e-072,
4.42646396e-059, 2.91874286e-006, 9.98773031e-003,
2.58779981e-002, 5.66104376e-003, 2.12699812e-002,
5.72117462e-003, 1.48212306e-003, 1.23926126e-003,
1.28212945e-056, 1.34955578e-005, 7.64591325e-079,
2.38392073e-065],
[0.00000000e+000, 2.56552144e-003, 1.12756182e-001,
2.40351143e-002, 2.62981485e-002, 7.56512432e-003,
6.19587609e-002, 1.73269871e-002, 5.87405128e-002,
3.26749742e-002, 1.24709193e-002, 2.93054408e-008,
3.71596993e-017, 2.79780317e-053, 4.95800770e-006,
3.77718083e-102],
[0.00000000e+000, 1.07213881e-002, 4.28390448e-002,
7.22769090e-001, 5.93479736e-001, 3.39341952e-001,
3.17013715e-001, 2.89168861e-001, 3.11143180e-001,
2.34889238e-001, 1.32953769e-001, 6.01944097e-002,
1.47306181e-002, 8.34699602e-006, 2.85972822e-006,
1.88926122e-031],
[0.00000000e+000, 9.14252587e-003, 5.74508682e-002,
4.00000235e-001, 7.93386618e-001, 7.55975146e-001,
6.32277283e-001, 6.83601459e-001, 4.98506972e-001,
3.82309992e-001, 2.81363576e-001, 1.23338103e-001,
4.15708021e-002, 9.86113407e-006, 1.32609387e-005,
3.74318048e-006],
[0.00000000e+000, 1.04243481e-002, 7.34587492e-002,
3.49556755e-001, 7.50680101e-001, 1.25683393e+000,
9.01245714e-001, 8.63446835e-001, 7.70443641e-001,
5.17237071e-001, 4.09810981e-001, 1.80645400e-001,
5.51284783e-002, 1.60674627e-005, 1.01182608e-005,
3.01442534e-006],
[0.00000000e+000, 1.65842404e-002, 8.34076781e-002,
1.89301935e-001, 5.21246906e-001, 8.54460001e-001,
1.12054931e+000, 9.64310078e-001, 8.34675180e-001,
6.52534012e-001, 3.79383514e-001, 2.11198205e-001,
5.17285688e-002, 1.63795563e-005, 4.10100851e-006,
3.49478980e-006],
[0.00000000e+000, 1.11666639e-002, 5.03319748e-002,
3.70510313e-001, 4.24294782e-001, 7.87535547e-001,
8.45085693e-001, 1.14590365e+000, 1.07673077e+000,
7.13492115e-001, 5.00740004e-001, 1.90102207e-001,
3.59740115e-002, 1.22988530e-005, 9.13512833e-006,
6.02097416e-006],
[0.00000000e+000, 6.07792440e-003, 5.49337607e-002,
2.23499535e-001, 4.82353827e-001, 7.52291991e-001,
8.89187601e-001, 9.33765370e-001, 1.10492283e+000,
8.50124391e-001, 5.88941528e-001, 1.94947085e-001,
5.09477228e-002, 1.43626161e-005, 1.02721567e-005,
1.29503893e-005],
[0.00000000e+000, 3.31622551e-003, 7.01829848e-002,
2.67512972e-001, 3.14796392e-001, 5.41516885e-001,
6.95769048e-001, 7.50620518e-001, 7.50038547e-001,
7.00954088e-001, 4.35197983e-001, 2.11283335e-001,
3.88576200e-002, 1.62810370e-005, 1.08243610e-005,
6.09172339e-006],
[0.00000000e+000, 4.39576425e-004, 7.17737968e-002,
1.89254612e-001, 2.47832532e-001, 5.16027731e-001,
6.02783971e-001, 6.15949277e-001, 8.05581107e-001,
7.44063535e-001, 5.44855374e-001, 2.52198706e-001,
4.39235685e-002, 1.18079721e-005, 1.18226645e-005,
1.01613165e-005],
[0.00000000e+000, 4.91737561e-003, 1.08686672e-001,
1.24987806e-001, 1.64110983e-001, 3.00118829e-001,
4.18159745e-001, 3.86897613e-001, 4.77718241e-001,
3.60854250e-001, 3.22466456e-001, 1.92516925e-001,
4.07209694e-002, 1.34978304e-005, 6.58739925e-006,
6.65716756e-006],
[0.00000000e+000, 6.35447018e-004, 3.96329620e-002,
1.83072502e-002, 7.04596701e-002, 1.24861117e-001,
1.37834574e-001, 1.59845720e-001, 1.66933479e-001,
1.56084857e-001, 1.14949158e-001, 8.46570798e-002,
1.50879843e-002, 2.03019580e-005, 8.26102156e-006,
1.48398182e-005],
[7.60299521e-006, 3.36326754e-006, 7.64855296e-006,
2.27621532e-005, 3.14933351e-005, 7.89308410e-005,
7.24212842e-005, 2.91748203e-005, 6.61873732e-005,
5.95693238e-005, 7.70713500e-005, 5.30687748e-005,
4.66030117e-005, 1.41633235e-005, 2.49066205e-005,
1.19109038e-005],
[5.78863840e-055, 7.88785149e-042, 2.54830412e-006,
2.60648191e-005, 1.68036205e-005, 2.12446739e-005,
3.57267603e-005, 4.02377033e-005, 3.56401935e-005,
3.09769252e-005, 2.13053382e-005, 4.49709414e-005,
2.61368373e-005, 1.68266203e-005, 1.66514322e-005,
2.60822813e-005],
[2.35721271e-141, 9.06871674e-097, 1.18637122e-089,
9.39934076e-022, 4.66000452e-005, 4.69664011e-005,
4.69316082e-005, 8.42184044e-005, 2.77788168e-005,
1.03294378e-005, 1.06803618e-005, 7.26341826e-075,
1.10073971e-065, 1.02831671e-005, 5.16902994e-049,
8.28040509e-043]])
Mu_other = np.array([[0.95537734, 0.46860132, 0.27110607, 0.19447667, 0.32135073,
0.48782072, 0.54963024, 0.42195593, 0.27152038, 0.17864251,
0.20155642, 0.16358271, 0.1040159 , 0.0874149 , 0.05129938,
0.02153823],
[0.51023519, 2.17757364, 0.9022516 , 0.24304235, 0.20119518,
0.39689588, 0.47242431, 0.46949918, 0.37741651, 0.16843746,
0.12590504, 0.12682331, 0.11282247, 0.08222718, 0.03648526,
0.02404257],
[0.18585796, 1.11958124, 4.47729443, 0.67959759, 0.43936317,
0.36934142, 0.41566744, 0.44467286, 0.48797422, 0.28795385,
0.17659191, 0.10674831, 0.07175567, 0.07249261, 0.04815305,
0.03697862],
[0.09854482, 0.3514869 , 1.84902386, 5.38491613, 1.27425161,
0.59242579, 0.36578735, 0.39181798, 0.38131832, 0.31501028,
0.13275648, 0.06408612, 0.04499218, 0.04000664, 0.02232326,
0.01322698],
[0.13674436, 0.1973461 , 0.33264088, 2.08016394, 3.28810184,
1.29198125, 0.74642201, 0.44357051, 0.32781391, 0.35511243,
0.20132011, 0.12961 , 0.04994553, 0.03748657, 0.03841073,
0.02700581],
[0.23495203, 0.13839031, 0.14085679, 0.5347385 , 1.46021275,
1.85222022, 1.02681162, 0.61513602, 0.39086271, 0.32871844,
0.25938947, 0.13520412, 0.05101963, 0.03714278, 0.02177751,
0.00979745],
[0.23139098, 0.18634831, 0.32002214, 0.2477269 , 0.64111274,
0.93691022, 1.14560725, 0.73176025, 0.43760432, 0.31057135,
0.29406937, 0.20632155, 0.09044896, 0.06448983, 0.03041877,
0.02522842],
[0.18786196, 0.25090485, 0.21366969, 0.15358412, 0.35761286,
0.62390736, 0.76125666, 0.82975354, 0.54980593, 0.32778339,
0.20858991, 0.1607099 , 0.13218526, 0.09042909, 0.04990491,
0.01762718],
[0.12220241, 0.17968132, 0.31826246, 0.19846971, 0.34823183,
0.41563737, 0.55930999, 0.54070187, 0.5573184 , 0.31526474,
0.20194048, 0.09234293, 0.08377534, 0.05819374, 0.0414762 ,
0.01563101],
[0.03429527, 0.06388018, 0.09407867, 0.17418896, 0.23404519,
0.28879108, 0.34528852, 0.34507961, 0.31461973, 0.29954426,
0.21759668, 0.09684718, 0.06596679, 0.04274337, 0.0356891 ,
0.02459849],
[0.05092152, 0.10829561, 0.13898902, 0.2005828 , 0.35807132,
0.45181815, 0.32281821, 0.28014803, 0.30125545, 0.31260137,
0.22923948, 0.17657382, 0.10276889, 0.05555467, 0.03430327,
0.02064256],
[0.06739051, 0.06795035, 0.0826437 , 0.09522087, 0.23309189,
0.39055444, 0.39458465, 0.29290532, 0.27204846, 0.17810118,
0.24399007, 0.22146653, 0.13732849, 0.07585801, 0.03938794,
0.0190908 ],
[0.04337917, 0.05375367, 0.05230119, 0.08066901, 0.16619572,
0.25423056, 0.25580913, 0.27430323, 0.22478799, 0.16909017,
0.14284879, 0.17211604, 0.14336033, 0.10344522, 0.06797049,
0.02546014],
[0.04080687, 0.06113728, 0.04392062, 0.04488748, 0.12808591,
0.19886058, 0.24542711, 0.19678011, 0.17800136, 0.13147441,
0.13564091, 0.14280335, 0.12969805, 0.11181631, 0.05550193,
0.02956066],
[0.01432324, 0.03441212, 0.05604694, 0.10154456, 0.09204 ,
0.13341443, 0.13396901, 0.16682638, 0.18562675, 0.1299677 ,
0.09922375, 0.09634331, 0.15184583, 0.13541738, 0.1169359 ,
0.03805293],
[0.01972631, 0.02274412, 0.03797545, 0.02036785, 0.04357298,
0.05783639, 0.10706321, 0.07688271, 0.06969759, 0.08029393,
0.05466604, 0.05129046, 0.04648653, 0.06132882, 0.05004289,
0.03030569]])
def generate_reduced_matrices(age_sep, Ni):
'''
Receives the age_separation and populations to generate the average contact
matrices, returns a (4, len(age_sep)+1, len(age_sep)+1) with the 4 partial
contact matrices: house, school, work and other
Ni is the population for each population component (16 5-years age groups)
'''
nMat = len(age_sep) + 1
Ms = np.empty((4, nMat, nMat))
age_indexes = list()
age_indexes.append(np.flatnonzero(ages_Mu_min <= age_sep[0]))
for i in range(1, len(age_sep)):
age_indexes.append(np.flatnonzero((ages_Mu_min > age_sep[i-1]) *
(ages_Mu_min <= age_sep[i])))
age_indexes.append(np.flatnonzero(ages_Mu_min > age_sep[-1]))
for i in range(nMat):
Nia = Ni[age_indexes[i]]
Na = Nia.sum()
for j in range(nMat):
Ms[0,i,j] = (Nia * ((Mu_house[age_indexes[i]][:,age_indexes[j]]).sum(axis=1))).sum()/Na
Ms[1,i,j] = (Nia * ((Mu_school[age_indexes[i]][:,age_indexes[j]]).sum(axis=1))).sum()/Na
Ms[2,i,j] = (Nia * ((Mu_work[age_indexes[i]][:,age_indexes[j]]).sum(axis=1))).sum()/Na
Ms[3,i,j] = (Nia * ((Mu_other[age_indexes[i]][:,age_indexes[j]]).sum(axis=1))).sum()/Na
return Ms
class SEIIHURD_age:
''' SEIIHURD Model'''
def __init__(self,tamanhoPop,numeroProcessadores=None):
self.N = tamanhoPop
self.numeroProcessadores = numeroProcessadores
self.pos = None
#pars dict betas, delta, kappa, p, gammaA, gammaS, h, epsilon, gammaH, gammaU, muU, muH, wU, wH
# seguindo a notação beta_12 é 2 infectando 1, onde 1 é a linha e 2 a coluna.
def _SEIIHURD_age_eq(self, X, t, pars):
S, E, Ia, Is, H, U, R, D, Nw = np.split(X, 9)
StE = S * (pars['beta'] @ ((Ia * pars['delta'] + Is).reshape((-1,1)))).flatten()
dS = - StE
dE = StE - pars['kappa'] * E
dIa = (1. - pars['p']) * pars['kappa'] * E - pars['gammaA'] * Ia
dIs = pars['p'] * pars['kappa'] * E - pars['gammaS'] * Is
dH = pars['h'] * pars['xi'] * pars['gammaS'] * Is + (1 - pars['muU'] +\
pars['wU'] * pars['muU']) * pars['gammaU'] * U - pars['gammaH'] * H
dU = pars['h'] * (1 - pars['xi']) * pars['gammaS'] * Is + pars['wH'] *\
pars['gammaH'] * H - pars['gammaU'] * U
dR = pars['gammaA'] * Ia + (1. - pars['h']) * pars['gammaS'] * Is + \
(1 - pars['muH']) * (1 - pars['wH']) * pars['gammaH'] * H
dD = (1 - pars['wH']) * pars['muH'] * pars['gammaH'] * H + \
(1 - pars['wU']) * pars['muU'] * pars['gammaU'] * U
dNw = pars['p'] * pars['kappa'] * E
return np.r_[dS, dE, dIa, dIs, dH, dU, dR, dD, dNw]
def _call_ODE(self, ts, ppars):
betas = ppars['beta'].copy()
pars = copy.deepcopy(ppars)
if 'tcut' not in ppars.keys():
tcorte = None
else:
tcorte = pars['tcut']
if type(ts) in [int, float]:
ts = np.arange(ts)
if tcorte == None:
tcorte = [ts[-1]]
if type(betas) != list:
betas = [betas]
if tcorte[-1] < ts[-1]:
tcorte.append(ts[-1])
tcorte = [ts[0]] + tcorte
tcorte.sort()
Is0 = pars['x0'].reshape((3,-1)).sum(axis=0)
x0 = np.r_[1. - Is0, pars['x0'], np.zeros(4*len(Is0)), pars['x0'][2*len(Is0):]]
saida = x0.reshape((1,-1))
Y = saida.copy()
for i in range(1, len(tcorte)):
cut_last = False
pars['beta'] = betas[i-1]
t = ts[(ts >= tcorte[i-1]) * (ts<= tcorte[i])]
if len(t) > 0:
if t[0] > tcorte[i-1]:
t = np.r_[tcorte[i-1], t]
if t[-1] < tcorte[i]:
t = np.r_[t, tcorte[i]]
cut_last = True
Y = spi.odeint(self._SEIIHURD_age_eq, Y[-1], t, args=(pars,))
if cut_last:
saida = np.r_[saida, Y[1:-1]]
else:
saida = np.r_[saida, Y[1:]]
else:
Y = spi.odeint(self._SEIIHURD_age_eq, Y[-1], tcorte[i-1:i+1], args=(pars,))
return ts, saida
def _fill_paramPSO(self, paramPSO):
if 'options' not in paramPSO.keys():
paramPSO['options'] = {'c1': 0.1, 'c2': 0.3, 'w': 0.9,'k':5,'p':2}
if 'n_particles' not in paramPSO.keys():
paramPSO['n_particles'] = 300
if 'iter' not in paramPSO.keys():
paramPSO['iter'] = 1000
return paramPSO
def _prepare_input(self, data):
list_states = ['S', 'E', 'Ia', 'Is', 'H', 'U', 'R', 'D', 'Nw']
i_integ = list()
Y = list()
for ke in data.keys():
if ke == 't':
t = data[ke]
else:
Y.append(data[ke])
simb, num = ke.split("_")
n0 = self.nages * list_states.index(simb)
if '_ALL' in ke:
i_integ.append(list(range(n0,n0 + self.nages)))
else:
i_integ.append(int(num) + n0)
return i_integ, Y, t
def _prepare_conversor(self, p2f, pothers, bound):
padjus = list()
if bound != None:
bound_new = [[], []]
for i, par in enumerate(p2f):
if 'beta' in par:
if '_ALL' in par:
for l in range(len(pothers['beta'])):
for j in range(pothers['beta'][i].shape[0]):
for k in range(pothers['beta'][i].shape[1]):
padjus.append('beta_{}_{}_{}'.format(l,j,k))
if bound != None:
bound_new[0].append(bound[0][i])
bound_new[1].append(bound[1][i])
else:
padjus.append(par)
if bound != None:
bound_new[0].append(bound[0][i])
bound_new[1].append(bound[1][i])
elif '_ALL' in par:
name = par.split('_')[0]
for j in range(len(pothers[name])):
padjus.append('{}_{}'.format(name, j))
if bound != None:
bound_new[0].append(bound[0][i])
bound_new[1].append(bound[1][i])
else:
padjus.append(par)
if bound != None:
bound_new[0].append(bound[0][i])
bound_new[1].append(bound[1][i])
if bound != None:
bound_new[0] = np.array(bound_new[0])
bound_new[1] = np.array(bound_new[1])
return bound_new, padjus
def _conversor(self, coefs, pars0, padjus):
pars = copy.deepcopy(pars0)
for i, coef in enumerate(coefs):
if 'beta' in padjus[i]:
if '_M_' in padjus[i]:
indx = int(padjus[i].split('_')[-1])
pars['beta'][indx] = coef * pars['beta'][indx]
else:
indx = padjus[i].split('_')
pars['beta'][int(indx[1])][int(indx[2]), int(indx[3])] = coef
elif '_' in padjus[i]:
name, indx = padjus[i].split('_')
pars[name][int(indx)] = coef
else:
pars[padjus[i]] = coef
return pars
def objectiveFunction(self, coefs_list, stand_error=False, weights=None):
errsq = np.zeros(coefs_list.shape[0])
for i, coefs in enumerate(coefs_list):
errs = self._residuals(coefs, stand_error, weights)
errsq[i] = (errs*errs).mean()
return errsq
def _residuals(self, coefs, stand_error=False, weights=None):
if type(weights) == type(None):
weights = np.ones(len(self.Y))
error_func = (lambda x: np.sqrt(x+1)) if stand_error else (lambda x:np.ones_like(x))
errs = np.empty((0,))
ts, mY = self._call_ODE(self.t, self._conversor(coefs, self.pars_init, self.padjus))
for indY, indODE in enumerate(self.i_integ):
if type(indODE) == list:
temp = (self.N.reshape((1,-1)) * mY[:,indODE]).sum(axis=1)
errs = np.r_[errs, weights[indY] * ((self.Y[indY] - temp) / error_func(temp)) ]
else:
try:
errs = np.r_[errs, weights[indY] * ((self.Y[indY] - self.N[indODE%self.nages] * mY[:,indODE]) / error_func(mY[:,indODE])) ]
except:
print(self.t, self._conversor(coefs, self.pars_init, self.padjus))
raise
errs = errs[~np.isnan(errs)]
return errs
def prepare_to_fit(self, data, pars, pars_to_fit, bound=None, nages=1, stand_error=False):
self.pars_init = copy.deepcopy(pars)
self.nages = nages
self.i_integ, self.Y, self.t = self._prepare_input(data)
self.bound, self.padjus = self._prepare_conversor(pars_to_fit, pars, bound)
self.n_to_fit = len(self.padjus)
def fit(self, data, pars, pars_to_fit, bound=None, nages=2, paramPSO=dict(), stand_error=False):
'''
data: dictionary:
t -> times
X_N -> variable:
X is the simbol of the parameter: S, E, Ia, Is, H, U, R, D, Nw
N is the index of the age-group, starting on 0
pars: dictionary, with the variable names as keys.
pars_to_fit: the name of the parameters to fits, if the parameter is a list,
add _N with the index you want to if or _ALL to fit all
the 'beta' parameter has 3 indexes: beta_I_J_K, with I indicating the
which tcut it belongs and J_K indicating the position in the matrix.
the beta also has a option 'beta_M_I' that fits a multiplicative
constant of the infection matrix, without changing the relative weights
(the _M_ and _ALL_ options are incompatible by now, and _M_ requires
testing)
bound = intervalo de limite para procura de cada parametro, onde None = sem limite
bound => (lista_min_bound, lista_max_bound)
'''
paramPSO = self._fill_paramPSO(paramPSO)
self.prepare_to_fit(data, pars, pars_to_fit, bound=bound, nages=nages, stand_error=stand_error)
optimizer = ps.single.LocalBestPSO(n_particles=paramPSO['n_particles'], dimensions=self.n_to_fit, options=paramPSO['options'],bounds=self.bound)
cost = pos = None
cost, pos = optimizer.optimize(self.objectiveFunction,paramPSO['iter'], stand_error=stand_error, n_processes=self.numeroProcessadores)
self.pos = pos
self.pars_opt = self._conversor(pos, self.pars_init, self.padjus )
self.rmse = cost
self.optimize = optimizer
def fit_lsquares(self, data, pars, pars_to_fit, bound=None, nages=2, stand_error=False, init=None, nrand=10):
self.prepare_to_fit(data, pars, pars_to_fit, bound=bound, nages=nages, stand_error=stand_error)
if init == None:
cost_best = np.inf
res_best = None
#BUG: the parallel code does not work if PSO code had run previously
if type(self.pos) != type(None) or self.numeroProcessadores == None or self.numeroProcessadores <= 1:
for i in range(nrand):
print("{} / {}".format(i, nrand))
par0 = np.random.rand(self.n_to_fit)
par0 = self.bound[0] + par0 * (self.bound[1] - self.bound[0])
res = least_squares(self._residuals, par0, bounds=self.bound)
if res.cost < cost_best:
cost_best = res.cost
res_best = res
else:
par0 = np.random.rand(nrand, self.n_to_fit)
par0 = self.bound[0].reshape((1,-1)) + par0 * (self.bound[1] - self.bound[0]).reshape((1,-1))
f = lambda p0: least_squares(self._residuals, p0, bounds=self.bound)
all_res = joblib.Parallel(n_jobs=self.numeroProcessadores)(joblib.delayed(f)(p0,) for p0 in par0)
costs = np.array([res.cost for res in all_res])
cost_best = all_res[costs.argmin()].cost
res_best = all_res[costs.argmin()]
else:
res_best = least_squares(self._residuals, init, bounds=bound )
self.pos_ls = res_best.x
self.pars_opt_ls = self._conversor(res_best.x, self.pars_init, self.padjus )
self.rmse_ls = (res_best.fun**2).mean()
self.result_ls = res_best
def predict(self, t=None, coefs=None, model_output=False):
if type(t) == type(None):
t = self.t
if type(coefs) == type(None):
coefs = self.pos
elif type(coefs) == str and coefs == 'LS':
coefs = self.pos_ls
ts, mY = self._call_ODE(t, self._conversor(coefs, self.pars_init, self.padjus))
saida = np.zeros((len(ts), 0))
for i in self.i_integ:
if type(i) == list:
ytemp = (mY[:,i] *self.N.reshape((1,-1))).sum(axis=1)
else:
ytemp = mY[:,i] * self.N[i%self.nages]
saida = np.c_[saida, ytemp.reshape((-1,1))]
if model_output:
return ts, saida, mY
else:
return ts, saida
#ts, X = call_ODE(X0, tmax, betas, param, tcorte=tcorte)
#plt.plot(ts, X[:,:2], '.-')
| 48.936267 | 152 | 0.593135 | 4,564 | 32,249 | 4.136284 | 0.317923 | 0.00678 | 0.017216 | 0.008899 | 0.118392 | 0.087721 | 0.077074 | 0.06881 | 0.066268 | 0.055832 | 0 | 0.456549 | 0.25852 | 32,249 | 658 | 153 | 49.010638 | 0.332929 | 0.054265 | 0 | 0.064348 | 0 | 0 | 0.010984 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.024348 | false | 0 | 0.022609 | 0 | 0.067826 | 0.003478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a1fedb42ea7da198208259c1cf29d8481af7dd8f | 3,202 | py | Python | exarl/agents/agent_vault/_prioritized_replay.py | schr476/EXARL | 7f4596bd8b3d7960aaf52bc677ceac4f37029834 | [
"BSD-3-Clause"
] | 2 | 2022-02-03T20:33:17.000Z | 2022-02-10T22:43:32.000Z | exarl/agents/agent_vault/_prioritized_replay.py | schr476/EXARL | 7f4596bd8b3d7960aaf52bc677ceac4f37029834 | [
"BSD-3-Clause"
] | 40 | 2022-01-25T18:03:12.000Z | 2022-03-31T21:43:32.000Z | exarl/agents/agent_vault/_prioritized_replay.py | schr476/EXARL | 7f4596bd8b3d7960aaf52bc677ceac4f37029834 | [
"BSD-3-Clause"
] | 1 | 2022-02-10T14:33:30.000Z | 2022-02-10T14:33:30.000Z | import random
import numpy as np
import tensorflow as tf
from collections import deque
class PrioritizedReplayBuffer():
""" Class implements Prioritized Experience Replay (PER)
"""
def __init__(self, maxlen):
""" PER constructor
Args:
maxlen (int): buffer length
"""
self.maxlen = None if maxlen == "none" else maxlen
self.buffer = deque(maxlen=self.maxlen)
self.priorities = deque(maxlen=self.maxlen)
def add(self, experience):
""" Add experiences to buffer
Args:
experience (list): state, action, reward, next_state, done
Returns:
full_buffer (done): True if buffer is full
"""
full_buffer = len(self.buffer) == self.maxlen
self.buffer.append(experience)
self.priorities.append(max(self.priorities, default=1))
return full_buffer
def get_probabilities(self, priority_scale):
""" Get probabilities for experiences
Args:
priority_scale (float64): range [0, 1]
Returns:
sample_probabilities (numpy array): probabilities assigned to experiences based on weighting factor (scale)
"""
scaled_priorities = np.array(self.priorities) ** priority_scale
sample_probabilities = scaled_priorities / sum(scaled_priorities)
return sample_probabilities
def get_importance(self, probabilities):
""" Compute importance
Args:
probabilities (numpy array): experience probabilities
Returns:
importance_normalized (numpy array): normalized importance
"""
importance = 1 / len(self.buffer) * 1 / probabilities
importance_normalized = importance / max(importance)
return importance_normalized
def sample(self, batch_size, priority_scale=1.0):
""" Sample experiences
Args:
batch_size (int): size of batch
priority_scale (float, optional): range = [0, 1]. Defaults to 1.0.
Returns:
samples (list): sampled based on probabilities
importance (numpy array): Importance of samples
sample_indices (array): Indices of samples
"""
sample_size = min(len(self.buffer), batch_size)
sample_probs = self.get_probabilities(priority_scale)
sample_indices = random.choices(range(len(self.buffer)), k=sample_size, weights=sample_probs)
samples = np.array(self.buffer, dtype=object)[sample_indices]
importance = self.get_importance(sample_probs[sample_indices])
return samples, importance, sample_indices
def set_priorities(self, indices, errors, offset=0.1):
""" Set priorities to experiences
Args:
indices (array): sample indices
errors (array): corresponding losses
offset (float, optional): Small offset. Defaults to 0.1.
"""
for i, e in zip(indices, errors):
self.priorities[int(i)] = abs(e) + offset
def get_buffer_length(self):
""" Get buffer length
Returns:
(int): buffer length
"""
return len(self.buffer)
| 32.673469 | 119 | 0.628045 | 345 | 3,202 | 5.704348 | 0.278261 | 0.04065 | 0.033028 | 0.021341 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007404 | 0.282948 | 3,202 | 97 | 120 | 33.010309 | 0.849739 | 0.358838 | 0 | 0 | 0 | 0 | 0.002319 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.205882 | false | 0 | 0.294118 | 0 | 0.676471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b807feaa7b46fd15709c8ce5d95d9ec7f33de619 | 446 | py | Python | utilities/readProperties.py | harry-100/qa-automation-framework | 5fbe03e930820537e53f2d26b1c2b2bd2b222bf5 | [
"MIT"
] | null | null | null | utilities/readProperties.py | harry-100/qa-automation-framework | 5fbe03e930820537e53f2d26b1c2b2bd2b222bf5 | [
"MIT"
] | null | null | null | utilities/readProperties.py | harry-100/qa-automation-framework | 5fbe03e930820537e53f2d26b1c2b2bd2b222bf5 | [
"MIT"
] | null | null | null | from configparser import RawConfigParser
config = RawConfigParser()
config.read("configuration/config.ini")
class ReadConfig():
@staticmethod
def getApplicationURL():
url = (config.get('common info', 'baseURL'))
return url
@staticmethod
def getUserName():
username = (config.get('common info', 'username'))
return username
@staticmethod
def getPassword():
password = (config.get('common info', 'password'))
return password
| 20.272727 | 52 | 0.7287 | 46 | 446 | 7.065217 | 0.5 | 0.138462 | 0.138462 | 0.175385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141256 | 446 | 21 | 53 | 21.238095 | 0.848564 | 0 | 0 | 0.1875 | 0 | 0 | 0.179372 | 0.053812 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1875 | false | 0.1875 | 0.0625 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b80bab1732354a9bf5c8b8066aa6d633362ec4a1 | 181 | py | Python | tinyq/__init__.py | mozillazg/tinyq | fd9ecc593931c9b315c4aeb9150389b3e4ae670e | [
"MIT"
] | 14 | 2017-08-02T23:30:16.000Z | 2021-05-31T19:58:29.000Z | tinyq/__init__.py | mozillazg/tinyq | fd9ecc593931c9b315c4aeb9150389b3e4ae670e | [
"MIT"
] | null | null | null | tinyq/__init__.py | mozillazg/tinyq | fd9ecc593931c9b315c4aeb9150389b3e4ae670e | [
"MIT"
] | 2 | 2017-03-13T09:36:05.000Z | 2017-10-27T14:33:48.000Z | # -*- coding: utf-8 -*-
from tinyq.app import Application # noqa
__version__ = '0.3.0'
__author__ = 'mozillazg'
__license__ = 'MIT'
__copyright__ = 'Copyright (c) 2017 mozillazg'
| 22.625 | 46 | 0.696133 | 22 | 181 | 5 | 0.863636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052288 | 0.154696 | 181 | 7 | 47 | 25.857143 | 0.666667 | 0.143646 | 0 | 0 | 0 | 0 | 0.296053 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b81231fb69c94c906db0d3069a6a4df0633be007 | 174 | py | Python | python/find_country/city.py | lukasjoc/scritps | ebcffef0a3977ab8bb1bebf20383c350bd7baa37 | [
"0BSD"
] | 1 | 2020-11-09T19:32:43.000Z | 2020-11-09T19:32:43.000Z | python/find_country/city.py | lukasjoc/scritps | ebcffef0a3977ab8bb1bebf20383c350bd7baa37 | [
"0BSD"
] | null | null | null | python/find_country/city.py | lukasjoc/scritps | ebcffef0a3977ab8bb1bebf20383c350bd7baa37 | [
"0BSD"
] | null | null | null | #!/usr/bin/env python3
from geopy.geocoders import Nominatim
locator = Nominatim(user_agent="getcity")
loc = locator.geocode("Munich")
print(loc.latitude, loc.longitude)
| 17.4 | 41 | 0.764368 | 23 | 174 | 5.73913 | 0.826087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00641 | 0.103448 | 174 | 9 | 42 | 19.333333 | 0.839744 | 0.12069 | 0 | 0 | 0 | 0 | 0.086093 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b814083d787036eed69c0998c2575b86f722e9ca | 3,172 | py | Python | src/cocoannot/annotpreferred/models.py | coco-tasks/annotation-tool | ebd2e77ec8aeddedb9f87f457b6d5d8989b602db | [
"MIT"
] | 9 | 2019-04-18T15:35:38.000Z | 2021-06-07T08:01:27.000Z | src/cocoannot/annotpreferred/models.py | coco-tasks/annotation-tool | ebd2e77ec8aeddedb9f87f457b6d5d8989b602db | [
"MIT"
] | 1 | 2019-07-16T10:07:09.000Z | 2019-07-16T10:07:09.000Z | src/cocoannot/annotpreferred/models.py | coco-tasks/annotation-tool | ebd2e77ec8aeddedb9f87f457b6d5d8989b602db | [
"MIT"
] | 3 | 2020-05-20T12:06:59.000Z | 2020-12-12T06:45:26.000Z | from django.contrib.auth.models import User
from django.db import models
from markdownx.models import MarkdownxField
class Category(models.Model):
"""
Represents a COCO category
"""
coco_id = models.IntegerField(unique=True, db_index=True)
name = models.CharField(max_length=50)
supercategory = models.CharField(max_length=50)
def __str__(self):
return "Category {}: {} ({})".format(self.coco_id, self.name, self.supercategory)
class Task(models.Model):
"""
Represents a Task
"""
number = models.IntegerField(unique=True, db_index=True)
name = models.CharField(max_length=50)
desc = models.TextField(blank=True, null=True)
desc_image = models.ImageField(upload_to='task_images', blank=True, default=None, null=True)
def __str__(self):
return "Task {}: {}".format(self.number, self.name)
class Image(models.Model):
"""
Represents an image in the dataset
"""
coco_id = models.IntegerField(unique=True, db_index=True)
path = models.CharField(max_length=200)
set_name = models.CharField(max_length=10)
width = models.IntegerField()
height = models.IntegerField()
related_tasks = models.ManyToManyField(Task)
def __str__(self):
return "Image {}".format(self.coco_id)
class Annot(models.Model):
"""
Represents a COCO annotation for instances.
"""
coco_id = models.IntegerField(unique=True, db_index=True)
image = models.ForeignKey(Image, on_delete=models.CASCADE)
category = models.ForeignKey(Category, on_delete=models.CASCADE)
area = models.FloatField()
iscrowd = models.BooleanField()
bbox_x = models.FloatField()
bbox_y = models.FloatField()
bbox_w = models.FloatField()
bbox_h = models.FloatField()
segmentation = models.TextField() # I am going to store the segmentation as a text field.
# I will convert it into json on demand.
def __str__(self):
return "Annot {} ({})".format(self.coco_id, self.category)
def get_bbox(self):
return [self.bbox_x, self.bbox_y, self.bbox_w, self.bbox_h]
def set_bbox(self, bbox):
bbox = tuple(bbox)
self.bbox_x, self.bbox_y, self.bbox_w, self.bbox_h = bbox
class Job(models.Model):
"""
Represents a job (an annotation of the preferred objects) for an image by a user.
"""
task = models.ForeignKey(Task, on_delete=models.CASCADE, db_index=True)
image = models.ForeignKey(Image, on_delete=models.CASCADE, db_index=True)
user = models.ForeignKey(User, on_delete=models.CASCADE, db_index=True)
is_example = models.BooleanField(default=False, db_index=True)
is_done = models.BooleanField(default=False, db_index=True)
date_created = models.DateTimeField(auto_now_add=True)
def __str__(self):
return "Job[task={}, image={}, user={}]".format(self.task.name, self.image_id, self.user.first_name)
class PreferredAnnot(models.Model):
job = models.ForeignKey(Job, on_delete=models.CASCADE, db_index=True)
annot = models.ForeignKey(Annot, on_delete=models.CASCADE, db_index=True)
class AnnotationPolicy(models.Model):
policy = MarkdownxField()
| 33.041667 | 108 | 0.698298 | 420 | 3,172 | 5.095238 | 0.261905 | 0.035981 | 0.056542 | 0.068692 | 0.377103 | 0.290187 | 0.290187 | 0.192056 | 0.192056 | 0.152336 | 0 | 0.004229 | 0.180013 | 3,172 | 95 | 109 | 33.389474 | 0.818531 | 0.094262 | 0 | 0.175439 | 0 | 0 | 0.033619 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.122807 | false | 0 | 0.052632 | 0.105263 | 0.964912 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
629f16a424f010c4c41e887a5a673cd1324c487c | 820 | py | Python | hadoop/hadoop/node.py | DropletProbe/shellscripts | d070eef24cd6003694d81a3bdc38f2097452c076 | [
"MIT"
] | null | null | null | hadoop/hadoop/node.py | DropletProbe/shellscripts | d070eef24cd6003694d81a3bdc38f2097452c076 | [
"MIT"
] | null | null | null | hadoop/hadoop/node.py | DropletProbe/shellscripts | d070eef24cd6003694d81a3bdc38f2097452c076 | [
"MIT"
] | null | null | null | import re
class Node:
def __init__(self, id, ip, hostname, type):
self.id = id
self.ip = ip
self.hostname = hostname
self.type = type
self.validate()
def validate(self):
self.illegal = False
if re.match("^(\d{1,3}\.){3}\d{1,3}$", self.ip):
self.illegal = reduce(lambda x, y : x and y, map(lambda x : True if int(x) <= 255 else False, self.ip.split(".")), True)
if self.illegal == False:
raise Exception("IP Format Error, " + self.ip + " is illegal.")
def __repr__(self):
return str(self)
def __str__(self):
return "<IP: %s, id: %s, hostname: %s, type: %s>" % (self.ip, self.id, self.hostname, self.type)
# if __name__ == "__main__":
# a = Node(1, "192.168.1.300", 1, 1)
# a.validate()
| 28.275862 | 132 | 0.540244 | 119 | 820 | 3.554622 | 0.378151 | 0.070922 | 0.07565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035959 | 0.287805 | 820 | 28 | 133 | 29.285714 | 0.688356 | 0.1 | 0 | 0 | 0 | 0.055556 | 0.126703 | 0.031335 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.055556 | 0.111111 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
62ac5880cfcb73a7f5f41808ba14ed348ca4e208 | 607 | py | Python | net_utils.py | mfatihaktas/edge-load-balance | b866ca47ba37a605eeba05658b1d302f6855a23f | [
"MIT"
] | null | null | null | net_utils.py | mfatihaktas/edge-load-balance | b866ca47ba37a605eeba05658b1d302f6855a23f | [
"MIT"
] | null | null | null | net_utils.py | mfatihaktas/edge-load-balance | b866ca47ba37a605eeba05658b1d302f6855a23f | [
"MIT"
] | null | null | null | from debug_utils import *
def run(node_l, cmd_l):
popens = {}
for i, n in enumerate(node_l):
popens[n] = n.popen(cmd_l[i])
log(DEBUG, "Started {}".format(n))
def run_masters(m_l):
run(m_l, ['./run.sh m {}'.format(i) for i in range(len(m_l))])
log(DEBUG, "done")
def run_workers(w_l):
run(w_l, ['./run.sh w {}'.format(i) for i in range(len(w_l))])
log(DEBUG, "done")
def run_dashboard_server(d):
run([d], ['./run.sh d'])
log(DEBUG, "done")
# TODO: does not work
def pkill():
os.system('pkill -f client.py; pkill -f master.py; pkill -f worker.py; pkill -f dashboard.py')
log(DEBUG, "done")
| 24.28 | 95 | 0.634267 | 116 | 607 | 3.189655 | 0.362069 | 0.108108 | 0.12973 | 0.059459 | 0.216216 | 0.216216 | 0.113514 | 0 | 0 | 0 | 0 | 0 | 0.151565 | 607 | 24 | 96 | 25.291667 | 0.718447 | 0.031301 | 0 | 0.222222 | 0 | 0.055556 | 0.244027 | 0 | 0 | 0 | 0 | 0.041667 | 0 | 1 | 0.277778 | false | 0 | 0.055556 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
62b9b66788e4870e77759cfd4f12b782254dda87 | 102 | py | Python | python/src/pdef/version.py | pdef/pdef-python | 09c6e6424ad141b40310eeea53c1f8b6e79be560 | [
"Apache-2.0"
] | 2 | 2020-03-15T03:22:59.000Z | 2020-03-15T04:37:23.000Z | python/src/pdef/version.py | pdef/pdef-python | 09c6e6424ad141b40310eeea53c1f8b6e79be560 | [
"Apache-2.0"
] | null | null | null | python/src/pdef/version.py | pdef/pdef-python | 09c6e6424ad141b40310eeea53c1f8b6e79be560 | [
"Apache-2.0"
] | null | null | null | # encoding: utf-8
'''Pdef version in a separate module to simplify setup.py.'''
__version__ = '1.2.0'
| 25.5 | 61 | 0.696078 | 17 | 102 | 3.941176 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045977 | 0.147059 | 102 | 3 | 62 | 34 | 0.724138 | 0.705882 | 0 | 0 | 0 | 0 | 0.208333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
62d38abd8cc901db862694dad66e79677fe1126b | 620 | py | Python | drift/tests/test_soakit.py | dgnorth/drift | d4f52726dad1e8a1aa25d9295dd898c5514f729f | [
"MIT"
] | 6 | 2016-09-24T13:40:12.000Z | 2020-04-15T18:53:47.000Z | drift/tests/test_soakit.py | dgnorth/drift | d4f52726dad1e8a1aa25d9295dd898c5514f729f | [
"MIT"
] | 4 | 2016-11-15T10:40:04.000Z | 2020-11-26T09:48:37.000Z | drift/tests/test_soakit.py | dgnorth/drift | d4f52726dad1e8a1aa25d9295dd898c5514f729f | [
"MIT"
] | 3 | 2016-10-31T09:48:02.000Z | 2021-05-25T09:22:07.000Z | import unittest
import logging
from flask import Flask
@unittest.skip("needs refactoring")
class driftTestCase(unittest.TestCase):
def setUp(self):
self.app = Flask(__name__)
logging.basicConfig(level="ERROR")
self.app.testing = True
self.test_client = self.app.test_client()
def tearDown(self):
pass
def test_flasksetup(self):
# Run minimal setup
# flasksetup(self.app, options=[])
pass
def test_all(self):
# Run with all options
# flasksetup(self.app)
pass
if __name__ == "__main__":
unittest.main()
| 18.787879 | 49 | 0.624194 | 71 | 620 | 5.225352 | 0.464789 | 0.09434 | 0.059299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.274194 | 620 | 32 | 50 | 19.375 | 0.824444 | 0.148387 | 0 | 0.166667 | 0 | 0 | 0.057361 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0.166667 | 0.166667 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
62d7ffd0472b4eb45907da6224fc3b2b392b8416 | 238 | py | Python | python/7kyu/even_numbers_in_an_array.py | Sigmanificient/codewars | b34df4bf55460d312b7ddf121b46a707b549387a | [
"MIT"
] | 3 | 2021-06-08T01:57:13.000Z | 2021-06-26T10:52:47.000Z | python/7kyu/even_numbers_in_an_array.py | Sigmanificient/codewars | b34df4bf55460d312b7ddf121b46a707b549387a | [
"MIT"
] | null | null | null | python/7kyu/even_numbers_in_an_array.py | Sigmanificient/codewars | b34df4bf55460d312b7ddf121b46a707b549387a | [
"MIT"
] | 2 | 2021-06-10T21:20:13.000Z | 2021-06-30T10:13:26.000Z | """Kata url: https://www.codewars.com/kata/5a431c0de1ce0ec33a00000c."""
from typing import List
def even_numbers(arr: List[int], n: int) -> List[int]:
odds: List[int] = [x for x in arr if not x % 2]
return odds[len(odds) - n:]
| 26.444444 | 71 | 0.655462 | 39 | 238 | 3.974359 | 0.666667 | 0.135484 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076531 | 0.176471 | 238 | 8 | 72 | 29.75 | 0.714286 | 0.273109 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
62e0e93747dae752fc1a23adaf41a5a5edb9094b | 1,912 | py | Python | pypy/module/oracle/test/test_objectvar.py | kantai/passe-pypy-taint-tracking | b60a3663f8fe89892dc182c8497aab97e2e75d69 | [
"MIT"
] | 2 | 2016-07-06T23:30:20.000Z | 2017-05-30T15:59:31.000Z | pypy/module/oracle/test/test_objectvar.py | benoitc/pypy | a3e1b12d1d01dc29056b7badc051ffc034297658 | [
"MIT"
] | null | null | null | pypy/module/oracle/test/test_objectvar.py | benoitc/pypy | a3e1b12d1d01dc29056b7badc051ffc034297658 | [
"MIT"
] | 2 | 2020-07-09T08:14:22.000Z | 2021-01-15T18:01:25.000Z | from pypy.module.oracle.test.test_connect import OracleTestBase
class AppTestObjectVar(OracleTestBase):
def test_fetch_object(self):
import datetime
cur = self.cnx.cursor()
try:
cur.execute("drop table pypy_test_objtable")
except oracle.DatabaseError:
pass
try:
cur.execute("drop type pypy_test_objtype")
except oracle.DatabaseError:
pass
try:
cur.execute("drop type pypy_test_arraytype")
except oracle.DatabaseError:
pass
cur.execute("""\
create type pypy_test_objtype as object (
numbercol number,
stringcol varchar2(60),
datecol date);
""")
cur.execute("""\
create type pypy_test_arraytype as varray(10) of number;
""")
cur.execute("""\
create table pypy_test_objtable (
objcol pypy_test_objtype,
arraycol pypy_test_arraytype)
""")
cur.execute("""\
insert into pypy_test_objtable values (
pypy_test_objtype(1, 'someText',
to_date(20070306, 'YYYYMMDD')),
pypy_test_arraytype(5, 10, null, 20))
""")
cur.execute("select objcol, arraycol from pypy_test_objtable")
objValue, arrayValue = cur.fetchone()
assert objValue.type.schema == self.cnx.username.upper()
assert objValue.type.name == "PYPY_TEST_OBJTYPE"
assert objValue.type.attributes[0].name == "NUMBERCOL"
assert isinstance(arrayValue, list)
assert arrayValue == [5, 10, None, 20]
assert objValue.NUMBERCOL == 1
assert objValue.STRINGCOL == "someText"
assert objValue.DATECOL == datetime.datetime(2007, 03, 06)
raises(AttributeError, getattr, objValue, 'OTHER')
| 37.490196 | 70 | 0.579498 | 192 | 1,912 | 5.614583 | 0.390625 | 0.096475 | 0.069573 | 0.04731 | 0.159555 | 0.159555 | 0.107607 | 0.107607 | 0.107607 | 0.107607 | 0 | 0.026418 | 0.326883 | 1,912 | 50 | 71 | 38.24 | 0.811189 | 0 | 0 | 0.354167 | 0 | 0 | 0.41841 | 0.011506 | 0 | 0 | 0 | 0 | 0.166667 | 0 | null | null | 0.0625 | 0.041667 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
62f5cef50adba84125aceb4b7bcd641b085ef856 | 76,410 | py | Python | python/pb/pomerium/pb/users_pb2.py | adriangb/enterprise-client | 5d50b457425b0c6d08415b0d986fa9151b792151 | [
"Apache-2.0"
] | null | null | null | python/pb/pomerium/pb/users_pb2.py | adriangb/enterprise-client | 5d50b457425b0c6d08415b0d986fa9151b792151 | [
"Apache-2.0"
] | null | null | null | python/pb/pomerium/pb/users_pb2.py | adriangb/enterprise-client | 5d50b457425b0c6d08415b0d986fa9151b792151 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: users.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from google.protobuf import timestamp_pb2 as google_dot_protobuf_dot_timestamp__pb2
from google.protobuf import struct_pb2 as google_dot_protobuf_dot_struct__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='users.proto',
package='pomerium.dashboard',
syntax='proto3',
serialized_options=b'Z+github.com/pomerium/pomerium-console/pkg/pb',
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n\x0busers.proto\x12\x12pomerium.dashboard\x1a\x1fgoogle/protobuf/timestamp.proto\x1a\x1cgoogle/protobuf/struct.proto\"\xd3\x01\n\rRecoveryToken\x12\n\n\x02id\x18\x01 \x01(\t\x12\x11\n\tnamespace\x18\x02 \x01(\t\x12.\n\ncreated_at\x18\x03 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12/\n\x0bmodified_at\x18\x04 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12.\n\nexpires_at\x18\x05 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12\x12\n\npublic_key\x18\x06 \x01(\t\"%\n\tGroupInfo\x12\n\n\x02id\x18\x01 \x01(\t\x12\x0c\n\x04name\x18\x02 \x01(\t\"\xf3\x01\n\x08UserInfo\x12\n\n\x02id\x18\x01 \x01(\t\x12\x0c\n\x04name\x18\x02 \x01(\t\x12\r\n\x05\x65mail\x18\x03 \x01(\t\x12\x0e\n\x06groups\x18\x04 \x03(\t\x12I\n\x0fnamespace_roles\x18\x05 \x03(\x0b\x32\x30.pomerium.dashboard.UserInfo.NamespaceRolesEntry\x12\x13\n\x0bpicture_url\x18\x06 \x01(\t\x12\x17\n\x0fis_impersonated\x18\x07 \x01(\x08\x1a\x35\n\x13NamespaceRolesEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\r\n\x05value\x18\x02 \x01(\t:\x02\x38\x01\"6\n\x12GetUserInfoRequest\x12\x14\n\x07user_id\x18\x01 \x01(\tH\x00\x88\x01\x01\x42\n\n\x08_user_id\"F\n\x13GetUserInfoResponse\x12/\n\tuser_info\x18\x01 \x01(\x0b\x32\x1c.pomerium.dashboard.UserInfo\"B\n\x12QueryGroupsRequest\x12\r\n\x05query\x18\x01 \x01(\t\x12\x0e\n\x06offset\x18\x02 \x01(\x03\x12\r\n\x05limit\x18\x03 \x01(\x03\"Y\n\x13QueryGroupsResponse\x12-\n\x06groups\x18\x01 \x03(\x0b\x32\x1d.pomerium.dashboard.GroupInfo\x12\x13\n\x0btotal_count\x18\x02 \x01(\x03\"A\n\x11QueryUsersRequest\x12\r\n\x05query\x18\x01 \x01(\t\x12\x0e\n\x06offset\x18\x02 \x01(\x03\x12\r\n\x05limit\x18\x03 \x01(\x03\"V\n\x12QueryUsersResponse\x12+\n\x05users\x18\x01 \x03(\x0b\x32\x1c.pomerium.dashboard.UserInfo\x12\x13\n\x0btotal_count\x18\x02 \x01(\x03\"\xc0\x01\n\x16PomeriumServiceAccount\x12\n\n\x02id\x18\x01 \x01(\t\x12\x19\n\x0cnamespace_id\x18\x08 \x01(\tH\x00\x88\x01\x01\x12\x0f\n\x07user_id\x18\x02 \x01(\t\x12.\n\nexpires_at\x18\x03 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12-\n\tissued_at\x18\x04 \x01(\x0b\x32\x1a.google.protobuf.TimestampB\x0f\n\r_namespace_id\"g\n AddPomeriumServiceAccountRequest\x12\x43\n\x0fservice_account\x18\x01 \x01(\x0b\x32*.pomerium.dashboard.PomeriumServiceAccount\"u\n!AddPomeriumServiceAccountResponse\x12\x43\n\x0fservice_account\x18\x01 \x01(\x0b\x32*.pomerium.dashboard.PomeriumServiceAccount\x12\x0b\n\x03JWT\x18\x02 \x01(\t\"1\n#DeletePomeriumServiceAccountRequest\x12\n\n\x02id\x18\x01 \x01(\t\"&\n$DeletePomeriumServiceAccountResponse\".\n GetPomeriumServiceAccountRequest\x12\n\n\x02id\x18\x01 \x01(\t\"h\n!GetPomeriumServiceAccountResponse\x12\x43\n\x0fservice_account\x18\x01 \x01(\x0b\x32*.pomerium.dashboard.PomeriumServiceAccount\"7\n\"ListPomeriumServiceAccountsRequest\x12\x11\n\tnamespace\x18\x01 \x01(\t\"k\n#ListPomeriumServiceAccountsResponse\x12\x44\n\x10service_accounts\x18\x01 \x03(\x0b\x32*.pomerium.dashboard.PomeriumServiceAccount\"\x80\x04\n\x0fPomeriumSession\x12\n\n\x02id\x18\x01 \x01(\t\x12\x36\n\x04user\x18\x02 \x01(\x0b\x32(.pomerium.dashboard.PomeriumSession.User\x12\x39\n\x06groups\x18\x03 \x03(\x0b\x32).pomerium.dashboard.PomeriumSession.Group\x12\x0e\n\x06issuer\x18\x04 \x01(\t\x12-\n\tissued_at\x18\x05 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12.\n\nexpires_at\x18\x06 \x01(\x0b\x32\x1a.google.protobuf.Timestamp\x12\x10\n\x08\x61udience\x18\x07 \x03(\t\x12?\n\x06\x63laims\x18\x08 \x03(\x0b\x32/.pomerium.dashboard.PomeriumSession.ClaimsEntry\x1a\x30\n\x05Group\x12\n\n\x02id\x18\x01 \x01(\t\x12\x0c\n\x04name\x18\x02 \x01(\t\x12\r\n\x05\x65mail\x18\x03 \x01(\t\x1a/\n\x04User\x12\n\n\x02id\x18\x01 \x01(\t\x12\x0c\n\x04name\x18\x02 \x01(\t\x12\r\n\x05\x65mail\x18\x03 \x01(\t\x1aI\n\x0b\x43laimsEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12)\n\x05value\x18\x02 \x01(\x0b\x32\x1a.google.protobuf.ListValue:\x02\x38\x01\"*\n\x1c\x44\x65letePomeriumSessionRequest\x12\n\n\x02id\x18\x01 \x01(\t\"\x1f\n\x1d\x44\x65letePomeriumSessionResponse\"\'\n\x19GetPomeriumSessionRequest\x12\n\n\x02id\x18\x01 \x01(\t\"R\n\x1aGetPomeriumSessionResponse\x12\x34\n\x07session\x18\x01 \x01(\x0b\x32#.pomerium.dashboard.PomeriumSession\"\xbf\x01\n\x1bListPomeriumSessionsRequest\x12\x12\n\x05query\x18\x01 \x01(\tH\x00\x88\x01\x01\x12\x13\n\x06offset\x18\x02 \x01(\x03H\x01\x88\x01\x01\x12\x12\n\x05limit\x18\x03 \x01(\x03H\x02\x88\x01\x01\x12\x15\n\x08order_by\x18\x04 \x01(\tH\x03\x88\x01\x01\x12\x14\n\x07user_id\x18\x05 \x01(\tH\x04\x88\x01\x01\x42\x08\n\x06_queryB\t\n\x07_offsetB\x08\n\x06_limitB\x0b\n\t_order_byB\n\n\x08_user_id\"j\n\x1cListPomeriumSessionsResponse\x12\x35\n\x08sessions\x18\x01 \x03(\x0b\x32#.pomerium.dashboard.PomeriumSession\x12\x13\n\x0btotal_count\x18\x02 \x01(\x03\"(\n\x12ImpersonateRequest\x12\x12\n\nsession_id\x18\x01 \x01(\t\"\x15\n\x13ImpersonateResponse2\xaa\x02\n\x0bUserService\x12^\n\x0bGetUserInfo\x12&.pomerium.dashboard.GetUserInfoRequest\x1a\'.pomerium.dashboard.GetUserInfoResponse\x12^\n\x0bQueryGroups\x12&.pomerium.dashboard.QueryGroupsRequest\x1a\'.pomerium.dashboard.QueryGroupsResponse\x12[\n\nQueryUsers\x12%.pomerium.dashboard.QueryUsersRequest\x1a&.pomerium.dashboard.QueryUsersResponse2\xda\x04\n\x1dPomeriumServiceAccountService\x12\x88\x01\n\x19\x41\x64\x64PomeriumServiceAccount\x12\x34.pomerium.dashboard.AddPomeriumServiceAccountRequest\x1a\x35.pomerium.dashboard.AddPomeriumServiceAccountResponse\x12\x91\x01\n\x1c\x44\x65letePomeriumServiceAccount\x12\x37.pomerium.dashboard.DeletePomeriumServiceAccountRequest\x1a\x38.pomerium.dashboard.DeletePomeriumServiceAccountResponse\x12\x88\x01\n\x19GetPomeriumServiceAccount\x12\x34.pomerium.dashboard.GetPomeriumServiceAccountRequest\x1a\x35.pomerium.dashboard.GetPomeriumServiceAccountResponse\x12\x8e\x01\n\x1bListPomeriumServiceAccounts\x12\x36.pomerium.dashboard.ListPomeriumServiceAccountsRequest\x1a\x37.pomerium.dashboard.ListPomeriumServiceAccountsResponse2\xe6\x03\n\x16PomeriumSessionService\x12|\n\x15\x44\x65letePomeriumSession\x12\x30.pomerium.dashboard.DeletePomeriumSessionRequest\x1a\x31.pomerium.dashboard.DeletePomeriumSessionResponse\x12s\n\x12GetPomeriumSession\x12-.pomerium.dashboard.GetPomeriumSessionRequest\x1a..pomerium.dashboard.GetPomeriumSessionResponse\x12^\n\x0bImpersonate\x12&.pomerium.dashboard.ImpersonateRequest\x1a\'.pomerium.dashboard.ImpersonateResponse\x12y\n\x14ListPomeriumSessions\x12/.pomerium.dashboard.ListPomeriumSessionsRequest\x1a\x30.pomerium.dashboard.ListPomeriumSessionsResponseB-Z+github.com/pomerium/pomerium-console/pkg/pbb\x06proto3'
,
dependencies=[google_dot_protobuf_dot_timestamp__pb2.DESCRIPTOR,google_dot_protobuf_dot_struct__pb2.DESCRIPTOR,])
_RECOVERYTOKEN = _descriptor.Descriptor(
name='RecoveryToken',
full_name='pomerium.dashboard.RecoveryToken',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='pomerium.dashboard.RecoveryToken.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='namespace', full_name='pomerium.dashboard.RecoveryToken.namespace', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='created_at', full_name='pomerium.dashboard.RecoveryToken.created_at', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='modified_at', full_name='pomerium.dashboard.RecoveryToken.modified_at', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='expires_at', full_name='pomerium.dashboard.RecoveryToken.expires_at', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='public_key', full_name='pomerium.dashboard.RecoveryToken.public_key', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=99,
serialized_end=310,
)
_GROUPINFO = _descriptor.Descriptor(
name='GroupInfo',
full_name='pomerium.dashboard.GroupInfo',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='pomerium.dashboard.GroupInfo.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='pomerium.dashboard.GroupInfo.name', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=312,
serialized_end=349,
)
_USERINFO_NAMESPACEROLESENTRY = _descriptor.Descriptor(
name='NamespaceRolesEntry',
full_name='pomerium.dashboard.UserInfo.NamespaceRolesEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='pomerium.dashboard.UserInfo.NamespaceRolesEntry.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='value', full_name='pomerium.dashboard.UserInfo.NamespaceRolesEntry.value', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=b'8\001',
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=542,
serialized_end=595,
)
_USERINFO = _descriptor.Descriptor(
name='UserInfo',
full_name='pomerium.dashboard.UserInfo',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='pomerium.dashboard.UserInfo.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='pomerium.dashboard.UserInfo.name', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='email', full_name='pomerium.dashboard.UserInfo.email', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='groups', full_name='pomerium.dashboard.UserInfo.groups', index=3,
number=4, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='namespace_roles', full_name='pomerium.dashboard.UserInfo.namespace_roles', index=4,
number=5, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='picture_url', full_name='pomerium.dashboard.UserInfo.picture_url', index=5,
number=6, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='is_impersonated', full_name='pomerium.dashboard.UserInfo.is_impersonated', index=6,
number=7, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_USERINFO_NAMESPACEROLESENTRY, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=352,
serialized_end=595,
)
_GETUSERINFOREQUEST = _descriptor.Descriptor(
name='GetUserInfoRequest',
full_name='pomerium.dashboard.GetUserInfoRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='user_id', full_name='pomerium.dashboard.GetUserInfoRequest.user_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='_user_id', full_name='pomerium.dashboard.GetUserInfoRequest._user_id',
index=0, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
],
serialized_start=597,
serialized_end=651,
)
_GETUSERINFORESPONSE = _descriptor.Descriptor(
name='GetUserInfoResponse',
full_name='pomerium.dashboard.GetUserInfoResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='user_info', full_name='pomerium.dashboard.GetUserInfoResponse.user_info', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=653,
serialized_end=723,
)
_QUERYGROUPSREQUEST = _descriptor.Descriptor(
name='QueryGroupsRequest',
full_name='pomerium.dashboard.QueryGroupsRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='query', full_name='pomerium.dashboard.QueryGroupsRequest.query', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='offset', full_name='pomerium.dashboard.QueryGroupsRequest.offset', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='limit', full_name='pomerium.dashboard.QueryGroupsRequest.limit', index=2,
number=3, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=725,
serialized_end=791,
)
_QUERYGROUPSRESPONSE = _descriptor.Descriptor(
name='QueryGroupsResponse',
full_name='pomerium.dashboard.QueryGroupsResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='groups', full_name='pomerium.dashboard.QueryGroupsResponse.groups', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='total_count', full_name='pomerium.dashboard.QueryGroupsResponse.total_count', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=793,
serialized_end=882,
)
_QUERYUSERSREQUEST = _descriptor.Descriptor(
name='QueryUsersRequest',
full_name='pomerium.dashboard.QueryUsersRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='query', full_name='pomerium.dashboard.QueryUsersRequest.query', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='offset', full_name='pomerium.dashboard.QueryUsersRequest.offset', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='limit', full_name='pomerium.dashboard.QueryUsersRequest.limit', index=2,
number=3, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=884,
serialized_end=949,
)
_QUERYUSERSRESPONSE = _descriptor.Descriptor(
name='QueryUsersResponse',
full_name='pomerium.dashboard.QueryUsersResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='users', full_name='pomerium.dashboard.QueryUsersResponse.users', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='total_count', full_name='pomerium.dashboard.QueryUsersResponse.total_count', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=951,
serialized_end=1037,
)
_POMERIUMSERVICEACCOUNT = _descriptor.Descriptor(
name='PomeriumServiceAccount',
full_name='pomerium.dashboard.PomeriumServiceAccount',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='pomerium.dashboard.PomeriumServiceAccount.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='namespace_id', full_name='pomerium.dashboard.PomeriumServiceAccount.namespace_id', index=1,
number=8, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='user_id', full_name='pomerium.dashboard.PomeriumServiceAccount.user_id', index=2,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='expires_at', full_name='pomerium.dashboard.PomeriumServiceAccount.expires_at', index=3,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='issued_at', full_name='pomerium.dashboard.PomeriumServiceAccount.issued_at', index=4,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='_namespace_id', full_name='pomerium.dashboard.PomeriumServiceAccount._namespace_id',
index=0, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
],
serialized_start=1040,
serialized_end=1232,
)
_ADDPOMERIUMSERVICEACCOUNTREQUEST = _descriptor.Descriptor(
name='AddPomeriumServiceAccountRequest',
full_name='pomerium.dashboard.AddPomeriumServiceAccountRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='service_account', full_name='pomerium.dashboard.AddPomeriumServiceAccountRequest.service_account', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1234,
serialized_end=1337,
)
_ADDPOMERIUMSERVICEACCOUNTRESPONSE = _descriptor.Descriptor(
name='AddPomeriumServiceAccountResponse',
full_name='pomerium.dashboard.AddPomeriumServiceAccountResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='service_account', full_name='pomerium.dashboard.AddPomeriumServiceAccountResponse.service_account', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='JWT', full_name='pomerium.dashboard.AddPomeriumServiceAccountResponse.JWT', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1339,
serialized_end=1456,
)
_DELETEPOMERIUMSERVICEACCOUNTREQUEST = _descriptor.Descriptor(
name='DeletePomeriumServiceAccountRequest',
full_name='pomerium.dashboard.DeletePomeriumServiceAccountRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='pomerium.dashboard.DeletePomeriumServiceAccountRequest.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1458,
serialized_end=1507,
)
_DELETEPOMERIUMSERVICEACCOUNTRESPONSE = _descriptor.Descriptor(
name='DeletePomeriumServiceAccountResponse',
full_name='pomerium.dashboard.DeletePomeriumServiceAccountResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1509,
serialized_end=1547,
)
_GETPOMERIUMSERVICEACCOUNTREQUEST = _descriptor.Descriptor(
name='GetPomeriumServiceAccountRequest',
full_name='pomerium.dashboard.GetPomeriumServiceAccountRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='pomerium.dashboard.GetPomeriumServiceAccountRequest.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1549,
serialized_end=1595,
)
_GETPOMERIUMSERVICEACCOUNTRESPONSE = _descriptor.Descriptor(
name='GetPomeriumServiceAccountResponse',
full_name='pomerium.dashboard.GetPomeriumServiceAccountResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='service_account', full_name='pomerium.dashboard.GetPomeriumServiceAccountResponse.service_account', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1597,
serialized_end=1701,
)
_LISTPOMERIUMSERVICEACCOUNTSREQUEST = _descriptor.Descriptor(
name='ListPomeriumServiceAccountsRequest',
full_name='pomerium.dashboard.ListPomeriumServiceAccountsRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='namespace', full_name='pomerium.dashboard.ListPomeriumServiceAccountsRequest.namespace', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1703,
serialized_end=1758,
)
_LISTPOMERIUMSERVICEACCOUNTSRESPONSE = _descriptor.Descriptor(
name='ListPomeriumServiceAccountsResponse',
full_name='pomerium.dashboard.ListPomeriumServiceAccountsResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='service_accounts', full_name='pomerium.dashboard.ListPomeriumServiceAccountsResponse.service_accounts', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1760,
serialized_end=1867,
)
_POMERIUMSESSION_GROUP = _descriptor.Descriptor(
name='Group',
full_name='pomerium.dashboard.PomeriumSession.Group',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='pomerium.dashboard.PomeriumSession.Group.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='pomerium.dashboard.PomeriumSession.Group.name', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='email', full_name='pomerium.dashboard.PomeriumSession.Group.email', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2210,
serialized_end=2258,
)
_POMERIUMSESSION_USER = _descriptor.Descriptor(
name='User',
full_name='pomerium.dashboard.PomeriumSession.User',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='pomerium.dashboard.PomeriumSession.User.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='name', full_name='pomerium.dashboard.PomeriumSession.User.name', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='email', full_name='pomerium.dashboard.PomeriumSession.User.email', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2260,
serialized_end=2307,
)
_POMERIUMSESSION_CLAIMSENTRY = _descriptor.Descriptor(
name='ClaimsEntry',
full_name='pomerium.dashboard.PomeriumSession.ClaimsEntry',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='key', full_name='pomerium.dashboard.PomeriumSession.ClaimsEntry.key', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='value', full_name='pomerium.dashboard.PomeriumSession.ClaimsEntry.value', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=b'8\001',
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2309,
serialized_end=2382,
)
_POMERIUMSESSION = _descriptor.Descriptor(
name='PomeriumSession',
full_name='pomerium.dashboard.PomeriumSession',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='pomerium.dashboard.PomeriumSession.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='user', full_name='pomerium.dashboard.PomeriumSession.user', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='groups', full_name='pomerium.dashboard.PomeriumSession.groups', index=2,
number=3, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='issuer', full_name='pomerium.dashboard.PomeriumSession.issuer', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='issued_at', full_name='pomerium.dashboard.PomeriumSession.issued_at', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='expires_at', full_name='pomerium.dashboard.PomeriumSession.expires_at', index=5,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='audience', full_name='pomerium.dashboard.PomeriumSession.audience', index=6,
number=7, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='claims', full_name='pomerium.dashboard.PomeriumSession.claims', index=7,
number=8, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[_POMERIUMSESSION_GROUP, _POMERIUMSESSION_USER, _POMERIUMSESSION_CLAIMSENTRY, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=1870,
serialized_end=2382,
)
_DELETEPOMERIUMSESSIONREQUEST = _descriptor.Descriptor(
name='DeletePomeriumSessionRequest',
full_name='pomerium.dashboard.DeletePomeriumSessionRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='pomerium.dashboard.DeletePomeriumSessionRequest.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2384,
serialized_end=2426,
)
_DELETEPOMERIUMSESSIONRESPONSE = _descriptor.Descriptor(
name='DeletePomeriumSessionResponse',
full_name='pomerium.dashboard.DeletePomeriumSessionResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2428,
serialized_end=2459,
)
_GETPOMERIUMSESSIONREQUEST = _descriptor.Descriptor(
name='GetPomeriumSessionRequest',
full_name='pomerium.dashboard.GetPomeriumSessionRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='id', full_name='pomerium.dashboard.GetPomeriumSessionRequest.id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2461,
serialized_end=2500,
)
_GETPOMERIUMSESSIONRESPONSE = _descriptor.Descriptor(
name='GetPomeriumSessionResponse',
full_name='pomerium.dashboard.GetPomeriumSessionResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='session', full_name='pomerium.dashboard.GetPomeriumSessionResponse.session', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2502,
serialized_end=2584,
)
_LISTPOMERIUMSESSIONSREQUEST = _descriptor.Descriptor(
name='ListPomeriumSessionsRequest',
full_name='pomerium.dashboard.ListPomeriumSessionsRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='query', full_name='pomerium.dashboard.ListPomeriumSessionsRequest.query', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='offset', full_name='pomerium.dashboard.ListPomeriumSessionsRequest.offset', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='limit', full_name='pomerium.dashboard.ListPomeriumSessionsRequest.limit', index=2,
number=3, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='order_by', full_name='pomerium.dashboard.ListPomeriumSessionsRequest.order_by', index=3,
number=4, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='user_id', full_name='pomerium.dashboard.ListPomeriumSessionsRequest.user_id', index=4,
number=5, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
_descriptor.OneofDescriptor(
name='_query', full_name='pomerium.dashboard.ListPomeriumSessionsRequest._query',
index=0, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
_descriptor.OneofDescriptor(
name='_offset', full_name='pomerium.dashboard.ListPomeriumSessionsRequest._offset',
index=1, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
_descriptor.OneofDescriptor(
name='_limit', full_name='pomerium.dashboard.ListPomeriumSessionsRequest._limit',
index=2, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
_descriptor.OneofDescriptor(
name='_order_by', full_name='pomerium.dashboard.ListPomeriumSessionsRequest._order_by',
index=3, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
_descriptor.OneofDescriptor(
name='_user_id', full_name='pomerium.dashboard.ListPomeriumSessionsRequest._user_id',
index=4, containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[]),
],
serialized_start=2587,
serialized_end=2778,
)
_LISTPOMERIUMSESSIONSRESPONSE = _descriptor.Descriptor(
name='ListPomeriumSessionsResponse',
full_name='pomerium.dashboard.ListPomeriumSessionsResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='sessions', full_name='pomerium.dashboard.ListPomeriumSessionsResponse.sessions', index=0,
number=1, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
name='total_count', full_name='pomerium.dashboard.ListPomeriumSessionsResponse.total_count', index=1,
number=2, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2780,
serialized_end=2886,
)
_IMPERSONATEREQUEST = _descriptor.Descriptor(
name='ImpersonateRequest',
full_name='pomerium.dashboard.ImpersonateRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name='session_id', full_name='pomerium.dashboard.ImpersonateRequest.session_id', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR, create_key=_descriptor._internal_create_key),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2888,
serialized_end=2928,
)
_IMPERSONATERESPONSE = _descriptor.Descriptor(
name='ImpersonateResponse',
full_name='pomerium.dashboard.ImpersonateResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2930,
serialized_end=2951,
)
_RECOVERYTOKEN.fields_by_name['created_at'].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP
_RECOVERYTOKEN.fields_by_name['modified_at'].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP
_RECOVERYTOKEN.fields_by_name['expires_at'].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP
_USERINFO_NAMESPACEROLESENTRY.containing_type = _USERINFO
_USERINFO.fields_by_name['namespace_roles'].message_type = _USERINFO_NAMESPACEROLESENTRY
_GETUSERINFOREQUEST.oneofs_by_name['_user_id'].fields.append(
_GETUSERINFOREQUEST.fields_by_name['user_id'])
_GETUSERINFOREQUEST.fields_by_name['user_id'].containing_oneof = _GETUSERINFOREQUEST.oneofs_by_name['_user_id']
_GETUSERINFORESPONSE.fields_by_name['user_info'].message_type = _USERINFO
_QUERYGROUPSRESPONSE.fields_by_name['groups'].message_type = _GROUPINFO
_QUERYUSERSRESPONSE.fields_by_name['users'].message_type = _USERINFO
_POMERIUMSERVICEACCOUNT.fields_by_name['expires_at'].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP
_POMERIUMSERVICEACCOUNT.fields_by_name['issued_at'].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP
_POMERIUMSERVICEACCOUNT.oneofs_by_name['_namespace_id'].fields.append(
_POMERIUMSERVICEACCOUNT.fields_by_name['namespace_id'])
_POMERIUMSERVICEACCOUNT.fields_by_name['namespace_id'].containing_oneof = _POMERIUMSERVICEACCOUNT.oneofs_by_name['_namespace_id']
_ADDPOMERIUMSERVICEACCOUNTREQUEST.fields_by_name['service_account'].message_type = _POMERIUMSERVICEACCOUNT
_ADDPOMERIUMSERVICEACCOUNTRESPONSE.fields_by_name['service_account'].message_type = _POMERIUMSERVICEACCOUNT
_GETPOMERIUMSERVICEACCOUNTRESPONSE.fields_by_name['service_account'].message_type = _POMERIUMSERVICEACCOUNT
_LISTPOMERIUMSERVICEACCOUNTSRESPONSE.fields_by_name['service_accounts'].message_type = _POMERIUMSERVICEACCOUNT
_POMERIUMSESSION_GROUP.containing_type = _POMERIUMSESSION
_POMERIUMSESSION_USER.containing_type = _POMERIUMSESSION
_POMERIUMSESSION_CLAIMSENTRY.fields_by_name['value'].message_type = google_dot_protobuf_dot_struct__pb2._LISTVALUE
_POMERIUMSESSION_CLAIMSENTRY.containing_type = _POMERIUMSESSION
_POMERIUMSESSION.fields_by_name['user'].message_type = _POMERIUMSESSION_USER
_POMERIUMSESSION.fields_by_name['groups'].message_type = _POMERIUMSESSION_GROUP
_POMERIUMSESSION.fields_by_name['issued_at'].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP
_POMERIUMSESSION.fields_by_name['expires_at'].message_type = google_dot_protobuf_dot_timestamp__pb2._TIMESTAMP
_POMERIUMSESSION.fields_by_name['claims'].message_type = _POMERIUMSESSION_CLAIMSENTRY
_GETPOMERIUMSESSIONRESPONSE.fields_by_name['session'].message_type = _POMERIUMSESSION
_LISTPOMERIUMSESSIONSREQUEST.oneofs_by_name['_query'].fields.append(
_LISTPOMERIUMSESSIONSREQUEST.fields_by_name['query'])
_LISTPOMERIUMSESSIONSREQUEST.fields_by_name['query'].containing_oneof = _LISTPOMERIUMSESSIONSREQUEST.oneofs_by_name['_query']
_LISTPOMERIUMSESSIONSREQUEST.oneofs_by_name['_offset'].fields.append(
_LISTPOMERIUMSESSIONSREQUEST.fields_by_name['offset'])
_LISTPOMERIUMSESSIONSREQUEST.fields_by_name['offset'].containing_oneof = _LISTPOMERIUMSESSIONSREQUEST.oneofs_by_name['_offset']
_LISTPOMERIUMSESSIONSREQUEST.oneofs_by_name['_limit'].fields.append(
_LISTPOMERIUMSESSIONSREQUEST.fields_by_name['limit'])
_LISTPOMERIUMSESSIONSREQUEST.fields_by_name['limit'].containing_oneof = _LISTPOMERIUMSESSIONSREQUEST.oneofs_by_name['_limit']
_LISTPOMERIUMSESSIONSREQUEST.oneofs_by_name['_order_by'].fields.append(
_LISTPOMERIUMSESSIONSREQUEST.fields_by_name['order_by'])
_LISTPOMERIUMSESSIONSREQUEST.fields_by_name['order_by'].containing_oneof = _LISTPOMERIUMSESSIONSREQUEST.oneofs_by_name['_order_by']
_LISTPOMERIUMSESSIONSREQUEST.oneofs_by_name['_user_id'].fields.append(
_LISTPOMERIUMSESSIONSREQUEST.fields_by_name['user_id'])
_LISTPOMERIUMSESSIONSREQUEST.fields_by_name['user_id'].containing_oneof = _LISTPOMERIUMSESSIONSREQUEST.oneofs_by_name['_user_id']
_LISTPOMERIUMSESSIONSRESPONSE.fields_by_name['sessions'].message_type = _POMERIUMSESSION
DESCRIPTOR.message_types_by_name['RecoveryToken'] = _RECOVERYTOKEN
DESCRIPTOR.message_types_by_name['GroupInfo'] = _GROUPINFO
DESCRIPTOR.message_types_by_name['UserInfo'] = _USERINFO
DESCRIPTOR.message_types_by_name['GetUserInfoRequest'] = _GETUSERINFOREQUEST
DESCRIPTOR.message_types_by_name['GetUserInfoResponse'] = _GETUSERINFORESPONSE
DESCRIPTOR.message_types_by_name['QueryGroupsRequest'] = _QUERYGROUPSREQUEST
DESCRIPTOR.message_types_by_name['QueryGroupsResponse'] = _QUERYGROUPSRESPONSE
DESCRIPTOR.message_types_by_name['QueryUsersRequest'] = _QUERYUSERSREQUEST
DESCRIPTOR.message_types_by_name['QueryUsersResponse'] = _QUERYUSERSRESPONSE
DESCRIPTOR.message_types_by_name['PomeriumServiceAccount'] = _POMERIUMSERVICEACCOUNT
DESCRIPTOR.message_types_by_name['AddPomeriumServiceAccountRequest'] = _ADDPOMERIUMSERVICEACCOUNTREQUEST
DESCRIPTOR.message_types_by_name['AddPomeriumServiceAccountResponse'] = _ADDPOMERIUMSERVICEACCOUNTRESPONSE
DESCRIPTOR.message_types_by_name['DeletePomeriumServiceAccountRequest'] = _DELETEPOMERIUMSERVICEACCOUNTREQUEST
DESCRIPTOR.message_types_by_name['DeletePomeriumServiceAccountResponse'] = _DELETEPOMERIUMSERVICEACCOUNTRESPONSE
DESCRIPTOR.message_types_by_name['GetPomeriumServiceAccountRequest'] = _GETPOMERIUMSERVICEACCOUNTREQUEST
DESCRIPTOR.message_types_by_name['GetPomeriumServiceAccountResponse'] = _GETPOMERIUMSERVICEACCOUNTRESPONSE
DESCRIPTOR.message_types_by_name['ListPomeriumServiceAccountsRequest'] = _LISTPOMERIUMSERVICEACCOUNTSREQUEST
DESCRIPTOR.message_types_by_name['ListPomeriumServiceAccountsResponse'] = _LISTPOMERIUMSERVICEACCOUNTSRESPONSE
DESCRIPTOR.message_types_by_name['PomeriumSession'] = _POMERIUMSESSION
DESCRIPTOR.message_types_by_name['DeletePomeriumSessionRequest'] = _DELETEPOMERIUMSESSIONREQUEST
DESCRIPTOR.message_types_by_name['DeletePomeriumSessionResponse'] = _DELETEPOMERIUMSESSIONRESPONSE
DESCRIPTOR.message_types_by_name['GetPomeriumSessionRequest'] = _GETPOMERIUMSESSIONREQUEST
DESCRIPTOR.message_types_by_name['GetPomeriumSessionResponse'] = _GETPOMERIUMSESSIONRESPONSE
DESCRIPTOR.message_types_by_name['ListPomeriumSessionsRequest'] = _LISTPOMERIUMSESSIONSREQUEST
DESCRIPTOR.message_types_by_name['ListPomeriumSessionsResponse'] = _LISTPOMERIUMSESSIONSRESPONSE
DESCRIPTOR.message_types_by_name['ImpersonateRequest'] = _IMPERSONATEREQUEST
DESCRIPTOR.message_types_by_name['ImpersonateResponse'] = _IMPERSONATERESPONSE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
RecoveryToken = _reflection.GeneratedProtocolMessageType('RecoveryToken', (_message.Message,), {
'DESCRIPTOR' : _RECOVERYTOKEN,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.RecoveryToken)
})
_sym_db.RegisterMessage(RecoveryToken)
GroupInfo = _reflection.GeneratedProtocolMessageType('GroupInfo', (_message.Message,), {
'DESCRIPTOR' : _GROUPINFO,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.GroupInfo)
})
_sym_db.RegisterMessage(GroupInfo)
UserInfo = _reflection.GeneratedProtocolMessageType('UserInfo', (_message.Message,), {
'NamespaceRolesEntry' : _reflection.GeneratedProtocolMessageType('NamespaceRolesEntry', (_message.Message,), {
'DESCRIPTOR' : _USERINFO_NAMESPACEROLESENTRY,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.UserInfo.NamespaceRolesEntry)
})
,
'DESCRIPTOR' : _USERINFO,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.UserInfo)
})
_sym_db.RegisterMessage(UserInfo)
_sym_db.RegisterMessage(UserInfo.NamespaceRolesEntry)
GetUserInfoRequest = _reflection.GeneratedProtocolMessageType('GetUserInfoRequest', (_message.Message,), {
'DESCRIPTOR' : _GETUSERINFOREQUEST,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.GetUserInfoRequest)
})
_sym_db.RegisterMessage(GetUserInfoRequest)
GetUserInfoResponse = _reflection.GeneratedProtocolMessageType('GetUserInfoResponse', (_message.Message,), {
'DESCRIPTOR' : _GETUSERINFORESPONSE,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.GetUserInfoResponse)
})
_sym_db.RegisterMessage(GetUserInfoResponse)
QueryGroupsRequest = _reflection.GeneratedProtocolMessageType('QueryGroupsRequest', (_message.Message,), {
'DESCRIPTOR' : _QUERYGROUPSREQUEST,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.QueryGroupsRequest)
})
_sym_db.RegisterMessage(QueryGroupsRequest)
QueryGroupsResponse = _reflection.GeneratedProtocolMessageType('QueryGroupsResponse', (_message.Message,), {
'DESCRIPTOR' : _QUERYGROUPSRESPONSE,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.QueryGroupsResponse)
})
_sym_db.RegisterMessage(QueryGroupsResponse)
QueryUsersRequest = _reflection.GeneratedProtocolMessageType('QueryUsersRequest', (_message.Message,), {
'DESCRIPTOR' : _QUERYUSERSREQUEST,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.QueryUsersRequest)
})
_sym_db.RegisterMessage(QueryUsersRequest)
QueryUsersResponse = _reflection.GeneratedProtocolMessageType('QueryUsersResponse', (_message.Message,), {
'DESCRIPTOR' : _QUERYUSERSRESPONSE,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.QueryUsersResponse)
})
_sym_db.RegisterMessage(QueryUsersResponse)
PomeriumServiceAccount = _reflection.GeneratedProtocolMessageType('PomeriumServiceAccount', (_message.Message,), {
'DESCRIPTOR' : _POMERIUMSERVICEACCOUNT,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.PomeriumServiceAccount)
})
_sym_db.RegisterMessage(PomeriumServiceAccount)
AddPomeriumServiceAccountRequest = _reflection.GeneratedProtocolMessageType('AddPomeriumServiceAccountRequest', (_message.Message,), {
'DESCRIPTOR' : _ADDPOMERIUMSERVICEACCOUNTREQUEST,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.AddPomeriumServiceAccountRequest)
})
_sym_db.RegisterMessage(AddPomeriumServiceAccountRequest)
AddPomeriumServiceAccountResponse = _reflection.GeneratedProtocolMessageType('AddPomeriumServiceAccountResponse', (_message.Message,), {
'DESCRIPTOR' : _ADDPOMERIUMSERVICEACCOUNTRESPONSE,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.AddPomeriumServiceAccountResponse)
})
_sym_db.RegisterMessage(AddPomeriumServiceAccountResponse)
DeletePomeriumServiceAccountRequest = _reflection.GeneratedProtocolMessageType('DeletePomeriumServiceAccountRequest', (_message.Message,), {
'DESCRIPTOR' : _DELETEPOMERIUMSERVICEACCOUNTREQUEST,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.DeletePomeriumServiceAccountRequest)
})
_sym_db.RegisterMessage(DeletePomeriumServiceAccountRequest)
DeletePomeriumServiceAccountResponse = _reflection.GeneratedProtocolMessageType('DeletePomeriumServiceAccountResponse', (_message.Message,), {
'DESCRIPTOR' : _DELETEPOMERIUMSERVICEACCOUNTRESPONSE,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.DeletePomeriumServiceAccountResponse)
})
_sym_db.RegisterMessage(DeletePomeriumServiceAccountResponse)
GetPomeriumServiceAccountRequest = _reflection.GeneratedProtocolMessageType('GetPomeriumServiceAccountRequest', (_message.Message,), {
'DESCRIPTOR' : _GETPOMERIUMSERVICEACCOUNTREQUEST,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.GetPomeriumServiceAccountRequest)
})
_sym_db.RegisterMessage(GetPomeriumServiceAccountRequest)
GetPomeriumServiceAccountResponse = _reflection.GeneratedProtocolMessageType('GetPomeriumServiceAccountResponse', (_message.Message,), {
'DESCRIPTOR' : _GETPOMERIUMSERVICEACCOUNTRESPONSE,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.GetPomeriumServiceAccountResponse)
})
_sym_db.RegisterMessage(GetPomeriumServiceAccountResponse)
ListPomeriumServiceAccountsRequest = _reflection.GeneratedProtocolMessageType('ListPomeriumServiceAccountsRequest', (_message.Message,), {
'DESCRIPTOR' : _LISTPOMERIUMSERVICEACCOUNTSREQUEST,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.ListPomeriumServiceAccountsRequest)
})
_sym_db.RegisterMessage(ListPomeriumServiceAccountsRequest)
ListPomeriumServiceAccountsResponse = _reflection.GeneratedProtocolMessageType('ListPomeriumServiceAccountsResponse', (_message.Message,), {
'DESCRIPTOR' : _LISTPOMERIUMSERVICEACCOUNTSRESPONSE,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.ListPomeriumServiceAccountsResponse)
})
_sym_db.RegisterMessage(ListPomeriumServiceAccountsResponse)
PomeriumSession = _reflection.GeneratedProtocolMessageType('PomeriumSession', (_message.Message,), {
'Group' : _reflection.GeneratedProtocolMessageType('Group', (_message.Message,), {
'DESCRIPTOR' : _POMERIUMSESSION_GROUP,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.PomeriumSession.Group)
})
,
'User' : _reflection.GeneratedProtocolMessageType('User', (_message.Message,), {
'DESCRIPTOR' : _POMERIUMSESSION_USER,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.PomeriumSession.User)
})
,
'ClaimsEntry' : _reflection.GeneratedProtocolMessageType('ClaimsEntry', (_message.Message,), {
'DESCRIPTOR' : _POMERIUMSESSION_CLAIMSENTRY,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.PomeriumSession.ClaimsEntry)
})
,
'DESCRIPTOR' : _POMERIUMSESSION,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.PomeriumSession)
})
_sym_db.RegisterMessage(PomeriumSession)
_sym_db.RegisterMessage(PomeriumSession.Group)
_sym_db.RegisterMessage(PomeriumSession.User)
_sym_db.RegisterMessage(PomeriumSession.ClaimsEntry)
DeletePomeriumSessionRequest = _reflection.GeneratedProtocolMessageType('DeletePomeriumSessionRequest', (_message.Message,), {
'DESCRIPTOR' : _DELETEPOMERIUMSESSIONREQUEST,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.DeletePomeriumSessionRequest)
})
_sym_db.RegisterMessage(DeletePomeriumSessionRequest)
DeletePomeriumSessionResponse = _reflection.GeneratedProtocolMessageType('DeletePomeriumSessionResponse', (_message.Message,), {
'DESCRIPTOR' : _DELETEPOMERIUMSESSIONRESPONSE,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.DeletePomeriumSessionResponse)
})
_sym_db.RegisterMessage(DeletePomeriumSessionResponse)
GetPomeriumSessionRequest = _reflection.GeneratedProtocolMessageType('GetPomeriumSessionRequest', (_message.Message,), {
'DESCRIPTOR' : _GETPOMERIUMSESSIONREQUEST,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.GetPomeriumSessionRequest)
})
_sym_db.RegisterMessage(GetPomeriumSessionRequest)
GetPomeriumSessionResponse = _reflection.GeneratedProtocolMessageType('GetPomeriumSessionResponse', (_message.Message,), {
'DESCRIPTOR' : _GETPOMERIUMSESSIONRESPONSE,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.GetPomeriumSessionResponse)
})
_sym_db.RegisterMessage(GetPomeriumSessionResponse)
ListPomeriumSessionsRequest = _reflection.GeneratedProtocolMessageType('ListPomeriumSessionsRequest', (_message.Message,), {
'DESCRIPTOR' : _LISTPOMERIUMSESSIONSREQUEST,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.ListPomeriumSessionsRequest)
})
_sym_db.RegisterMessage(ListPomeriumSessionsRequest)
ListPomeriumSessionsResponse = _reflection.GeneratedProtocolMessageType('ListPomeriumSessionsResponse', (_message.Message,), {
'DESCRIPTOR' : _LISTPOMERIUMSESSIONSRESPONSE,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.ListPomeriumSessionsResponse)
})
_sym_db.RegisterMessage(ListPomeriumSessionsResponse)
ImpersonateRequest = _reflection.GeneratedProtocolMessageType('ImpersonateRequest', (_message.Message,), {
'DESCRIPTOR' : _IMPERSONATEREQUEST,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.ImpersonateRequest)
})
_sym_db.RegisterMessage(ImpersonateRequest)
ImpersonateResponse = _reflection.GeneratedProtocolMessageType('ImpersonateResponse', (_message.Message,), {
'DESCRIPTOR' : _IMPERSONATERESPONSE,
'__module__' : 'users_pb2'
# @@protoc_insertion_point(class_scope:pomerium.dashboard.ImpersonateResponse)
})
_sym_db.RegisterMessage(ImpersonateResponse)
DESCRIPTOR._options = None
_USERINFO_NAMESPACEROLESENTRY._options = None
_POMERIUMSESSION_CLAIMSENTRY._options = None
_USERSERVICE = _descriptor.ServiceDescriptor(
name='UserService',
full_name='pomerium.dashboard.UserService',
file=DESCRIPTOR,
index=0,
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_start=2954,
serialized_end=3252,
methods=[
_descriptor.MethodDescriptor(
name='GetUserInfo',
full_name='pomerium.dashboard.UserService.GetUserInfo',
index=0,
containing_service=None,
input_type=_GETUSERINFOREQUEST,
output_type=_GETUSERINFORESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='QueryGroups',
full_name='pomerium.dashboard.UserService.QueryGroups',
index=1,
containing_service=None,
input_type=_QUERYGROUPSREQUEST,
output_type=_QUERYGROUPSRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='QueryUsers',
full_name='pomerium.dashboard.UserService.QueryUsers',
index=2,
containing_service=None,
input_type=_QUERYUSERSREQUEST,
output_type=_QUERYUSERSRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
])
_sym_db.RegisterServiceDescriptor(_USERSERVICE)
DESCRIPTOR.services_by_name['UserService'] = _USERSERVICE
_POMERIUMSERVICEACCOUNTSERVICE = _descriptor.ServiceDescriptor(
name='PomeriumServiceAccountService',
full_name='pomerium.dashboard.PomeriumServiceAccountService',
file=DESCRIPTOR,
index=1,
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_start=3255,
serialized_end=3857,
methods=[
_descriptor.MethodDescriptor(
name='AddPomeriumServiceAccount',
full_name='pomerium.dashboard.PomeriumServiceAccountService.AddPomeriumServiceAccount',
index=0,
containing_service=None,
input_type=_ADDPOMERIUMSERVICEACCOUNTREQUEST,
output_type=_ADDPOMERIUMSERVICEACCOUNTRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='DeletePomeriumServiceAccount',
full_name='pomerium.dashboard.PomeriumServiceAccountService.DeletePomeriumServiceAccount',
index=1,
containing_service=None,
input_type=_DELETEPOMERIUMSERVICEACCOUNTREQUEST,
output_type=_DELETEPOMERIUMSERVICEACCOUNTRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='GetPomeriumServiceAccount',
full_name='pomerium.dashboard.PomeriumServiceAccountService.GetPomeriumServiceAccount',
index=2,
containing_service=None,
input_type=_GETPOMERIUMSERVICEACCOUNTREQUEST,
output_type=_GETPOMERIUMSERVICEACCOUNTRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='ListPomeriumServiceAccounts',
full_name='pomerium.dashboard.PomeriumServiceAccountService.ListPomeriumServiceAccounts',
index=3,
containing_service=None,
input_type=_LISTPOMERIUMSERVICEACCOUNTSREQUEST,
output_type=_LISTPOMERIUMSERVICEACCOUNTSRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
])
_sym_db.RegisterServiceDescriptor(_POMERIUMSERVICEACCOUNTSERVICE)
DESCRIPTOR.services_by_name['PomeriumServiceAccountService'] = _POMERIUMSERVICEACCOUNTSERVICE
_POMERIUMSESSIONSERVICE = _descriptor.ServiceDescriptor(
name='PomeriumSessionService',
full_name='pomerium.dashboard.PomeriumSessionService',
file=DESCRIPTOR,
index=2,
serialized_options=None,
create_key=_descriptor._internal_create_key,
serialized_start=3860,
serialized_end=4346,
methods=[
_descriptor.MethodDescriptor(
name='DeletePomeriumSession',
full_name='pomerium.dashboard.PomeriumSessionService.DeletePomeriumSession',
index=0,
containing_service=None,
input_type=_DELETEPOMERIUMSESSIONREQUEST,
output_type=_DELETEPOMERIUMSESSIONRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='GetPomeriumSession',
full_name='pomerium.dashboard.PomeriumSessionService.GetPomeriumSession',
index=1,
containing_service=None,
input_type=_GETPOMERIUMSESSIONREQUEST,
output_type=_GETPOMERIUMSESSIONRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='Impersonate',
full_name='pomerium.dashboard.PomeriumSessionService.Impersonate',
index=2,
containing_service=None,
input_type=_IMPERSONATEREQUEST,
output_type=_IMPERSONATERESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
_descriptor.MethodDescriptor(
name='ListPomeriumSessions',
full_name='pomerium.dashboard.PomeriumSessionService.ListPomeriumSessions',
index=3,
containing_service=None,
input_type=_LISTPOMERIUMSESSIONSREQUEST,
output_type=_LISTPOMERIUMSESSIONSRESPONSE,
serialized_options=None,
create_key=_descriptor._internal_create_key,
),
])
_sym_db.RegisterServiceDescriptor(_POMERIUMSESSIONSERVICE)
DESCRIPTOR.services_by_name['PomeriumSessionService'] = _POMERIUMSESSIONSERVICE
# @@protoc_insertion_point(module_scope)
| 42.975253 | 6,495 | 0.778471 | 8,465 | 76,410 | 6.681985 | 0.04808 | 0.034652 | 0.05744 | 0.058236 | 0.693368 | 0.615809 | 0.580415 | 0.565617 | 0.555628 | 0.551049 | 0 | 0.028241 | 0.107918 | 76,410 | 1,777 | 6,496 | 42.999437 | 0.801567 | 0.035912 | 0 | 0.677083 | 1 | 0.002451 | 0.203301 | 0.160883 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.003676 | 0 | 0.003676 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1a0b60342365dfb5d7137cd8463e182aeaeff08e | 8,063 | py | Python | src/oci/database_management/models/sql_tuning_advisor_task_summary_finding_counts.py | ezequielramos/oci-python-sdk | cc4235cf217beaf9feed75760e9ce82610222762 | [
"Apache-2.0",
"BSD-3-Clause"
] | 3 | 2020-09-10T22:09:45.000Z | 2021-12-24T17:00:07.000Z | src/oci/database_management/models/sql_tuning_advisor_task_summary_finding_counts.py | ezequielramos/oci-python-sdk | cc4235cf217beaf9feed75760e9ce82610222762 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | src/oci/database_management/models/sql_tuning_advisor_task_summary_finding_counts.py | ezequielramos/oci-python-sdk | cc4235cf217beaf9feed75760e9ce82610222762 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | # coding: utf-8
# Copyright (c) 2016, 2021, Oracle and/or its affiliates. All rights reserved.
# This software is dual-licensed to you under the Universal Permissive License (UPL) 1.0 as shown at https://oss.oracle.com/licenses/upl or Apache License 2.0 as shown at http://www.apache.org/licenses/LICENSE-2.0. You may choose either license.
from oci.util import formatted_flat_dict, NONE_SENTINEL, value_allowed_none_or_none_sentinel # noqa: F401
from oci.decorators import init_model_state_from_kwargs
@init_model_state_from_kwargs
class SqlTuningAdvisorTaskSummaryFindingCounts(object):
"""
The finding counts data for the SQL Tuning Advisor summary report.
"""
def __init__(self, **kwargs):
"""
Initializes a new SqlTuningAdvisorTaskSummaryFindingCounts object with values from keyword arguments.
The following keyword arguments are supported (corresponding to the getters/setters of this class):
:param recommended_sql_profile:
The value to assign to the recommended_sql_profile property of this SqlTuningAdvisorTaskSummaryFindingCounts.
:type recommended_sql_profile: int
:param implemented_sql_profile:
The value to assign to the implemented_sql_profile property of this SqlTuningAdvisorTaskSummaryFindingCounts.
:type implemented_sql_profile: int
:param index:
The value to assign to the index property of this SqlTuningAdvisorTaskSummaryFindingCounts.
:type index: int
:param restructure:
The value to assign to the restructure property of this SqlTuningAdvisorTaskSummaryFindingCounts.
:type restructure: int
:param statistics:
The value to assign to the statistics property of this SqlTuningAdvisorTaskSummaryFindingCounts.
:type statistics: int
:param alternate_plan:
The value to assign to the alternate_plan property of this SqlTuningAdvisorTaskSummaryFindingCounts.
:type alternate_plan: int
"""
self.swagger_types = {
'recommended_sql_profile': 'int',
'implemented_sql_profile': 'int',
'index': 'int',
'restructure': 'int',
'statistics': 'int',
'alternate_plan': 'int'
}
self.attribute_map = {
'recommended_sql_profile': 'recommendedSqlProfile',
'implemented_sql_profile': 'implementedSqlProfile',
'index': 'index',
'restructure': 'restructure',
'statistics': 'statistics',
'alternate_plan': 'alternatePlan'
}
self._recommended_sql_profile = None
self._implemented_sql_profile = None
self._index = None
self._restructure = None
self._statistics = None
self._alternate_plan = None
@property
def recommended_sql_profile(self):
"""
**[Required]** Gets the recommended_sql_profile of this SqlTuningAdvisorTaskSummaryFindingCounts.
The count of distinct SQL statements with recommended SQL profiles.
:return: The recommended_sql_profile of this SqlTuningAdvisorTaskSummaryFindingCounts.
:rtype: int
"""
return self._recommended_sql_profile
@recommended_sql_profile.setter
def recommended_sql_profile(self, recommended_sql_profile):
"""
Sets the recommended_sql_profile of this SqlTuningAdvisorTaskSummaryFindingCounts.
The count of distinct SQL statements with recommended SQL profiles.
:param recommended_sql_profile: The recommended_sql_profile of this SqlTuningAdvisorTaskSummaryFindingCounts.
:type: int
"""
self._recommended_sql_profile = recommended_sql_profile
@property
def implemented_sql_profile(self):
"""
**[Required]** Gets the implemented_sql_profile of this SqlTuningAdvisorTaskSummaryFindingCounts.
The count of distinct SQL statements with implemented SQL profiles.
:return: The implemented_sql_profile of this SqlTuningAdvisorTaskSummaryFindingCounts.
:rtype: int
"""
return self._implemented_sql_profile
@implemented_sql_profile.setter
def implemented_sql_profile(self, implemented_sql_profile):
"""
Sets the implemented_sql_profile of this SqlTuningAdvisorTaskSummaryFindingCounts.
The count of distinct SQL statements with implemented SQL profiles.
:param implemented_sql_profile: The implemented_sql_profile of this SqlTuningAdvisorTaskSummaryFindingCounts.
:type: int
"""
self._implemented_sql_profile = implemented_sql_profile
@property
def index(self):
"""
**[Required]** Gets the index of this SqlTuningAdvisorTaskSummaryFindingCounts.
The count of distinct SQL statements with index recommendations.
:return: The index of this SqlTuningAdvisorTaskSummaryFindingCounts.
:rtype: int
"""
return self._index
@index.setter
def index(self, index):
"""
Sets the index of this SqlTuningAdvisorTaskSummaryFindingCounts.
The count of distinct SQL statements with index recommendations.
:param index: The index of this SqlTuningAdvisorTaskSummaryFindingCounts.
:type: int
"""
self._index = index
@property
def restructure(self):
"""
**[Required]** Gets the restructure of this SqlTuningAdvisorTaskSummaryFindingCounts.
The count of distinct SQL statements with restructure SQL recommendations.
:return: The restructure of this SqlTuningAdvisorTaskSummaryFindingCounts.
:rtype: int
"""
return self._restructure
@restructure.setter
def restructure(self, restructure):
"""
Sets the restructure of this SqlTuningAdvisorTaskSummaryFindingCounts.
The count of distinct SQL statements with restructure SQL recommendations.
:param restructure: The restructure of this SqlTuningAdvisorTaskSummaryFindingCounts.
:type: int
"""
self._restructure = restructure
@property
def statistics(self):
"""
**[Required]** Gets the statistics of this SqlTuningAdvisorTaskSummaryFindingCounts.
The count of distinct SQL statements with stale/missing optimizer statistics recommendations.
:return: The statistics of this SqlTuningAdvisorTaskSummaryFindingCounts.
:rtype: int
"""
return self._statistics
@statistics.setter
def statistics(self, statistics):
"""
Sets the statistics of this SqlTuningAdvisorTaskSummaryFindingCounts.
The count of distinct SQL statements with stale/missing optimizer statistics recommendations.
:param statistics: The statistics of this SqlTuningAdvisorTaskSummaryFindingCounts.
:type: int
"""
self._statistics = statistics
@property
def alternate_plan(self):
"""
**[Required]** Gets the alternate_plan of this SqlTuningAdvisorTaskSummaryFindingCounts.
The count of distinct SQL statements with alternative plan recommendations.
:return: The alternate_plan of this SqlTuningAdvisorTaskSummaryFindingCounts.
:rtype: int
"""
return self._alternate_plan
@alternate_plan.setter
def alternate_plan(self, alternate_plan):
"""
Sets the alternate_plan of this SqlTuningAdvisorTaskSummaryFindingCounts.
The count of distinct SQL statements with alternative plan recommendations.
:param alternate_plan: The alternate_plan of this SqlTuningAdvisorTaskSummaryFindingCounts.
:type: int
"""
self._alternate_plan = alternate_plan
def __repr__(self):
return formatted_flat_dict(self)
def __eq__(self, other):
if other is None:
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not self == other
| 35.676991 | 245 | 0.693538 | 796 | 8,063 | 6.834171 | 0.167085 | 0.066176 | 0.253676 | 0.110294 | 0.63125 | 0.526471 | 0.453676 | 0.340441 | 0.304412 | 0.277206 | 0 | 0.002964 | 0.246806 | 8,063 | 225 | 246 | 35.835556 | 0.892804 | 0.572244 | 0 | 0.084507 | 0 | 0 | 0.100743 | 0.049814 | 0 | 0 | 0 | 0 | 0 | 1 | 0.225352 | false | 0 | 0.028169 | 0.028169 | 0.408451 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1a150533d8cad7a2aba7a53cd1cb833f76eb2499 | 3,078 | py | Python | tests/lib/test_otping.py | reputage/py-didery | 2d54a9e39fb01a81d4d6f7814ca7a611a7418a47 | [
"Apache-2.0"
] | null | null | null | tests/lib/test_otping.py | reputage/py-didery | 2d54a9e39fb01a81d4d6f7814ca7a611a7418a47 | [
"Apache-2.0"
] | 15 | 2018-05-24T23:30:21.000Z | 2018-05-25T17:39:51.000Z | tests/lib/test_otping.py | reputage/py-didery | 2d54a9e39fb01a81d4d6f7814ca7a611a7418a47 | [
"Apache-2.0"
] | null | null | null | import pytest
try:
import simplejson as json
except ImportError:
import json
from ioflo.aio.http import Valet
# import didery.routing
from diderypy.lib import generating as gen
from diderypy.lib import otping as otp
vk, sk, did = gen.keyGen()
otpData = {
"id": did,
"blob": "AeYbsHot0pmdWAcgTo5sD8iAuSQAfnH5U6wiIGpVNJQQoYKBYrPPxAoIc1i5SHCIDS8KFFgf8i0tDq8XGizaCgo9yjuKHHNJZFi0QD9K"
"6Vpt6fP0XgXlj8z_4D-7s3CcYmuoWAh6NVtYaf_GWw_2sCrHBAA2mAEsml3thLmu50Dw"
}
url1, url2 = "http://localhost:8080/blob", "http://localhost:8000/blob"
urls = ["http://localhost:8080", "http://localhost:8000"]
def testPostOtpBlob():
result = otp.postOtpBlob(otpData, sk, urls)
assert result[url1].status == 201
assert result[url2].status == 201
def testPostOtpBlobNoUrls():
with pytest.raises(ValueError) as ex:
otp.postOtpBlob(otpData, sk, None)
def testPostOtpBlobEmptyUrls():
with pytest.raises(ValueError) as ex:
otp.postOtpBlob(otpData, sk, [])
def testPostOtpBlobNoSk():
with pytest.raises(ValueError) as ex:
otp.postOtpBlob(otpData, None, urls)
def testPostOtpBlobEmptySk():
with pytest.raises(ValueError) as ex:
otp.postOtpBlob(otpData, "", urls)
def testGetOtpBlob():
data, result = otp.getOtpBlob(did, urls)
assert data['otp_data'] == otpData
def testGetOtpBlobNoUrls():
with pytest.raises(ValueError) as ex:
otp.getOtpBlob(did, None)
def testGetOtpBlobEmptyUrls():
with pytest.raises(ValueError) as ex:
otp.getOtpBlob(did, [])
def testPutOtpBlob():
result = otp.putOtpBlob(otpData, sk, urls)
assert result[url1+"/"+otpData["id"]].status == 200
assert result[url1+"/"+otpData["id"]].response.body == otpData
assert result[url2+"/"+otpData["id"]].status == 200
assert result[url2+"/"+otpData["id"]].response.body == otpData
def testPutOtpBlobNoUrls():
with pytest.raises(ValueError) as ex:
otp.putOtpBlob(otpData, sk, None)
def testPutOtpBlobEmptyUrls():
with pytest.raises(ValueError) as ex:
otp.putOtpBlob(otpData, sk, [])
def testPutOtpBlobNoSk():
with pytest.raises(ValueError) as ex:
otp.putOtpBlob(otpData, None, urls)
def testPutOtpBlobEmptySk():
with pytest.raises(ValueError) as ex:
otp.putOtpBlob(otpData, "", urls)
def testRemoveOtpBlob():
result = otp.removeOtpBlob(did, sk, urls)
assert result[url1+"/"+did].status == 200
assert result[url1+"/"+did].response.body == otpData
assert result[url2+"/"+did].status == 200
assert result[url2+"/"+did].response.body == otpData
def testRemoveOtpBlobNoUrls():
with pytest.raises(ValueError) as ex:
otp.removeOtpBlob(did, sk, None)
def testRemoveOtpBlobEmptyUrls():
with pytest.raises(ValueError) as ex:
otp.removeOtpBlob(did, sk, [])
def testRemoveOtpBlobNoSk():
with pytest.raises(ValueError) as ex:
otp.removeOtpBlob(did, None, urls)
def testRemoveOtpBlobEmptySk():
with pytest.raises(ValueError) as ex:
otp.removeOtpBlob(did, "", urls)
| 24.624 | 118 | 0.692982 | 342 | 3,078 | 6.225146 | 0.222222 | 0.065759 | 0.105214 | 0.170972 | 0.47675 | 0.418976 | 0.330672 | 0.330672 | 0.330672 | 0.146548 | 0 | 0.028888 | 0.179012 | 3,078 | 124 | 119 | 24.822581 | 0.813613 | 0.006823 | 0 | 0.181818 | 0 | 0 | 0.09689 | 0.056301 | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0.233766 | false | 0 | 0.090909 | 0 | 0.324675 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1a209ab2fb009b89d259657281d619b4962c46e2 | 64 | py | Python | code/sample_1-2-16.py | KoyanagiHitoshi/AtCoder-Python-Introduction | 6d014e333a873f545b4d32d438e57cf428b10b96 | [
"MIT"
] | 1 | 2022-03-29T13:50:12.000Z | 2022-03-29T13:50:12.000Z | code/sample_1-2-16.py | KoyanagiHitoshi/AtCoder-Python-Introduction | 6d014e333a873f545b4d32d438e57cf428b10b96 | [
"MIT"
] | null | null | null | code/sample_1-2-16.py | KoyanagiHitoshi/AtCoder-Python-Introduction | 6d014e333a873f545b4d32d438e57cf428b10b96 | [
"MIT"
] | null | null | null | rows = int(input())
x = [input() for i in range(rows)]
print(x)
| 16 | 34 | 0.609375 | 12 | 64 | 3.25 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171875 | 64 | 3 | 35 | 21.333333 | 0.735849 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a7e08ada30043433441727fc2f8b4036acae9399 | 5,023 | py | Python | src/GUI/menuTypes.py | Vidhu007/Cloud-Encryption | ec9ccd76a71e98740d937b34a7734f821448fae0 | [
"MIT"
] | 7 | 2021-05-10T13:30:51.000Z | 2022-03-20T17:49:59.000Z | src/GUI/menuTypes.py | Vidhu007/Cloud-Encryption | ec9ccd76a71e98740d937b34a7734f821448fae0 | [
"MIT"
] | null | null | null | src/GUI/menuTypes.py | Vidhu007/Cloud-Encryption | ec9ccd76a71e98740d937b34a7734f821448fae0 | [
"MIT"
] | 8 | 2019-04-05T10:40:49.000Z | 2022-03-20T06:00:43.000Z | import users
import sys
import encryption
import googleDriveAPI
u= users
e= encryption
g= googleDriveAPI
#Function to generate menu for privileged (admin) user
def privilegeMenu():
try:
while True:
#Menu system used to navigate the management console
user_In = input("|U - Upload file | D - Download file | S - User settings | L - Log out | E - Exit|\n").lower()
if(user_In == "s"):
while True:
#Cases for creating or deleting users and creating a new key
user_In = input("|A - Add User | D - Delete User | N - Generate New Key | E - Exit to menu|\n").lower()
if (user_In == "a"):
createLogin = input("Create login name: ")
createPassword = getpass.getpass("Create password: ")
u.newUser(createLogin,createPassword)
elif(user_In == "d"):
userName = input("\nEnter username you wish to delete: ")
u.deleteUser(userName)
elif(user_In == "N"):
e.keyGen()
break
elif(user_In == "e"):
break
else:
print("Wrong input try again\n")
elif(user_In == "u"):
while True:
#Upload file screen
print("\nEnsure that files you wish to upload are in the 'Files' folder.\nEnter m to return to main menu.\n")
user_In = input("Enter file name: ")
if(user_In == "m"):
break
else:
e.encrypt(user_In, e.keyRead())
g.uploadFile(user_In)
elif(user_In == "e"):
print("Exiting.")
sys.exit()
elif(user_In == "l"):
#Logging out of admin account and setting admin bool to false
u.privilege = False
print("Logging out.")
break
elif(user_In == "d"):
#Download file screen
while True:
user_In = input("\n|S - Search for file | D - Download File | E - Exit to menu|\n").lower()
if(user_In == "s"):
#Incase you forgot the file name you can double check here
user_In = input("Enter file name: ")
g.searchFile(user_In)
elif(user_In == "d"):
#download file via file name
user_In = input("Enter file name: ")
fileID= g.fileID(user_In)
g.downloadFile(fileID, user_In)
e.decrypt(user_In, e.keyRead())
elif(user_In == "e"):
break
else:
print("Wrong input try again\n")
else:
print("Wrong input try again\n")
except Exception:
pass
#Function which generates menu for standard non-privileged users
def standardMenu():
try:
while True:
#Same as above minus the settings option
user_In = input("|U - Upload file | D - Download file | L - Log Out | E - Exit|\n").lower()
if(user_In == "u"):
while True:
print("\nEnsure that files you wish to upload are in the 'Files' folder.\nEnter m to return to main menu.\n")
user_In = input("Enter file name: ")
if(user_In == "m"):
break
else:
e.encrypt(user_In, e.keyRead())
g.uploadFile(user_In)
elif(user_In == "e"):
print("Exiting.")
sys.exit()
elif(user_In == "l"):
print("Logging out.")
break
elif(user_In == "d"):
while True:
user_In = input("\n|S - Search for file | D - Download File | E - Exit to menu|\n").lower()
if(user_In == "s"):
user_In = input("Enter file name: ")
g.searchFile(user_In)
elif(user_In == "d"):
user_In = input("Enter file name: ")
fileID= g.fileID(user_In)
g.downloadFile(fileID, user_In)
e.decrypt(user_In, e.keyRead())
elif(user_In == "e"):
break
else:
print("Wrong input try again\n")
else:
print("Wrong input try again\n")
except Exception:
pass | 38.937984 | 129 | 0.428827 | 515 | 5,023 | 4.097087 | 0.229126 | 0.125118 | 0.066351 | 0.045498 | 0.627488 | 0.615166 | 0.603318 | 0.603318 | 0.573934 | 0.52891 | 0 | 0 | 0.476807 | 5,023 | 129 | 130 | 38.937985 | 0.802892 | 0.088991 | 0 | 0.772277 | 1 | 0.069307 | 0.197723 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019802 | false | 0.039604 | 0.039604 | 0 | 0.059406 | 0.108911 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c503e7668b1bca9c8fa3e9b2fad69b66aea6dd54 | 660 | py | Python | tests/test_path.py | Infinidat/infi.gevent-utils | 7aef923fb19c2ea7abfe9f8341d2dfcb7b7eebdd | [
"BSD-3-Clause"
] | null | null | null | tests/test_path.py | Infinidat/infi.gevent-utils | 7aef923fb19c2ea7abfe9f8341d2dfcb7b7eebdd | [
"BSD-3-Clause"
] | null | null | null | tests/test_path.py | Infinidat/infi.gevent-utils | 7aef923fb19c2ea7abfe9f8341d2dfcb7b7eebdd | [
"BSD-3-Clause"
] | null | null | null | from __future__ import absolute_import
from infi.gevent_utils.os import path
import sys
import os
sys.path.append(os.path.dirname(__file__))
from utils import GreenletCalledValidatorTestCase
class PathTestCase(GreenletCalledValidatorTestCase):
def test_exists(self):
self.switch_validator.assert_called(0)
self.assertFalse(path.exists("/this_path_probably_doesnt_exist_or_else_the_test_will_fail"))
self.switch_validator.assert_called(1)
def test_basename(self):
self.switch_validator.assert_called(0)
self.assertEqual("a.text", path.basename("/a/b/c/a.text"))
self.switch_validator.assert_called(0)
| 33 | 100 | 0.771212 | 88 | 660 | 5.443182 | 0.477273 | 0.083507 | 0.158664 | 0.208768 | 0.298539 | 0.23382 | 0.167015 | 0.167015 | 0 | 0 | 0 | 0.007018 | 0.136364 | 660 | 19 | 101 | 34.736842 | 0.833333 | 0 | 0 | 0.2 | 0 | 0 | 0.118182 | 0.089394 | 0 | 0 | 0 | 0 | 0.4 | 1 | 0.133333 | false | 0 | 0.333333 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
c509490417a8c93598f13380d18986bf96b33fd7 | 200 | py | Python | feedbacks/urls.py | mpyatishev/djfeedback | fc1ebf0646d4449371ed80560db7cbb3f7996156 | [
"MIT"
] | null | null | null | feedbacks/urls.py | mpyatishev/djfeedback | fc1ebf0646d4449371ed80560db7cbb3f7996156 | [
"MIT"
] | null | null | null | feedbacks/urls.py | mpyatishev/djfeedback | fc1ebf0646d4449371ed80560db7cbb3f7996156 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from django.conf.urls import patterns, url
from . import views
urlpatterns = patterns(
'',
url(r'feedback$', views.FeedbackView.as_view(), name='feedback-post')
)
| 15.384615 | 73 | 0.655 | 25 | 200 | 5.2 | 0.76 | 0.169231 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006061 | 0.175 | 200 | 12 | 74 | 16.666667 | 0.781818 | 0.105 | 0 | 0 | 0 | 0 | 0.124294 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
c51b5610a93a01c7edaae445a44f41f8aa36b738 | 626 | py | Python | get_proc_users.py | dangtrinhnt/gem | bc53cf19d3541542e4c14c24b5fb186432e91c45 | [
"Apache-2.0"
] | null | null | null | get_proc_users.py | dangtrinhnt/gem | bc53cf19d3541542e4c14c24b5fb186432e91c45 | [
"Apache-2.0"
] | 44 | 2019-11-18T20:15:35.000Z | 2021-07-27T20:26:38.000Z | get_proc_users.py | dangtrinhnt/gem | bc53cf19d3541542e4c14c24b5fb186432e91c45 | [
"Apache-2.0"
] | null | null | null | #! /usr/bin/env python
import sys
from commons import *
def print_proc_users(csv_path, condition_number):
csv_dat = get_dict_data_from_csv_file(csv_path)
if csv_dat:
print "Processing user with condition %s\n" % condition_number
for email in csv_dat:
num = str_to_num(email['src']) % 10
if num in condition_number or condition_number[0]==-1:
print "%s to %s\n" % (email['src'], email['dest'])
if __name__ == "__main__":
csv_path = sys.argv[1]
if sys.argv[2] == 'all':
condition_number = [-1]
else:
condition_number = map(int, sys.argv[2].split(','))
print_proc_users(csv_path, condition_number)
| 24.076923 | 64 | 0.699681 | 102 | 626 | 3.970588 | 0.470588 | 0.259259 | 0.069136 | 0.083951 | 0.177778 | 0.177778 | 0.177778 | 0 | 0 | 0 | 0 | 0.01518 | 0.158147 | 626 | 25 | 65 | 25.04 | 0.753321 | 0.033546 | 0 | 0 | 0 | 0 | 0.110927 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.117647 | null | null | 0.235294 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c51deae5ca13775e3c2ef4b77c14c0ca5e33d193 | 1,184 | py | Python | 01-Lesson-Plans/03-Python-Pandas/1/Activities/12-functions-02/Unsolved/functions-02.py | tatianegercina/FinTech | b40687aa362d78674e223eb15ecf14bc59f90b62 | [
"ADSL"
] | 1 | 2021-04-13T07:14:34.000Z | 2021-04-13T07:14:34.000Z | 01-Lesson-Plans/03-Python-Pandas/1/Activities/12-functions-02/Unsolved/functions-02.py | tatianegercina/FinTech | b40687aa362d78674e223eb15ecf14bc59f90b62 | [
"ADSL"
] | 2 | 2021-06-02T03:14:19.000Z | 2022-02-11T23:21:24.000Z | 01-Lesson-Plans/03-Python-Pandas/1/Activities/12-functions-02/Unsolved/functions-02.py | tatianegercina/FinTech | b40687aa362d78674e223eb15ecf14bc59f90b62 | [
"ADSL"
] | 1 | 2021-05-07T13:26:50.000Z | 2021-05-07T13:26:50.000Z | # Define a function "warble" that takes in a string as an argument, adds " arglebargle" to the end of it, and returns the result.
# Print the result of calling your "warble" function with the argument "hello".
# Define a function "wibble" that takes a string as an argument, prints the argument, prepends "wibbly " to the argument, and returns the result
# Print the result of calling your "wibble" function with the argument "bibbly"
# Define a function "print_sum" that takes in two numbers as arguments and prints the sum of those two numbers.
# Define a function "return_sum" that takes in two numbers as arguments and returns the sum of those two numbers
# Using either "return_sum" and no mathematical operators, define a function "triple_sum" that takes in 3 arguments and returns the sum of those 3 numbers.
# Define a function "dance_party" that takes in a string as an argument, that prints "dance!", updates the string from calling "wibble" function with that argument, updates the string from calling "warble" function with that argument, returns the updated string
# Print the result of calling your "dance_party" function with your name as the argument
| 43.851852 | 261 | 0.771115 | 191 | 1,184 | 4.748691 | 0.26178 | 0.046307 | 0.099228 | 0.036384 | 0.455347 | 0.374862 | 0.30871 | 0.251378 | 0.185226 | 0.101433 | 0 | 0.00206 | 0.179899 | 1,184 | 26 | 262 | 45.538462 | 0.932029 | 0.969595 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c51e6be205213ab9c3f0f822b11808c56b8e2982 | 1,003 | py | Python | Section 6 - Modular Programming/Green eggs and ham v4.py | gitjot/python-for-lccs | a8a4ae8847abbc33361f80183c06d57b20523382 | [
"CC0-1.0"
] | 10 | 2020-02-14T14:28:15.000Z | 2022-02-02T18:44:11.000Z | Section 6 - Modular Programming/Green eggs and ham v4.py | gitjot/python-for-lccs | a8a4ae8847abbc33361f80183c06d57b20523382 | [
"CC0-1.0"
] | null | null | null | Section 6 - Modular Programming/Green eggs and ham v4.py | gitjot/python-for-lccs | a8a4ae8847abbc33361f80183c06d57b20523382 | [
"CC0-1.0"
] | 8 | 2020-03-25T09:27:42.000Z | 2021-11-03T15:24:38.000Z | # Event: LCCS Python Fundamental Skills Workshop
# Date: Dec 2018
# Author: Joe English, PDST
# eMail: computerscience@pdst.ie
# Purpose: To find (and fix) two syntax errors
# A program to display Green Eggs and Ham (v4)
def showChorus():
print()
print("I do not like green eggs and ham.")
print("I do not like them Sam-I-am.")
print()
def showVerse1():
print("I do not like them here or there.")
print("I do not like them anywhere.")
print("I do not like them in a house")
print("I do not like them with a mouse")
def displayVerse2():
print("I do not like them in a box")
print("I do not like them with a fox")
print("I will not eat them in the rain.")
print("I will not eat them on a train")
# Program execution starts here
showChorus()
displayVerse1() # SYNTAX ERROR 1 - function 'displayVerse1' does not exist
showChorus()
showVerse2() # SYNTAX ERROR 2 - function 'showVerse2' does not exist
showChorus()
| 30.393939 | 75 | 0.658026 | 155 | 1,003 | 4.258065 | 0.445161 | 0.090909 | 0.09697 | 0.133333 | 0.309091 | 0.286364 | 0.139394 | 0.139394 | 0 | 0 | 0 | 0.017173 | 0.245264 | 1,003 | 32 | 76 | 31.34375 | 0.85469 | 0.347956 | 0 | 0.25 | 0 | 0 | 0.49183 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15 | true | 0 | 0 | 0 | 0.15 | 0.6 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
c51f3d08b27846ef7d07616f6d207a8d88638159 | 1,316 | py | Python | flask_youku/__init__.py | xiaoyh121/program | 6826f024cce7a4250a1dab8dba145c1f0d713286 | [
"Apache-2.0"
] | 176 | 2016-12-11T03:24:41.000Z | 2021-12-10T11:44:37.000Z | flask_youku/__init__.py | xiaoyh121/program | 6826f024cce7a4250a1dab8dba145c1f0d713286 | [
"Apache-2.0"
] | 4 | 2018-02-07T03:31:13.000Z | 2021-12-25T13:03:49.000Z | flask_youku/__init__.py | xiaoyh121/program | 6826f024cce7a4250a1dab8dba145c1f0d713286 | [
"Apache-2.0"
] | 76 | 2016-11-13T08:57:38.000Z | 2021-12-25T12:02:05.000Z | from flask import Blueprint, Markup
from flask import render_template
class Youku(object):
"""Flask-Youku extents."""
def __init__(self, app=None, **kwargs):
"""Init Flask-Youku's instance via app object"""
if app:
self.init_app(app)
def init_app(self, app):
"""Init Flask-Youku's instance via app object"""
self.register_blueprint(app)
# Create the Jinja function `youku`
app.add_template_global(youku)
def register_blueprint(self, app):
"""Register the youku blueprint into app object."""
module = Blueprint(
'youku',
__name__,
template_folder='templates')
app.register_blueprint(module)
return module
class Video(object):
"""Receive the youku_id to rendering the video.html"""
def __init__(self, video_id, cls='youku'):
self.video_id = video_id
self.cls = cls
def render(self, *args, **kwargs):
return render_template(*args, **kwargs)
@property
def html(self):
"""Tag the HTML as security string."""
return Markup(
self.render('youku/video.html', video=self))
def youku(*args, **kwargs):
"""Define the Jinja function."""
video = Video(*args, **kwargs)
return video.html
| 24.830189 | 59 | 0.609422 | 159 | 1,316 | 4.880503 | 0.301887 | 0.051546 | 0.03866 | 0.03866 | 0.090206 | 0.090206 | 0.090206 | 0.090206 | 0 | 0 | 0 | 0 | 0.271277 | 1,316 | 52 | 60 | 25.307692 | 0.809176 | 0.224924 | 0 | 0 | 0 | 0 | 0.035569 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.241379 | false | 0 | 0.068966 | 0.034483 | 0.517241 | 0.172414 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
c523effb8f36813f8d45730c0dbdd83679d7448e | 16,256 | py | Python | pyvmodule/expr.py | tanhongze/pyvmodule | b88cd35e57893024071306d238ce601341ce3bb4 | [
"MIT"
] | null | null | null | pyvmodule/expr.py | tanhongze/pyvmodule | b88cd35e57893024071306d238ce601341ce3bb4 | [
"MIT"
] | null | null | null | pyvmodule/expr.py | tanhongze/pyvmodule | b88cd35e57893024071306d238ce601341ce3bb4 | [
"MIT"
] | 1 | 2020-01-20T07:25:40.000Z | 2020-01-20T07:25:40.000Z | #-- coding:utf-8
from .ast import ASTNode
from .compute.value import expr_value_calc_funcs,expr_value_prop_funcs
from .compute.width import expr_width_calc_funcs,expr_width_fix_funcs
from .compute.width import expr_match_width,expr_calc_width
from .tools.utility import count_one
import warnings
__all__ = ['Mux','Concatenate','Expr','wrap_expr',
'BinaryOperator',
'ConstExpr','Hexadecimal','Decimal','Octal','Binary']
def wrap_expr(expr):
if isinstance(expr,Expr):return expr
elif isinstance(expr,int):return Hexadecimal(expr)
else:raise TypeError('Cannot convert "%s" object into "Expr".'%type(expr))
def propagated(func):
def propagated_func(*args):
res = func(*args)
return res if res is None else res._prop_value()
return propagated_func
class Expr(ASTNode):
max_cols_per_line = 120
@classmethod
def _need_split_line(cls,codes):
for lines in codes:
if len(lines)>1:return True
length = 0
for lines in codes:
for line in lines:
for word in line:length+=len(word)
return length>cls.max_cols_per_line
def _generate(self,indent=0,p_precedence=99):return self._expr_generate_funcs[self.typename](self,indent,p_precedence)
def _calc_width(self):self._width_calc_func(self)
def _fix_width(self,expr):return self._width_fix_func(self,expr)
def _prop_value(self):return self._value_prop_func(self)
@property
def typename(self):return self._typename
@typename.setter
def typename(self,typename):
assert isinstance(typename,str)
self._typename = typename
self._width_calc_func = expr_width_calc_funcs[typename]
self._width_fix_func = expr_width_fix_funcs [typename]
self._value_calc_func = expr_value_calc_funcs[typename]
self._value_prop_func = expr_value_prop_funcs[typename]
@property
def lhs(self):return self.childs[1]
@lhs.setter
def lhs(self,subexpr):self.childs[1] = wrap_expr(subexpr)
@property
def rhs(self):return self.childs[0]
@rhs.setter
def rhs(self,subexpr):self.childs[0] = wrap_expr(subexpr)
@property
def cond(self):return self.childs[2]
@cond.setter
def cond(self,cond):
cond = wrap_expr(cond)
cond._fix_width(1)
self.childs[2] = cond
def __init__(self):raise NotImplementedError()
def __int__(self):return self._value_calc_func(self)
def __len__(self):
if self.width is None:raise ValueError('Getting width of width-free expr "%s".'%str(self))
if self.width <= 0 :raise ValueError('Found negative width in "%s".'%str(self))
return self.width
@property
def _is_constant(self):return False
@property
def length(self):return 1
def _wrap_constant(self,value):return Hexadecimal(value,width=self.width)
@staticmethod
def _hex_value(*args,**kwargs):return Hexadecimal(*args,**kwargs)
@staticmethod
def _is_constant_value(expr,value):return isinstance(expr,(int,ConstExpr)) and int(expr)==value
@staticmethod
def _is_expr_typename(obj,typename):return isinstance(obj,Expr) and expr._typename == typename
@propagated
def __mul__(self,rhs):return self if rhs is None else Concatenate(self,rhs)
@propagated
def __pos__(self):return self
@propagated
def __pow__(self,rhs):
if rhs ==0:return None
else:return Replicate(self,rhs)
@propagated
def __lt__ (self,rhs):return BinaryOperator('<',self,rhs)
@propagated
def __gt__ (self,rhs):return BinaryOperator('>',self,rhs)
@propagated
def __le__ (self,rhs):return BinaryOperator('<=',self,rhs)
@propagated
def __ge__ (self,rhs):return BinaryOperator('>=',self,rhs)
@propagated
def __add__(self,rhs):return AddOperator(self,rhs)
@propagated
def __sub__(self,rhs):return BinaryOperator('-',self,rhs)
@propagated
def __and__(self,rhs):return AndOperator(self,rhs)
@propagated
def __or__ (self,rhs):return OrOperator(self,rhs)
@propagated
def __xor__(self,rhs):return XorOperator(self,rhs)
@propagated
def __invert__(self):return UnaryOperator('~',self)
@propagated
def __neg__(self):return UnaryOperator(' -',self)
@propagated
def __lshift__(self,rhs):return BinaryOperator('<<',self,rhs)
@propagated
def __rshift__(self,rhs):return BinaryOperator('>>',self,rhs)
@propagated
def __floordiv__(self,rhs):return BinaryOperator('==',self,rhs)
@propagated
def validif(self,cond):return ValidIf(cond,self)
@propagated
def mux(self,lhs,rhs):return Mux(self,lhs,rhs)
@propagated
def multiply_operate(self,rhs):return MulOperator(self,rhs)
@propagated
def divide_operate(self,rhs):return DivOperator(self,rhs)
@propagated
def module_operate(self,rhs):return ModOperator(self,rhs)
@propagated
def equal_to(self,rhs):return BinaryOperator('==',self,rhs)
@propagated
def not_equal_to(self,rhs):return BinaryOperator('!=',self,rhs)
@propagated
def reduce_or(self):return UnaryOperator(' |',self)
@propagated
def reduce_and(self):return UnaryOperator(' &',self)
@propagated
def reduce_xor(self):return UnaryOperator(' ^',self)
def __getitem__(self,key):raise SyntaxError('Invalid fetch "[%s]" from expr "%s".'%(str(key),str(self)))
def __rpow__(self,lhs):return wrap_expr(lhs)**self
def __rmul__(self,lhs):return self if lhs is None else wrap_expr(lhs)*self
def __radd__(self,lhs):return wrap_expr(lhs)+self
def __rsub__(self,lhs):return wrap_expr(lhs)-self
def __rand__(self,lhs):return wrap_expr(lhs)&self
def __ror__(self,lhs) :return wrap_expr(lhs)|self
def __rxor__(self,lhs):return wrap_expr(lhs)^self
def __rfloordiv__(self,lhs):return wrap_expr(lhs)//self
@staticmethod
def full_adder_c(a,b,c):
return a&b|a&c|b&c
@staticmethod
def full_adder_s(a,b,c):
return a^b^c
def _set_default(self,typename,n_childs=0):
self.comments = []
self.typename = typename
self.childs = [None]*ASTNode._expr_n_childs[typename]
self.value = None
self.width = None
def _connect_port(self,m,p):
if p.io!='input':raise KeyError('Assigning "%s" to %s port "%s"'%(str(self),p.io,str(p)))
self._fix_width(p)
class UnaryOperator(Expr):
def __init__(self,typename,rhs):
self._set_default(typename)
self.rhs = rhs
self._calc_width()
class BinaryOperator(Expr):
def __init__(self,typename,lhs,rhs):
self._set_default(typename)
self.rhs = rhs
self.lhs = lhs
self._calc_width()
class Mux(Expr):
def __init__(self,cond,lhs,rhs):
self._set_default('?:')
self.rhs = rhs
self.lhs = lhs
self.cond = cond
self._calc_width()
class MultilineAlignOperator(Expr):
@property
def _display_as_long(self):return self._display_as_long_val
@_display_as_long.setter
def _display_as_long(self,as_long):
if not isinstance(as_long,bool):raise TypeError('Type of "long" should be bool')
self._display_as_long_val = as_long
class AssociativeOperator(MultilineAlignOperator):
def _merge_childs(self,other):
other = wrap_expr(other)
if other._typename==self._typename:
self._display_as_long|=other._display_as_long
self.childs.extend(other.childs)
else:self.childs.append(other)
def __init__(self,typename,lhs,rhs,long=False):
self._display_as_long = long
self._set_default(typename)
lhs = wrap_expr(lhs)
rhs = wrap_expr(rhs)
expr_calc_width(lhs,rhs)
self._merge_childs(lhs)
self._merge_childs(rhs)
self._calc_width()
class OrOperator(AssociativeOperator):
def __init__(self,lhs,rhs,long=False):AssociativeOperator.__init__(self,'|',lhs,rhs,long)
def __ior__(self,other):
expr_calc_width(self,other)
self._merge_childs(other)
return self
class AndOperator(AssociativeOperator):
def __init__(self,lhs,rhs,long=False):AssociativeOperator.__init__(self,'&',lhs,rhs,long)
def __iand__(self,other):
expr_calc_width(self,other)
self._merge_childs(other)
return self
class AddOperator(AssociativeOperator):
def __init__(self,lhs,rhs,long=False):AssociativeOperator.__init__(self,'+',lhs,rhs,long)
def __iadd__(self,other):
expr_calc_width(self,other)
self._merge_childs(other)
return self
class XorOperator(AssociativeOperator):
def __init__(self,lhs,rhs,long=False):AssociativeOperator.__init__(self,'^',lhs,rhs,long)
def __ixor__(self,other):
expr_calc_width(self,other)
self._merge_childs(other)
return self
def fix_slice(key,width):
start = (0 if key.step is None else key.stop-key.step) if key.start is None else key.start
stop = (width if key.step is None else key.start+key.start) if key.stop is None else key.stop
width = stop - start
return start,stop,width
class Concatenate(AssociativeOperator):
def _extract_childs(self,args):
for arg in args:
if isinstance(arg,(tuple,list)):self._extract_childs(arg)
else:self._merge_childs(arg)
def __init__(self,*args,long=False):
self._display_as_long = long
self._set_default('{}')
self._extract_childs(args)
self._calc_width()
def __setitem__(self,key,val):
if not isinstance(key,slice):raise TypeError(type(key))
expr_match_width(val,len(self))
start,stop,width = fix_slice(key,len(self))
base = 0
for expr in self.childs:
if stop < base or start > base + len(expr):continue
if start > base:
expr[start-base:] = val[:base-start+len(expr)]
elif stop<base +len(expr):
expr[:stop-base] = val[-(stop-base):]
else:
expr[:] = val[base-start::len(expr)]
base += len(expr)
class ValidIf(BinaryOperator):
def __init__(self,lhs,rhs):
self._set_default('validif')
self.rhs = rhs
self.lhs = lhs
self._calc_width()
class MulOperator(BinaryOperator):
def __init__(self,lhs,rhs):
self._set_default('*')
self.rhs = rhs
self.lhs = lhs
self._calc_width()
class DivOperator(BinaryOperator):
def __init__(self,lhs,rhs):
self._set_default('/')
self.rhs = rhs
self.lhs = lhs
self._calc_width()
class ModOperator(BinaryOperator):
def __init__(self,lhs,rhs):
self._set_default('%')
self.rhs = rhs
self.lhs = lhs
self._calc_width()
class Replicate(UnaryOperator):
@property
def count(self):return self._count
@count.setter
def count(self,count):
count=int(count)
self._count = count
if count<=0:raise ValueError('Invalid replicate "%s".'%self)
def __init__(self,rhs,count):
self._set_default('{{}}')
self.rhs = rhs
self.count = count
self._calc_width()
class ConstExpr(Expr):
_radix_fmtstrs = {
16:lambda width,value:("%d'h{:0>%dx}"%(width,(width+3)//4)).format(value),
10:lambda width,value:("%d'd{:0>d}" % width ).format(value),
8 :lambda width,value:("%d'o{:0>%do}"%(width,(width+2)//3)).format(value),
2 :lambda width,value:("%d'b{:0>%db}"%(width, width )).format(value)}
@property
def radix(self):return self._radix
@radix.setter
def radix(self,radix):
if radix not in {2,8,10,16}:raise ValueError('Invalid radix.')
self._radix = radix
self._radix_fmtstr = self._radix_fmtstrs[radix]
@staticmethod
def _convert_str(value):
y = 0
for i in range(len(x)):
y<<=8
c = ord(x[i])
y |=c
if c>=256 or c<0:raise RuntimeError('Charset Error')
return y
@property
def value(self):return self._value
@value.setter
def value(self,value):
if value is None:self._value = value
else:
if isinstance(value,str):self._value = self._convert_str(value)
else:self._value = int(value)
if not self._width is None:self._value&=(1<<self._width)-1
@property
def width(self):return self._width
@width.setter
def width(self,width):
if not isinstance(width,int):raise TypeError(type(width),width)
if width<=0:raise ValueError('Constant value with non-positive width.')
self._width = width
if not self._value is None:self._value&=(1<<self._width)-1
@property
def _driven(self):return 0 if self._value is None else self._constant
@property
def _constant(self):return -1 if self._width is None else (1<<self._width)-1
@property
def _is_constant(self):return True
def _set_default(self,typename,n_childs=0):
self.comments = []
self.childs = [None]*n_childs
self.value = None
self.width = None
def __init__(self,value,width=None,radix=10):
self.typename = 'const'
self._value = 0
self._width = None
if not width is None:self.width = width
self.radix = radix
self.value = value
def __getitem__(self,key):
if isinstance(key,slice):
for a in {'start','stop','step'}:
if not isinstance(getattr(key,a),(int,type(None))):
raise SyntaxError('Invalid fetch format from constant expression.')
start = 0 if key.start is None else key.start
if not key.step is None:return Hexadecimal(int(self)>>start,width=key.step)
elif not key.stop is None:return Hexadecimal(int(self)>>start,width=key.stop-start)
return Hexadecimal(int(self)>>start,width=self.width-start)
elif isinstance(key,(int,ConstExpr)):
loc = int(key)
if loc<0:loc += len(self)
return Binary((self.value>>loc)&1,width=1)
elif isinstance(key,Expr):
n = 1<<len(key)
v = self.value
m = count_one(v)
if m==0:return Binary(0,width=1)
elif m==n:return Binary(1,width=1)
else:
if m<=(n>>1):
expr = 0
for i in range(n):
if ((v>>i)&1)==1:expr|=key//i
else:
expr = 1
for i in range(n):
if ((v>>i)&1)==0:expr&=~(key//i)
return expr
else:raise TypeError(type(key))
def __str__(self):
width = self.width
value = self.value
if value is None:
if width is None:return "'bz"
else:return "%d'bz"%width
if width is None:
if value<0:warnings.warn('Negative value without width declared.')
return str(value)
result = self._radix_fmtstr(width,value)
return result
def __int__(self):return self.value
def __eq__(self,other):
if isinstance(other,ConstExpr):return self.width==other.width and int(self)==int(other)
elif isinstance(other,int):return self.width is None and int(self)==other
else:return False
def __hash__(self):return int(self)+(0 if self.width is None else self.width)
def Hexadecimal(x,width=None):
if width==0:return None
else:return ConstExpr(x,width=width,radix=16)
def Binary (x,width=None):
if width==0:return None
else:return ConstExpr(x,width=width,radix=2 )
def Octal (x,width=None):
if width==0:return None
else:return ConstExpr(x,width=width,radix=8 )
def Decimal (x,width=None):
if width==0:return None
else:return ConstExpr(x,width=width,radix=10)
| 40.237624 | 123 | 0.62906 | 2,152 | 16,256 | 4.508829 | 0.109201 | 0.033907 | 0.032979 | 0.039163 | 0.36463 | 0.324539 | 0.282284 | 0.250953 | 0.187262 | 0.163661 | 0 | 0.007474 | 0.251046 | 16,256 | 403 | 124 | 40.337469 | 0.789487 | 0.000923 | 0 | 0.278607 | 0 | 0 | 0.036497 | 0 | 0 | 0 | 0 | 0 | 0.002488 | 1 | 0.278607 | false | 0 | 0.014925 | 0.149254 | 0.383085 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
c538cf5b43e938d74b89e921d97d1ef0493292ec | 317 | py | Python | solutions/binarysearch.io/hard/collecting-coins/main.py | zwliew/ctci | 871f4fc957be96c6d0749d205549b7b35dc53d9e | [
"MIT"
] | 4 | 2020-11-07T14:38:02.000Z | 2022-01-03T19:02:36.000Z | solutions/binarysearch.io/hard/collecting-coins/main.py | zwliew/ctci | 871f4fc957be96c6d0749d205549b7b35dc53d9e | [
"MIT"
] | 1 | 2019-04-17T06:55:14.000Z | 2019-04-17T06:55:14.000Z | solutions/binarysearch.io/hard/collecting-coins/main.py | zwliew/ctci | 871f4fc957be96c6d0749d205549b7b35dc53d9e | [
"MIT"
] | null | null | null | class Solution:
2 def solve(self, matrix):
3 from functools import lru_cache
4 @lru_cache(None)
5 def dp(i, j):
6 if i < 0 or j < 0:
7 return 0
8 return max(dp(i - 1, j), dp(i, j - 1)) + matrix[i][j]
9 return dp(len(matrix) - 1, len(matrix[0]) - 1)
| 31.7 | 66 | 0.498423 | 56 | 317 | 2.785714 | 0.535714 | 0.057692 | 0.051282 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08 | 0.369085 | 317 | 9 | 67 | 35.222222 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.111111 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c53d83148b42eaa02961efd8a515c82ec643034c | 813 | py | Python | examples/dialogs.py | tgolsson/appJar | 5e2f8bff44e927e7c2bae17fccddc6dbf79952f0 | [
"Apache-2.0"
] | 666 | 2016-11-14T18:17:40.000Z | 2022-03-29T03:53:22.000Z | examples/dialogs.py | tgolsson/appJar | 5e2f8bff44e927e7c2bae17fccddc6dbf79952f0 | [
"Apache-2.0"
] | 598 | 2016-10-20T21:04:09.000Z | 2022-03-15T22:44:49.000Z | examples/dialogs.py | tgolsson/appJar | 5e2f8bff44e927e7c2bae17fccddc6dbf79952f0 | [
"Apache-2.0"
] | 95 | 2017-01-19T12:23:58.000Z | 2022-03-06T18:16:21.000Z | from appJar import gui
def press(btn):
if btn == "info": app.infoBox("Title Here", "Message here...")
if btn == "error": app.errorBox("Title Here", "Message here...")
if btn == "warning": app.warningBox("Title Here", "Message here...")
if btn == "yesno": app.yesNoBox("Title Here", "Message here...")
if btn == "question": app.questionBox("Title Here", "Message here...")
if btn == "ok": app.okBox("Title Here", "Message here...")
if btn == "retry": app.retryBox("Title Here", "Message here...")
if btn == "text": app.textBox("Title Here", "Message here...")
if btn == "number": app.numberBox("Title Here", "Message here...")
app=gui()
app.addButtons(["info", "error", "warning", "yesno", "question"], press)
app.addButtons(["ok", "retry", "text", "number"], press)
app.go()
| 45.166667 | 74 | 0.607626 | 107 | 813 | 4.616822 | 0.299065 | 0.091093 | 0.291498 | 0.364372 | 0.404858 | 0.404858 | 0 | 0 | 0 | 0 | 0 | 0 | 0.158672 | 813 | 17 | 75 | 47.823529 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0.389914 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.066667 | 0 | 0.133333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c54a392610a02b36eccf6f7a462a2e02a2aa190a | 1,681 | py | Python | src/ggrc_risks/models/risk.py | Killswitchz/ggrc-core | 2460df94daf66727af248ad821462692917c97a9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | src/ggrc_risks/models/risk.py | Killswitchz/ggrc-core | 2460df94daf66727af248ad821462692917c97a9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | src/ggrc_risks/models/risk.py | Killswitchz/ggrc-core | 2460df94daf66727af248ad821462692917c97a9 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # Copyright (C) 2017 Google Inc.
# Licensed under http://www.apache.org/licenses/LICENSE-2.0 <see LICENSE file>
from sqlalchemy.ext.declarative import declared_attr
from ggrc import db
from ggrc.access_control.roleable import Roleable
from ggrc.fulltext.mixin import Indexed
from ggrc.models.associationproxy import association_proxy
from ggrc.models import mixins
from ggrc.models.deferred import deferred
from ggrc.models.object_document import PublicDocumentable
from ggrc.models.object_person import Personable
from ggrc.models import reflection
from ggrc.models.relationship import Relatable
from ggrc.models.track_object_state import HasObjectState
class Risk(Roleable, HasObjectState, mixins.CustomAttributable, Relatable,
Personable, PublicDocumentable,
mixins.LastDeprecatedTimeboxed, mixins.BusinessObject,
Indexed, db.Model):
__tablename__ = 'risks'
# Overriding mixin to make mandatory
@declared_attr
def description(cls): # pylint: disable=no-self-argument
return deferred(db.Column(db.Text, nullable=False), cls.__name__)
risk_objects = db.relationship(
'RiskObject', backref='risk', cascade='all, delete-orphan')
objects = association_proxy('risk_objects', 'object', 'RiskObject')
_api_attrs = reflection.ApiAttributes(
'risk_objects',
reflection.Attribute('objects', create=False, update=False),
)
_aliases = {
"document_url": None,
"document_evidence": None,
"status": {
"display_name": "State",
"mandatory": False,
"description": "Options are: \n {}".format('\n'.join(
mixins.BusinessObject.VALID_STATES))
}
}
| 33.62 | 78 | 0.732302 | 191 | 1,681 | 6.303665 | 0.528796 | 0.07309 | 0.093023 | 0.033223 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004298 | 0.169542 | 1,681 | 49 | 79 | 34.306122 | 0.858166 | 0.104105 | 0 | 0 | 0 | 0 | 0.117255 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027027 | false | 0 | 0.324324 | 0.027027 | 0.540541 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
c5508e61b45a9bd59041d4ba0c8bea652aa09b89 | 2,033 | py | Python | cfnbootstrap/construction_errors.py | roberthutto/aws-cfn-bootstrap | 801a16802a931fa4dae0eba4898fe1ccdb304924 | [
"Apache-2.0"
] | null | null | null | cfnbootstrap/construction_errors.py | roberthutto/aws-cfn-bootstrap | 801a16802a931fa4dae0eba4898fe1ccdb304924 | [
"Apache-2.0"
] | null | null | null | cfnbootstrap/construction_errors.py | roberthutto/aws-cfn-bootstrap | 801a16802a931fa4dae0eba4898fe1ccdb304924 | [
"Apache-2.0"
] | 3 | 2017-02-10T13:14:38.000Z | 2018-09-20T01:04:20.000Z | #==============================================================================
# Copyright 2011 Amazon.com, Inc. or its affiliates. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#==============================================================================
class BuildError(Exception):
"""
Base exception for errors raised while building
"""
pass
class NoSuchConfigSetError(BuildError):
"""
Exception signifying no config error with specified name exists
"""
def __init__(self, msg):
self.msg = msg
def __str__(self):
return self.msg
class NoSuchConfigurationError(BuildError):
"""
Exception signifying no config error with specified name exists
"""
def __init__(self, msg):
self.msg = msg
def __str__(self):
return self.msg
class CircularConfigSetDependencyError(BuildError):
"""
Exception signifying circular dependency in configSets
"""
def __init__(self, msg):
self.msg = msg
def __str__(self):
return self.msg
class ToolError(BuildError):
"""
Exception raised by Tools when they cannot successfully change reality
Attributes:
msg - a human-readable error message
code - an error code, if applicable
"""
def __init__(self, msg, code=None):
self.msg = msg
self.code = code
def __str__(self):
if (self.code):
return '%s (return code %s)' % (self.msg, self.code)
else:
return self.msg
| 28.236111 | 79 | 0.619774 | 237 | 2,033 | 5.181435 | 0.481013 | 0.074104 | 0.035831 | 0.045603 | 0.232899 | 0.232899 | 0.232899 | 0.232899 | 0.232899 | 0.232899 | 0 | 0.005092 | 0.22725 | 2,033 | 71 | 80 | 28.633803 | 0.776575 | 0.559764 | 0 | 0.576923 | 0 | 0 | 0.02396 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.307692 | false | 0.038462 | 0 | 0.115385 | 0.692308 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
c5610fccc549a5c9d69f6c3b166e598fbe0653b9 | 6,172 | py | Python | mangabee_parsers.py | ta-dachi/mangaget | 4ef39df0a6cceb2817d3bd0ad4d8290b8f576341 | [
"MIT"
] | null | null | null | mangabee_parsers.py | ta-dachi/mangaget | 4ef39df0a6cceb2817d3bd0ad4d8290b8f576341 | [
"MIT"
] | null | null | null | mangabee_parsers.py | ta-dachi/mangaget | 4ef39df0a6cceb2817d3bd0ad4d8290b8f576341 | [
"MIT"
] | null | null | null | from html.parser import HTMLParser
class mangabeeSearchParser(HTMLParser):
def __init__(self):
HTMLParser.__init__(self)
self.inLink = False
self.lastTag = None
self.lastClass = None
self.urls = [] # Where we store our results
def handle_starttag(self, tag, attrs):
if (tag == 'div'):
attrs = dict(attrs)
self.lastTag = 'div'
if (attrs.get('class') == 'nde'):
self.inLink = True
self.lastClass ='nde'
if (self.lastClass == 'nde'):
if (tag == 'div'):
attrs = dict(attrs)
if (attrs.get('class') == 'cvr'):
self.lastClass ='cvr'
if (self.lastTag == 'div' and tag == 'a' and self.lastClass == 'cvr'):
self.lastTag = 'a'
attrs = dict(attrs) # example output: {'href': 'http://www.mangabee.com/Tokyo_Ghoul/'}
self.urls.append( attrs.get('href') ) #['http://www.mangabee.com/Tokyo_Ghoul', ...]
def handle_endtag(self, tag):
if (tag == 'div'):
self.inLink = False
self.lastTag = None
self.lastClass = None
def handle_data(self, data):
pass
class mangabeeSetupParser(HTMLParser):
def __init__(self):
HTMLParser.__init__(self)
self.inLink = False
self.lastTag = None
self.lastClass = None
self.pages = []
self.chapters = []
self.src = []
self.first_occurrence_chapters = False
self.first_occurrence_pages = False
def handle_starttag(self, tag, attrs):
if (tag == 'select'): # The tag with pages data.
self.inLink = True
attrs = dict(attrs)
self.lastTag = 'select'
if (attrs.get('class') == 'cbo_wpm_pag'):
self.lastClass = 'cbo_wpm_pag'
if (tag == 'option' and self.lastClass == 'cbo_wpm_pag'):
self.inLink = True
self.lastTag = 'option'
if (tag == 'select'): # The tag with chapter data.
self.inLink = True
attrs = dict(attrs)
self.lastTag = 'select'
if (attrs.get('class') == 'cbo_wpm_chp'):
self.lastClass = 'cbo_wpm_chp'
if (tag == 'img'): # Wade through html to find img tag.
self.inLink = True
attrs = dict(attrs) # The tag with image data and location.
self.lastTag = 'img'
if (attrs.get('class') == 'manga-page'): # Found tag with manga image.
self.lastClass = 'manga-page'
self.src.append(attrs.get('src')) # Add example src. Need lots of string manipulation to generate image urls
def handle_endtag(self, tag):
if (tag == 'select' and self.lastClass =='cbo_wpm_chp'): # The tag with chapter data.
self.inLink = False
self.lastTag = None
self.lastClass = None
self.first_occurrence_chapters = True # Chapter selection occurs twice so, only add chapters once.
if (tag == 'select' and self.lastClass =='cbo_wpm_pag'): # The tag with chapter data.
self.inLink = False
self.lastTag = None
self.lastClass = None
self.first_occurrence_pages = True # Chapter selection occurs twice so, only add chapters once.
if (tag == 'img'): # The tag with image data and location.
self.inLink = False
self.lastTag = None
self.lastClass = None
def handle_data(self, data):
if (self.lastClass == 'cbo_wpm_chp' and self.first_occurrence_chapters == False):
self.chapters.append(data)
if (self.lastClass == 'cbo_wpm_pag' and self.lastTag == 'option' and self.first_occurrence_pages == False):
self.pages.append(data)
class mangabeeHTMLGetImageUrls(HTMLParser):
def __init__(self):
HTMLParser.__init__(self)
self.inLink = False
self.lastTag = None
self.lastClass = None
self.page_numbers = []
self.second_occurrence_pages = False
def handle_starttag(self, tag, attrs):
if (tag == 'select'): # The tag with pages data.
self.inLink = True
attrs = dict(attrs)
self.lastTag = 'select'
if (attrs.get('class') == 'cbo_wpm_pag'):
self.lastClass = 'cbo_wpm_pag'
if (tag == 'option' and self.lastClass == 'cbo_wpm_pag' and self.second_occurrence_pages == False):
self.inLink = True
self.lastTag = 'option'
attrs = dict(attrs)
self.page_numbers.append(attrs.get('value'))
if (tag == 'div'):
self.inLink = True
attrs = dict(attrs)
self.lastTag = 'div'
if (attrs.get('class') == 'clr'):
self.lastClass = 'clr'
def handle_endtag(self, tag):
if (tag == 'select'): # The tag with chapter data.
self.inLink = False
self.lastTag = None
self.lastClass = None
if (self.lastClass == 'clr'):
self.second_occurrence_pages = True
def handle_data(self, data):
pass
class mangabeeHTMLGetImageSrcs(HTMLParser):
def __init__(self):
HTMLParser.__init__(self)
self.inLink = False
self.lastTag = None
self.lastClass = None
self.src = None
def handle_starttag(self, tag, attrs):
if (tag == 'img'): # The tag with pages data.
self.inLink = True
attrs = dict(attrs)
self.lastTag = 'section'
if (attrs.get('class') == 'manga-page'):
self.lastClass = 'manga-page'
if (tag == 'img' and self.lastClass == 'manga-page'):
self.inLink = True
self.lastTag = 'img'
attrs = dict(attrs)
self.src = attrs.get('src')
def handle_endtag(self, tag):
if (tag == 'img'):
self.inLink = False
self.lastTag = None
self.lastClass = None
def handle_data(self, data):
pass
| 35.883721 | 124 | 0.545366 | 694 | 6,172 | 4.720461 | 0.132565 | 0.111111 | 0.047009 | 0.057998 | 0.758852 | 0.703297 | 0.620574 | 0.562882 | 0.486264 | 0.486264 | 0 | 0 | 0.33733 | 6,172 | 171 | 125 | 36.093567 | 0.800978 | 0.10499 | 0 | 0.705479 | 0 | 0 | 0.072466 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.109589 | false | 0.020548 | 0.006849 | 0 | 0.143836 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c565a1b2f5a20a17f2045f38225c4233abda30b9 | 456 | py | Python | tests/web/backend_pytest.py | brumar/eel-for-transcrypt | 28cf5e0aa55a3c885b63d79d1ffae1370be644d2 | [
"MIT"
] | 1 | 2019-12-31T13:53:05.000Z | 2019-12-31T13:53:05.000Z | tests/web/backend_pytest.py | brumar/eel-for-transcrypt | 28cf5e0aa55a3c885b63d79d1ffae1370be644d2 | [
"MIT"
] | 1 | 2021-11-15T17:48:03.000Z | 2021-11-15T17:48:03.000Z | tests/web/backend_pytest.py | brumar/eel-for-transcrypt | 28cf5e0aa55a3c885b63d79d1ffae1370be644d2 | [
"MIT"
] | null | null | null | import eel_for_transcrypt as eel
from web.common import InventoryItem
@eel.expose
def i_return_the_same(anything):
return anything
@eel.expose
def a_generator(mn, mx):
yield from range(mn, mx)
@eel.expose
@eel.apply_factories(inputfactory=InventoryItem)
def return_a_dataclass(datac: InventoryItem):
assert isinstance(datac, InventoryItem)
return datac
@eel.expose
def sluggish(timeout):
eel.sleep(timeout)
return True
| 16.888889 | 48 | 0.756579 | 62 | 456 | 5.419355 | 0.532258 | 0.107143 | 0.107143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162281 | 456 | 26 | 49 | 17.538462 | 0.879581 | 0 | 0 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 1 | 0.235294 | false | 0 | 0.117647 | 0.058824 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
c574d6290b0c40bcbc5696cd5ebb36152641b976 | 215 | py | Python | func_one.py | FoxProklya/Step-Python | 67514509655e552fc5adcc7963b971ef6f0bb46a | [
"MIT"
] | null | null | null | func_one.py | FoxProklya/Step-Python | 67514509655e552fc5adcc7963b971ef6f0bb46a | [
"MIT"
] | null | null | null | func_one.py | FoxProklya/Step-Python | 67514509655e552fc5adcc7963b971ef6f0bb46a | [
"MIT"
] | null | null | null | def f(x):
if x <= -2:
f = 1 - (x + 2)**2
return f
if -2 < x <= 2:
f = -(x/2)
return f
if 2 < x:
f = (x - 2)**2 + 1
return f
x = int(input())
print(f(x))
| 14.333333 | 26 | 0.316279 | 38 | 215 | 1.789474 | 0.263158 | 0.147059 | 0.088235 | 0.294118 | 0.352941 | 0.352941 | 0 | 0 | 0 | 0 | 0 | 0.1 | 0.488372 | 215 | 14 | 27 | 15.357143 | 0.518182 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0 | 0 | 0.333333 | 0.083333 | 0 | 0 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3d71bcc45f53747aca6197878307201d4f4b2564 | 506 | py | Python | tags/models.py | yuyuyuhaoshi/Blog-BE | a485d5159076d619d4fd6019fe9b96ac04020d4d | [
"Apache-2.0"
] | null | null | null | tags/models.py | yuyuyuhaoshi/Blog-BE | a485d5159076d619d4fd6019fe9b96ac04020d4d | [
"Apache-2.0"
] | null | null | null | tags/models.py | yuyuyuhaoshi/Blog-BE | a485d5159076d619d4fd6019fe9b96ac04020d4d | [
"Apache-2.0"
] | null | null | null | from django.db import models
from django.utils.timezone import now
from django.contrib.auth.models import User
from utils.base_model import SoftDeletionModel
class Tag(SoftDeletionModel):
name = models.CharField('标题名', max_length=100, unique=True, blank=False, null=False)
created_time = models.DateTimeField('创建时间', default=now)
class Meta:
ordering = ['name']
verbose_name = "标签"
verbose_name_plural = verbose_name
def __str__(self):
return self.name
| 26.631579 | 88 | 0.717391 | 65 | 506 | 5.415385 | 0.630769 | 0.085227 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007335 | 0.1917 | 506 | 18 | 89 | 28.111111 | 0.853301 | 0 | 0 | 0 | 0 | 0 | 0.025692 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.307692 | 0.076923 | 0.769231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
3d7ca16d1d0cb0fd5ce512de12142e0f598017a2 | 572 | py | Python | app/models/link.py | aries-zhang/flask-template | 369d77f2910f653f46668dd9bda735954b6c145e | [
"MIT"
] | null | null | null | app/models/link.py | aries-zhang/flask-template | 369d77f2910f653f46668dd9bda735954b6c145e | [
"MIT"
] | null | null | null | app/models/link.py | aries-zhang/flask-template | 369d77f2910f653f46668dd9bda735954b6c145e | [
"MIT"
] | null | null | null | import time # NOQA
from app import db
class Link(db.Model):
id = db.Column(db.Integer, primary_key=True)
title = db.Column(db.String)
url = db.Column(db.String)
description = db.Column(db.String)
type = db.Column(db.Integer)
enabled = db.Column(db.Boolean)
createtime = db.Column(db.DateTime)
def __init__(self, title, url, description, type, enabled):
self.title = title
self.url = url
self.description = description
self.type = type
self.enabled = enabled
self.createtime = time.time()
| 27.238095 | 63 | 0.63986 | 76 | 572 | 4.75 | 0.355263 | 0.155125 | 0.193906 | 0.132964 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.246504 | 572 | 20 | 64 | 28.6 | 0.837587 | 0.006993 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.117647 | 0 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
3d8734a866fbee3cba78ae6db665c5cbc41ba2ea | 440 | py | Python | assessment/seeders/base_seeder.py | kenware/Assessment | 69f5e3fbf18dfa2c59eaf3b083ebdba7ca66c9b7 | [
"MIT"
] | null | null | null | assessment/seeders/base_seeder.py | kenware/Assessment | 69f5e3fbf18dfa2c59eaf3b083ebdba7ca66c9b7 | [
"MIT"
] | 3 | 2020-02-11T23:31:01.000Z | 2021-06-10T21:04:34.000Z | assessment/seeders/base_seeder.py | kenware/Assessment | 69f5e3fbf18dfa2c59eaf3b083ebdba7ca66c9b7 | [
"MIT"
] | null | null | null |
from .seed_assessment_type import seed_assessment
from .seed_question import seed_question
from .seed_answer import seed_answer
from .seed_user import seed_user
from .seed_score import seed_score
from .seed_assessment_name import seed_assessment_name
class Seeder(object):
def seed_all(self):
seed_assessment_name()
seed_assessment()
seed_question()
seed_answer()
seed_user()
seed_score()
| 24.444444 | 54 | 0.752273 | 59 | 440 | 5.220339 | 0.271186 | 0.155844 | 0.175325 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.193182 | 440 | 17 | 55 | 25.882353 | 0.867606 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.428571 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
3d89564a5d0fa853d134b34b86a84b5003e24ceb | 328 | py | Python | contek_tusk/metric_data.py | contek-io/contek-tusk | 74dc73388367adb958848819b29fe24316c4f6f4 | [
"MIT"
] | null | null | null | contek_tusk/metric_data.py | contek-io/contek-tusk | 74dc73388367adb958848819b29fe24316c4f6f4 | [
"MIT"
] | null | null | null | contek_tusk/metric_data.py | contek-io/contek-tusk | 74dc73388367adb958848819b29fe24316c4f6f4 | [
"MIT"
] | null | null | null | from pandas import DataFrame
from contek_tusk.table import Table
class MetricData:
def __init__(self, table: Table, df: DataFrame) -> None:
self._table = table
self._df = df
def get_table(self) -> Table:
return self._table
def get_data_frame(self) -> DataFrame:
return self._df
| 19.294118 | 60 | 0.655488 | 43 | 328 | 4.72093 | 0.418605 | 0.17734 | 0.137931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.262195 | 328 | 16 | 61 | 20.5 | 0.838843 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.2 | 0.2 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
3d92ede6e5d24bbbfeb9c757cc08cd7affa9cd34 | 268 | py | Python | src/pyons/setup.py | larioandr/thesis-models | ecbc8c01aaeaa69034d6fe1d8577ab655968ea5f | [
"MIT"
] | 1 | 2021-01-17T15:49:03.000Z | 2021-01-17T15:49:03.000Z | src/pyons/setup.py | larioandr/thesis-models | ecbc8c01aaeaa69034d6fe1d8577ab655968ea5f | [
"MIT"
] | null | null | null | src/pyons/setup.py | larioandr/thesis-models | ecbc8c01aaeaa69034d6fe1d8577ab655968ea5f | [
"MIT"
] | 1 | 2021-03-07T15:31:06.000Z | 2021-03-07T15:31:06.000Z | from setuptools import setup
setup(
name='pyons',
version='1.0',
author="Andrey Larionov",
author_email="larioandr@gmail.com",
license="MIT",
py_modules=['pyons'],
install_requires=[
],
tests_requires=[
'pytest',
],
)
| 15.764706 | 39 | 0.589552 | 28 | 268 | 5.5 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01 | 0.253731 | 268 | 16 | 40 | 16.75 | 0.76 | 0 | 0 | 0.142857 | 0 | 0 | 0.208955 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.071429 | 0 | 0.071429 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3d95e63a148b7fb62965e71316967e479358de64 | 2,262 | py | Python | html2markdown.py | DeusFigendi/fefebot | 935338c7b082502f25f97ae4874b4e896a04972e | [
"MIT"
] | 4 | 2016-09-19T03:54:31.000Z | 2021-03-27T23:06:34.000Z | html2markdown.py | DeusFigendi/fefebot | 935338c7b082502f25f97ae4874b4e896a04972e | [
"MIT"
] | 1 | 2017-08-01T15:04:57.000Z | 2017-08-08T22:02:46.000Z | html2markdown.py | DeusFigendi/fefebot | 935338c7b082502f25f97ae4874b4e896a04972e | [
"MIT"
] | 6 | 2015-08-24T09:37:41.000Z | 2018-12-26T19:40:42.000Z | #! /usr/bin/env python3.2
import re
def _subpre(text):
list=re.split('(<pre>|</pre>)',text)
for i in range(len(list)):
# begin of pre
if i%4==1:
list[i]='\n\n '
# in pre
elif i%4==2:
list[i]=re.sub('<p>|<br>|\n\n', '\n\n ',list[i])
# end of pre
elif i%4==3:
list[i]='\n\n'
return ''.join(list)
def _subblock(text):
list=re.split('(<blockquote>|</blockquote>)',text)
for i in range(len(list)):
# begin of blockquote
if i%4==1:
list[i]='\n\n> '
# in blockquote
elif i%4==2:
list[i]=re.sub('<p>|<br>|\n\n', '\n\n> ',list[i])
# end of blockquote
elif i%4==3:
list[i]='\n\n'
return ''.join(list)
def _sublinks(text):
return re.sub('<a href=\"(?P<link>.*?)\">(?P<linktext>.*?)</a>', lambda m : '[' + _markdownify_linktext(m.group('linktext')) + '](' + _fefe_linksintern(m.group('link')) + ')', text)
def _markdownify(text):
list=re.split('(\[.*\]\(.*\))',text)
# only change when not a link
for i in range(0,len(list),2):
list[i]=re.sub('\*','\\*',list[i])
list[i]=re.sub('_','\\_',list[i])
list[i]=re.sub('<b>','**',list[i])
list[i]=re.sub('</b>','**',list[i])
list[i]=re.sub('<i>','_',list[i])
list[i]=re.sub('</i>','_',list[i])
list[i]=re.sub('<u>','\n',list[i])
list[i]=re.sub('</u>','\n',list[i])
list[i]=re.sub('<li>','\n - ',list[i])
list[i]=re.sub('</li>','\n',list[i])
list[i]=re.sub('<p>','\n\n',list[i])
list[i]=re.sub('</p>','\n\n',list[i])
list[i]=re.sub('<br>','\n\n',list[i])
return ''.join(list)
def _markdownify_linktext(text):
list=re.split('(\[.*\]\(.*\))',text)
# only change when not a link
for i in range(0,len(list),2):
list[i]=re.sub('\*','\\*',list[i])
list[i]=re.sub('_','\\_',list[i])
list[i]=re.sub('<b>','**',list[i])
list[i]=re.sub('</b>','**',list[i])
list[i]=re.sub('<i>','_',list[i])
list[i]=re.sub('</i>','_',list[i])
return ''.join(list)
def _fefe_linksintern(text):
text=re.sub('^\/\?ts=','https://blog.fefe.de/?ts=',text)
text=re.sub('^\/\?q=','https://blog.fefe.de/?q=',text)
return text
def html2md(html):
html=_subpre(html)
html=_subblock(html)
html=_sublinks(html)
html=_markdownify(html)
return html
| 29 | 183 | 0.517683 | 380 | 2,262 | 3.018421 | 0.157895 | 0.200523 | 0.12816 | 0.183086 | 0.593723 | 0.593723 | 0.559721 | 0.559721 | 0.559721 | 0.484743 | 0 | 0.010096 | 0.167993 | 2,262 | 77 | 184 | 29.376623 | 0.599362 | 0.07206 | 0 | 0.516667 | 0 | 0 | 0.175598 | 0.03445 | 0 | 0 | 0 | 0 | 0 | 1 | 0.116667 | false | 0 | 0.016667 | 0.016667 | 0.25 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3db26a9a64ef3907fd6d3bfdd43c6b7c844f6a0f | 303 | py | Python | mood_sense/serializers.py | D-Denysenko/health-app | 18d1e9c492fb00694e1987a6cdaa2197ff4efa11 | [
"MIT"
] | null | null | null | mood_sense/serializers.py | D-Denysenko/health-app | 18d1e9c492fb00694e1987a6cdaa2197ff4efa11 | [
"MIT"
] | 9 | 2021-03-19T08:05:00.000Z | 2022-03-12T00:15:53.000Z | mood_sense/serializers.py | D-Denysenko/health-app | 18d1e9c492fb00694e1987a6cdaa2197ff4efa11 | [
"MIT"
] | null | null | null | from rest_framework import serializers
from .models import Mood
class MoodSerializer(serializers.ModelSerializer):
class Meta:
model = Mood
fields = ['profile', 'characteristic', 'latitude', 'longitude', 'image', 'location']
read_only_fields = ['latitude', 'longitude']
| 23.307692 | 92 | 0.686469 | 29 | 303 | 7.068966 | 0.724138 | 0.165854 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.19802 | 303 | 12 | 93 | 25.25 | 0.843621 | 0 | 0 | 0 | 0 | 0 | 0.225166 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
3dbaf6caeb51e514bda230b2abe9f5f3e8537dce | 974 | py | Python | tests/test_address_book.py | kibernick/pycontacts | 9ec7653cdea582b242a6d5f314b4d0c4bb92dd39 | [
"MIT"
] | null | null | null | tests/test_address_book.py | kibernick/pycontacts | 9ec7653cdea582b242a6d5f314b4d0c4bb92dd39 | [
"MIT"
] | null | null | null | tests/test_address_book.py | kibernick/pycontacts | 9ec7653cdea582b242a6d5f314b4d0c4bb92dd39 | [
"MIT"
] | null | null | null | from pycontacts import AddressBook
from pycontacts.models import Person
from pycontacts.managers import (
EmailAddressManager,
GroupManager,
PhoneNumberManager,
PersonManager,
StreetAddressManager,
)
def test_create_book():
book = AddressBook()
assert book._store is not None
assert isinstance(book._store, dict)
def test_create_person_model_class():
book = AddressBook()
p = book.persons.create()
assert isinstance(p, Person)
assert p.book is not None
assert isinstance(p.book, AddressBook)
assert p.book._store is book._store
def test_create_book_with_managers(address_book):
assert isinstance(address_book.email_addresses, EmailAddressManager)
assert isinstance(address_book.groups, GroupManager)
assert isinstance(address_book.phone_numbers, PhoneNumberManager)
assert isinstance(address_book.persons, PersonManager)
assert isinstance(address_book.street_addresses, StreetAddressManager)
| 29.515152 | 74 | 0.776181 | 110 | 974 | 6.663636 | 0.318182 | 0.174625 | 0.15689 | 0.184175 | 0.068213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157084 | 974 | 32 | 75 | 30.4375 | 0.892814 | 0 | 0 | 0.076923 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.423077 | 1 | 0.115385 | false | 0 | 0.115385 | 0 | 0.230769 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3dc696f09fb0ebe8bc4f7011c19473f98ca4f506 | 335 | py | Python | tango_with_django_project/rango/admin.py | DADDYKIKI/tango_with_django_project | da2bbb0b7fd2d587c9af4c7ac14068678b2c38cf | [
"MIT"
] | null | null | null | tango_with_django_project/rango/admin.py | DADDYKIKI/tango_with_django_project | da2bbb0b7fd2d587c9af4c7ac14068678b2c38cf | [
"MIT"
] | null | null | null | tango_with_django_project/rango/admin.py | DADDYKIKI/tango_with_django_project | da2bbb0b7fd2d587c9af4c7ac14068678b2c38cf | [
"MIT"
] | null | null | null | from django.contrib import admin
from rango.models import Category, Page
admin.site.register(Page)
admin.site.register(Category)
class CategoryAdmin(admin.ModelAdmin):
prepopulated_fields = {'slug':('name',)}
class PageAdmin(admin.ModelAdmin):
list_display = ('title',
'category', 'url')
| 22.333333 | 45 | 0.668657 | 36 | 335 | 6.166667 | 0.638889 | 0.081081 | 0.117117 | 0.189189 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208955 | 335 | 14 | 46 | 23.928571 | 0.837736 | 0 | 0 | 0 | 0 | 0 | 0.074766 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
3dcde3d12d8ff748623472b864c1c6d69f5873ea | 1,462 | py | Python | plugins/playbook/deploy_cluster/decapod_plugin_playbook_deploy_cluster/monitor_secret.py | angry-tony/ceph-lcm-decapod | 535944d3ee384c3a7c4af82f74041b0a7792433f | [
"Apache-2.0"
] | 41 | 2016-11-03T16:40:17.000Z | 2019-05-23T08:39:17.000Z | plugins/playbook/deploy_cluster/decapod_plugin_playbook_deploy_cluster/monitor_secret.py | Mirantis/ceph-lcm | fad9bad0b94f2ef608362953583b10a54a841d24 | [
"Apache-2.0"
] | 30 | 2016-10-14T10:54:46.000Z | 2017-10-20T15:58:01.000Z | plugins/playbook/deploy_cluster/decapod_plugin_playbook_deploy_cluster/monitor_secret.py | angry-tony/ceph-lcm-decapod | 535944d3ee384c3a7c4af82f74041b0a7792433f | [
"Apache-2.0"
] | 28 | 2016-09-17T01:17:36.000Z | 2019-07-05T03:32:54.000Z | # -*- coding: utf-8 -*-
# Copyright (c) 2016 Mirantis Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Specified KV model for storing monitor secrets."""
import base64
import os
import struct
import time
from decapod_common.models import kv
class MonitorSecret(kv.KV):
NAMESPACE = "monitor_secret"
@classmethod
def upsert(cls, key, value):
return super().upsert(cls.NAMESPACE, key, value)
@classmethod
def find(cls, keys):
return super().find(cls.NAMESPACE, keys)
@classmethod
def find_one(cls, key):
models = cls.find([key])
if models:
return models[0]
@classmethod
def remove(cls, keys):
return super().remove(cls.NAMESPACE, keys)
def generate_monitor_secret():
key = os.urandom(16)
header = struct.pack("<hiih", 1, int(time.time()), 0, len(key))
secret = base64.b64encode(header + key)
secret = secret.decode("utf-8")
return secret
| 25.649123 | 69 | 0.685363 | 203 | 1,462 | 4.91133 | 0.541872 | 0.060181 | 0.026078 | 0.032096 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018166 | 0.209302 | 1,462 | 56 | 70 | 26.107143 | 0.844291 | 0.426813 | 0 | 0.148148 | 0 | 0 | 0.02934 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.185185 | false | 0 | 0.185185 | 0.111111 | 0.62963 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
3de9f24b49937335e24db781a7e382e77643515c | 568 | py | Python | zip_files.py | VladimirsHisamutdinovs/Advanced_Python_Operations | 509c219f70adcbe9b3dedd71bff819494bab9c83 | [
"Apache-2.0"
] | null | null | null | zip_files.py | VladimirsHisamutdinovs/Advanced_Python_Operations | 509c219f70adcbe9b3dedd71bff819494bab9c83 | [
"Apache-2.0"
] | null | null | null | zip_files.py | VladimirsHisamutdinovs/Advanced_Python_Operations | 509c219f70adcbe9b3dedd71bff819494bab9c83 | [
"Apache-2.0"
] | null | null | null | import zipfile
zip_file = zipfile.ZipFile("zip_archive.zip", "w")
zip_file.write("textfile_for_zip_01")
zip_file.write("textfile_for_zip_02")
zip_file.write("textfile_for_zip_03")
# print(zipfile.is_zipfile("zip_archive.zip"))
# zip_file = zipfile.ZipFile("zip_archive.zip")
# print(zip_file.namelist())
# print(zip_file.infolist())
# zip_info = zip_file.getinfo("textfile_for_zip_02")
# print(zip_info.file_size)
# print(zip_file.read("textfile_for_zip_01"))
zip_file.extract("textfile_for_zip_02")
zip_file.extractall()
zip_file.close() | 24.695652 | 53 | 0.748239 | 88 | 568 | 4.409091 | 0.261364 | 0.216495 | 0.216495 | 0.154639 | 0.505155 | 0.505155 | 0.175258 | 0 | 0 | 0 | 0 | 0.023576 | 0.103873 | 568 | 23 | 54 | 24.695652 | 0.738703 | 0.466549 | 0 | 0 | 0 | 0 | 0.335766 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3deeb28e7a4a40609c5fe55751360abc1b88afba | 1,603 | py | Python | komposisjon/komposisjon/rektangler_kvadrater.py | knutsenfiksdal/Oving_8 | 4e5d3a358cfb9127509a86a61c9499f22da9eabc | [
"MIT"
] | null | null | null | komposisjon/komposisjon/rektangler_kvadrater.py | knutsenfiksdal/Oving_8 | 4e5d3a358cfb9127509a86a61c9499f22da9eabc | [
"MIT"
] | null | null | null | komposisjon/komposisjon/rektangler_kvadrater.py | knutsenfiksdal/Oving_8 | 4e5d3a358cfb9127509a86a61c9499f22da9eabc | [
"MIT"
] | null | null | null | class Rektangel:
def __init__(self, start_x, start_y, bredde, hoyde):
self.start_x = start_x
self.start_y = start_y
self.hoyde = hoyde
self.bredde = bredde
def areal(self):
return self.bredde*self.hoyde
# Endrer høyde og bredde på en slik måte at areal forblir det samme
def strekk(self, multiplikator):
self.bredde *= multiplikator
self.hoyde /= multiplikator
def __str__(self):
return f"Rektangel fra ({self.start_x}, {self.start_y}), " \
f"bredde {self.bredde}, høyde {self.hoyde}"
# Kvadrat som bruker komposisjon og delegering
class Kvadrat:
def __init__(self, start_x, start_y, storrelse):
self.rektanglet = Rektangel(start_x, start_y, storrelse, storrelse)
@property
def bredde(self):
return self.rektanglet.bredde
@property
def hoyde(self):
return self.rektanglet.hoyde
@bredde.setter
def bredde(self, ny_bredde):
self.rektanglet.bredde = ny_bredde
self.rektanglet.hoyde = ny_bredde
@hoyde.setter
def hoyde(self, ny_hoyde):
self.rektanglet.bredde = ny_hoyde
self.rektanglet.hoyde = ny_hoyde
def areal(self):
return self.rektanglet.areal()
if __name__ == "__main__":
rektanglet = Rektangel(5, 5, 10, 5)
print(rektanglet)
print(rektanglet.areal())
rektanglet.strekk(0.5)
print(rektanglet)
print(rektanglet.areal())
kvadrat = Kvadrat(2, 2, 6)
print(kvadrat)
print(kvadrat.areal())
kvadrat.strekk(6)
print(kvadrat)
print(kvadrat.areal())
| 26.278689 | 75 | 0.646288 | 201 | 1,603 | 4.965174 | 0.233831 | 0.112224 | 0.04008 | 0.04509 | 0.284569 | 0.178357 | 0.046092 | 0 | 0 | 0 | 0 | 0.009144 | 0.249532 | 1,603 | 60 | 76 | 26.716667 | 0.820449 | 0.068621 | 0 | 0.26087 | 0 | 0 | 0.065101 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.217391 | false | 0 | 0 | 0.108696 | 0.369565 | 0.173913 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
3df076848f2032b90ec31c8b5ee8c64134fd5e5c | 1,579 | py | Python | lunch/admin.py | KrzysztofSakowski/lunch-crawler | 6a93d6cfad634fb98f89bc22d68547801865c9ae | [
"Apache-2.0"
] | 1 | 2020-02-17T13:40:08.000Z | 2020-02-17T13:40:08.000Z | lunch/admin.py | KrzysztofSakowski/lunch-crawler | 6a93d6cfad634fb98f89bc22d68547801865c9ae | [
"Apache-2.0"
] | 4 | 2020-02-11T23:06:14.000Z | 2021-06-10T18:07:30.000Z | lunch/admin.py | KrzysztofSakowski/lunch-crawler | 6a93d6cfad634fb98f89bc22d68547801865c9ae | [
"Apache-2.0"
] | null | null | null | from django.contrib import admin
from .models import MenuFacebook, MenuEmail, UserProfile, Occupation, FacebookRestaurant, EmailRestaurant
class MenuBaseAdmin(admin.ModelAdmin):
list_display = ('id', 'format_date', 'is_lunch', 'message')
list_filter = ('created_date', 'is_lunch')
list_editable = ('is_lunch',)
ordering = ['-created_date']
def format_date(self, obj):
return obj.created_date.strftime('%Y-%m-%d, %R')
class RestaurantAdmin(admin.ModelAdmin):
list_display = ('id', 'name')
class UserProfileInline(admin.StackedInline):
model = UserProfile.restaurants.through
class UserProfileAdmin(admin.ModelAdmin):
inlines = UserProfileInline,
list_display = ('name', 'restaurants_list',)
def get_inline_instances(self, request, obj=None):
if not obj:
return []
return super(UserProfileAdmin, self).get_inline_instances(request, obj)
def name(self, obj):
return obj.user.username
def restaurants_list(self, obj):
return "\n".join([a.name for a in obj.restaurants.all()])
class SeatAdmin(admin.ModelAdmin):
list_display = ('id', 'restaurant', 'seats_taken', 'seats_total', 'date_declared')
def restaurant(self, obj):
return obj.restaurant.name
admin.site.register(FacebookRestaurant, RestaurantAdmin)
admin.site.register(EmailRestaurant, RestaurantAdmin)
admin.site.register(MenuFacebook, MenuBaseAdmin)
admin.site.register(MenuEmail, MenuBaseAdmin)
admin.site.register(UserProfile, UserProfileAdmin)
admin.site.register(Occupation, SeatAdmin)
| 29.240741 | 105 | 0.723243 | 177 | 1,579 | 6.322034 | 0.389831 | 0.048257 | 0.091153 | 0.069705 | 0.075067 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155161 | 1,579 | 53 | 106 | 29.792453 | 0.838831 | 0 | 0 | 0 | 0 | 0 | 0.098797 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.147059 | false | 0 | 0.058824 | 0.117647 | 0.794118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
3df45b763adea0ed603bc91664b6febfe07b4afe | 1,920 | py | Python | src/yafowil/tests/__init__.py | 2silver/yafowil | b9776503f98f145f7aaaa4f61b73e238c92c534c | [
"BSD-3-Clause"
] | 8 | 2015-12-15T21:14:00.000Z | 2019-11-11T22:13:18.000Z | src/yafowil/tests/__init__.py | 2silver/yafowil | b9776503f98f145f7aaaa4f61b73e238c92c534c | [
"BSD-3-Clause"
] | 21 | 2015-11-21T10:12:12.000Z | 2021-06-03T06:51:53.000Z | src/yafowil/tests/__init__.py | 2silver/yafowil | b9776503f98f145f7aaaa4f61b73e238c92c534c | [
"BSD-3-Clause"
] | 5 | 2016-11-23T13:41:52.000Z | 2020-06-08T18:21:00.000Z | from __future__ import print_function
from node.tests import NodeTestCase
from yafowil.base import factory
from yafowil.compat import IS_PY2
import lxml.etree as etree
import sys
import unittest
import yafowil.common
import yafowil.compound
import yafowil.persistence
import yafowil.table
if not IS_PY2:
from importlib import reload
class YafowilTestCase(NodeTestCase):
def setUp(self):
super(YafowilTestCase, self).setUp()
factory.clear()
reload(yafowil.persistence)
reload(yafowil.common)
reload(yafowil.compound)
reload(yafowil.table)
def fxml(xml):
et = etree.fromstring(xml)
return etree.tostring(et, pretty_print=True).decode('utf-8')
def pxml(xml):
print(fxml(xml))
def test_suite():
from yafowil.tests import test_base
from yafowil.tests import test_common
from yafowil.tests import test_compound
from yafowil.tests import test_controller
from yafowil.tests import test_persistence
from yafowil.tests import test_resources
from yafowil.tests import test_table
from yafowil.tests import test_tsf
from yafowil.tests import test_utils
suite = unittest.TestSuite()
suite.addTest(unittest.findTestCases(test_base))
suite.addTest(unittest.findTestCases(test_common))
suite.addTest(unittest.findTestCases(test_compound))
suite.addTest(unittest.findTestCases(test_controller))
suite.addTest(unittest.findTestCases(test_persistence))
suite.addTest(unittest.findTestCases(test_resources))
suite.addTest(unittest.findTestCases(test_table))
suite.addTest(unittest.findTestCases(test_tsf))
suite.addTest(unittest.findTestCases(test_utils))
return suite
def run_tests():
from zope.testrunner.runner import Runner
runner = Runner(found_suites=[test_suite()])
runner.run()
sys.exit(int(runner.failed))
if __name__ == '__main__':
run_tests()
| 25.945946 | 64 | 0.752083 | 240 | 1,920 | 5.854167 | 0.275 | 0.086121 | 0.102491 | 0.140925 | 0.403559 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00187 | 0.164583 | 1,920 | 73 | 65 | 26.30137 | 0.874065 | 0 | 0 | 0 | 0 | 0 | 0.006771 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.092593 | false | 0 | 0.407407 | 0 | 0.555556 | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
9a7cfcbc63f3c97c82737bfbbfa13e26624618e7 | 214 | py | Python | src/librhc/cost/__init__.py | arnavthareja/mushr_pixelart_mpc | db6ee6cae9b4cb1d3b213fed06690074372c824b | [
"BSD-3-Clause"
] | 5 | 2019-08-30T08:20:27.000Z | 2021-08-01T17:16:16.000Z | src/librhc/cost/__init__.py | arnavthareja/mushr_pixelart_mpc | db6ee6cae9b4cb1d3b213fed06690074372c824b | [
"BSD-3-Clause"
] | 1 | 2020-09-09T13:38:08.000Z | 2020-12-15T12:20:26.000Z | src/librhc/cost/__init__.py | arnavthareja/mushr_pixelart_mpc | db6ee6cae9b4cb1d3b213fed06690074372c824b | [
"BSD-3-Clause"
] | 4 | 2019-09-14T21:26:09.000Z | 2021-08-27T23:01:41.000Z | # Copyright (c) 2019, The Personal Robotics Lab, The MuSHR Team, The Contributors of MuSHR
# License: BSD 3-Clause. See LICENSE.md file in root directory.
from .waypoints import Waypoints
__all__ = ["Waypoints"]
| 30.571429 | 90 | 0.757009 | 31 | 214 | 5.096774 | 0.806452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027778 | 0.158879 | 214 | 6 | 91 | 35.666667 | 0.85 | 0.700935 | 0 | 0 | 0 | 0 | 0.147541 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
9a9f85fc451de9881426ccefc8e13f03669bb8d6 | 491 | py | Python | cosmogrb/utils/fits_file.py | wematthias/cosmogrb | 09852eb4e6e7315bbede507e19a2d57f1b927c3f | [
"BSD-2-Clause"
] | 3 | 2020-03-08T18:20:32.000Z | 2022-03-10T17:27:26.000Z | cosmogrb/utils/fits_file.py | wematthias/cosmogrb | 09852eb4e6e7315bbede507e19a2d57f1b927c3f | [
"BSD-2-Clause"
] | 11 | 2020-03-04T17:21:15.000Z | 2020-06-09T12:20:00.000Z | cosmogrb/utils/fits_file.py | wematthias/cosmogrb | 09852eb4e6e7315bbede507e19a2d57f1b927c3f | [
"BSD-2-Clause"
] | 5 | 2020-03-18T18:05:05.000Z | 2022-03-21T16:06:38.000Z | from responsum.utils.fits_file import FITSFile, FITSExtension as FE
import pkg_resources
class FITSExtension(FE):
# I use __new__ instead of __init__ because I need to use the classmethod .from_columns instead of the
# constructor of fits.BinTableHDU
def __init__(self, data_tuple, header_tuple):
creator = "COSMOGRB v.%s" % (pkg_resources.get_distribution("cosmogrb").version)
super(FITSExtension, self).__init__(data_tuple, header_tuple, creator=creator)
| 32.733333 | 106 | 0.757637 | 65 | 491 | 5.338462 | 0.615385 | 0.069164 | 0.086455 | 0.115274 | 0.15562 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164969 | 491 | 14 | 107 | 35.071429 | 0.846341 | 0.268839 | 0 | 0 | 0 | 0 | 0.058989 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
9aa920f4f30751f1feef1f340c733399005558c4 | 1,235 | py | Python | venv/lib/python3.9/site-packages/py2app/recipes/PIL/prescript.py | dequeb/asmbattle | 27e8b209de5787836e288a2f2f9b7644ce07563e | [
"MIT"
] | 193 | 2020-01-15T09:34:20.000Z | 2022-03-18T19:14:16.000Z | venv/lib/python3.9/site-packages/py2app/recipes/PIL/prescript.py | dequeb/asmbattle | 27e8b209de5787836e288a2f2f9b7644ce07563e | [
"MIT"
] | 185 | 2020-01-15T08:38:27.000Z | 2022-03-27T17:29:29.000Z | venv/lib/python3.9/site-packages/py2app/recipes/PIL/prescript.py | dequeb/asmbattle | 27e8b209de5787836e288a2f2f9b7644ce07563e | [
"MIT"
] | 23 | 2020-01-24T14:47:18.000Z | 2022-02-22T17:19:47.000Z | def _recipes_pil_prescript(plugins):
try:
import Image
have_PIL = False
except ImportError:
from PIL import Image
have_PIL = True
import sys
def init():
if Image._initialized >= 2:
return
if have_PIL:
try:
import PIL.JpegPresets
sys.modules["JpegPresets"] = PIL.JpegPresets
except ImportError:
pass
for plugin in plugins:
try:
if have_PIL:
try:
# First try absolute import through PIL (for
# Pillow support) only then try relative imports
m = __import__("PIL." + plugin, globals(), locals(), [])
m = getattr(m, plugin)
sys.modules[plugin] = m
continue
except ImportError:
pass
__import__(plugin, globals(), locals(), [])
except ImportError:
print("Image: failed to import")
if Image.OPEN or Image.SAVE:
Image._initialized = 2
return 1
Image.init = init
| 26.276596 | 80 | 0.460729 | 110 | 1,235 | 5.018182 | 0.427273 | 0.050725 | 0.054348 | 0.065217 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004608 | 0.472874 | 1,235 | 46 | 81 | 26.847826 | 0.843318 | 0.072065 | 0 | 0.352941 | 0 | 0 | 0.033246 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0.058824 | 0.323529 | 0 | 0.441176 | 0.029412 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
9acbf669f84ad525253b32c114c4e395b93adc19 | 3,488 | py | Python | open-hackathon-tempUI/src/hackathon/config-sample.py | SpAiNiOr/LABOSS | 32ad341821e9f30fecfa338b5669f574d32dd0fa | [
"Apache-2.0"
] | null | null | null | open-hackathon-tempUI/src/hackathon/config-sample.py | SpAiNiOr/LABOSS | 32ad341821e9f30fecfa338b5669f574d32dd0fa | [
"Apache-2.0"
] | null | null | null | open-hackathon-tempUI/src/hackathon/config-sample.py | SpAiNiOr/LABOSS | 32ad341821e9f30fecfa338b5669f574d32dd0fa | [
"Apache-2.0"
] | null | null | null | # "javascript" section for javascript. see @app.route('/config.js') in app/views.py
# oauth constants
HOSTNAME = "http://hackathon.chinacloudapp.cn" # host name of the UI site
QQ_OAUTH_STATE = "openhackathon" # todo state should be constant. Actually it should be unguessable to prevent CSFA
HACkATHON_API_ENDPOINT = "http://hackathon.chinacloudapp.cn:15000"
Config = {
"environment": "local",
"login": {
"github": {
"access_token_url": 'https://github.com/login/oauth/access_token?client_id=a10e2290ed907918d5ab&client_secret=5b240a2a1bed6a6cf806fc2f34eb38a33ce03d75&redirect_uri=%s/github&code=' % HOSTNAME,
"user_info_url": 'https://api.github.com/user?access_token=',
"emails_info_url": 'https://api.github.com/user/emails?access_token='
},
"qq": {
"access_token_url": 'https://graph.qq.com/oauth2.0/token?grant_type=authorization_code&client_id=101192358&client_secret=d94f8e7baee4f03371f52d21c4400cab&redirect_uri=%s/qq&code=' % HOSTNAME,
"openid_url": 'https://graph.qq.com/oauth2.0/me?access_token=',
"user_info_url": 'https://graph.qq.com/user/get_user_info?access_token=%s&oauth_consumer_key=%s&openid=%s'
},
"gitcafe": {
"access_token_url": 'https://api.gitcafe.com/oauth/token?client_id=25ba4f6f90603bd2f3d310d11c0665d937db8971c8a5db00f6c9b9852547d6b8&client_secret=e3d821e82d15096054abbc7fbf41727d3650cab6404a242373f5c446c0918634&redirect_uri=%s/gitcafe&grant_type=authorization_code&code=' % HOSTNAME
},
"provider_enabled": ["github", "qq", "gitcafe"],
"session_minutes": 60,
"token_expiration_minutes": 60 * 24
},
"hackathon-api": {
"endpoint": HACkATHON_API_ENDPOINT
},
"javascript": {
"renren": {
"clientID": "client_id=7e0932f4c5b34176b0ca1881f5e88562",
"redirect_url": "redirect_uri=%s/renren" % HOSTNAME,
"scope": "scope=read_user_message+read_user_feed+read_user_photo",
"response_type": "response_type=token",
},
"github": {
"clientID": "client_id=a10e2290ed907918d5ab",
"redirect_uri": "redirect_uri=%s/github" % HOSTNAME,
"scope": "scope=user",
},
"google": {
"clientID": "client_id=304944766846-7jt8jbm39f1sj4kf4gtsqspsvtogdmem.apps.googleusercontent.com",
"redirect_url": "redirect_uri=%s/google" % HOSTNAME,
"scope": "scope=https://www.googleapis.com/auth/userinfo.profile+https://www.googleapis.com/auth/userinfo.email",
"response_type": "response_type=token",
},
"qq": {
"clientID": "client_id=101192358",
"redirect_uri": "redirect_uri=%s/qq" % HOSTNAME,
"scope": "scope=get_user_info",
"state": "state=%s" % QQ_OAUTH_STATE,
"response_type": "response_type=code",
},
"gitcafe": {
"clientID": "client_id=25ba4f6f90603bd2f3d310d11c0665d937db8971c8a5db00f6c9b9852547d6b8",
"clientSecret": "client_secret=e3d821e82d15096054abbc7fbf41727d3650cab6404a242373f5c446c0918634",
"redirect_uri": "redirect_uri=http://hackathon.chinacloudapp.cn/gitcafe",
"response_type": "response_type=code",
"scope": "scope=public"
},
"hackathon": {
"name": "open-xml-sdk",
"endpoint": HACkATHON_API_ENDPOINT
}
}
}
| 48.444444 | 294 | 0.648222 | 346 | 3,488 | 6.300578 | 0.315029 | 0.055505 | 0.038532 | 0.044037 | 0.261468 | 0.078899 | 0.048624 | 0 | 0 | 0 | 0 | 0.114754 | 0.213016 | 3,488 | 71 | 295 | 49.126761 | 0.679417 | 0.0582 | 0 | 0.1875 | 0 | 0.078125 | 0.64144 | 0.137321 | 0 | 0 | 0 | 0.014085 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
9ad97cd25d6ffe7ca83c1fced680d4dc39e56290 | 1,642 | py | Python | api/serializers.py | mariomtzjr/podemos_test | 5efaf02a19aa8c4849e3ad0108546e95af524126 | [
"MIT"
] | null | null | null | api/serializers.py | mariomtzjr/podemos_test | 5efaf02a19aa8c4849e3ad0108546e95af524126 | [
"MIT"
] | null | null | null | api/serializers.py | mariomtzjr/podemos_test | 5efaf02a19aa8c4849e3ad0108546e95af524126 | [
"MIT"
] | null | null | null | from rest_framework import serializers
from apps.calendarioPago.models import CalendarioPago
from apps.cliente.models import Cliente
from apps.cuenta.models import Cuenta
from apps.grupo.models import Grupo
from apps.miembro.models import Miembro
from apps.transaccion.models import Transaccion
# Serializers define the API representation.
class CalendarioPagoSerializer(serializers.ModelSerializer):
class Meta:
model = CalendarioPago
fields = ['id', 'cuenta_id', 'num_pago', 'monto', 'fecha_pago', 'estatus', ]
class ClienteSerializer(serializers.ModelSerializer):
class Meta:
model = Cliente
fields = ['id', 'nombre', ]
class MiembroSerializer(serializers.ModelSerializer):
cliente = ClienteSerializer(source='cliente_id', read_only=True)
class Meta:
model = Miembro
fields = ['cliente']
class MiembrosSerializer(serializers.ModelSerializer):
class Meta:
model = Miembro
fields = ['id', 'grupo_id', 'cliente_id']
class GrupoSerializer(serializers.ModelSerializer):
miembros = MiembroSerializer(many=True)
class Meta:
model = Grupo
fields = ['id', 'nombre', 'miembros']
class GruposSerializer(serializers.ModelSerializer):
class Meta:
model = Grupo
fields = ['id', 'nombre', ]
class TransaccionSerializer(serializers.ModelSerializer):
class Meta:
model = Transaccion
fields = ['id', 'cuenta_id', 'fecha', 'monto', ]
class CuentaSerializer(serializers.ModelSerializer):
class Meta:
model = Cuenta
fields = ['id', 'grupo_id', 'estatus', 'monto', 'saldo']
| 26.483871 | 84 | 0.694275 | 163 | 1,642 | 6.932515 | 0.269939 | 0.184071 | 0.099115 | 0.185841 | 0.293805 | 0.058407 | 0.058407 | 0 | 0 | 0 | 0 | 0 | 0.200974 | 1,642 | 61 | 85 | 26.918033 | 0.86128 | 0.025579 | 0 | 0.341463 | 0 | 0 | 0.098874 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.170732 | 0 | 0.609756 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
9ae66ae64bed27a4c419e21d360710c58e9c3114 | 1,589 | py | Python | turbinia/workers/fsstat.py | dfjxs/turbinia | 23a97d9d826cbcc51e6b5dfd50d85251506bf242 | [
"Apache-2.0"
] | 1 | 2021-05-31T19:44:50.000Z | 2021-05-31T19:44:50.000Z | turbinia/workers/fsstat.py | dfjxs/turbinia | 23a97d9d826cbcc51e6b5dfd50d85251506bf242 | [
"Apache-2.0"
] | null | null | null | turbinia/workers/fsstat.py | dfjxs/turbinia | 23a97d9d826cbcc51e6b5dfd50d85251506bf242 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright 2021 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Task to run fsstat on disk partitions."""
from __future__ import unicode_literals
import os
from turbinia import TurbiniaException
from turbinia.workers import TurbiniaTask
from turbinia.evidence import EvidenceState as state
from turbinia.evidence import ReportText
class FsstatTask(TurbiniaTask):
REQUIRED_STATES = [state.ATTACHED]
def run(self, evidence, result):
"""Task to execute fsstat.
Args:
evidence (Evidence object): The evidence we will process.
result (TurbiniaTaskResult): The object to place task results into.
Returns:
TurbiniaTaskResult object.
"""
fsstat_output = os.path.join(self.output_dir, 'fsstat.txt')
output_evidence = ReportText(source_path=fsstat_output)
cmd = ['sudo', 'fsstat', evidence.device_path]
result.log('Running fsstat as [{0!s}]'.format(cmd))
self.execute(
cmd, result, stdout_file=fsstat_output, new_evidence=[output_evidence],
close=True)
return result | 33.104167 | 79 | 0.733166 | 213 | 1,589 | 5.394366 | 0.568075 | 0.052219 | 0.022628 | 0.02785 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007686 | 0.181246 | 1,589 | 48 | 80 | 33.104167 | 0.87548 | 0.514789 | 0 | 0 | 0 | 0 | 0.062241 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.352941 | 0 | 0.588235 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
9aedf1a23d553278d5b929adc837502da68eda10 | 356 | py | Python | mayan/apps/mimetype/apps.py | eshbeata/open-paperless | 6b9ed1f21908116ad2795b3785b2dbd66713d66e | [
"Apache-2.0"
] | 2,743 | 2017-12-18T07:12:30.000Z | 2022-03-27T17:21:25.000Z | mayan/apps/mimetype/apps.py | eshbeata/open-paperless | 6b9ed1f21908116ad2795b3785b2dbd66713d66e | [
"Apache-2.0"
] | 15 | 2017-12-18T14:58:07.000Z | 2021-03-01T20:05:05.000Z | mayan/apps/mimetype/apps.py | eshbeata/open-paperless | 6b9ed1f21908116ad2795b3785b2dbd66713d66e | [
"Apache-2.0"
] | 257 | 2017-12-18T03:12:58.000Z | 2022-03-25T08:59:10.000Z | from __future__ import unicode_literals
from django.utils.translation import ugettext_lazy as _
from common import MayanAppConfig
from .licenses import * # NOQA
class MIMETypesApp(MayanAppConfig):
name = 'mimetype'
verbose_name = _('MIME types')
def ready(self, *args, **kwargs):
super(MIMETypesApp, self).ready(*args, **kwargs)
| 22.25 | 56 | 0.727528 | 41 | 356 | 6.097561 | 0.682927 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176966 | 356 | 15 | 57 | 23.733333 | 0.853242 | 0.011236 | 0 | 0 | 0 | 0 | 0.051429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.444444 | 0 | 0.888889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
9aefb8bc9120b71f8727047442cac13c02b21950 | 388 | py | Python | test/level.py | Matt-London/command-line-tutorial | 5b6afeedb4075de114e8c91756ecf3a03645fde7 | [
"MIT"
] | 1 | 2020-07-11T06:29:25.000Z | 2020-07-11T06:29:25.000Z | test/level.py | Matt-London/Command-Line-Tutorial | 5b6afeedb4075de114e8c91756ecf3a03645fde7 | [
"MIT"
] | 15 | 2020-07-10T20:01:51.000Z | 2020-08-10T05:23:47.000Z | test/level.py | Matt-London/command-line-tutorial | 5b6afeedb4075de114e8c91756ecf3a03645fde7 | [
"MIT"
] | null | null | null | from packages.levels.Level import Level
import packages.levels.levels as Levels
import packages.resources.functions as function
import packages.resources.variables as var
from packages.filesystem.Directory import Directory
from packages.filesystem.File import File
var.bash_history = ("Check")
test = Level("Instruct", "Help", ("Check"))
test.instruct()
test.help()
print(test.check()) | 27.714286 | 51 | 0.796392 | 52 | 388 | 5.923077 | 0.403846 | 0.116883 | 0.149351 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095361 | 388 | 14 | 52 | 27.714286 | 0.877493 | 0 | 0 | 0 | 0 | 0 | 0.056555 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.545455 | 0 | 0.545455 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
9af36b234d70f262e1618ab3933e4d7b9aedd9f4 | 2,760 | py | Python | scraper/models.py | mrcnc/assessor-scraper | b502ebb157048d20294ca44ab0d30e3a44d86c08 | [
"MIT"
] | null | null | null | scraper/models.py | mrcnc/assessor-scraper | b502ebb157048d20294ca44ab0d30e3a44d86c08 | [
"MIT"
] | null | null | null | scraper/models.py | mrcnc/assessor-scraper | b502ebb157048d20294ca44ab0d30e3a44d86c08 | [
"MIT"
] | 1 | 2019-02-14T04:01:40.000Z | 2019-02-14T04:01:40.000Z | # -*- coding: utf-8 -*-
import os
import logging
from sqlalchemy import create_engine, Column, Integer, String, ForeignKey
from sqlalchemy.engine.url import URL
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import relationship
from scraper import settings
Base = declarative_base()
def db_connect():
"""
Returns sqlalchemy engine instance
"""
if 'DATABASE_URL' in os.environ:
DATABASE_URL = os.environ['DATABASE_URL']
logging.debug("Connecting to %s", URL)
else:
DATABASE_URL = URL(**settings.DATABASE)
logging.debug("Connecting with settings %s", DATABASE_URL)
return create_engine(DATABASE_URL)
def create_tables(engine):
Base.metadata.create_all(engine)
class Property(Base):
__tablename__ = 'properties'
id = Column(Integer, primary_key=True)
property_key = Column(String, nullable=False)
todays_date = Column(String)
location = Column(String)
owner_name = Column(String)
mailing_address = Column(String)
municipal_district = Column(String)
location_address = Column(String)
tax_bill_number = Column(String)
property_class = Column(String)
special_tax_district = Column(String)
subdivision_name = Column(String)
land_area_sq_ft = Column(String)
zoning_district = Column(String)
building_area_sq_ft = Column(String)
square = Column(String)
lot = Column(String)
book = Column(String)
folio = Column(String)
line = Column(String)
parcel_map = Column(String)
legal_description = Column(String)
assessment_area = Column(String)
values = relationship('PropertyValue')
transfers = relationship('PropertyTransfer')
class PropertyValue(Base):
__tablename__ = 'property_values'
id = Column(Integer, primary_key=True)
property_id = Column(Integer, ForeignKey('properties.id'))
year = Column(String)
land_value = Column(String)
building_value = Column(String)
total_value = Column(String)
assessed_land_value = Column(String)
assessed_building_value = Column(String)
total_assessed_value = Column(String)
homestead_exemption_value = Column(String)
taxable_assessment = Column(String)
age_freeze = Column(String)
disability_freeze = Column(String)
assmnt_change = Column(String)
tax_contract = Column(String)
class PropertyTransfer(Base):
__tablename__ = 'property_transfers'
id = Column(Integer, primary_key=True)
property_id = Column(Integer, ForeignKey('properties.id'))
sale_transfer_date = Column(String)
price = Column(String)
grantor = Column(String)
grantee = Column(String)
notarial_archive_number = Column(String)
instrument_number = Column(String)
| 29.677419 | 73 | 0.721014 | 317 | 2,760 | 6.041009 | 0.334385 | 0.256919 | 0.062141 | 0.034465 | 0.148825 | 0.096606 | 0.096606 | 0.077285 | 0.077285 | 0.077285 | 0 | 0.000445 | 0.186232 | 2,760 | 92 | 74 | 30 | 0.852182 | 0.020652 | 0 | 0.069444 | 0 | 0 | 0.061407 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027778 | false | 0 | 0.097222 | 0 | 0.888889 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
9afbc58c35485195590c0111ab875fa7190d1ec1 | 621 | py | Python | kesko_webapp/models.py | kounelisagis/kesko-food-waste-hackathon | 6b66806aeaf4fc72ea96e47f152cd4bbd8b5a43d | [
"MIT"
] | 1 | 2019-12-29T16:16:54.000Z | 2019-12-29T16:16:54.000Z | kesko_webapp/models.py | kounelisagis/kesko-food-waste-hackathon | 6b66806aeaf4fc72ea96e47f152cd4bbd8b5a43d | [
"MIT"
] | 14 | 2019-11-16T18:27:51.000Z | 2022-02-26T20:17:01.000Z | kesko_webapp/models.py | kounelisagis/kesko-food-waste-hackathon | 6b66806aeaf4fc72ea96e47f152cd4bbd8b5a43d | [
"MIT"
] | 8 | 2019-11-15T20:27:32.000Z | 2020-08-26T16:21:48.000Z | from django.conf import settings
from django.contrib.auth.models import AbstractUser
from django.db import models
class User(AbstractUser):
pass
class Article(models.Model):
"""
Models available articles
"""
name = models.CharField(max_length=200)
serial_id = models.CharField(max_length=200)
class ArticlePurchase(models.Model):
"""
Models users purchases.
"""
article = models.ForeignKey(Article, on_delete=models.CASCADE)
user = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE)
amount = models.IntegerField()
date = models.DateTimeField()
| 23.884615 | 80 | 0.727858 | 73 | 621 | 6.09589 | 0.479452 | 0.067416 | 0.076404 | 0.107865 | 0.121348 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011673 | 0.172303 | 621 | 25 | 81 | 24.84 | 0.854086 | 0.078905 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.076923 | 0.230769 | 0 | 0.923077 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
b103796b9eb62b2e02e96ca3c1828f5ebc3886b8 | 3,137 | py | Python | example/06-modules/modules.py | iten-engineering/python | 97a79973c7727cd881974462db99a99d612b55f9 | [
"MIT"
] | null | null | null | example/06-modules/modules.py | iten-engineering/python | 97a79973c7727cd881974462db99a99d612b55f9 | [
"MIT"
] | null | null | null | example/06-modules/modules.py | iten-engineering/python | 97a79973c7727cd881974462db99a99d612b55f9 | [
"MIT"
] | null | null | null | # =============================================================================
# Python examples - modules
# =============================================================================
# -----------------------------------------------------------------------------
# Module
# -----------------------------------------------------------------------------
# Module
# - Mit Python können Definitionen (Funktionen, Klassen) in eine eigenen Datei (Modul) ausgelagert werden.
# - Die Definitionen eines Moduls können in andere Modlue oder das Hauptprogramm importiert und dort genutzt werden
# - Der Datei Name entspricht dabei dem Modulnamen mit dem Suffix ".py"
# - Innerhalb vom Modul ist der Modulname via die interen Varialble "__name__" verfügbar
import fibo
print ("Fibo sample:")
fibo.print_fib(100)
result = fibo.fib(100)
print(result)
print(("Show module details:"))
print(dir(fibo))
# -----------------------------------------------------------------------------
# Import
# -----------------------------------------------------------------------------
# Sample: `import module `
# - imports everything and keeps it in the module's namespace
# - module.func()
# - module.className.func()
# Sample: `from module import *`
# - imports everything under the current namespace
# - func()
# - className.func()
# > not recommended
# Sample: `from module import className`
# - selectively imports under the current namespace
# - className.func()
# - like standard modules: math, os, sys
# -----------------------------------------------------------------------------
# Import with custom name
# -----------------------------------------------------------------------------
# game.py
# import the draw module
# if visual_mode:
# # in visual mode, we draw using graphics
# import draw_visual as draw
# else:
# # in textual mode, we print out text
# import draw_textual as draw
#
# def main():
# result = play_game()
# # this can either be visual or textual depending on visual_mode
# draw.draw_game(result)
# -----------------------------------------------------------------------------
# Executing modules as scripts
# -----------------------------------------------------------------------------
# When you run a Python module with: python fibo.py <arguments>
# - the code in the module will be executed, just as if you imported it,
# - but with the __name__ set to "__main__".
# That means that by adding this code at the end of your module:
# if __name__ == "__main__":
# import sys
# fib(int(sys.argv[1]))
# you can make the file usable as a script as well as an importable module,
# because the code that parses the command line only runs if the module is executed as the “main” file!
# -----------------------------------------------------------------------------
# Further details
# -----------------------------------------------------------------------------
# Links:
# - https://docs.python.org/3/tutorial/modules.html
# - https://realpython.com/python-modules-packages/
# =============================================================================
# The end.
| 34.855556 | 115 | 0.476889 | 302 | 3,137 | 4.864238 | 0.5 | 0.01838 | 0.014976 | 0.029952 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002956 | 0.137392 | 3,137 | 89 | 116 | 35.247191 | 0.539911 | 0.907555 | 0 | 0 | 0 | 0 | 0.143498 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.714286 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
b10a7ba9df13f93730fafc42256936a0555a720d | 7,034 | py | Python | autoelective/util.py | apomeloYM/PKUAutoElective | 21b4ab000919f68080e7a942ddff4ca070cf41e7 | [
"MIT"
] | null | null | null | autoelective/util.py | apomeloYM/PKUAutoElective | 21b4ab000919f68080e7a942ddff4ca070cf41e7 | [
"MIT"
] | null | null | null | autoelective/util.py | apomeloYM/PKUAutoElective | 21b4ab000919f68080e7a942ddff4ca070cf41e7 | [
"MIT"
] | 2 | 2020-02-07T04:02:14.000Z | 2020-02-16T23:34:16.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
# filename: util.py
import os
import csv
from copy import deepcopy
import hashlib
from functools import wraps
from collections import OrderedDict
from ._compat import json, JSONDecodeError
from .exceptions import NoInstanceError, ImmutableTypeError, ReadonlyPropertyError
__Util_Funcs__ = ["mkdir","json_load","json_dump","read_csv","to_bytes","to_utf8","MD5","SHA1",]
__Util_Class__ = ["ImmutableAttrsMixin",]
__Util_Decorator__ = ["singleton","noinstance","ReadonlyProperty",]
__Util_MetaClass__ = ["Singleton","NoInstance",]
__all__ = __Util_Funcs__ + __Util_Class__ + __Util_Decorator__ + __Util_MetaClass__
def to_bytes(s):
if isinstance(s, (str,int,float)):
return str(s).encode("utf-8")
elif isinstance(s, bytes):
return s
else:
raise TypeError
def to_utf8(s):
if isinstance(s, bytes):
return s.decode("utf-8")
elif isinstance(s, (str,int,float)):
return str(s)
else:
raise TypeError
def MD5(data):
return hashlib.md5(to_bytes(data)).hexdigest()
def SHA1(data):
return hashlib.sha1(to_bytes(data)).hexdigest()
def mkdir(path):
if not os.path.exists(path):
os.mkdir(path)
def json_load(file, *args, **kwargs):
if not file:
return None
elif not os.path.exists(file):
return None
else:
with open(file, "r", encoding="utf-8-sig") as fp:
try:
return json.load(fp, *args, **kwargs)
except JSONDecodeError:
return None
def json_dump(obj, file, *args, **kwargs):
with open(file, "w", encoding="utf-8") as fp:
json.dump(obj, fp, *args, **kwargs)
def read_csv(file, encoding="utf-8-sig"):
with open(file, "r", encoding=encoding, newline="") as fp:
reader = csv.DictReader(fp)
return [_ for _ in reader]
def noinstance(cls):
""" 被修饰类不能被继承! """
@wraps(cls)
def wrapper(*args, **kwargs):
raise NoInstanceError("class %s cannot be instantiated" % cls.__name__)
return wrapper
def singleton(cls):
""" 被修饰类不能被继承! """
_inst = {}
@wraps(cls)
def get_inst(*args, **kwargs):
if cls not in _inst:
_inst[_cls] = _cls(*args, **kwargs)
return _inst[cls]
return get_inst
class Singleton(type):
"""
Singleton Metaclass
@link https://github.com/jhao104/proxy_pool/blob/428359c8dada998481f038dbdc8d3923e5850c0e/Util/utilClass.py
"""
_inst = {}
def __call__(cls, *args, **kwargs):
if cls not in cls._inst:
cls._inst[cls] = super(Singleton, cls).__call__(*args)
return cls._inst[cls]
class NoInstance(type):
def __call__(cls, *args, **kwargs):
raise NoInstanceError("class %s cannot be instantiated" % cls.__name__)
def _is_readonly(obj, key):
raise ReadonlyPropertyError("'%s.%s' property is read-only" % (obj.__class__.__name__, key))
class ReadonlyProperty(property):
""" 只读属性 """
def __init__(self, fget=None, fset=None, fdel=None, doc=None):
super().__init__(fget, None, None, doc) # 直接丢弃 fset, fdel
def __set__(self, obj, value):
_is_readonly(obj, self.fget.__name__)
def __delete__(self, obj):
_is_readonly(obj, self.fget.__name__)
def _is_immutable(self):
raise ImmutableTypeError('%r objects are immutable' % self.__class__.__name__)
class ImmutableDictMixin(object):
"""
@link https://github.com/pallets/werkzeug/blob/master/werkzeug/datastructures.py
Makes a :class:`dict` immutable.
.. versionadded:: 0.5
:private:
"""
_hash_cache = None
@classmethod
def fromkeys(cls, keys, value=None):
instance = super(cls, cls).__new__(cls)
instance.__init__(zip(keys, repeat(value)))
return instance
def __reduce_ex__(self, protocol):
return type(self), (dict(self),)
def _iter_hashitems(self):
return iteritems(self)
def __hash__(self):
if self._hash_cache is not None:
return self._hash_cache
rv = self._hash_cache = hash(frozenset(self._iter_hashitems()))
return rv
def setdefault(self, key, default=None):
_is_immutable(self)
def update(self, *args, **kwargs):
_is_immutable(self)
def pop(self, key, default=None):
_is_immutable(self)
def popitem(self):
_is_immutable(self)
def __setitem__(self, key, value):
_is_immutable(self)
def __delitem__(self, key):
_is_immutable(self)
def clear(self):
_is_immutable(self)
class ImmutableDict(ImmutableDictMixin, dict):
"""
@link https://github.com/pallets/werkzeug/blob/master/werkzeug/datastructures.py
An immutable :class:`dict`.
.. versionadded:: 0.5
"""
def __repr__(self):
return '%s(%s)' % (
self.__class__.__name__,
dict.__repr__(self),
)
def copy(self):
"""Return a shallow mutable copy of this object. Keep in mind that
the standard library's :func:`copy` function is a no-op for this class
like for any other python immutable type (eg: :class:`tuple`).
"""
return dict(self)
def __copy__(self):
return self
class ImmutableAttrsMixin(object):
"""
very ugly !!!
一种不可变对象的实现
只允许创建时赋值,创建后不能再修改:
- 不能通过 __setattr__/__delattr__ 实现修改
- 不能通过 __dict__.__setitem__/__delitem__ 实现修改
关于复制:
- 调用 copy.copy 返回本身
- 调用 copy.deepcopy 返回新对象
"""
__cache_init = {}
def __init__(self, *args, **kwargs):
### 重置 __init__ ###
#print("ImmutableAttrsMixin.__init__")
self.__class__.__init__ = __class__.__cache_init[self.__class__.__name__]
def __new__(cls, *args, **kwargs):
#print("ImmutableAttrsMixin.__new__ Start")
### 暂时允许修改 ###
cls.__setattr__ = object.__setattr__
cls.__delattr__ = object.__delattr__
### 初始化对象 ###
obj = object.__new__(cls)
cls.__init__(obj, *args, **kwargs)
### 禁止通过 __dict__ 修改对象 ###
obj.__dict__ = ImmutableDict(obj.__dict__)
### 重新设置为不允许修改,并取消 __init__ 函数 ###
cls.__setattr__ = __class__.__setattr__
cls.__delattr__ = __class__.__delattr__
### 暂时取消 __init__ 函数,避免自动 __init__ 时报错 ###
__class__.__cache_init[cls.__name__] = cls.__init__
cls.__init__ = __class__.__init__
#print("ImmutableAttrsMixin.__new__ End")
return obj
def __setattr__(self, key, value):
#print("ImmutableAttrsMixin.__setitem__")
_is_immutable(self)
def __delattr__(self, key):
#print("ImmutableAttrsMixin.__delitem__")
_is_immutable(self)
def __copy__(self):
return self
def __deepcopy__(self, memo=None):
"""
@link https://github.com/pallets/werkzeug/blob/master/werkzeug/datastructures.py --> MultiDict
"""
return self.__class__(**deepcopy(self.__dict__, memo=memo))
| 27.802372 | 111 | 0.633068 | 819 | 7,034 | 4.929182 | 0.263736 | 0.032202 | 0.037156 | 0.03567 | 0.208571 | 0.150855 | 0.128561 | 0.114689 | 0.081001 | 0.081001 | 0 | 0.008985 | 0.240546 | 7,034 | 252 | 112 | 27.912698 | 0.746724 | 0.186807 | 0 | 0.209459 | 0 | 0 | 0.05153 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.263514 | false | 0 | 0.054054 | 0.047297 | 0.554054 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b112b2802063ecfa7ce3db6c16ab4326c7eda2fb | 1,746 | py | Python | nsm.py | svepe/neural-stack | c48e6b94f00e77cedd9d692bdc2a6715bb007db5 | [
"MIT"
] | null | null | null | nsm.py | svepe/neural-stack | c48e6b94f00e77cedd9d692bdc2a6715bb007db5 | [
"MIT"
] | 1 | 2017-07-26T07:18:42.000Z | 2017-07-26T07:18:42.000Z | nsm.py | svepe/neural-stack | c48e6b94f00e77cedd9d692bdc2a6715bb007db5 | [
"MIT"
] | null | null | null | import numpy as np
import chainer.functions as F
from chainer import Variable
def neural_stack(V, s, d, u, v):
# strengths
s_new = d
for t in reversed(xrange(s.shape[1])):
x = s[:, t].reshape(-1, 1) - u
s_new = F.concat((s_new, F.maximum(Variable(np.zeros_like(x.data)), x)))
u = F.maximum(Variable(np.zeros_like(x.data)), -x)
s = F.fliplr(s_new)
# memory
V = F.concat((V, F.expand_dims(v, 1)))
# result
r = Variable(np.zeros_like(v.data))
ur = Variable(np.ones_like(u.data))
for t in reversed(xrange(s_new.shape[1])):
w = F.minimum(s[:, t].reshape(-1, 1), ur)
r += V[:, t] * F.broadcast_to(w, V[:, t].shape)
x = ur - s[:, t].reshape(-1, 1)
ur = F.maximum(Variable(np.zeros_like(x.data)), x)
return V, s, r
batch_size = 3
stack_element_size = 2
V = Variable(np.zeros((batch_size, 1, stack_element_size)))
s = Variable(np.zeros((batch_size, 1)))
d = Variable(np.ones((batch_size, 1)) * 0.4)
u = Variable(np.ones((batch_size, 1)) * 0.)
v = Variable(np.ones((batch_size, stack_element_size)))
V, s, r = neural_stack(V, s, d, u, v)
d = Variable(np.ones((batch_size, 1)) * 0.8)
u = Variable(np.ones((batch_size, 1)) * 0.)
v = Variable(np.ones((batch_size, stack_element_size)) * 2.)
V, s, r = neural_stack(V, s, d, u, v)
d = Variable(np.ones((batch_size, 1)) * 0.9)
u = Variable(np.ones((batch_size, 1)) * 0.9)
v = Variable(np.ones((batch_size, stack_element_size)) * 3.)
V, s, r = neural_stack(V, s, d, u, v)
d = Variable(np.ones((batch_size, 1)) * 0.1)
u = Variable(np.ones((batch_size, 1)) * 0.1)
v = Variable(np.ones((batch_size, stack_element_size)) * 3.)
V, s, r = neural_stack(V, s, d, u, v)
print V.data
print s.data
print r.data
| 29.59322 | 80 | 0.613402 | 326 | 1,746 | 3.150307 | 0.168712 | 0.185005 | 0.177215 | 0.222006 | 0.694255 | 0.665044 | 0.550146 | 0.534567 | 0.454722 | 0.358325 | 0 | 0.026912 | 0.191294 | 1,746 | 58 | 81 | 30.103448 | 0.700425 | 0.013173 | 0 | 0.190476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.071429 | null | null | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b11a302f53a38192c5dd68e4767ae96d3e146ef3 | 301 | py | Python | run.py | Prakash2403/ultron | 7d1067eb98ef52f6a88299534ea204e7ae45d7a7 | [
"MIT"
] | 13 | 2017-08-15T15:50:13.000Z | 2019-06-03T10:24:50.000Z | run.py | Prakash2403/ultron | 7d1067eb98ef52f6a88299534ea204e7ae45d7a7 | [
"MIT"
] | 3 | 2017-08-29T16:35:04.000Z | 2021-06-01T23:49:16.000Z | run.py | Prakash2403/ultron | 7d1067eb98ef52f6a88299534ea204e7ae45d7a7 | [
"MIT"
] | 4 | 2017-08-16T09:33:59.000Z | 2019-06-05T07:25:30.000Z | #! /usr/bin/python3
from default_settings import default_settings
from ultron_cli import UltronCLI
if __name__ == '__main__':
default_settings()
try:
UltronCLI().cmdloop()
except KeyboardInterrupt:
print("\nInterrupted by user.")
print("Goodbye")
exit(0)
| 23.153846 | 45 | 0.664452 | 32 | 301 | 5.875 | 0.75 | 0.239362 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008658 | 0.232558 | 301 | 12 | 46 | 25.083333 | 0.805195 | 0.063123 | 0 | 0 | 0 | 0 | 0.131673 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.2 | 0 | 0.2 | 0.2 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b11f6265d46fdca364a4dd3bf4dcf5a12d2f410f | 2,871 | py | Python | praetorian_ssh_proxy/hanlers/menu_handler.py | Praetorian-Defence/praetorian-ssh-proxy | 068141bf0cee9fcf10434fab2dc5c16cfdd35f5a | [
"MIT"
] | null | null | null | praetorian_ssh_proxy/hanlers/menu_handler.py | Praetorian-Defence/praetorian-ssh-proxy | 068141bf0cee9fcf10434fab2dc5c16cfdd35f5a | [
"MIT"
] | null | null | null | praetorian_ssh_proxy/hanlers/menu_handler.py | Praetorian-Defence/praetorian-ssh-proxy | 068141bf0cee9fcf10434fab2dc5c16cfdd35f5a | [
"MIT"
] | null | null | null | import sys
class MenuHandler(object):
def __init__(self, client, client_channel, remote_checker):
self._client = client
self._client_channel = client_channel
self._buffer = ''
self._remote_checker = remote_checker
@staticmethod
def create_from_channel(client, client_channel, remote_checker) -> 'MenuHandler':
return MenuHandler(client, client_channel, remote_checker)
def serve_remote_menu(self):
if not self._remote_checker.is_remote_set:
self._client_channel.send('-------------------------------------------------\r\n')
self._client_channel.send('| Welcome to Praetorian SSH proxy |\r\n')
self._client_channel.send('-------------------------------------------------\r\n')
for counter, remote in enumerate(self._remote_checker.remote, 1):
self._client_channel.send('| ({:10}) {:30} {} |\r\n'.format(remote.project['name'], remote.name, counter))
self._client_channel.send('-------------------------------------------------\r\n')
self._client_channel.send('| {:43} {} |\r\n'.format('exit', len(self._remote_checker.remote) + 1))
self._client_channel.send('-------------------------------------------------\r\n')
self._client_channel.send('Choose your remote: ')
while True:
data = self._client_channel.recv(1024)
if not data:
continue
# BACKSPACE
if data == b'\x7f':
self._buffer = self._buffer[:-1]
self._client_channel.send('\b \b')
# EXIT (CTRL+C)
elif data == b'\x03':
self._client_channel.send(f'\n\rExiting ...\r\n')
self._client.close()
sys.exit(0)
# ENTER
elif data == b'\r':
if self._buffer in map(str, range(1, len(self._remote_checker.remote) + 1)):
self._remote_checker.set_remote(self._remote_checker.remote[int(self._buffer) - 1])
self._client_channel.send(f'\n\rChosen remote {self._remote_checker.remote.name}.\r\n')
return
elif self._buffer == str(len(self._remote_checker.remote) + 1):
self._client_channel.send(f'\n\rExiting ...\r\n')
self._client.close()
sys.exit(0)
else:
self._client_channel.send('\n\rWrong option.\r\n')
self._client_channel.send('Choose your option: ')
self._buffer = ''
else:
self._client_channel.send(data)
self._buffer += data.decode()
| 45.571429 | 122 | 0.492163 | 293 | 2,871 | 4.542662 | 0.249147 | 0.157776 | 0.21713 | 0.236664 | 0.534936 | 0.381668 | 0.362885 | 0.290008 | 0.257701 | 0.223892 | 0 | 0.011381 | 0.326715 | 2,871 | 62 | 123 | 46.306452 | 0.677186 | 0.010101 | 0 | 0.291667 | 0 | 0 | 0.174419 | 0.088443 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.020833 | 0.020833 | 0.145833 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b123989fc301ccc896657660002120b9f5336e64 | 6,451 | py | Python | xenavalkyrie/xena_object.py | xenadevel/PyXenaValkyrie | 9bb1d0b058c45dc94a778fd674a679b53f03a34c | [
"Apache-2.0"
] | 4 | 2018-07-13T08:09:38.000Z | 2022-02-09T01:36:13.000Z | xenavalkyrie/xena_object.py | xenadevel/PyXenaValkyrie | 9bb1d0b058c45dc94a778fd674a679b53f03a34c | [
"Apache-2.0"
] | 1 | 2019-07-31T04:56:43.000Z | 2019-08-01T07:11:21.000Z | xenavalkyrie/xena_object.py | xenadevel/PyXenaValkyrie | 9bb1d0b058c45dc94a778fd674a679b53f03a34c | [
"Apache-2.0"
] | 3 | 2019-05-30T23:47:02.000Z | 2022-02-04T12:32:14.000Z | """
Base classes and utilities for all Xena Manager (Xena) objects.
:author: yoram@ignissoft.com
"""
import time
import re
import logging
from collections import OrderedDict
from trafficgenerator.tgn_utils import TgnError
from trafficgenerator.tgn_object import TgnObject, TgnObjectsDict
logger = logging.getLogger(__name__)
class XenaAttributeError(TgnError):
pass
class XenaObjectsDict(TgnObjectsDict):
def __getitem__(self, key):
""" Override default implementation and allow access with index as well. """
if TgnObjectsDict.__getitem__(self, key) is not None:
return TgnObjectsDict.__getitem__(self, key)
else:
for obj in self:
if obj.index == key:
return OrderedDict.__getitem__(self, obj)
class XenaObject(TgnObject):
""" Base class for all Xena objects. """
def __init__(self, **data):
if data['parent']:
self.session = data['parent'].session
self.chassis = data['parent'].chassis
if 'objRef' not in data:
data['objRef'] = '{}/{}/{}'.format(data['parent'].ref, data['objType'], data['index'].split('/')[-1])
if 'name' not in data:
data['name'] = data['index']
super(XenaObject, self).__init__(**data)
def obj_index(self):
"""
:return: object index.
"""
return str(self._data['index'])
index = property(obj_index)
def obj_id(self):
"""
:return: object ID.
"""
return int(self.index.split('/')[-1]) if self.index else None
id = property(obj_id)
def _create(self):
self.api.create(self)
def reserve(self, force=False):
""" Reserve object.
XenaManager-2G -> [Relinquish]/Reserve Chassis/Module/Port.
:param force: True - take forcefully, False - fail if port is reserved by other user
"""
reservation = self.get_attribute(self.cli_prefix + '_reservation')
if reservation == 'RESERVED_BY_YOU':
return
elif reservation == 'RESERVED_BY_OTHER' and not force:
reservedby = self.get_attribute(self.cli_prefix + '_reservedby')
raise TgnError('Resource {} reserved by {}'.format(self, reservedby))
self.relinquish()
self.send_command(self.cli_prefix + '_reservation', 'reserve')
def relinquish(self):
""" Relinquish object.
XenaManager-2G -> Relinquish Chassis/Module/Port.
"""
if self.get_attribute(self.cli_prefix + '_reservation') != 'RELEASED':
self.send_command(self.cli_prefix + '_reservation relinquish')
def release(self):
""" Release object.
XenaManager-2G -> Release Chassis/Module/Port.
"""
if self.get_attribute(self.cli_prefix + '_reservation') == 'RESERVED_BY_YOU':
self.send_command(self.cli_prefix + '_reservation release')
def send_command(self, command, *arguments):
""" Send command with no output.
:param command: command to send.
:param arguments: list of command arguments.
"""
self.api.send_command(self, command, *arguments)
def send_command_return(self, command, *arguments):
""" Send command and wait for single line output. """
return self.api.send_command_return(self, command, *arguments)
def send_command_return_multilines(self, command, *arguments):
""" Send command and wait for multiple lines output. """
return self.api.send_command_return_multilines(self, command, *arguments)
def set_attributes(self, **attributes):
""" Sets list of attributes.
:param attributes: dictionary of {attribute: value} to set.
"""
try:
self.api.set_attributes(self, **attributes)
except Exception as e:
if '<notwritable>' in repr(e).lower() or '<badvalue>' in repr(e).lower():
raise XenaAttributeError(e)
else:
raise e
def get_attribute(self, attribute):
""" Returns single object attribute.
:param attribute: requested attribute to query.
:returns: returned value.
:rtype: str
"""
try:
return self.api.get_attribute(self, attribute)
except Exception as e:
if '#syntax error' in repr(e).lower() or 'keyerror' in repr(e).lower():
raise XenaAttributeError(e)
else:
raise e
def get_attributes(self):
""" Returns all object's attributes.
:returns: dictionary of <name, value> of all attributes.
:rtype: dict of (str, str)
"""
return self.api.get_attributes(self)
def wait_for_states(self, attribute, timeout=40, *states):
for _ in range(timeout):
if self.get_attribute(attribute).lower() in [s.lower() for s in states]:
return
time.sleep(1)
raise TgnError('{} failed to reach state {}, state is {} after {} seconds'.
format(attribute, states, self.get_attribute(attribute), timeout))
def read_stat(self, captions, stat_name):
return dict(zip(captions, self.api.get_stats(self, stat_name)))
#
# Private methods.
#
def _build_index_command(self, command, *arguments):
return ('{} {}' + len(arguments) * ' {}').format(self.index, command, *arguments)
def _extract_return(self, command, index_command_value):
return re.sub('{}\s*{}\s*'.format(self.index, command.upper()), '', index_command_value)
def _get_index_len(self):
return len(self.index.split())
def _get_command_len(self):
return len(self.index.split())
class XenaObject21(XenaObject):
""" Base class for all Xena objects with index_len = 2 and command_len = 1. """
#
# Private methods.
#
def _build_index_command(self, command, *arguments):
module, port, sid = self.index.split('/')
return ('{}/{} {} [{}]' + len(arguments) * ' {}').format(module, port, command, sid, *arguments)
def _extract_return(self, command, index_command_value):
module, port, sid = self.index.split('/')
return re.sub('{}/{}\s*{}\s*\[{}\]\s*'.format(module, port, command.upper(), sid), '', index_command_value)
def _get_index_len(self):
return 2
def _get_command_len(self):
return 1
| 32.746193 | 115 | 0.611843 | 737 | 6,451 | 5.189959 | 0.225237 | 0.03451 | 0.04183 | 0.037647 | 0.345882 | 0.30902 | 0.277647 | 0.156863 | 0.135425 | 0.058039 | 0 | 0.002942 | 0.262285 | 6,451 | 196 | 116 | 32.913265 | 0.800798 | 0.181987 | 0 | 0.245098 | 0 | 0 | 0.085222 | 0.004411 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0.009804 | 0.058824 | 0.068627 | 0.539216 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b12c849d2ef4e720802c1f093c8c0678dd35a0b0 | 1,061 | py | Python | app/models/news_article_test.py | engineer237/News-application | 66d7e8d70c5c023292dea4f5b87bd11ab5fb102e | [
"MIT"
] | null | null | null | app/models/news_article_test.py | engineer237/News-application | 66d7e8d70c5c023292dea4f5b87bd11ab5fb102e | [
"MIT"
] | null | null | null | app/models/news_article_test.py | engineer237/News-application | 66d7e8d70c5c023292dea4f5b87bd11ab5fb102e | [
"MIT"
] | null | null | null | import unittest # module for testing
from models import news_article
class TestNewsArticles(unittest.TestCase):
'''
TestNewsArticles for testing class news_articles
'''
def setUp(self):
'''
Set up method to run before each test cases.
'''
# id, title ,poster, url_link, description, published_date, content
self.new_article = news_article.News_Article(
'123',
'BBc News',
"https://www.independent.co.uk/news/world/americas/miss-usa-2019-cheslie-kryst-death-b2003891.html",
"A woman who jumped to her death from a New York City high rise apartment building has been identified as former Miss USA Cheslie Kryst.\r\nThe 2019 pageant winner who had a ninth floor apartment in Man… [+2506 chars]",
"2022-01-31T05:07:00Z",
"With a calm that belied their underdog status, the Bengals intercepted Patrick Mahomes and completed a field-goal drive in overtime to end Kansas City’s streak of Super Bowl appearances.")
if __name__ == "__main__":
unittest.main() | 48.227273 | 227 | 0.697455 | 149 | 1,061 | 4.885906 | 0.765101 | 0.04533 | 0.049451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043373 | 0.217719 | 1,061 | 22 | 228 | 48.227273 | 0.83012 | 0.168709 | 0 | 0 | 0 | 0.230769 | 0.638197 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.153846 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b12e9ab06bf81720fa6f6bbe4f8fd67e00e19bb0 | 977 | py | Python | tests/test_utility.py | ericbdaniels/pygeostat | 94d9cba9265945268f08302f86ce5ba1848fd601 | [
"MIT"
] | null | null | null | tests/test_utility.py | ericbdaniels/pygeostat | 94d9cba9265945268f08302f86ce5ba1848fd601 | [
"MIT"
] | null | null | null | tests/test_utility.py | ericbdaniels/pygeostat | 94d9cba9265945268f08302f86ce5ba1848fd601 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from __future__ import absolute_import, division, print_function
__author__ = 'pygeostat development team'
__date__ = '2020-01-04'
__version__ = '1.0.0'
import os, sys
try:
import pygeostat as gs
except ImportError:
sys.path.append(os.path.abspath(os.path.join(os.path.dirname( __file__ ), r'..')))
import pygeostat as gs
import unittest
import warnings
import subprocess
class GetExecutableTest(unittest.TestCase):
'''
Test suite for loading executable files from a protected repository
'''
def setUp(self):
self.wrong_token = 'wrong_token'
def test_token_error(self):
'''
Test if the proper error handleing is in place for wrong access token
'''
with self.assertRaises(Exception):
gs.get_executable(access_token=self.wrong_token)
if __name__ == '__main__':
subprocess.call([sys.executable, '-m', 'unittest', str(__file__), '-v']) | 22.204545 | 86 | 0.684749 | 125 | 977 | 5.024 | 0.624 | 0.028662 | 0.05414 | 0.06051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015345 | 0.199591 | 977 | 44 | 87 | 22.204545 | 0.787724 | 0.184237 | 0 | 0.095238 | 0 | 0 | 0.097625 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 1 | 0.095238 | false | 0 | 0.380952 | 0 | 0.52381 | 0.047619 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b14a3e4e999395aab5aac5de3e1df984c03e66f4 | 690 | py | Python | casepro/translation.py | praekelt/helpdesk | 69a7242679c30d2f7cb30a433809e738b9756a3c | [
"BSD-3-Clause"
] | 5 | 2015-07-21T15:58:31.000Z | 2019-09-14T22:34:00.000Z | casepro/translation.py | praekelt/helpdesk | 69a7242679c30d2f7cb30a433809e738b9756a3c | [
"BSD-3-Clause"
] | 197 | 2015-03-24T15:26:04.000Z | 2017-11-28T19:24:37.000Z | casepro/translation.py | praekelt/helpdesk | 69a7242679c30d2f7cb30a433809e738b9756a3c | [
"BSD-3-Clause"
] | 10 | 2015-03-24T12:26:36.000Z | 2017-02-21T13:08:57.000Z | from __future__ import unicode_literals
from django.utils.translation import ugettext as _
from django.utils.translation import get_language as _get_language
from modeltranslation.translator import translator, TranslationOptions
from modeltranslation import utils
from nsms.text.models import Text
class TextTranslationOptions(TranslationOptions):
fields = ('text',)
translator.register(Text, TextTranslationOptions)
# need to translate something for django translations to kick in
_("Something to trigger localizations")
# monkey patch a version of get_language that isn't broken
def get_language():
lang = _get_language()
return lang
utils.get_language = get_language
| 27.6 | 70 | 0.815942 | 85 | 690 | 6.435294 | 0.517647 | 0.140768 | 0.054845 | 0.095064 | 0.117002 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131884 | 690 | 24 | 71 | 28.75 | 0.913189 | 0.172464 | 0 | 0 | 0 | 0 | 0.066901 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.428571 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b14c88c3a21671daaf4ca901cbbd386b9d8bf26a | 703 | py | Python | pytools/mpiwrap.py | nchristensen/pytools | 82da2e0aad6863763f1950318bcb933662020135 | [
"MIT"
] | 52 | 2015-06-23T10:30:24.000Z | 2021-07-28T20:50:31.000Z | pytools/mpiwrap.py | nchristensen/pytools | 82da2e0aad6863763f1950318bcb933662020135 | [
"MIT"
] | 72 | 2015-10-22T18:57:08.000Z | 2022-03-01T00:04:45.000Z | pytools/mpiwrap.py | nchristensen/pytools | 82da2e0aad6863763f1950318bcb933662020135 | [
"MIT"
] | 27 | 2015-09-14T07:24:04.000Z | 2021-12-17T14:31:33.000Z | """See pytools.prefork for this module's reason for being."""
import mpi4py.rc # pylint:disable=import-error
mpi4py.rc.initialize = False
from mpi4py.MPI import * # noqa pylint:disable=wildcard-import,wrong-import-position
import pytools.prefork # pylint:disable=wrong-import-position
pytools.prefork.enable_prefork()
if Is_initialized(): # noqa pylint:disable=undefined-variable
raise RuntimeError("MPI already initialized before MPI wrapper import")
def InitWithAutoFinalize(*args, **kwargs): # noqa
result = Init(*args, **kwargs) # noqa pylint:disable=undefined-variable
import atexit
atexit.register(Finalize) # noqa pylint:disable=undefined-variable
return result
| 33.47619 | 85 | 0.762447 | 88 | 703 | 6.068182 | 0.5 | 0.146067 | 0.127341 | 0.146067 | 0.191011 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004926 | 0.133713 | 703 | 20 | 86 | 35.15 | 0.871921 | 0.428165 | 0 | 0 | 0 | 0 | 0.125641 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.416667 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b14d75f54839eba4678025c29ab6853f284addcb | 1,571 | py | Python | make/requirements.py | Fizzadar/Kanmail | 3915b1056949b50410478d1519b9276d64ef4f5d | [
"OpenSSL"
] | 12 | 2019-02-10T21:18:53.000Z | 2020-02-17T07:40:48.000Z | make/requirements.py | Fizzadar/Kanmail | 3915b1056949b50410478d1519b9276d64ef4f5d | [
"OpenSSL"
] | 71 | 2017-11-17T07:13:02.000Z | 2020-04-03T15:25:43.000Z | make/requirements.py | Fizzadar/Kanmail | 3915b1056949b50410478d1519b9276d64ef4f5d | [
"OpenSSL"
] | 1 | 2020-02-15T03:16:13.000Z | 2020-02-15T03:16:13.000Z | from distutils.spawn import find_executable
from os import path
import click
from .settings import (
BASE_DEVELOPMENT_REQUIREMENTS_FILENAME,
BASE_REQUIREMENTS_FILENAME,
DEVELOPMENT_REQUIREMENTS_FILENAME,
REQUIREMENTS_FILENAME,
)
from .util import print_and_run
def _ensure_pip_tools_installed():
if not find_executable('pip-sync'):
click.echo('Installing pip-tools')
print_and_run(('pip', 'install', 'pip-tools'))
@click.group()
def cli():
pass
@cli.command()
@click.option('--dev', is_flag=True, default=False)
def install(dev):
_ensure_pip_tools_installed()
requirements_file = (
DEVELOPMENT_REQUIREMENTS_FILENAME
if dev
else REQUIREMENTS_FILENAME
)
print_and_run(('pip-sync', requirements_file))
click.echo('Requirements setup complete!')
@cli.command()
def update():
_ensure_pip_tools_installed()
print_and_run((
'pip-compile',
'-q',
f'--output-file={path.relpath(REQUIREMENTS_FILENAME)}',
path.relpath(BASE_REQUIREMENTS_FILENAME),
))
click.echo(f'Requiremnts file updated: {path.relpath(REQUIREMENTS_FILENAME)}')
@cli.command()
def update_dev():
_ensure_pip_tools_installed()
print_and_run((
'pip-compile',
'-q',
f'--output-file={path.relpath(DEVELOPMENT_REQUIREMENTS_FILENAME)}',
path.relpath(BASE_DEVELOPMENT_REQUIREMENTS_FILENAME),
))
click.echo(f'Development requirements file updated: {DEVELOPMENT_REQUIREMENTS_FILENAME}')
if __name__ == '__main__':
cli()
| 21.819444 | 93 | 0.695099 | 179 | 1,571 | 5.75419 | 0.301676 | 0.23301 | 0.180583 | 0.08932 | 0.284466 | 0.130097 | 0.130097 | 0.130097 | 0.130097 | 0.130097 | 0 | 0 | 0.189052 | 1,571 | 71 | 94 | 22.126761 | 0.808477 | 0 | 0 | 0.28 | 0 | 0 | 0.237428 | 0.118396 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0.02 | 0.1 | 0 | 0.2 | 0.1 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b14f875123a59ce6fa0837c5ecb49e829cede9cf | 1,135 | py | Python | integration/python/src/helper/hosts.py | ArpitShukla007/planetmint | 4b1e215e0059e26c0cee6778c638306021b47bdd | [
"Apache-2.0"
] | 3 | 2022-01-19T13:39:52.000Z | 2022-01-28T05:57:08.000Z | integration/python/src/helper/hosts.py | ArpitShukla007/planetmint | 4b1e215e0059e26c0cee6778c638306021b47bdd | [
"Apache-2.0"
] | 67 | 2022-01-13T22:42:17.000Z | 2022-03-31T14:18:26.000Z | integration/python/src/helper/hosts.py | ArpitShukla007/planetmint | 4b1e215e0059e26c0cee6778c638306021b47bdd | [
"Apache-2.0"
] | 7 | 2022-01-13T16:20:54.000Z | 2022-02-07T11:42:05.000Z | # Copyright © 2020 Interplanetary Database Association e.V.,
# Planetmint and IPDB software contributors.
# SPDX-License-Identifier: (Apache-2.0 AND CC-BY-4.0)
# Code is Apache-2.0 and docs are CC-BY-4.0
from typing import List
from planetmint_driver import Planetmint
class Hosts:
hostnames = []
connections = []
def __init__(self, filepath):
self.set_hostnames(filepath=filepath)
self.set_connections()
def set_hostnames(self, filepath) -> None:
with open(filepath) as f:
self.hostnames = f.readlines()
def set_connections(self) -> None:
self.connections = list(map(lambda h: Planetmint(h), self.hostnames))
def get_connection(self, index=0) -> Planetmint:
return self.connections[index]
def get_transactions(self, tx_id) -> List:
return list(map(lambda connection: connection.transactions.retrieve(tx_id), self.connections))
def assert_transaction(self, tx_id) -> None:
txs = self.get_transactions(tx_id)
for tx in txs:
assert txs[0] == tx, \
'Cannot find transaction {}'.format(tx_id)
| 30.675676 | 102 | 0.667841 | 148 | 1,135 | 5.006757 | 0.439189 | 0.026991 | 0.021592 | 0.02969 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015891 | 0.223789 | 1,135 | 36 | 103 | 31.527778 | 0.824064 | 0.171806 | 0 | 0 | 0 | 0 | 0.027807 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 1 | 0.272727 | false | 0 | 0.090909 | 0.090909 | 0.590909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b15405b5c4a9b35dd5bdc84b62d31229a91e7265 | 17,228 | py | Python | example_snippets.py | kimberscott/ffmpeg-stimuli-generation | 54bce134a3236d9e7d2fefe4538378d76f2db798 | [
"MIT"
] | null | null | null | example_snippets.py | kimberscott/ffmpeg-stimuli-generation | 54bce134a3236d9e7d2fefe4538378d76f2db798 | [
"MIT"
] | null | null | null | example_snippets.py | kimberscott/ffmpeg-stimuli-generation | 54bce134a3236d9e7d2fefe4538378d76f2db798 | [
"MIT"
] | 1 | 2020-08-14T17:15:29.000Z | 2020-08-14T17:15:29.000Z | """
Examples of using the functions in videotools.py to generate videos.
This file will not run as-is - it is just intended to provide reference commands you might copy and edit.
"""
import os
from videotools import *
this_path = os.path.dirname(os.path.abspath(__file__))
input_path = os.path.join(this_path, "example_input")
output_path = os.path.join(this_path, "example_output")
# Put two videos side-by-side
makeSideBySide(os.path.join(input_path, "cropped_book.mp4"), os.path.join(input_path, "cropped_box.mp4"), "right", os.path.join(output_path, "side_by_side.mp4"))
# Make a collage of the object-introduction videos
vids = [
"apple",
"cup",
"lotion",
"spray",
"whiteball",
"orangeball",
"train",
"toycar",
"sunglasses",
"marker",
"flashlight",
"block",
]
vids = ["cropped_" + v + ".mp4" for v in vids]
make_collage(input_path, vids, 4, os.path.join(output_path, "0_introsA"), True, 1920, vidHeight=640)
# Replace the audio in VIDEO_1 with a different mp3 file NEW_AUDIO
sp.call([
"ffmpeg",
"-i",
VIDEO_1,
"-i",
NEW_AUDIO,
"-map",
"0:v",
"-map",
"1:a",
"-shortest",
OUTPUT_VIDEO_NAME,
])
# Make a video where the input video plays backwards then forwards
sp.call(
[
"ffmpeg",
"-i",
INPUT_VIDEO,
"-i",
INPUT_VIDEO,
"-filter_complex",
"[1:v]reverse[secondhalf];[0:v][secondhalf]concat[out]",
"-map",
"""[out]""",
"-loglevel",
"error",
OUTPUT_VIDEO,
]
)
# The following are included for reference about potentially useful ffmpeg commands only - they are very specialized
# for particular stimuli!
def combineVideos(croppedVideoDir, sidebysideDir, regularOrderDict, whichVersions, minimal=False):
'''Generate all versions of side-by-side videos needed for Lookit physics study.
i.e. A / B, flippedA / B, A / flippedB, flippedA / flippedB.'''
make_sure_path_exists(sidebysideDir)
commands = ["""[0:v]setpts=PTS-STARTPTS,pad=iw*3:ih:color=white[a];[1:v]setpts=PTS-STARTPTS[z];[a][z]overlay=x=2*w:repeatlast=1:shortest=1:eof_action=repeat[out]""", \
"""[0:v]setpts=PTS-STARTPTS,hflip,pad=iw*3:ih:color=white[b];[1:v]setpts=PTS-STARTPTS[z];[b][z]overlay=x=2*w:repeatlast=1:shortest=1:eof_action=repeat[out]""", \
"""[0:v]setpts=PTS-STARTPTS,pad=iw*3:ih:color=white[b];[1:v]setpts=PTS-STARTPTS[z];[z]hflip[c];[b][c]overlay=x=2*w:repeatlast=1:shortest=1:eof_action=repeat[out]""", \
"""[0:v]setpts=PTS-STARTPTS,hflip,pad=iw*3:ih:color=white[b];[1:v]setpts=PTS-STARTPTS[z];[z]hflip[c];[b][c]overlay=x=2*w:repeatlast=1:shortest=1:eof_action=repeat[out]"""]
suffixes = ['NN', 'RN', 'NR', 'RR']
allfiles = os.listdir(croppedVideoDir)
for iVid1, video1 in enumerate(allfiles):
(shortname1, ext1) = os.path.splitext(video1)
if not(os.path.isdir(os.path.join(croppedVideoDir, video1))) and ext1 == VIDEXT:
for iVid2 in range(len(allfiles)):
if iVid2 == iVid1:
continue
if minimal and iVid2 <= iVid1:
continue
else:
video2 = allfiles[iVid2]
(shortname2, ext2) = os.path.splitext(video2)
if not(os.path.isdir(os.path.join(croppedVideoDir, video2))) and ext2 == VIDEXT:
labels = [parse_video_filename(v, regularOrderDict) for v in [video1, video2]]
if labels[0][0] == labels[1][0] and \
labels[0][2] == labels[1][2] and \
labels[0][3] == labels[1][3] and \
labels[0][4] == labels[1][4]:
outfilenameBase = 'sbs_' + labels[0][0] + '_' + labels[0][1] + '_' + labels[1][1] + '_' + \
labels[0][2] + '_' + labels[0][3] + '_' + labels[0][4] + '_'
for iVid in range(len(commands)):
if suffixes[iVid] in whichVersions:
sp.call(["ffmpeg", "-i", os.path.join(croppedVideoDir, video1), \
"-i", os.path.join(croppedVideoDir, video2), \
"-filter_complex", \
commands[iVid], \
"-map", """[out]""", "-loglevel", "error", \
os.path.join(sidebysideDir, outfilenameBase + suffixes[iVid] + '.mp4')])
def flipVideos(rawVideoDir, origVideoDir, unflippedOrderDict):
vidLengths = {
"apple": 66,
"cup": 86,
"lotion": 68,
"orangeball": 76,
"spray": 90,
"whiteball": 64,
}
fadeFrames = 10
make_sure_path_exists(origVideoDir)
for f in os.listdir(rawVideoDir):
if not (os.path.isdir(os.path.join(rawVideoDir, f))):
(shortname, ext) = os.path.splitext(f)
if ext in ORIGEXT:
(event, outcome, object, camera,
background) = parse_video_filename(
shortname, unflippedOrderDict
)
if outcome == "up":
continue
sp.call(
[
"ffmpeg",
"-i",
os.path.join(rawVideoDir, f),
"-vf",
"""vflip,fade=type=in:start_frame=1:nb_frames={}:color=0x009EFC,fade=type=out:start_frame={}:color=0x009EFC""".format(
fadeFrames, vidLengths[object] - fadeFrames
),
"-loglevel",
"error",
os.path.join(
origVideoDir,
event
+ "_up_"
+ object
+ "_"
+ background
+ "_"
+ camera
+ ".mp4",
),
]
)
sp.call(
[
"ffmpeg",
"-i",
os.path.join(rawVideoDir, f),
"-vf",
"""fade=type=in:start_frame=1:nb_frames={}:color=0x009EFC,fade=type=out:start_frame={}:color=0x009EFC""".format(
fadeFrames, vidLengths[object] - fadeFrames
),
"-loglevel",
"error",
os.path.join(
origVideoDir,
event
+ "_down_"
+ object
+ "_"
+ background
+ "_"
+ camera
+ ".mp4",
),
]
)
return 0
### Crops and rescales 640px wide.
def cropVideos(
origVideoDir,
croppedVideoDir,
regularOrderDict,
originalSizes=[],
cropStrings=[],
which=[],
cropByName=[],
timecrop=[],
fadeParams=[],
doCrossFade=False,
):
"""TODO: docstring
timecrop: list of (ID, start, stop, padStart, padStop) tuples.
ID: dict containing any keys in ['object', 'event', 'outcome', 'camera', 'background'] and values.
This time cropping will be applied to any videos that match the values for all
the specified keys.
start, stop: start and stop times in s.
padStart, padStop: amount of time to extend first and last frames by, in s.
fadeParams: (fadeFrames, fadeColor)
"""
make_sure_path_exists(croppedVideoDir)
for f in os.listdir(origVideoDir):
if not (os.path.isdir(os.path.join(origVideoDir, f))):
(shortname, ext) = os.path.splitext(f)
if ext in ORIGEXT:
if regularOrderDict:
(event, outcome, object, camera, background) = parse_video_filename(
shortname, regularOrderDict
)
thisID = {
"event": event,
"outcome": outcome,
"object": object,
"camera": camera,
"background": background,
}
if len(which) == 2 and not (object, event) == which:
continue
if len(which) == 3 and not (object, event, outcome) == which:
continue
timecropCommand = []
doTimeCrop = False
if timecrop:
for (ID, s, e, pS, pE) in timecrop:
if all([thisID[key] == val for (key, val) in ID.items()]):
startTime = s
endTime = e
padStart = pS
padEnd = pE
doTimeCrop = True
if doTimeCrop:
if not startTime == -1:
timecropCommand = ["-ss", str(startTime)]
if not endTime == -1:
timecropCommand = timecropCommand + [
"-t",
str(endTime - startTime),
]
else:
warnings.warn("No time cropping for this video")
if cropByName:
for (vidNames, cropStrForNames) in cropByName:
if f in vidNames:
cropStr = cropStrForNames
else:
if originalSizes == "*":
cropStr = cropStrings[0]
else:
res = findVideoResolution(os.path.join(origVideoDir, f))
if res in originalSizes:
cropStr = cropStrings[originalSizes.index(res)]
else:
cropStr = """scale=640:-2"""
cropStr = cropStr + ",setpts=PTS-STARTPTS"
if doTimeCrop:
croppedVid = os.path.join(
croppedVideoDir, shortname + "_middle.mp4"
)
croppedVidFinal = os.path.join(croppedVideoDir, shortname + ".mp4")
else:
croppedVid = os.path.join(croppedVideoDir, shortname + ".mp4")
croppedVidFinal = croppedVid
command = (
["ffmpeg", "-i", os.path.join(origVideoDir, f), "-vf", cropStr]
+ timecropCommand
+ ["-loglevel", "error", croppedVid]
)
sp.call(command)
if doTimeCrop:
firstImg = os.path.join(croppedVideoDir, shortname + "_first.png")
lastImg = os.path.join(croppedVideoDir, shortname + "_last.png")
firstVid = os.path.join(croppedVideoDir, shortname + "_first.mp4")
lastVid = os.path.join(croppedVideoDir, shortname + "_last.mp4")
sp.call(
[
"ffmpeg",
"-i",
croppedVid,
"-vframes",
"1",
"-f",
"image2",
firstImg,
"-loglevel",
"error",
]
)
[nF, dur, x, y] = get_video_details(
croppedVid, ["nframes", "vidduration", "width", "height"]
)
sp.call(
[
"ffmpeg",
"-i",
croppedVid,
"-vf",
"select='eq(n,{})'".format(nF - 1),
"-vframes",
"1",
"-f",
"image2",
lastImg,
"-loglevel",
"error",
]
)
sp.call(
[
"ffmpeg",
"-loop",
"1",
"-i",
firstImg,
"-t",
str(padStart),
firstVid,
"-loglevel",
"error",
]
)
sp.call(
[
"ffmpeg",
"-loop",
"1",
"-i",
lastImg,
"-t",
str(padEnd),
lastVid,
"-loglevel",
"error",
]
)
if not doCrossFade:
concat_mp4s(croppedVidFinal, [firstVid, croppedVid, lastVid])
else:
unfaded = os.path.join(
croppedVideoDir, shortname + "_beforecrossfade.mp4"
)
concat_mp4s(unfaded, [croppedVid, lastVid])
# see crossfade advice at http://superuser.com/a/778967
sp.call(
[
"ffmpeg",
"-i",
unfaded,
"-i",
firstVid,
"-f",
"lavfi",
"-i",
"color=white:s={}x{}".format(int(x), int(y)),
"-filter_complex",
"[0:v]format=pix_fmts=yuva420p,fade=t=out:st={}:d={}:alpha=1,setpts=PTS-STARTPTS[va0];\
[1:v]format=pix_fmts=yuva420p,fade=t=in:st=0:d={}:alpha=1,setpts=PTS-STARTPTS+{}/TB[va1];\
[2:v]scale={}x{},trim=duration={}[over];\
[over][va0]overlay=format=yuv420[over1];\
[over1][va1]overlay=format=yuv420[outv]".format(
dur + padEnd,
padEnd,
padEnd,
dur,
int(x),
int(y),
dur + padStart + padEnd,
),
"-vcodec",
"libx264",
"-map",
"[outv]",
croppedVidFinal,
"-loglevel",
"error",
]
)
os.remove(unfaded)
os.remove(firstImg)
os.remove(lastImg)
os.remove(firstVid)
os.remove(lastVid)
os.remove(croppedVid)
if fadeParams:
(fadeFrames, fadeColor) = fadeParams
nF = get_video_details(croppedVidFinal, "nframes")
unfaded = os.path.join(croppedVideoDir, shortname + "_unfaded.mp4")
os.rename(croppedVidFinal, unfaded)
sp.call(
[
"ffmpeg",
"-i",
unfaded,
"-vf",
"""fade=type=in:start_frame=1:nb_frames={}:color={},fade=type=out:start_frame={}:color={}""".format(
fadeFrames, fadeColor, nF - fadeFrames, fadeColor
),
"-loglevel",
"error",
croppedVidFinal,
]
)
os.remove(unfaded) | 40.252336 | 187 | 0.390585 | 1,351 | 17,228 | 4.911917 | 0.250925 | 0.034358 | 0.042194 | 0.048975 | 0.336347 | 0.287221 | 0.210368 | 0.190024 | 0.172845 | 0.142104 | 0 | 0.022442 | 0.500813 | 17,228 | 428 | 188 | 40.252336 | 0.749186 | 0.06977 | 0 | 0.401084 | 1 | 0.00813 | 0.076811 | 0.013132 | 0 | 0 | 0 | 0.002336 | 0 | 1 | 0.00813 | false | 0 | 0.00542 | 0 | 0.01626 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b155f55e9f976d163537ef6daaa4dfc7e72b3594 | 2,004 | py | Python | logbook/auth.py | nicola-zanardi/personal-logbook | d44989825ec82437ffd50572c23ef7c2ddf00e30 | [
"Unlicense"
] | null | null | null | logbook/auth.py | nicola-zanardi/personal-logbook | d44989825ec82437ffd50572c23ef7c2ddf00e30 | [
"Unlicense"
] | 7 | 2019-08-28T18:22:40.000Z | 2020-01-15T09:10:13.000Z | logbook/auth.py | nicola-zen/personal-logbook | d44989825ec82437ffd50572c23ef7c2ddf00e30 | [
"Unlicense"
] | null | null | null | from flask import Blueprint, flash, redirect, render_template, request, url_for
from werkzeug.security import check_password_hash, generate_password_hash
from flask_login import login_required, login_user, logout_user
from logbook.models import User, db
from peewee import fn
auth = Blueprint("auth", __name__)
@auth.route("/login")
def login():
return render_template("login.html")
@auth.route("/login", methods=["POST"])
def login_post():
username = request.form.get("username")
password = request.form.get("password")
remember = True if request.form.get("remember") else False
user = User.get_or_none(fn.Lower(User.username) == username.lower())
# inform the user if the username/password is wrong
if user is None or not check_password_hash(user.password, password):
flash("Please check your login details and try again.")
return redirect(url_for("logbook.index_next_pages"))
login_user(user, remember=remember)
return redirect(url_for("logbook.index_next_pages"))
@auth.route("/signup")
def signup():
return render_template("signup.html")
@auth.route("/signup", methods=["POST"])
def signup_post():
username = request.form.get("username").lower()
password = request.form.get("password")
user = User.get_or_none(
fn.Lower(User.username) == username
) # if this returns a user, then the email already exists in database
if user is not None:
flash("Username already exists")
return redirect(url_for("auth.signup"))
# create new user with the form data. Hash the password so plaintext version isn't saved.
new_user = User(
username=username, password=generate_password_hash(password, method="sha256")
)
new_user.save()
return redirect(url_for("auth.login"))
@auth.route("/logout", methods=["POST"])
@login_required
def logout():
logout_user()
return redirect(url_for("auth.login"))
| 30.363636 | 94 | 0.686627 | 264 | 2,004 | 5.060606 | 0.310606 | 0.026946 | 0.052395 | 0.07485 | 0.284431 | 0.221557 | 0.127246 | 0.127246 | 0.065868 | 0.065868 | 0 | 0.001861 | 0.195609 | 2,004 | 65 | 95 | 30.830769 | 0.826923 | 0.101297 | 0 | 0.139535 | 1 | 0 | 0.152425 | 0.027714 | 0 | 0 | 0 | 0 | 0 | 1 | 0.116279 | false | 0.116279 | 0.116279 | 0.046512 | 0.395349 | 0.046512 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b15a35bd4f1abd5ba27c131e3166d2cc71012e7c | 748 | py | Python | Medium/valid-ip-addresses.py | SaumyaRai2010/algoexpert-data-structures-algorithms | bcafd8d7798661bf86c2d6234221d764c68fc19f | [
"MIT"
] | 152 | 2021-07-15T02:56:17.000Z | 2022-03-31T08:59:52.000Z | Medium/valid-ip-addresses.py | deepakgarg08/algoexpert-data-structures-algorithms | 2264802bce971e842c616b1eaf9238639d73915f | [
"MIT"
] | 2 | 2021-07-18T22:01:28.000Z | 2022-02-17T03:55:04.000Z | Medium/valid-ip-addresses.py | deepakgarg08/algoexpert-data-structures-algorithms | 2264802bce971e842c616b1eaf9238639d73915f | [
"MIT"
] | 74 | 2021-07-16T11:55:30.000Z | 2022-03-31T14:48:06.000Z |
# VALID IP ADDRESSES
# O(1) time and space
def validIPAddresses(string):
# Write your code here.
validIPAddresses = []
if len(string) < 4:
return []
for i in range(3):
if not isValidPart(string[:i+1]):
continue
for j in range(i+1, i+4):
if not isValidPart(string[i+1:j+1]):
continue
for k in range(j+1, j+4):
if not isValidPart(string[j+1:k+1]) or not isValidPart(string[k+1:]):
continue
validIP = string[:i+1] + "." + string[i+1:j+1] + "." + string[j+1:k+1] + "." + string[k+1:]
validIPAddresses.append(validIP)
return validIPAddresses
def isValidPart(string):
if len(string) == 1:
return True
if not 0 < len(string) < 4 or string[0] == "0":
return False
return 0 <= int(string) <= 255
| 24.933333 | 95 | 0.620321 | 121 | 748 | 3.834711 | 0.305785 | 0.18319 | 0.172414 | 0.142241 | 0.211207 | 0.103448 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0.213904 | 748 | 30 | 96 | 24.933333 | 0.741497 | 0.080214 | 0 | 0.136364 | 0 | 0 | 0.005848 | 0 | 0 | 0 | 0 | 0.033333 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b1667d176dd7399e7e7f6c6217ae79f8d38f3cee | 638 | py | Python | passive_capture/reporter/admin.py | Sillson/passive_capture_py | 167d08865400571c9eed60c0040cf67d27fa11b4 | [
"MIT"
] | null | null | null | passive_capture/reporter/admin.py | Sillson/passive_capture_py | 167d08865400571c9eed60c0040cf67d27fa11b4 | [
"MIT"
] | null | null | null | passive_capture/reporter/admin.py | Sillson/passive_capture_py | 167d08865400571c9eed60c0040cf67d27fa11b4 | [
"MIT"
] | null | null | null | from django.contrib import admin
from django.contrib.gis import admin as geo_model_admin
from leaflet.admin import LeafletGeoAdmin
from .models import Forecasts, Dam, Species
# Forecast Model
class ForecastsAdmin(admin.ModelAdmin):
list_display = ('dam', 'species', 'forecast_range')
admin.site.register(Forecasts, ForecastsAdmin)
# Species Model
class SpeciesAdmin(admin.ModelAdmin):
list_display = ('name', 'reference_name')
admin.site.register(Species, SpeciesAdmin)
# Dam Model - requires GeoAdmin privelages
class DamAdmin(LeafletGeoAdmin):
list_display = ('name', 'abbr', 'location')
admin.site.register(Dam, DamAdmin)
| 26.583333 | 55 | 0.782132 | 77 | 638 | 6.38961 | 0.428571 | 0.067073 | 0.103659 | 0.105691 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.112853 | 638 | 23 | 56 | 27.73913 | 0.869258 | 0.10815 | 0 | 0 | 0 | 0 | 0.102837 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.307692 | 0 | 0.769231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b16a393bb50e48e50f448e75e1aa34a864d369d1 | 226 | py | Python | 5_Pham_Ngo_Tien_Dung/3.1.py | lpython2006e/exercies | 84343eae57d86708a7984aa02f77183a4688a508 | [
"MIT"
] | null | null | null | 5_Pham_Ngo_Tien_Dung/3.1.py | lpython2006e/exercies | 84343eae57d86708a7984aa02f77183a4688a508 | [
"MIT"
] | null | null | null | 5_Pham_Ngo_Tien_Dung/3.1.py | lpython2006e/exercies | 84343eae57d86708a7984aa02f77183a4688a508 | [
"MIT"
] | 8 | 2020-07-10T14:13:54.000Z | 2020-08-03T08:17:50.000Z | """Write a program that allow user enter a file name (path) then content, allow user to save it"""
filename = input("Please input filename")
f= open(filename,"w+")
content = input("Please input content")
f.write(content) | 37.666667 | 99 | 0.712389 | 35 | 226 | 4.6 | 0.6 | 0.111801 | 0.198758 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.159292 | 226 | 6 | 100 | 37.666667 | 0.847368 | 0.40708 | 0 | 0 | 0 | 0 | 0.346774 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b16cd2c50420d1e6d132def2948468675ae9b60d | 720 | py | Python | tests/test_DataAugmenterExternally.py | AlexKay28/zarnitsa | c7e93423dcc1f000849f8c1e1f685e8a91b90f9c | [
"Apache-2.0"
] | 8 | 2021-07-19T18:25:03.000Z | 2021-10-05T15:25:20.000Z | tests/test_DataAugmenterExternally.py | AlexKay28/zarnitsa | c7e93423dcc1f000849f8c1e1f685e8a91b90f9c | [
"Apache-2.0"
] | 22 | 2021-07-26T19:13:32.000Z | 2021-10-09T18:56:07.000Z | tests/test_DataAugmenterExternally.py | AlexKay28/zarnitsa | c7e93423dcc1f000849f8c1e1f685e8a91b90f9c | [
"Apache-2.0"
] | 1 | 2021-08-10T12:24:00.000Z | 2021-08-10T12:24:00.000Z | import os
import sys
import pytest
import numpy as np
import pandas as pd
from scipy.stats import ks_2samp
sys.path.append("zarnitsa/")
from zarnitsa.stats import DataAugmenterExternally
N_TO_CHECK = 500
SIG = 0.5
@pytest.fixture
def dae():
return DataAugmenterExternally()
@pytest.fixture
def normal_data():
return pd.Series(np.random.normal(0, SIG * 3, size=N_TO_CHECK), dtype="float64")
def test_augment_column_permute(dae, normal_data):
"""
Augment column with normal distribution
"""
normal_data_aug = dae.augment_distrib_random(
aug_type="normal", size=N_TO_CHECK, loc=0, scale=SIG * 3
)
assert ks_2samp(normal_data, normal_data_aug).pvalue > 0.01, "KS criteria"
| 20 | 84 | 0.730556 | 107 | 720 | 4.719626 | 0.485981 | 0.09901 | 0.047525 | 0.047525 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026756 | 0.169444 | 720 | 35 | 85 | 20.571429 | 0.817726 | 0.054167 | 0 | 0.095238 | 0 | 0 | 0.049624 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 1 | 0.142857 | false | 0 | 0.333333 | 0.095238 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b172a5ff4bd5c2830f5d2332f4e30cc2a061bc37 | 306 | py | Python | run2.py | akuz/deep-gen-mnist | 13d4d350a0dc9dc7f0111c839fb7158654f048c4 | [
"MIT"
] | null | null | null | run2.py | akuz/deep-gen-mnist | 13d4d350a0dc9dc7f0111c839fb7158654f048c4 | [
"MIT"
] | null | null | null | run2.py | akuz/deep-gen-mnist | 13d4d350a0dc9dc7f0111c839fb7158654f048c4 | [
"MIT"
] | null | null | null |
import numpy as np
import tensorflow as tf
import model
if __name__ == "__main__":
print("Making level configs...")
level_configs = model.default_level_configs()
print("Making filter variables...")
filters = model.make_filters(tf.get_default_graph(), level_configs)
print("Done")
| 19.125 | 71 | 0.70915 | 39 | 306 | 5.179487 | 0.564103 | 0.237624 | 0.168317 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 306 | 15 | 72 | 20.4 | 0.801587 | 0 | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.333333 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
b183550bc53fd30c394fa716585596aa04c10f32 | 99 | py | Python | tests/__init__.py | Fokko/example-library-python | b20b69c6dae93c32cd3d2c86a644abbf6b85199b | [
"Apache-2.0"
] | null | null | null | tests/__init__.py | Fokko/example-library-python | b20b69c6dae93c32cd3d2c86a644abbf6b85199b | [
"Apache-2.0"
] | null | null | null | tests/__init__.py | Fokko/example-library-python | b20b69c6dae93c32cd3d2c86a644abbf6b85199b | [
"Apache-2.0"
] | null | null | null | import sys, os
path = os.path.dirname(__file__)
if path not in sys.path:
sys.path.append(path)
| 19.8 | 32 | 0.717172 | 18 | 99 | 3.722222 | 0.555556 | 0.179104 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.161616 | 99 | 4 | 33 | 24.75 | 0.807229 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b18dfbe911fad785c5c6176e1eec4c5f85de7b49 | 938 | py | Python | rabbitai/tasks/celery_app.py | psbsgic/rabbitai | 769e120ba605d56ac076f810a549c38dac410c8e | [
"Apache-2.0"
] | null | null | null | rabbitai/tasks/celery_app.py | psbsgic/rabbitai | 769e120ba605d56ac076f810a549c38dac410c8e | [
"Apache-2.0"
] | null | null | null | rabbitai/tasks/celery_app.py | psbsgic/rabbitai | 769e120ba605d56ac076f810a549c38dac410c8e | [
"Apache-2.0"
] | 1 | 2021-07-09T16:29:50.000Z | 2021-07-09T16:29:50.000Z | """
This is the main entrypoint used by Celery workers. As such,
it needs to call create_app() in order to initialize things properly
"""
from typing import Any
from celery.signals import worker_process_init
# Rabbitai framework imports
from rabbitai import create_app
from rabbitai.extensions import celery_app, db
# Init the Flask app / configure everything
flask_app = create_app()
# Need to import late, as the celery_app will have been setup by "create_app()"
# pylint: disable=wrong-import-position, unused-import
from . import cache, schedules, scheduler # isort:skip
# Export the celery app globally for Celery (as run on the cmd line) to find
app = celery_app
@worker_process_init.connect
def reset_db_connection_pool(**kwargs: Any) -> None: # pylint: disable=unused-argument
with flask_app.app_context():
# https://docs.sqlalchemy.org/en/14/core/connections.html#engine-disposal
db.engine.dispose()
| 32.344828 | 87 | 0.765458 | 141 | 938 | 4.971631 | 0.595745 | 0.051355 | 0.048502 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002519 | 0.153518 | 938 | 28 | 88 | 33.5 | 0.880353 | 0.553305 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.454545 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
b18f8ac4ca91a60fabe49e7603be45706caf3334 | 52 | py | Python | chatbot/component/__init__.py | zgj0607/ChatBot | 3c6126754b9d037a04bd80d13874e2ae16b2c421 | [
"Apache-2.0"
] | null | null | null | chatbot/component/__init__.py | zgj0607/ChatBot | 3c6126754b9d037a04bd80d13874e2ae16b2c421 | [
"Apache-2.0"
] | null | null | null | chatbot/component/__init__.py | zgj0607/ChatBot | 3c6126754b9d037a04bd80d13874e2ae16b2c421 | [
"Apache-2.0"
] | null | null | null | __all__ = (
'readonly_admin',
'singleton'
)
| 10.4 | 21 | 0.576923 | 4 | 52 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.269231 | 52 | 4 | 22 | 13 | 0.657895 | 0 | 0 | 0 | 0 | 0 | 0.442308 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
b1a2e9e876bf7788f4968b9eb3b29a91a90c21c3 | 9,585 | py | Python | umich_daily.py | mpars0ns/scansio-sonar-es | ea7b1928277317b97c84443812da01af99ef0feb | [
"BSD-3-Clause"
] | 36 | 2015-10-14T21:17:16.000Z | 2022-01-21T16:34:24.000Z | umich_daily.py | mpars0ns/scansio-sonar-es | ea7b1928277317b97c84443812da01af99ef0feb | [
"BSD-3-Clause"
] | 5 | 2015-10-19T13:47:55.000Z | 2017-06-21T07:12:41.000Z | umich_daily.py | mpars0ns/scansio-sonar-es | ea7b1928277317b97c84443812da01af99ef0feb | [
"BSD-3-Clause"
] | 8 | 2016-04-28T09:34:20.000Z | 2022-01-21T16:34:23.000Z | import argparse
import sys
from multiprocessing import cpu_count, Process, Queue
import json
import logging
from datetime import datetime
from elasticsearch import Elasticsearch
from elasticsearch.helpers import bulk, scan
import hashlib
from helpers.certparser import process_cert
from helpers.hostparser import proccess_host
logger = logging.getLogger('SSLImporter')
logger_format = logging.Formatter('\033[1;32m%(levelname)-5s %(module)s:%(funcName)s():%(lineno)d %(asctime)s\033[0m| '
'%(message)s')
stream_handler = logging.StreamHandler(sys.stdout)
stream_handler.setFormatter(logger_format)
logger.addHandler(stream_handler)
elastic_logger = logging.getLogger('elasticsearch')
elastic_logger.addHandler(stream_handler)
DEFAULT_SERVER = u'localhost'
DEFAULT_PORT = 9200
def process_scan_certs(q, es):
"""
:param q: The Queue object that certs should be pulled off of
:param es: An Elasticsearch connection. This way each worker has its own connection and you don't have to share it
across multiple workers/processes
:return:
"""
bulk_certs = []
while True:
certs = q.get()
if certs == "DONE":
bulk(es, bulk_certs)
return True
for cert in certs['certs']:
newcert = process_cert(cert)
if newcert:
newcert['import_date'] = certs['time']
newcert['source'] = 'umich'
newcert_action = {"_index": "passive-ssl-certs-umich", "_type": "cert", '_id': newcert['hash_id'],
'_source': newcert}
bulk_certs.append(newcert_action)
if len(bulk_certs) == 500:
bulk(es, bulk_certs)
bulk_certs = []
def process_hosts(q, es, initial):
"""
:param q: The Queue object that hosts should be pulled off of
:param es: An Elasticsearch connection. This way each worker has its own connection and you don't have to share it
across multiple workers/processes
:param initial: If this is the initial upload then we set the first_seen = last_seen. Other wise first_seen is left
blank and will be cleaned up later
:return:
"""
bulk_hosts = []
while True:
line = q.get()
if line == "DONE":
bulk(es, bulk_hosts)
return True
host = proccess_host(line)
cert_hash = hashlib.sha1(host['host']+host['hash']+host['source'])
cert_hash = cert_hash.hexdigest()
if initial:
host['first_seen'] = host['last_seen']
action = {"_op_type": "update", "_index": 'passive-ssl-hosts-umich', "_type": "host", "_id": cert_hash,
"doc": line, "doc_as_upsert": "true"}
bulk_hosts.append(action)
if len(bulk_hosts) == 500:
bulk(es, bulk_hosts)
bulk_hosts = []
def parse_scanfile(f, host_queue, cert_queue):
"""
:param f: json file from University of Michigan that has been lz4 decompressed.
:param host_queue: Queue to send host info to
:param cert_queue: Queue to send cert info to
:return:
"""
certs_set = set()
with open(f) as scan_file:
for line in scan_file:
item = json.loads(line)
item['log'].pop(0)
for entry in item['log']:
if entry['data']:
if 'server_certificates' in entry['data'] and entry['data']['server_certificates'] is not None:
if entry['data']['server_certificates']['certificate'] is not None:
if 'fingerprint_sha1' in entry['data']['server_certificates']['certificate']:
server_cert = entry['data']['server_certificates']['certificate']['fingerprint_sha1']
doc = {'host': item['host'], 'source': 'umich', 'last_seen': item['time'],
'hash': server_cert}
host_queue.put(doc)
if server_cert in certs_set:
pass # We already have this sha1 and we don't need to attempt parsing it
else:
if entry['data']['server_certificates']['certificate'] is not None:
if 'raw' in entry['data']['server_certificates']:
raw_cert = dict()
raw_cert['time'] = item['time']
raw_cert['certs'] = entry['data']['server_certificates']['raw']
else:
raw_cert = None
if raw_cert:
cert_queue.put(raw_cert)
certs_set.add(server_cert) # We have added this hash to be processed so we
# don't need to process it again
print "Finished processing file....now printing the length of the certs set"
print len(certs_set)
def main(argv):
parser = argparse.ArgumentParser()
parser.add_argument('--server', default=DEFAULT_SERVER,
help=u'Elasticsearch hostname or IP (default {0})'.format(DEFAULT_SERVER))
parser.add_argument('--port', default=DEFAULT_PORT,
help=u'Elasticsearch port (default {0})'.format(DEFAULT_PORT))
parser.add_argument('--scanfile', help=u'Path to umich scan file you are ingesting. '
u'Please make sure to decompress it')
parser.add_argument('--initial', help=u'If this is the first file you are importing please use this flag',
action='store_true')
args = parser.parse_args(argv[1:])
if args.scanfile is None:
logger.error("Please include a scanfile")
sys.exit(1)
workers = cpu_count()
process_hosts_queue = Queue(maxsize=20000)
process_certs_queue = Queue(maxsize=20000)
for w in xrange(workers/2):
# Establish elasticsearch connection for each process
es = Elasticsearch([{u'host': args.server, u'port': args.port}], timeout=30)
p = Process(target=process_hosts, args=(process_hosts_queue, es, args.initial))
p.daemon = True
p.start()
for w in xrange(workers/2):
# Establish elasticsearch connection for each process
es = Elasticsearch([{u'host': args.server, u'port': args.port}], timeout=30)
p = Process(target=process_scan_certs, args=(process_certs_queue, es))
p.daemon = True
p.start()
logger.warning("Starting processing of {file} at {date}".format(file=args.scanfile, date=datetime.now()))
# This is the bottle neck of the process but it works for now
parse_scanfile(args.scanfile, process_hosts_queue, process_certs_queue)
# Once all the json lines have been put onto the queue. Add DONE so the queue workers know when to quit.
for w in xrange(workers):
process_hosts_queue.put("DONE")
process_certs_queue.put("DONE")
# Close out the queue we are done
process_hosts_queue.close()
process_hosts_queue.join_thread()
process_certs_queue.close()
process_certs_queue.join_thread()
# this is kinda dirty but without looking up everything at insert time (slow) I don't know of a better way to do
# this based on the number of documents we will have
refresh_es = Elasticsearch([{u'host': args.server, u'port': args.port}], timeout=30)
# construct an elasticsearch query where the filter is looking for any entry that is missing the field first_seen
q = {'size': 500, "query": {"match_all": {}}, "filter": {"missing": {"field": "first_seen"}}}
new_updates = refresh_es.search(index='passive-ssl-hosts-umich', body=q)
logger.warning("Numer of hosts to update is {count}".format(count=new_updates['hits']['total']))
# Scan across all the documents missing the first_seen field and bulk update them
missing_first_seen = scan(refresh_es, query=q, scroll='30m', index='passive-ssl-hosts-umich')
bulk_miss = []
for miss in missing_first_seen:
last_seen = miss['_source']['last_seen']
first_seen = last_seen
action = {"_op_type": "update", "_index": "passive-ssl-hosts-umich", "_type": "host", "_id": miss['_id'],
"doc": {'first_seen': first_seen}}
bulk_miss.append(action)
if len(bulk_miss) == 500:
bulk(refresh_es, bulk_miss)
bulk_miss = []
# Get the remaining ones that are less than 000 and the loop has ended
bulk(refresh_es, bulk_miss)
logger.warning("{file} import finished at {date}".format(file=args.scanfile, date=datetime.now()))
# Now we should optimize each index to max num segments of 1 to help with searching/sizing and just over all
# es happiness
logger.warning("Optimizing index: {index} at {date}".format(index='passive-ssl-hosts-umich', date=datetime.now()))
refresh_es.indices.optimize(index='passive-ssl-hosts-umich', max_num_segments=1, request_timeout=7500)
logger.warning("Optimizing index: {index} at {date}".format(index='passive-ssl-certs-umich', date=datetime.now()))
refresh_es.indices.optimize(index='passive-ssl-certs-umich', max_num_segments=1, request_timeout=7500)
if __name__ == "__main__":
main(sys.argv)
| 45.212264 | 119 | 0.605842 | 1,202 | 9,585 | 4.682196 | 0.242928 | 0.017591 | 0.023987 | 0.033582 | 0.298152 | 0.22779 | 0.219261 | 0.219261 | 0.205757 | 0.190476 | 0 | 0.010377 | 0.286176 | 9,585 | 211 | 120 | 45.42654 | 0.812189 | 0.103495 | 0 | 0.191781 | 0 | 0.006849 | 0.195601 | 0.034878 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.068493 | 0.10274 | null | null | 0.027397 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b1a435a669f2409d097f7f74a5d9ca3c12d7e85f | 1,944 | py | Python | isaactest/tests/recieve_verify_emails.py | jsharkey13/isaac-selenium-testing | fc57ec57179cf7d9f0bb5ef46d759792b2af3bc8 | [
"MIT"
] | null | null | null | isaactest/tests/recieve_verify_emails.py | jsharkey13/isaac-selenium-testing | fc57ec57179cf7d9f0bb5ef46d759792b2af3bc8 | [
"MIT"
] | 1 | 2016-01-15T11:28:06.000Z | 2016-01-25T17:09:18.000Z | isaactest/tests/recieve_verify_emails.py | jsharkey13/isaac-selenium-testing | fc57ec57179cf7d9f0bb5ef46d759792b2af3bc8 | [
"MIT"
] | 1 | 2019-05-14T16:53:49.000Z | 2019-05-14T16:53:49.000Z | from ..utils.log import log, INFO, ERROR, PASS
from ..utils.i_selenium import assert_tab, image_div
from ..tests import TestWithDependency
__all__ = ["recieve_verify_emails"]
#####
# Test : Recieve Verification Emails
#####
@TestWithDependency("RECIEVE_VERIFY_EMAILS", ["REQ_VERIFY_EMAILS"])
def recieve_verify_emails(driver, inbox, GUERRILLAMAIL, WAIT_DUR, **kwargs):
"""Test if the new verification emails are recieved.
- 'driver' should be a Selenium WebDriver.
- 'inbox' should be a GuerrillaInbox object.
- 'GUERRILLAMAIL' is the string URL of GuerrillaMail.
"""
verification_email_request_limit = 4
assert_tab(driver, GUERRILLAMAIL)
inbox.wait_for_email(WAIT_DUR, expected=verification_email_request_limit)
try:
verification_emails_recived = 0
log(INFO, "Checking if verification emails recieved.")
verification_emails = inbox.get_by_subject("Verify your email")
verification_emails_recived = len(verification_emails)
assert verification_emails_recived > 0
if verification_emails_recived != verification_email_request_limit:
log(ERROR, "Expected %s verification emails, received %s!" % (verification_email_request_limit, verification_emails_recived))
log(ERROR, "Will have to use last received verification email, not the last requested!")
else:
log(INFO, "Received all %s verification emails." % verification_email_request_limit)
for email in verification_emails:
email.image()
email.save_html_body()
log(PASS, "%s verification emails recieved." % verification_emails_recived)
return True
except AssertionError:
image_div(driver, "ERROR_recieve_verification")
log(ERROR, "Expected %s verification emails, no emails received! See 'ERROR_recieve_verification.png'!" % (verification_email_request_limit))
return False
| 44.181818 | 149 | 0.718107 | 224 | 1,944 | 5.955357 | 0.361607 | 0.215892 | 0.107946 | 0.130435 | 0.118441 | 0.052474 | 0 | 0 | 0 | 0 | 0 | 0.001926 | 0.19856 | 1,944 | 43 | 150 | 45.209302 | 0.8543 | 0.117284 | 0 | 0 | 0 | 0 | 0.251347 | 0.060443 | 0 | 0 | 0 | 0 | 0.137931 | 1 | 0.034483 | false | 0.068966 | 0.103448 | 0 | 0.206897 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
b1a911035784142a39959873000505c8b7d79b40 | 2,455 | py | Python | openshift/helper/openshift.py | flaper87/openshift-restclient-python | 13d5d86ca89035b9f596032e7a34f3cc33bf8f18 | [
"Apache-2.0"
] | null | null | null | openshift/helper/openshift.py | flaper87/openshift-restclient-python | 13d5d86ca89035b9f596032e7a34f3cc33bf8f18 | [
"Apache-2.0"
] | null | null | null | openshift/helper/openshift.py | flaper87/openshift-restclient-python | 13d5d86ca89035b9f596032e7a34f3cc33bf8f18 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import absolute_import
import json
from kubernetes.client import models as k8s_models
from kubernetes.client import apis as k8s_apis
from kubernetes.client.rest import ApiException
from urllib3.exceptions import MaxRetryError
from . import VERSION_RX
from .. import config
from ..client import models as openshift_models
from ..client import apis as openshift_apis
from ..client import ApiClient, ConfigurationObject
from .base import BaseObjectHelper
from .exceptions import OpenShiftException
class OpenShiftObjectHelper(BaseObjectHelper):
@staticmethod
def client_from_config(config_file, context):
if not config_file:
return ApiClient(config=ConfigurationObject())
return config.new_client_from_config(config_file, context)
@classmethod
def available_apis(cls):
apis = ['OapiApi']
apis.extend([x for x in dir(openshift_apis) if VERSION_RX.search(x)])
apis.extend([x for x in dir(k8s_apis) if VERSION_RX.search(x)])
return apis
@staticmethod
def get_exception_class():
return OpenShiftException
@staticmethod
def model_class_from_name(model_name):
try:
return getattr(openshift_models, model_name)
except AttributeError:
return getattr(k8s_models, model_name)
@staticmethod
def api_class_from_name(api_name):
try:
return getattr(openshift_apis, api_name)
except AttributeError:
return getattr(k8s_apis, api_name)
def create_project(self, metadata, display_name=None, description=None):
""" Creating a project requires using the project_request endpoint. """
# TODO: handle admin-level project creation
w, stream = self._create_stream(None)
try:
proj_req = openshift_models.V1ProjectRequest(metadata=metadata, display_name=display_name, description=description)
openshift_apis.OapiApi(self.api_client).create_project_request(proj_req)
except ApiException as exc:
msg = json.loads(exc.body).get('message', exc.reason) if exc.body.startswith('{') else exc.body
raise OpenShiftException(msg, status=exc.status)
except MaxRetryError as ex:
raise OpenShiftException(str(ex.reason))
self._read_stream(w, stream, metadata.name)
return self._wait_for_response(metadata.name, None, 'create')
| 35.071429 | 127 | 0.712424 | 296 | 2,455 | 5.706081 | 0.324324 | 0.035524 | 0.035524 | 0.030787 | 0.170515 | 0.136175 | 0.023683 | 0 | 0 | 0 | 0 | 0.004132 | 0.211405 | 2,455 | 69 | 128 | 35.57971 | 0.868285 | 0.052546 | 0 | 0.173077 | 0 | 0 | 0.00906 | 0 | 0 | 0 | 0 | 0.014493 | 0 | 1 | 0.115385 | false | 0 | 0.25 | 0.019231 | 0.557692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
490c3a09e90ac7741bc5df730d26dac2764368fc | 40,373 | py | Python | TrainingExtensions/tensorflow/src/python/aimet_tensorflow/utils/op/fusedbatchnorm.py | quic-ykota/aimet | c897bd4c360e3a0fb7a329c6bb98b569f66bace1 | [
"BSD-3-Clause"
] | 945 | 2020-04-30T02:23:55.000Z | 2022-03-31T08:44:32.000Z | TrainingExtensions/tensorflow/src/python/aimet_tensorflow/utils/op/fusedbatchnorm.py | seaun163/aimet | de94e5522e0c9250fb422d064b77ef9ecc70f239 | [
"BSD-3-Clause"
] | 563 | 2020-05-01T03:07:22.000Z | 2022-03-30T05:35:58.000Z | TrainingExtensions/tensorflow/src/python/aimet_tensorflow/utils/op/fusedbatchnorm.py | seaun163/aimet | de94e5522e0c9250fb422d064b77ef9ecc70f239 | [
"BSD-3-Clause"
] | 186 | 2020-04-30T00:55:26.000Z | 2022-03-30T09:54:51.000Z | # /usr/bin/env python3.5
# -*- mode: python -*-
# =============================================================================
# @@-COPYRIGHT-START-@@
#
# Copyright (c) 2019-2020, Qualcomm Innovation Center, Inc. All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
# 1. Redistributions of source code must retain the above copyright notice,
# this list of conditions and the following disclaimer.
#
# 2. Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
#
# 3. Neither the name of the copyright holder nor the names of its contributors
# may be used to endorse or promote products derived from this software
# without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
# SPDX-License-Identifier: BSD-3-Clause
#
# @@-COPYRIGHT-END-@@
# =============================================================================
""" utilities for fused batchnorm op """
from typing import Union
import numpy as np
import tensorflow as tf
from tensorflow.contrib import graph_editor as ge
from aimet_common.utils import AimetLogger
from aimet_tensorflow.utils import constants
logger = AimetLogger.get_area_logger(AimetLogger.LogAreas.Utils)
_BN_STRUCTURE_ERROR_MSG = "BN op doesn't have the expected structure"
class BNUtils:
""" Batch Norm/ fused Batch Norm op related utils"""
@staticmethod
def skip_bn_op(sess: tf.compat.v1.Session, bn_op: tf.Operation, in_tensor: tf.Tensor, out_tensor: tf.Tensor):
"""
Skip given bn op specified (fused batch norm op).
Note: supports only Fused bn op types.
:param sess: Tensorflow session
:param bn_op: Batchnorm op to be skipped
:param in_tensor: Input tensor to the batchnorm op
:param out_tensor: Output tensor of the batchnorm op
"""
if in_tensor is None or out_tensor is None:
logger.error("Error, input and output tensors must be provided for skipping the op")
assert False
else:
with sess.graph.as_default():
if bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm']:
ge.detach_outputs(in_tensor.op)
ge.reroute_ts(in_tensor, out_tensor)
BNUtils.remove_bn_op_from_update_ops(sess, bn_op)
else:
logger.error("Error, Unknown BN op")
assert False
@staticmethod
def _get_tensor_read_var_op_trainable_bn_op(input_tensor: tf.Tensor) -> tf.Tensor:
"""
Generic helper to find a read op tensor associated with input tensor that can be evaluated, when the bn op is
marked trainable.
:param input_tensor: Input tensor to find corresponding read op tensor that can be evaluated
:return: read var op type tensor as tf.Tensor type.
"""
logger.debug('Fetching params from trainable BN op type')
assert input_tensor.op.inputs[0].op.inputs is not None
# inputs of 0 is beta tensor , get readVarOp associated with it
var_tensor = input_tensor.op.inputs[0].op.inputs[0]
assert var_tensor.op.outputs is not None
assert len(var_tensor.consumers()) >= 3
tensor_consumers = var_tensor.consumers()
var_read_tensor = None
# get read variable op tensor from these consumers
# do not pick the one with _1 , it is not fetch-able
for consumer in tensor_consumers:
if consumer.type == 'ReadVariableOp' and 'ReadVariableOp_1' not in consumer.name:
assert consumer.outputs is not None
var_read_tensor = consumer.outputs[0]
break
assert var_read_tensor is not None
return var_read_tensor
@staticmethod
def get_beta_read_op(bn_op: tf.Operation) -> tf.Operation:
"""
Get beta read op from BN op specified.
:param bn_op: bn_op obtained from connected graph using get_modules (is mul_1 op inside BN scope)
:return: beta read op
"""
if bn_op.type in ['Mul']:
# For regular BN
# mul_1 -> add_1 <-- sub <-- beta_read
assert len(bn_op.outputs) >= 1, _BN_STRUCTURE_ERROR_MSG
add_1 = bn_op.outputs[0].consumers()[0]
assert len(add_1.inputs) >= 2, _BN_STRUCTURE_ERROR_MSG
sub = add_1.inputs[1].op
assert len(sub.inputs) >= 1, _BN_STRUCTURE_ERROR_MSG
beta_read = sub.inputs[0].op
elif bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm']:
assert len(bn_op.inputs) == 5
beta_read = bn_op.inputs[constants.BN_OP_PARAM_INDICES['beta']].op
if beta_read.type == 'Switch': # tf slim bn using training tensor form
beta_read = beta_read.inputs[0].op
assert 'read' in beta_read.name
else:
logger.error("Error, unknown BN op")
assert False
assert beta_read.type in ['ReadVariableOp', 'Identity'] # Will be identity for tf slim BNs
return beta_read
@staticmethod
def _get_beta_read_var_op_tensor_using_structure(bn_op: tf.Operation) -> tf.Tensor:
"""
Get beta readVariableOp tensor from BN op specified.
:param bn_op: FusedBatchNorm as tf.Operation
:return: tensor associated with bn op beta readVariableOp type, as tf.Tensor
"""
assert bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm', 'Mul']
beta_read_tensor = BNUtils.get_beta_read_op(bn_op).outputs[0]
assert beta_read_tensor is not None
if beta_read_tensor.op.inputs[0].op.type == 'Switch':
logger.debug('Fetching params from trainable BN op type')
beta_read_tensor = BNUtils._get_tensor_read_var_op_trainable_bn_op(beta_read_tensor)
return beta_read_tensor
@staticmethod
def get_beta_read_var_op_tensor(graph: tf.Graph, bn_op: tf.Operation) -> tf.Tensor:
"""
Get beta readVariableOp tensor from BN op specified.
:param graph: TensorFlow graph
:param bn_op: FusedBatchNorm as tf.Operation
:return: tensor associated with bn op beta readVariableOp type, as tf.Tensor
"""
try:
# try name based tensor look up for Keras layers
beta_read_tensor = BNUtils._get_bn_param_tensor_using_name(graph, bn_op,
constants.BNOpParamType.beta)
except KeyError:
# if we can't find the tensor name, use structure match
# to figure out the read tensor for param
beta_read_tensor = BNUtils._get_beta_read_var_op_tensor_using_structure(bn_op)
return beta_read_tensor
@staticmethod
def get_beta_as_numpy_data(sess: tf.compat.v1.Session, bn_op: tf.Operation) -> np.ndarray:
"""
Get beta param from BN op specified.
:param sess: tensorflow session
:param bn_op: bn_op as tf.Operation
:return: beta tensor as numpy data
"""
beta_tensor = BNUtils.get_beta_read_var_op_tensor(sess.graph, bn_op)
with sess.graph.as_default():
numpy_data = sess.run(beta_tensor)
return numpy_data
@staticmethod
def get_gamma_as_read_op(bn_op: tf.Operation) -> tf.Operation:
"""
Get gamma read op from BN op specified.
:param bn_op: bn_op obtained from connected graph using get_modules (is mul_1 op inside BN scope)
:return: gamma read op
"""
if bn_op.type in ['Mul']:
# For regular BN
# mul_1 <-- mul <-- gamma_read <-- gamma_tensor
assert len(bn_op.inputs) >= 2, _BN_STRUCTURE_ERROR_MSG
mul = bn_op.inputs[1].op
assert len(mul.inputs) >= 2, _BN_STRUCTURE_ERROR_MSG
gamma_read = mul.inputs[1].op
elif bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm']:
assert len(bn_op.inputs) == 5
gamma_read = bn_op.inputs[constants.BN_OP_PARAM_INDICES['gamma']].op
if gamma_read.type == 'Switch': # tf slim bn using training tensor form
gamma_read = gamma_read.inputs[0].op
assert 'read' in gamma_read.name or gamma_read.type == 'Const'
else:
logger.error("Error, unknown BN op")
assert False
assert gamma_read.type in ['ReadVariableOp', 'Identity', 'Const'] # Will be identity for tf slim BNs
return gamma_read
@staticmethod
def _get_gamma_read_var_op_tensor_using_structure(bn_op: tf.Operation) -> tf.Tensor:
"""
Get the gamma read var op tensor associated with the batchnorm op.
:param bn_op: Batchnorm op to get gamma read var op tensor from
:return: Gamma read var op tensor associated with bn_op
"""
assert bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm', 'Mul']
gamma_read_tensor = BNUtils.get_gamma_as_read_op(bn_op).outputs[0]
assert gamma_read_tensor is not None
if gamma_read_tensor.op.inputs and gamma_read_tensor.op.inputs[0].op.type == 'Switch':
logger.debug('Fetching params from trainable BN op type')
gamma_read_tensor = BNUtils._get_tensor_read_var_op_trainable_bn_op(gamma_read_tensor)
return gamma_read_tensor
@staticmethod
def get_gamma_read_var_op_tensor(graph: tf.Graph, bn_op: tf.Operation) -> tf.Tensor:
"""
Get the gamma read var op tensor associated with the batchnorm op.
:param graph: TensorFlow graph
:param bn_op: Batchnorm op to get gamma read var op tensor from
:return: Gamma read var op tensor associated with bn_op
"""
try:
# try name based tensor look up for Keras layers
gamma_read_tensor = BNUtils._get_bn_param_tensor_using_name(graph, bn_op,
constants.BNOpParamType.gamma)
except KeyError:
# if we can't find the tensor name, use structure match
# to figure out the read tensor for param
gamma_read_tensor = BNUtils._get_gamma_read_var_op_tensor_using_structure(bn_op)
return gamma_read_tensor
@staticmethod
def get_gamma_as_numpy_data(sess: tf.compat.v1.Session, bn_op: tf.Operation) -> np.ndarray:
"""
Get gamma param from BN op specified.
:param sess: tensorflow session
:param bn_op: bn_op obtained from connected graph using get_modules (is mul_1 op inside BN scope)
:return: gamma as numpy data
"""
gamma_tensor = BNUtils.get_gamma_read_var_op_tensor(sess.graph, bn_op)
with sess.graph.as_default():
numpy_data = sess.run(gamma_tensor)
return numpy_data
@staticmethod
def _bn_op_var_struct_1(bn_op: tf.Operation) -> Union[tf.Operation, None]:
"""
Return moving_variance op corresponding to batchnorm with training tensor.
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: Read operation for moving_variance
"""
try:
mul_op = bn_op.inputs[1].op
assert mul_op.type == 'Mul'
rsqrt_op = mul_op.inputs[0].op
assert rsqrt_op.type == 'Rsqrt'
add_op = rsqrt_op.inputs[0].op
assert add_op.type == 'AddV2'
merge_op = add_op.inputs[0].op
assert merge_op.type == 'Merge'
read_op = merge_op.inputs[0].op
assert read_op.type in ['ReadVariableOp']
return read_op
except: # pylint: disable=bare-except
return None
@staticmethod
def _bn_op_var_struct_2(bn_op: tf.Operation) -> Union[tf.Operation, None]:
"""
Return moving_variance op corresponding to batchnorm with training=True.
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: Read operation for moving_variance
"""
try:
mul_op = bn_op.inputs[1].op
assert mul_op.type == 'Mul'
rsqrt_op = mul_op.inputs[0].op
assert rsqrt_op.type == 'Rsqrt'
add_op = rsqrt_op.inputs[0].op
assert add_op.type == 'AddV2'
squeeze_1_op = add_op.inputs[0].op
assert squeeze_1_op.type == 'Squeeze'
sub_op = squeeze_1_op.outputs[0].consumers()[0]
assert sub_op.type == 'Sub'
read_op = sub_op.inputs[0].op
assert read_op.type in ['ReadVariableOp']
return read_op
except: # pylint: disable=bare-except
return None
@staticmethod
def _bn_op_var_struct_3(bn_op: tf.Operation) -> Union[tf.Operation, None]:
"""
Return moving_variance op corresponding to batchnorm with training=False.
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: Read operation for moving_variance
"""
try:
mul_op = bn_op.inputs[1].op
assert mul_op.type == 'Mul'
rsqrt_op = mul_op.inputs[0].op
assert rsqrt_op.type == 'Rsqrt'
add_op = rsqrt_op.inputs[0].op
assert add_op.type == 'AddV2'
read_op = add_op.inputs[0].op
assert read_op.type in ['ReadVariableOp']
return read_op
except: # pylint: disable=bare-except
return None
@staticmethod
def get_moving_variance_as_read_op(bn_op: tf.Operation) -> Union[tf.Operation, None]:
"""
Get moving variance read op from BN op specified.
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: moving variance as read op
"""
# register handlers for different structures
bn_op_struct_for_variance_handlers = [BNUtils._bn_op_var_struct_1,
BNUtils._bn_op_var_struct_2,
BNUtils._bn_op_var_struct_3]
if bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm']:
assert len(bn_op.inputs) == 5
moving_var_read = bn_op.inputs[constants.BN_OP_PARAM_INDICES['movingvariance']].op
if moving_var_read.type == 'Switch': # tf slim bn using training tensor form
moving_var_read = moving_var_read.inputs[0].op
assert 'read' in moving_var_read.name
elif bn_op.type in ['Mul']:
# For regular BN
moving_var_read = None
# try all handlers available
for handler in bn_op_struct_for_variance_handlers:
if moving_var_read is None:
moving_var_read = handler(bn_op)
else:
break
assert moving_var_read is not None, _BN_STRUCTURE_ERROR_MSG
else:
logger.error("Error, unknown BN op")
assert False
if moving_var_read.type == 'Identity':
assert len(moving_var_read.inputs) == 1, _BN_STRUCTURE_ERROR_MSG
assert moving_var_read.type in ['ReadVariableOp', 'Const', 'Identity']
return moving_var_read
@staticmethod
def _get_moving_variance_read_var_op_tensor_using_structure(bn_op: tf.Operation) -> tf.Tensor:
"""
Get moving variance readVariableOp tensor from BN op specified.
:param bn_op: FusedBatchNorm as tf.Operation
:return: tensor associated with bn op moving variance readVariableOp type, as tf.Tensor
"""
# only support fused BN
assert bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm', 'Mul']
moving_var_read_tensor = BNUtils.get_moving_variance_as_read_op(bn_op).outputs[0]
assert moving_var_read_tensor is not None
if moving_var_read_tensor.op.type == 'Const':
logger.debug("BN op has const type op for moving variance")
# get the sub_1 op associated with moving variance read op
assert len(bn_op.outputs) >= 2
moving_avg_1_sub_1 = bn_op.outputs[2].consumers()[0]
all_inputs = moving_avg_1_sub_1.inputs
# among inputs figure out the read var op type that can be "evaluated"
for input_t in all_inputs:
if input_t.op.type == 'ReadVariableOp':
moving_var_read_tensor = input_t
elif input_t.op.type == 'Identity' and 'read:0' in input_t.name: # tf slim form
moving_var_read_tensor = input_t
elif moving_var_read_tensor.op.inputs[0].op.type == 'Switch':
logger.debug("Fetch moving var from a trainable BN op structure")
moving_var_read_tensor = BNUtils._get_tensor_read_var_op_trainable_bn_op(moving_var_read_tensor)
return moving_var_read_tensor
@staticmethod
def get_moving_variance_read_var_op_tensor(graph: tf.Graph, bn_op: tf.Operation) -> tf.Tensor:
"""
Get moving variance readVariableOp tensor from BN op specified.
:param graph: TensorFlow graph
:param bn_op: FusedBatchNorm as tf.Operation
:return: tensor associated with bn op moving variance readVariableOp type, as tf.Tensor
"""
try:
# try name based tensor look up for Keras layers
moving_var_read_tensor = BNUtils._get_bn_param_tensor_using_name(graph, bn_op,
constants.BNOpParamType.moving_variance)
except KeyError:
# if we can't find the tensor name, use structure match
# to figure out the read tensor for param
moving_var_read_tensor = BNUtils._get_moving_variance_read_var_op_tensor_using_structure(bn_op)
return moving_var_read_tensor
@staticmethod
def get_moving_variance_as_numpy_data(sess: tf.compat.v1.Session, bn_op: tf.Operation) -> np.ndarray:
"""
Get moving variance param from BN op specified.
:param sess: tensorflow session
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: moving variance as numpy data
"""
moving_var_tensor = BNUtils.get_moving_variance_read_var_op_tensor(sess.graph, bn_op)
with sess.graph.as_default():
numpy_data = sess.run(moving_var_tensor)
return numpy_data
@staticmethod
def _bn_op_mean_struct_1(bn_op: tf.Operation) -> Union[tf.Operation, None]:
"""
Return moving_mean op corresponding to batchnorm with training tensor.
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: Read operation for moving_mean
"""
try:
mul_op = bn_op.inputs[1].op
assert mul_op.type == 'Mul'
mul_2_op = mul_op.outputs[0].consumers()[1]
assert mul_2_op.type == 'Mul'
merge_op = mul_2_op.inputs[0].op
assert merge_op.type == 'Merge'
read_op = merge_op.inputs[0].op
assert read_op.type in ['ReadVariableOp']
return read_op
except: # pylint: disable=bare-except
return None
@staticmethod
def _bn_op_mean_struct_2(bn_op: tf.Operation) -> Union[tf.Operation, None]:
"""
Return moving_mean op corresponding to batchnorm with training=True.
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: Read operation for moving_mean
"""
try:
mul_op = bn_op.inputs[1].op
assert mul_op.type == 'Mul'
mul_2_op = mul_op.outputs[0].consumers()[1]
assert mul_2_op.type == 'Mul'
squeeze_op = mul_2_op.inputs[0].op
assert squeeze_op.type == 'Squeeze'
sub_op = squeeze_op.outputs[0].consumers()[0]
assert sub_op.type == 'Sub'
read_op = sub_op.inputs[0].op
assert read_op.type in ['ReadVariableOp']
return read_op
except: # pylint: disable=bare-except
return None
@staticmethod
def _bn_op_mean_struct_3(bn_op: tf.Operation) -> Union[tf.Operation, None]:
"""
Return moving_mean op corresponding to batchnorm with training=False.
:param bn_op: bn_op obtained from connected graph using get_modules
a mul_1 op inside BN scope.
:return: Read operation for moving_mean
"""
try:
mul_op = bn_op.inputs[1].op
assert mul_op.type == 'Mul'
mul_2_op = mul_op.outputs[0].consumers()[1]
assert mul_2_op.type == 'Mul'
read_op = mul_2_op.inputs[0].op
assert read_op.type in ['ReadVariableOp']
return read_op
except: # pylint: disable=bare-except
return None
@staticmethod
def get_moving_mean_as_read_op(bn_op: tf.Operation) -> Union[tf.Operation, None]:
"""
Get moving mean read op from BN op specified.
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: moving mean read op
"""
if bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm']:
assert len(bn_op.inputs) == 5
moving_mean_read = bn_op.inputs[constants.BN_OP_PARAM_INDICES['movingmean']].op
if moving_mean_read.type == 'Switch': # tf slim bn using training tensor form
moving_mean_read = moving_mean_read.inputs[0].op
assert 'read' in moving_mean_read.name
elif bn_op.type in ['Mul']:
# For regular BN
# mul_1 << - mul --> mul_2 <-- cond/merge <-- switch2 <-- moving mean read < moving mean tensor
# inputs[1] is mul .op.inputs[1] is gamma:read op whose input is gamma tensor as variable v2
# register handlers for different structures
bn_op_struct_for_mean_handlers = [BNUtils._bn_op_mean_struct_1,
BNUtils._bn_op_mean_struct_2,
BNUtils._bn_op_mean_struct_3]
moving_mean_read = None
# try all handlers available
for handler in bn_op_struct_for_mean_handlers:
if moving_mean_read is None:
moving_mean_read = handler(bn_op)
else:
break
assert moving_mean_read is not None, _BN_STRUCTURE_ERROR_MSG
else:
logger.error("Error, unknown BN op")
assert False
if moving_mean_read.type == 'Identity':
assert len(moving_mean_read.inputs) == 1, _BN_STRUCTURE_ERROR_MSG
assert moving_mean_read.type in ['ReadVariableOp', 'Const', 'Identity']
return moving_mean_read
@staticmethod
def _get_moving_mean_read_var_op_tensor_using_structure(bn_op: tf.Operation) -> tf.Tensor:
"""
Get moving mean readVariableOp tensor from BN op specified.
:param bn_op: FusedBatchNorm as tf.Operation
:return: tensor associated with bn op moving mean readVariableOp type, as tf.Tensor
"""
# only support fused BN
assert bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm', 'Mul']
moving_mean_read_tensor = BNUtils.get_moving_mean_as_read_op(bn_op).outputs[0]
assert moving_mean_read_tensor is not None
if moving_mean_read_tensor.op.type == 'Const':
logger.debug("BN op has const type op for moving variance")
# get the read var type from bn op
# get the sub_1 op associated with moving mean read op
assert len(bn_op.outputs) > 1
moving_avg_sub_1 = bn_op.outputs[1].consumers()[0]
all_inputs = moving_avg_sub_1.inputs
# among inputs figure out the read var op type that can be "evaluated"
for input_t in all_inputs:
if input_t.op.type == 'ReadVariableOp':
moving_mean_read_tensor = input_t
elif input_t.op.type == 'Identity' and 'read:0' in input_t.name: # tf slim form
moving_mean_read_tensor = input_t
elif moving_mean_read_tensor.op.inputs[0].op.type == 'Switch':
logger.debug("Fetch moving var from a trainable BN op structure")
moving_mean_read_tensor = BNUtils._get_tensor_read_var_op_trainable_bn_op(moving_mean_read_tensor)
return moving_mean_read_tensor
@staticmethod
def get_moving_mean_read_var_op_tensor(graph: tf.Graph, bn_op: tf.Operation) -> tf.Tensor:
"""
Get moving mean readVariableOp tensor from BN op specified.
:param graph: TensorFlow graph
:param bn_op: FusedBatchNorm as tf.Operation
:return: tensor associated with bn op moving mean readVariableOp type, as tf.Tensor
"""
try:
# try name based tensor look up for Keras layers
moving_mean_read_tensor = BNUtils._get_bn_param_tensor_using_name(graph, bn_op,
constants.BNOpParamType.moving_mean)
except KeyError:
# if we can't find the tensor name, use structure match
# to figure out the read tensor for param
moving_mean_read_tensor = BNUtils._get_moving_mean_read_var_op_tensor_using_structure(bn_op)
return moving_mean_read_tensor
@staticmethod
def get_moving_mean_as_numpy_data(sess: tf.compat.v1.Session, bn_op: tf.Operation) -> np.ndarray:
"""
Get moving mean param from BN op specified.
:param sess: tensorflow session
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: moving mean as numpy data
"""
moving_mean_tensor = BNUtils.get_moving_mean_read_var_op_tensor(sess.graph, bn_op)
with sess.graph.as_default():
numpy_data = sess.run(moving_mean_tensor)
return numpy_data
@staticmethod
def get_epsilon(bn_op: tf.Operation) -> float:
"""
Returns epsilon extracted from given bn op.
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: epsilon value
"""
if bn_op.type in ['Mul']:
assert len(bn_op.inputs) >= 2, _BN_STRUCTURE_ERROR_MSG
mul = bn_op.inputs[1].op
assert len(mul.inputs) >= 1, _BN_STRUCTURE_ERROR_MSG
rsqrt = mul.inputs[0].op
assert len(rsqrt.inputs) >= 1, _BN_STRUCTURE_ERROR_MSG
add = rsqrt.inputs[0].op
assert len(add.inputs) >= 2, _BN_STRUCTURE_ERROR_MSG
epsilon = add.inputs[1].op
numpy_epsilon = epsilon.get_attr('value').float_val[0]
elif bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm']:
# epsilon can be derived as attribute value
numpy_epsilon = bn_op.get_attr("epsilon")
else:
logger.error("Error, unknown BN op")
assert False
return numpy_epsilon
@staticmethod
def get_assign_moving_avg_op(bn_op: tf.Operation) -> Union[tf.Operation, None]:
"""
Get assign_moving_avg op corresponding with the bn_op, if it exists.
:param bn_op: Batchnorm op to search for corresponding assign_moving_avg op
:return: assign_moving_op corresponding with the bn op, or None if it does not exist.
"""
assert bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm']
assert len(bn_op.outputs) == 6 or len(bn_op.outputs) == 5
if bn_op.outputs[1].consumers():
child_op = bn_op.outputs[1].consumers()[0]
if child_op.type == 'Merge':
sub_op = child_op.outputs[0].consumers()[0]
else:
sub_op = child_op
assert sub_op.type == 'Sub'
mul_op = sub_op.outputs[0].consumers()[0]
assert mul_op.type == 'Mul'
assign_moving_avg_op = mul_op.outputs[0].consumers()[0]
assert assign_moving_avg_op.type in ['AssignSub', 'AssignSubVariableOp']
return assign_moving_avg_op
return None
@staticmethod
def get_assign_moving_avg_1_op(bn_op: tf.Operation) -> Union[tf.Operation, None]:
"""
Get assign_moving_avg_1 op corresponding with the bn_op, if it exists.
:param bn_op: Batchnorm op to search for corresponding assign_moving_avg_1 op
:return: assign_moving_avg_1 corresponding with the bn op, or None if it does not exist.
"""
assert bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm']
assert len(bn_op.outputs) == 6 or len(bn_op.outputs) == 5
if bn_op.outputs[2].consumers():
child_op = bn_op.outputs[2].consumers()[0]
if child_op.type == 'Merge':
sub_op = child_op.outputs[0].consumers()[0]
else:
sub_op = child_op
assert sub_op.type == 'Sub'
mul_op = sub_op.outputs[0].consumers()[0]
assert mul_op.type == 'Mul'
assign_moving_avg_op = mul_op.outputs[0].consumers()[0]
assert assign_moving_avg_op.type in ['AssignSub', 'AssignSubVariableOp']
return assign_moving_avg_op
return None
@staticmethod
def remove_bn_op_from_update_ops(sess: tf.compat.v1.Session, bn_op: tf.Operation):
"""
Remove batchnorm assign_moving_avg and assign_moving_avg_1 ops from update ops.
:param sess: tf.compat.v1.Session
:param bn_op: BatchNorm operation whose assign_moving_avg and assign_moving_avg_1 ops should be removed.
"""
with sess.graph.as_default():
update_ops = tf.compat.v1.get_collection_ref(tf.compat.v1.GraphKeys.UPDATE_OPS)
assign_moving_avg_op = BNUtils.get_assign_moving_avg_op(bn_op)
assign_moving_avg_op_1 = BNUtils.get_assign_moving_avg_1_op(bn_op)
if assign_moving_avg_op and assign_moving_avg_op in update_ops:
update_ops.remove(assign_moving_avg_op)
logger.debug('Removed %s from update ops', assign_moving_avg_op.name)
if assign_moving_avg_op_1 and assign_moving_avg_op_1 in update_ops:
update_ops.remove(assign_moving_avg_op_1)
logger.debug('Removed %s from update ops', assign_moving_avg_op_1.name)
@staticmethod
def _get_bn_param_tensor_using_name(graph: tf.Graph, bn_op: tf.Operation, param_type: constants.BNOpParamType):
"""
Helper to get BN op param read tensor.
:param graph: TensorFlow graph
:param bn_op: BN op from which param read tensor is to be extracted
:param param_type: param type for which param tensor is to be extracted, as constants.BNOpParamType (supported
types are beta, gamma, moving_mean or moving_variance)
:return: param read tensor
"""
if param_type not in vars(constants.BNOpParamType).values():
assert 0, 'Error, get_bn_param_using_name() invalid param type requested'
# name of the fused bn contains bn_name/FusedBatchNormV3 or
# bn_name/cond/FusedBatchNormV3_1
# we need only the bn_name to make param tensor names
op_name = bn_op.name.split('/')[0]
param_tensor_name = op_name + constants.BN_OP_PARAM_NAME_SUFFIX[param_type]
param_tensor = graph.get_tensor_by_name(param_tensor_name)
return param_tensor
@staticmethod
def _bn_op_momentum_struct_1(bn_op: tf.Operation) -> Union[float, None]:
"""
Return momentum value corresponding to batchnorm with training tensor.
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: momentum value
"""
try:
mul_op = bn_op.inputs[1].op
assert mul_op.type == 'Mul'
mul_2_op = mul_op.outputs[0].consumers()[1]
assert mul_2_op.type == 'Mul'
merge_op = mul_2_op.inputs[0].op
assert merge_op.type == 'Merge'
switch_1_op = merge_op.outputs[0].consumers()[0]
assert switch_1_op.type == 'Switch'
sub_op = switch_1_op.outputs[1].consumers()[0]
assert sub_op.type == 'Sub'
assign_moving_avg_mul_op = sub_op.outputs[0].consumers()[0]
assert assign_moving_avg_mul_op.type == 'Mul'
decay_op = assign_moving_avg_mul_op.inputs[1].op
assert decay_op.type == 'Const'
decay = decay_op.get_attr('value').float_val[0]
return 1 - decay
except: # pylint: disable=bare-except
return None
@staticmethod
def _bn_op_momentum_struct_2(bn_op: tf.Operation) -> Union[float, None]:
"""
Return momentum value corresponding to batchnorm with training=True.
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: momentum value
"""
try:
mul_op = bn_op.inputs[1].op
assert mul_op.type == 'Mul'
mul_2_op = mul_op.outputs[0].consumers()[1]
assert mul_2_op.type == 'Mul'
squeeze_op = mul_2_op.inputs[0].op
assert squeeze_op.type == 'Squeeze'
sub_op = squeeze_op.outputs[0].consumers()[0]
assert sub_op.type == 'Sub'
assign_moving_avg_mul_op = sub_op.outputs[0].consumers()[0]
assert assign_moving_avg_mul_op.type == 'Mul'
decay_op = assign_moving_avg_mul_op.inputs[1].op
assert decay_op.type == 'Const'
decay = decay_op.get_attr('value').float_val[0]
return 1 - decay
except: # pylint: disable=bare-except
return None
@staticmethod
def _fused_bn_op_momentum_struct_1(bn_op: tf.Operation) -> Union[float, None]:
"""
Return momentum value corresponding to fused batchnorm with training tensor.
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: momentum value
"""
try:
merge_1_op = bn_op.outputs[1].consumers()[0]
assert merge_1_op.type == 'Merge'
sub_op = merge_1_op.outputs[0].consumers()[0]
assert sub_op.type == 'Sub'
mul_op = sub_op.outputs[0].consumers()[0]
assert mul_op.type == 'Mul'
sub_2_op = mul_op.inputs[1].op
assert sub_2_op.type == 'Sub'
merge_op = sub_2_op.inputs[1].op
assert merge_op.type == 'Merge'
decay_op = merge_op.inputs[1].op
assert decay_op.type == 'Const'
decay = decay_op.get_attr('value').float_val[0]
return decay
except: # pylint: disable=bare-except
return None
@staticmethod
def _fused_bn_op_momentum_struct_2(bn_op: tf.Operation) -> Union[float, None]:
"""
Return momentum value corresponding to fused batchnorm with training=True.
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: momentum value
"""
try:
sub_op = bn_op.outputs[1].consumers()[0]
assert sub_op.type == 'Sub'
mul_op = sub_op.outputs[0].consumers()[0]
assert mul_op.type == 'Mul'
sub_2_op = mul_op.inputs[1].op
assert sub_2_op.type == 'Sub'
decay_op = sub_2_op.inputs[1].op
assert decay_op.type == 'Const'
decay = decay_op.get_attr('value').float_val[0]
return decay
except: # pylint: disable=bare-except
return None
@staticmethod
def get_momentum(bn_op: tf.Operation) -> float:
"""
Returns momentum extracted from given bn op. If bn op is training=False mode, momentum will be none.
:param bn_op: bn_op obtained from connected graph using get_modules a mul_1 op inside BN scope.
:return: momentum value
"""
# register handlers for different structures
bn_op_struct_for_momentum_handlers = [BNUtils._bn_op_momentum_struct_1,
BNUtils._bn_op_momentum_struct_2]
fused_bn_op_struct_for_momentum_handlers = [BNUtils._fused_bn_op_momentum_struct_1,
BNUtils._fused_bn_op_momentum_struct_2]
decay = None
if bn_op.type in ['Mul']:
# try all handlers available
for handler in bn_op_struct_for_momentum_handlers:
if decay is None:
decay = handler(bn_op)
else:
break
elif bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm']:
# try all handlers available
for handler in fused_bn_op_struct_for_momentum_handlers:
if decay is None:
decay = handler(bn_op)
else:
break
else:
logger.error("Error, unknown BN op")
assert False
return decay
@staticmethod
def get_training(bn_op: tf.Operation) -> Union[None, bool, tf.Tensor]:
"""
Returns either a boolean of whether the BN op training mode is True or False, or the is_training tensor
feeding into the BN op if it is using a tensor to determine the mode dynamically.
:param bn_op: bn_op obtained in the connected graph
:return: True or False for training mode, or tf.Tensor that determines the mode dynamically.
"""
assert bn_op.type in ['FusedBatchNormV3', 'FusedBatchNorm', 'Mul']
if bn_op.type == 'FusedBatchNormV3' or bn_op.type == 'FusedBatchNorm':
if 'FusedBatchNormV3_1' in bn_op.name:
switch_op = bn_op.inputs[0].op
pred_id_op = switch_op.inputs[1].op
training = pred_id_op.inputs[0]
else:
training = bn_op.get_attr('is_training')
return training
# Non fused batchnorm case
mul_op = bn_op.inputs[1].op
assert mul_op.type == 'Mul'
rsqrt_op = mul_op.inputs[0].op
assert rsqrt_op.type == 'Rsqrt'
add_op = rsqrt_op.inputs[0].op
assert add_op.type == 'AddV2'
add_input_op = add_op.inputs[0].op
if add_input_op.type == 'Squeeze':
return True
if add_input_op.type == 'ReadVariableOp':
return False
if add_input_op.type == 'Merge':
switch_op = add_input_op.inputs[1].op
assert switch_op.type == 'Switch'
pred_id_op = switch_op.inputs[1].op
assert pred_id_op.type == 'Identity'
return pred_id_op.inputs[0]
logger.error('Error, unknown BN structure')
return None
| 43.552319 | 118 | 0.626112 | 5,458 | 40,373 | 4.377794 | 0.065958 | 0.045032 | 0.012053 | 0.021344 | 0.792124 | 0.749435 | 0.709258 | 0.668662 | 0.620281 | 0.583243 | 0 | 0.010818 | 0.292497 | 40,373 | 926 | 119 | 43.599352 | 0.82569 | 0.28851 | 0 | 0.573585 | 0 | 0 | 0.071916 | 0.000922 | 0 | 0 | 0 | 0 | 0.216981 | 1 | 0.066038 | false | 0 | 0.011321 | 0 | 0.171698 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4917259ce7d453d0c463913b457ccefb5c69c0f6 | 2,772 | py | Python | src/TamaTou.py | hirmiura/starsector-mod-Font_Replacement_for_Orbitron | ad7b5e3f4d8afd1a2aa97a420a2ec9a3aaf9b3d7 | [
"MIT"
] | 1 | 2022-01-17T02:58:46.000Z | 2022-01-17T02:58:46.000Z | src/TamaTou.py | hirmiura/starsector-mod-Font_Replacement_for_Orbitron | ad7b5e3f4d8afd1a2aa97a420a2ec9a3aaf9b3d7 | [
"MIT"
] | null | null | null | src/TamaTou.py | hirmiura/starsector-mod-Font_Replacement_for_Orbitron | ad7b5e3f4d8afd1a2aa97a420a2ec9a3aaf9b3d7 | [
"MIT"
] | null | null | null | # SPDX-License-Identifier: MIT
# Copyright 2022 hirmiura (https://github.com/hirmiura)
#
# TamaTouを生成するスクリプト
#
# 使い方
# 1. fontforgeでコンソールを出す(fontforge-console.bat)
# 2. ディレクトリ移動
# 3. ffpython TamaTou.py
# 4. 待つ
#
# Orbitron → オービトロン → オーブ → 玉 → Tamaやな!
# Noto → No Toufu → 豆腐だらけだし → Touやな!
# →→ TamaTou
#
import fontforge
for weight in ['Regular', 'Bold']:
print('玉改変')
fn = fontforge.open(f'Orbitron-{weight}.ttf')
fn.encoding = 'UnicodeFull'
fn.save(f'tmp1-{weight}.sfd')
fn.close()
print(f'能登改変 {weight}')
fn = fontforge.open(f'NotoSansJP-{weight}.otf')
fn.encoding = 'UnicodeFull'
fn.cidFlatten()
# fn.ascent = 800
# fn.descent = 200
# fn.upos = -125
# fn.em = 1000
fn.save(f'tmp2-{weight}.sfd')
fn.close()
print('作成')
name = 'TamaTou'
copyright = 'Copyright (c) 2022, Hiroshi Miura (https://github.com/hirmiura) with Reserved Font Name TamaTou.'
version = '1.0.0'
license = 'Open Font License'
fn = fontforge.open(f'tmp1-{weight}.sfd')
fn.fontname = name
fn.familyname = name
fn.fullname = name
fn.weight = weight
fn.version = version
fn.sfntRevision = None
fn.copyright = copyright
fn.appendSFNTName(0x411, 0, copyright)
fn.appendSFNTName(0x411, 1, name)
fn.appendSFNTName(0x411, 2, '')
fn.appendSFNTName(0x411, 3, '')
fn.appendSFNTName(0x411, 4, name)
fn.appendSFNTName(0x411, 5, version)
fn.appendSFNTName(0x411, 6, name + '-' + weight)
fn.appendSFNTName(0x411, 7, '')
fn.appendSFNTName(0x411, 8, '')
fn.appendSFNTName(0x411, 9, '')
fn.appendSFNTName(0x411, 10, '')
fn.appendSFNTName(0x411, 11, '')
fn.appendSFNTName(0x411, 12, '')
fn.appendSFNTName(0x411, 13, license)
fn.appendSFNTName(0x411, 14, '')
fn.appendSFNTName(0x411, 15, '')
fn.appendSFNTName(0x411, 16, name)
fn.appendSFNTName(0x411, 17, '')
fn.appendSFNTName(0x409, 0, copyright)
fn.appendSFNTName(0x409, 1, name)
fn.appendSFNTName(0x409, 2, '')
fn.appendSFNTName(0x409, 3, '')
fn.appendSFNTName(0x409, 4, name)
fn.appendSFNTName(0x409, 5, version)
fn.appendSFNTName(0x409, 6, name + '-' + weight)
fn.appendSFNTName(0x409, 7, '')
fn.appendSFNTName(0x409, 8, '')
fn.appendSFNTName(0x409, 9, '')
fn.appendSFNTName(0x409, 10, '')
fn.appendSFNTName(0x409, 11, '')
fn.appendSFNTName(0x409, 12, '')
fn.appendSFNTName(0x409, 13, license)
fn.appendSFNTName(0x409, 14, '')
fn.appendSFNTName(0x409, 15, '')
fn.appendSFNTName(0x409, 16, name)
fn.appendSFNTName(0x409, 17, '')
# fn.mergeFonts(f'tmp1-{weight}.sfd')
fn.mergeFonts(f'tmp2-{weight}.sfd')
fn.save(f'tmp3-{weight}.sfd')
fn.generate(f'TamaTou-{weight}.otf')
fn.close()
| 30.461538 | 114 | 0.636724 | 349 | 2,772 | 5.083095 | 0.292264 | 0.32469 | 0.213078 | 0.027058 | 0.086809 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103464 | 0.198052 | 2,772 | 90 | 115 | 30.8 | 0.690508 | 0.135281 | 0 | 0.075758 | 0 | 0.015152 | 0.137626 | 0.018519 | 0 | 0 | 0.075758 | 0 | 0 | 1 | 0 | false | 0 | 0.015152 | 0 | 0.015152 | 0.045455 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4918381c344c6f579cf53ea1bf560dc12227d2bf | 2,623 | py | Python | bambu/bambu.py | westurner/pandasrdf | c194b1eb9928488bc19b82d3cab409158cd413a3 | [
"BSD-3-Clause"
] | 2 | 2016-07-01T10:48:04.000Z | 2017-01-24T16:53:44.000Z | bambu/bambu.py | westurner/pandasrdf | c194b1eb9928488bc19b82d3cab409158cd413a3 | [
"BSD-3-Clause"
] | 1 | 2016-06-20T10:54:53.000Z | 2017-02-07T05:47:38.000Z | bambu/bambu.py | westurner/pandasrdf | c194b1eb9928488bc19b82d3cab409158cd413a3 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from __future__ import print_function
"""
bambu
------
pandas RDF functionality
Installation
--------------
::
# pip install pandas
pip install rdflib
"""
import sys
import pandas as pd
import rdflib
def bambu():
"""
mainfunc
"""
pass
def to_rdf(df):
"""
Args:
df (DataFrame): pandas DataFrame to serialize to RDF
kwargs (dict): TODO
Returns:
rdflib.Graph: a serializable RDFLib Graph
"""
def read_rdf(path, **kwargs):
"""
Args:
path (str): path to an RDF source
kwargs (dict): TODO
Returns:
DataFrame: pandas DataFrame
"""
def to_rdfa(df, **kwargs):
"""
Args:
df (DataFrame): pandas DataFrame to serialize to RDF
kwargs (dict): TODO
Returns:
(list, StringIO): namespaces, RDFa table
"""
def read_rdfa(path, **kwargs):
"""
Args:
path (str): path to an RDF source
kwargs (dict): TODO
Returns:
DataFrame: pandas DataFrame
"""
def to_jsonld(df, **kwargs):
"""
Args:
df (DataFrame): pandas DataFrame to serialize to RDF
kwargs (dict): TODO
Returns:
(context, StringIO): JSONLD context, JSONLD data
"""
def read_jsonld(path, **kwargs):
"""
Args:
path (str): path to a JSONLD source
kwargs (dict): TODO
Returns:
DataFrame: pandas DataFrame
"""
import unittest
class Test_bambu(unittest.TestCase):
def test_bambu(self):
pass
def test_10_to_rdf(self):
df = pd.DataFrame([[1, 2], [3, 4]], columns=['A', 'B'])
output = to_rdf(df)
print(output)
self.assertTrue(output)
def main(*args):
import optparse
import logging
prs = optparse.OptionParser(usage="%prog: [args]")
prs.add_option('-v', '--verbose',
dest='verbose',
action='store_true',)
prs.add_option('-q', '--quiet',
dest='quiet',
action='store_true',)
prs.add_option('-t', '--test',
dest='run_tests',
action='store_true',)
args = args and list(args) or sys.argv[1:]
(opts, args) = prs.parse_args(args)
if not opts.quiet:
logging.basicConfig()
if opts.verbose:
logging.getLogger().setLevel(logging.DEBUG)
if opts.run_tests:
sys.argv = [sys.argv[0]] + args
import unittest
sys.exit(unittest.main())
output = bambu()
return 0
if __name__ == "__main__":
sys.exit(main())
| 18.602837 | 63 | 0.552421 | 301 | 2,623 | 4.700997 | 0.332226 | 0.021201 | 0.101767 | 0.089046 | 0.371025 | 0.371025 | 0.332862 | 0.313781 | 0.277739 | 0.277739 | 0 | 0.00554 | 0.311857 | 2,623 | 140 | 64 | 18.735714 | 0.778393 | 0.309569 | 0 | 0.145833 | 0 | 0 | 0.068919 | 0 | 0 | 0 | 0 | 0.042857 | 0.020833 | 1 | 0.208333 | false | 0.041667 | 0.166667 | 0 | 0.416667 | 0.041667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4924d0f3858273f23eb72e262ac3af691158f5e6 | 835 | py | Python | invenio_app_ils/ill/loaders/jsonschemas/borrowing_request.py | equadon/invenio-app-ils | 42ba282968d0aa28fb1bfc71d0709685165aaec4 | [
"MIT"
] | null | null | null | invenio_app_ils/ill/loaders/jsonschemas/borrowing_request.py | equadon/invenio-app-ils | 42ba282968d0aa28fb1bfc71d0709685165aaec4 | [
"MIT"
] | null | null | null | invenio_app_ils/ill/loaders/jsonschemas/borrowing_request.py | equadon/invenio-app-ils | 42ba282968d0aa28fb1bfc71d0709685165aaec4 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#
# Copyright (C) 2019-2020 CERN.
#
# invenio-app-ils is free software; you can redistribute it and/or modify it
# under the terms of the MIT License; see LICENSE file for more details.
"""BorrowingRequest schema for marshmallow loader."""
from invenio_records_rest.schemas import RecordMetadataSchemaJSONV1
from marshmallow import EXCLUDE, fields
class BorrowingRequestSchemaV1(RecordMetadataSchemaJSONV1):
"""BorrowingRequest schema."""
class Meta:
"""Meta attributes for the schema."""
unknown = EXCLUDE
cancel_reason = fields.Str()
document_pid = fields.Str(required=True)
name = fields.Str(required=True)
notes = fields.Str()
library_pid = fields.Str(required=True) # TODO: validate
status = fields.Str(required=True) # TODO: this should be an enum
| 29.821429 | 76 | 0.718563 | 103 | 835 | 5.776699 | 0.650485 | 0.090756 | 0.114286 | 0.141176 | 0.129412 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017544 | 0.180838 | 835 | 27 | 77 | 30.925926 | 0.852339 | 0.415569 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0 | 1 | 0 | false | 0 | 0.181818 | 0 | 0.909091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
493827951fe9c01069f538a18ee56a8c22b8b962 | 1,355 | py | Python | screenpy/questions/body_of_the_last_response.py | perrygoy/screenpy | 862c0d7e5ff9f1265e520ab383c04ddbd4d060eb | [
"MIT"
] | 39 | 2019-03-22T15:18:23.000Z | 2022-02-23T17:32:03.000Z | screenpy/questions/body_of_the_last_response.py | perrygoy/screenpy | 862c0d7e5ff9f1265e520ab383c04ddbd4d060eb | [
"MIT"
] | 63 | 2019-07-17T06:25:19.000Z | 2022-01-13T07:03:53.000Z | screenpy/questions/body_of_the_last_response.py | bandophahita/screenpy | db0f3ef91a891b9d095016d83fa4b589620808ce | [
"MIT"
] | 15 | 2019-07-09T11:02:56.000Z | 2021-12-24T07:43:56.000Z | """
Investigate the body of the last API response received by the Actor.
"""
from json.decoder import JSONDecodeError
from typing import Union
from screenpy import Actor
from screenpy.abilities import MakeAPIRequests
from screenpy.exceptions import UnableToAnswer
from screenpy.pacing import beat
class BodyOfTheLastResponse:
"""Ask about the body of the last API response received by the Actor.
Abilities Required:
|MakeAPIRequests|
Examples::
the_actor.should(
See.the(BodyOfTheLastResponse(), ContainsTheEntry(play="Hamlet"))
)
the_actor.should(
See.the(BodyOfTheLastResponse(), ReadsExactly("To be, or not to be"))
)
"""
def describe(self) -> str:
"""Describe the Question.."""
return "The body of the last response."
@beat("{} examines the body of the last response they received.")
def answered_by(self, the_actor: Actor) -> Union[dict, str]:
"""Direct the Actor to investigate the body of the last response."""
responses = the_actor.ability_to(MakeAPIRequests).responses
if len(responses) < 1:
raise UnableToAnswer(f"{the_actor} has not yet received any API responses.")
try:
return responses[-1].json()
except JSONDecodeError:
return responses[-1].text
| 30.111111 | 88 | 0.667159 | 160 | 1,355 | 5.60625 | 0.4 | 0.071349 | 0.050167 | 0.06689 | 0.296544 | 0.296544 | 0.100334 | 0.100334 | 0.100334 | 0.100334 | 0 | 0.002927 | 0.243542 | 1,355 | 44 | 89 | 30.795455 | 0.872195 | 0.35941 | 0 | 0 | 0 | 0 | 0.170398 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
4941a07b8598fcd71acf4d8decca54a679038504 | 1,257 | py | Python | urlbrevity/test_urlconf.py | kezabelle/django-urlbrevity | a8b779587986c60c4e0597aead908d954480f0f9 | [
"BSD-2-Clause-FreeBSD"
] | 4 | 2015-02-13T16:20:41.000Z | 2020-07-02T18:45:50.000Z | urlbrevity/test_urlconf.py | kezabelle/django-urlbrevity | a8b779587986c60c4e0597aead908d954480f0f9 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | urlbrevity/test_urlconf.py | kezabelle/django-urlbrevity | a8b779587986c60c4e0597aead908d954480f0f9 | [
"BSD-2-Clause-FreeBSD"
] | null | null | null | # -*- coding: utf-8 -*-
from pytest import raises
from django.contrib import admin
from django.core.urlresolvers import reverse
from django.core.urlresolvers import resolve
from django.core.urlresolvers import NoReverseMatch
from django.core.urlresolvers import Resolver404
from django.http import HttpResponse
import urlbrevity
try:
from django.conf.urls import patterns, url, include
except ImportError: # pragma: no cover
from django.conf.urls.defaults import patterns, url, include
finally:
def just_a_view(request, pk):
return HttpResponse(str(pk))
urlpatterns = patterns("",
url(regex=r'^test_user/(?P<pk>\d+)/?$',
view=just_a_view),
url(r'redirect/', include(urlbrevity.redirects)),
url(r'admin/', include(admin.site.urls)),
)
def test_reversing():
assert (reverse('urlbrevity:short', kwargs={'encoded_value': 'rQuX'})
== '/redirect/rQuX')
def test_reversing_badchars():
with raises(NoReverseMatch):
reverse('urlbrevity:short', kwargs={'encoded_value': 'rQu1'})
def test_resolving():
assert resolve('/redirect/rQuX').func == urlbrevity.do_redirect
def test_resolving_badchars():
with raises(Resolver404):
resolve('/redirect/rQu1')
| 28.568182 | 73 | 0.703262 | 152 | 1,257 | 5.723684 | 0.414474 | 0.091954 | 0.064368 | 0.11954 | 0.23908 | 0.091954 | 0 | 0 | 0 | 0 | 0 | 0.008637 | 0.171042 | 1,257 | 43 | 74 | 29.232558 | 0.826296 | 0.030231 | 0 | 0 | 0 | 0 | 0.121711 | 0.020559 | 0 | 0 | 0 | 0 | 0.0625 | 1 | 0.15625 | false | 0 | 0.34375 | 0.03125 | 0.53125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
49503dbeb658d944f139ed75ff92cfc671b7acd3 | 86 | py | Python | day5.py | GuiltyD/Python_code | db03c491824b66d842a7b4ff8aa45644233526a6 | [
"MIT"
] | null | null | null | day5.py | GuiltyD/Python_code | db03c491824b66d842a7b4ff8aa45644233526a6 | [
"MIT"
] | null | null | null | day5.py | GuiltyD/Python_code | db03c491824b66d842a7b4ff8aa45644233526a6 | [
"MIT"
] | null | null | null | f = open('./day4.py')
for chunk in iter(lambda :f.read(10),''):
print(chunk)
| 17.2 | 41 | 0.55814 | 14 | 86 | 3.428571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044118 | 0.209302 | 86 | 4 | 42 | 21.5 | 0.661765 | 0 | 0 | 0 | 0 | 0 | 0.105882 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4958f00172a7990bcba76c17970e13446ea6dcfc | 8,498 | py | Python | backend/src/contaxy/operations/deployment.py | ml-tooling/contaxy | 3317a866c2ef641667a2d318885c8b0f5096b56a | [
"MIT"
] | 3 | 2021-10-17T23:25:05.000Z | 2022-02-03T21:40:59.000Z | backend/src/contaxy/operations/deployment.py | ml-tooling/contaxy | 3317a866c2ef641667a2d318885c8b0f5096b56a | [
"MIT"
] | 14 | 2021-11-09T15:24:29.000Z | 2022-03-11T13:26:04.000Z | backend/src/contaxy/operations/deployment.py | ml-tooling/contaxy | 3317a866c2ef641667a2d318885c8b0f5096b56a | [
"MIT"
] | 3 | 2022-01-27T08:31:57.000Z | 2022-02-11T13:38:00.000Z | from abc import ABC, abstractmethod
from datetime import datetime
from typing import Any, List, Literal, Optional
from contaxy.schema import Job, JobInput, ResourceAction, Service, ServiceInput
from contaxy.schema.deployment import DeploymentType
# TODO: update_service functionality
class ServiceOperations(ABC):
@abstractmethod
def list_services(
self,
project_id: str,
deployment_type: Literal[
DeploymentType.SERVICE, DeploymentType.EXTENSION
] = DeploymentType.SERVICE,
) -> List[Service]:
"""Lists all services associated with the given project.
Args:
project_id (str): The project ID to filter the services.
deployment_type (One of [DeploymentType.SERVICE, DeploymentType.JOB]): The deployment type of either Service or Extension (which is a subtype of Service).
Returns:
List[Service]: The list of services associated with the project.
"""
pass
@abstractmethod
def deploy_service(
self,
project_id: str,
service: ServiceInput,
action_id: Optional[str] = None,
deployment_type: Literal[
DeploymentType.SERVICE, DeploymentType.EXTENSION
] = DeploymentType.SERVICE,
) -> Service:
"""Deploys a service for the specified project.
If no `action_id` is provided, the system will automatically select the best deployment option.
Available deployment options (actions) can be requested via the [list_deploy_service_actions](#services/list_deploy_service_actions) operation.
If the action is from an extension, the `action_id` must be a composite ID with the following format: `{extension_id}~{action_id}`.
The action mechanism is further explained in the description of the [list_deploy_service_actions](#services/list_deploy_service_actions).
Args:
project_id (str): [description]
service (ServiceInput): [description]
action_id (Optional[str], optional): The ID of the selected action. Defaults to `None`.
deployment_type (One of [DeploymentType.SERVICE, DeploymentType.JOB]): The deployment type of either Service or Extension (which is a subtype of Service).
Returns:
Service: The metadata of the deployed service.
"""
pass
@abstractmethod
def list_deploy_service_actions(
self,
project_id: str,
service: ServiceInput,
) -> List[ResourceAction]:
"""Lists all available service deployment options (actions).
Args:
project_id (str): The project ID associated with the service.
service_id (str): The ID of the service.
Returns:
List[ResourceAction]: Available deployment actions.
"""
pass
@abstractmethod
def get_service_metadata(
self,
project_id: str,
service_id: str,
) -> Service:
"""Returns the metadata of a single service.
Args:
project_id (str): The project ID associated with the service.
service_id (str): The ID of the service.
Returns:
Service: The service metadata.
"""
pass
@abstractmethod
def delete_service(
self,
project_id: str,
service_id: str,
delete_volumes: bool = False,
) -> None:
"""Deletes a service.
Args:
project_id (str): The project ID associated with the service.
service_id (str): The ID of the service.
delete_volumes (bool, optional): If `True`, all attached volumes will be deleted. Defaults to `False`.
Raises:
RuntimeError: If an error occurs during the deletion of the service.
"""
pass
@abstractmethod
def delete_services(
self,
project_id: str,
) -> None:
"""Deletes all services associated with a project.
Args:
project_id (str): The project ID.
"""
pass
@abstractmethod
def get_service_logs(
self,
project_id: str,
service_id: str,
lines: Optional[int],
since: Optional[datetime],
) -> str:
"""Returns the logs of a service.
Args:
project_id (str): The ID of the project into which the service is deployed in.
service_id (str): The ID of the service.
lines (Optional[int]): If provided, just the last `n` lines are returned from the log. Defaults to `None`.
since (Optional[datetime]): If provided, just the logs since the given timestamp are returned. Defaults to `None`.
Raises:
NotImplementedError: [description]
RuntimeError: If reading the logs of the given service fails.
Returns:
str: The logs of the service.
"""
pass
@abstractmethod
def suggest_service_config(
self,
project_id: str,
container_image: str,
) -> ServiceInput:
"""Suggests an input configuration based on the provided `container_image`.
The suggestion is based on metadata extracted from the container image (e.g. labels)
as well as suggestions based on previous project deployments with the same image.
Args:
project_id (str): The project ID associated with the service.
container_image (str): The container image to use as context for the suggestion.
Returns:
ServiceInput: The suggested service configuration.
"""
pass
@abstractmethod
def list_service_actions(
self,
project_id: str,
service_id: str,
) -> List[ResourceAction]:
"""Lists all actions available for the specified service.
See the endpoint documentation for more information on the action mechanism.
Args:
project_id (str): The project ID associated with the service.
service_id (str): The ID of the service.
Returns:
List[ResourceAction]: Available actions for given services.
"""
pass
@abstractmethod
def execute_service_action(
self,
project_id: str,
service_id: str,
action_id: str,
) -> Any:
"""Executes the selected service action.
The actions need to be first requested from the list_service_actions operation.
If the action is from an extension, the `action_id` must be a composite ID with the following format: `{extension_id}~{action_id}`.
Args:
project_id (str): The project ID associated with the service.
service_id (str): The ID of the service.
action_id (str): The ID of the selected action.
Returns:
`None` or a redirect response to another URL.
"""
pass
class JobOperations(ABC):
@abstractmethod
def list_jobs(self, project_id: str) -> List[Job]:
pass
@abstractmethod
def deploy_job(
self,
project_id: str,
job: JobInput,
action_id: Optional[str] = None,
) -> Job:
pass
@abstractmethod
def list_deploy_job_actions(
self,
project_id: str,
job: JobInput,
) -> List[ResourceAction]:
pass
@abstractmethod
def suggest_job_config(self, project_id: str, container_image: str) -> JobInput:
pass
@abstractmethod
def get_job_metadata(self, project_id: str, job_id: str) -> Job:
pass
@abstractmethod
def delete_job(self, project_id: str, job_id: str) -> None:
pass
@abstractmethod
def delete_jobs(
self,
project_id: str,
) -> None:
"""Deletes all jobs associated with a project.
Args:
project_id (str): The project ID.
"""
pass
@abstractmethod
def get_job_logs(
self,
project_id: str,
job_id: str,
lines: Optional[int] = None,
since: Optional[datetime] = None,
) -> str:
pass
@abstractmethod
def list_job_actions(
self,
project_id: str,
job_id: str,
) -> List[ResourceAction]:
pass
@abstractmethod
def execute_job_action(self, project_id: str, job_id: str, action_id: str) -> Any:
pass
class DeploymentOperations(ServiceOperations, JobOperations, ABC):
pass
| 29.922535 | 166 | 0.619087 | 967 | 8,498 | 5.317477 | 0.163392 | 0.048619 | 0.072345 | 0.062233 | 0.527227 | 0.449242 | 0.399455 | 0.314275 | 0.286659 | 0.253209 | 0 | 0 | 0.307249 | 8,498 | 283 | 167 | 30.028269 | 0.87345 | 0.485879 | 0 | 0.721429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003534 | 0 | 1 | 0.142857 | false | 0.15 | 0.035714 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
4960c8757588886d5cee0b290cbf124fc76beb18 | 552 | py | Python | config/ConfigSices.py | atosborges00/sereno_bot | 06bedb02847eff050adeadb6bcc5440bcd2283c3 | [
"FSFAP"
] | null | null | null | config/ConfigSices.py | atosborges00/sereno_bot | 06bedb02847eff050adeadb6bcc5440bcd2283c3 | [
"FSFAP"
] | null | null | null | config/ConfigSices.py | atosborges00/sereno_bot | 06bedb02847eff050adeadb6bcc5440bcd2283c3 | [
"FSFAP"
] | null | null | null | from os import path
from config.ConfigPaths import ConfigPaths
class ConfigSices:
""" Sices platform useful URLs """
LOGIN_URL = 'https://monitoramento.sicessolar.com.br/login'
ANALYTICS_PAGE = 'https://monitoramento.sicessolar.com.br/analytics?und={code}'
""" Sices platform default directories """
RAW_DATA_PATH = path.join(ConfigPaths.RAW_DATA_PATH, "sices")
""" Sices platform options settings """
PREFERENCES = {
'download.default_directory': RAW_DATA_PATH,
'safebrowsing.enabled': 'false'}
| 26.285714 | 83 | 0.697464 | 61 | 552 | 6.163934 | 0.590164 | 0.103723 | 0.087766 | 0.164894 | 0.175532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.182971 | 552 | 20 | 84 | 27.6 | 0.833703 | 0.047101 | 0 | 0 | 0 | 0 | 0.368421 | 0.059497 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
496d668dab143daad188848fbd26c751e580633a | 357 | py | Python | contentcuration/contentcuration/migrations/0059_merge.py | Tlazypanda/studio | cd1c2f169c705027cdd808cbbcae907d0a9b21d2 | [
"MIT"
] | 1 | 2019-03-30T18:14:25.000Z | 2019-03-30T18:14:25.000Z | contentcuration/contentcuration/migrations/0059_merge.py | Tlazypanda/studio | cd1c2f169c705027cdd808cbbcae907d0a9b21d2 | [
"MIT"
] | 4 | 2016-05-06T17:19:30.000Z | 2019-03-15T01:51:24.000Z | contentcuration/contentcuration/migrations/0059_merge.py | Tlazypanda/studio | cd1c2f169c705027cdd808cbbcae907d0a9b21d2 | [
"MIT"
] | 4 | 2016-10-18T22:49:08.000Z | 2019-09-17T11:20:51.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.9.7 on 2017-03-29 19:12
from __future__ import unicode_literals
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('contentcuration', '0058_auto_20170223_1636'),
('contentcuration', '0057_assessmentitem_deleted'),
]
operations = [
]
| 21 | 59 | 0.680672 | 40 | 357 | 5.825 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126316 | 0.201681 | 357 | 16 | 60 | 22.3125 | 0.691228 | 0.187675 | 0 | 0 | 1 | 0 | 0.278746 | 0.174216 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
4974d677e63b39744893c4f6fa71c6ce00ac7913 | 2,240 | py | Python | ckanext/scheming/logic.py | vrk-kpa/ckanext-scheming | b82e20e04acdc4a71163675f843ac9be74f29d41 | [
"MIT"
] | null | null | null | ckanext/scheming/logic.py | vrk-kpa/ckanext-scheming | b82e20e04acdc4a71163675f843ac9be74f29d41 | [
"MIT"
] | null | null | null | ckanext/scheming/logic.py | vrk-kpa/ckanext-scheming | b82e20e04acdc4a71163675f843ac9be74f29d41 | [
"MIT"
] | 1 | 2021-12-15T12:50:40.000Z | 2021-12-15T12:50:40.000Z | from ckantoolkit import get_or_bust, side_effect_free, ObjectNotFound
from ckanext.scheming.helpers import (
scheming_dataset_schemas, scheming_get_dataset_schema,
scheming_group_schemas, scheming_get_group_schema,
scheming_organization_schemas, scheming_get_organization_schema,
)
@side_effect_free
def scheming_dataset_schema_list(context, data_dict):
'''
Return a list of dataset types customized with the scheming extension
'''
return list(scheming_dataset_schemas())
@side_effect_free
def scheming_dataset_schema_show(context, data_dict):
'''
Return the scheming schema for a given dataset type
:param type: the dataset type
:param expanded: True to expand presets (default)
'''
t = get_or_bust(data_dict, 'type')
expanded = data_dict.get('expanded', True)
s = scheming_get_dataset_schema(t, expanded)
if s is None:
raise ObjectNotFound()
return s
@side_effect_free
def scheming_group_schema_list(context, data_dict):
'''
Return a list of group types customized with the scheming extension
'''
return list(scheming_group_schemas())
@side_effect_free
def scheming_group_schema_show(context, data_dict):
'''
Return the scheming schema for a given group type
:param type: the group type
:param expanded: True to expand presets (default)
'''
t = get_or_bust(data_dict, 'type')
expanded = data_dict.get('expanded', True)
s = scheming_get_group_schema(t, expanded)
if s is None:
raise ObjectNotFound()
return s
@side_effect_free
def scheming_organization_schema_list(context, data_dict):
'''
Return a list of organization types customized with the scheming extension
'''
return list(scheming_organization_schemas())
@side_effect_free
def scheming_organization_schema_show(context, data_dict):
'''
Return the scheming schema for a given organization type
:param type: the organization type
:param expanded: True to expand presets (default)
'''
t = get_or_bust(data_dict, 'type')
expanded = data_dict.get('expanded', True)
s = scheming_get_organization_schema(t, expanded)
if s is None:
raise ObjectNotFound()
return s
| 28 | 78 | 0.729018 | 297 | 2,240 | 5.222222 | 0.161616 | 0.061896 | 0.063185 | 0.065764 | 0.740812 | 0.740812 | 0.727273 | 0.63185 | 0.63185 | 0.448098 | 0 | 0 | 0.195982 | 2,240 | 79 | 79 | 28.35443 | 0.861188 | 0.275446 | 0 | 0.538462 | 0 | 0 | 0.023873 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.051282 | 0 | 0.358974 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4980cf418b1fec3383b451b2c9e98a8148676569 | 1,671 | py | Python | fitbenchmarking/parsing/base_parser.py | arm61/fitbenchmarking | c745c684e3ca4895a666eb863426746d8f06636c | [
"BSD-3-Clause"
] | null | null | null | fitbenchmarking/parsing/base_parser.py | arm61/fitbenchmarking | c745c684e3ca4895a666eb863426746d8f06636c | [
"BSD-3-Clause"
] | null | null | null | fitbenchmarking/parsing/base_parser.py | arm61/fitbenchmarking | c745c684e3ca4895a666eb863426746d8f06636c | [
"BSD-3-Clause"
] | null | null | null | """
Implements the base Parser as a Context Manager.
"""
from abc import ABCMeta, abstractmethod
class Parser:
"""
Base abstract class for a parser.
Further parsers should inherit from this and override the abstract parse()
method.
"""
__metaclass__ = ABCMeta
def __init__(self, filename):
"""
Store the filename for use by enter.
:param filename: The path to the file to be parsed
:type filename: string
"""
self._filename = filename
self.file = None
self.fitting_problem = None
def __enter__(self):
"""
Called when used as a context manager.
Opens the file ready for parsing.
"""
self.file = open(self._filename, 'r')
return self
def __exit__(self, exc_type, exc_value, traceback):
"""
Called when used as a context manager.
Closes the file.
:param exc_type: Used if an exception occurs. Contains the
exception type.
:type exc_type: type
:param exc_value: Used if an exception occurs. Contains the exception
value.
:type exc_value: Exception
:param traceback: Used if an exception occurs. Contains the exception
traceback.
:type traceback: traceback
"""
try:
self.file.close()
except AttributeError:
pass
@abstractmethod
def parse(self):
"""
Parse the file into a FittingProblem.
:returns: The parsed problem
:rtype: FittingProblem
"""
raise NotImplementedError
| 26.109375 | 78 | 0.581089 | 182 | 1,671 | 5.197802 | 0.412088 | 0.029598 | 0.031712 | 0.053911 | 0.201903 | 0.201903 | 0.201903 | 0.136364 | 0 | 0 | 0 | 0 | 0.351287 | 1,671 | 63 | 79 | 26.52381 | 0.872694 | 0.51167 | 0 | 0 | 0 | 0 | 0.001745 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0.055556 | 0.055556 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
498ebed60829fc81050f096acf226151f138af86 | 525 | py | Python | oTree/consent/__init__.py | jleutgeb/privilege | 2a4f15c98d94d9f1dbf1c4685c5e96d018d58abc | [
"MIT"
] | null | null | null | oTree/consent/__init__.py | jleutgeb/privilege | 2a4f15c98d94d9f1dbf1c4685c5e96d018d58abc | [
"MIT"
] | 11 | 2021-05-06T09:45:30.000Z | 2022-03-01T17:48:35.000Z | oTree/consent/__init__.py | jleutgeb/privilege | 2a4f15c98d94d9f1dbf1c4685c5e96d018d58abc | [
"MIT"
] | null | null | null | from otree.api import *
c = Currency
doc = """
Simple Consent App
Players may only continue after clicking the consent button.
"""
class Constants(BaseConstants):
name_in_url = 'consent'
players_per_group = None
num_rounds = 1
class Subsession(BaseSubsession):
pass
class Group(BaseGroup):
pass
class Player(BasePlayer):
consent = models.BooleanField(choices=[[True, 'Ja']])
# PAGES
class Consent(Page):
form_model = "player"
form_fields = ["consent"]
page_sequence = [Consent]
| 14.583333 | 61 | 0.693333 | 63 | 525 | 5.650794 | 0.746032 | 0.050562 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002387 | 0.201905 | 525 | 35 | 62 | 15 | 0.847255 | 0.009524 | 0 | 0.1 | 0 | 0 | 0.200772 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.1 | 0.05 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
4994b9856023b95cccc4144927c2909950d9bad5 | 383 | gyp | Python | binding.gyp | mceSystems/node-windows-pac-resolver | a1eaaa6b74d4e82218e6d975582aab121e12da6f | [
"MIT"
] | 1 | 2021-11-14T01:26:45.000Z | 2021-11-14T01:26:45.000Z | binding.gyp | mceSystems/node-windows-pac-resolver | a1eaaa6b74d4e82218e6d975582aab121e12da6f | [
"MIT"
] | 1 | 2021-08-31T21:38:42.000Z | 2021-08-31T21:38:42.000Z | binding.gyp | mceSystems/node-windows-pac-resolver | a1eaaa6b74d4e82218e6d975582aab121e12da6f | [
"MIT"
] | 1 | 2021-11-14T01:26:12.000Z | 2021-11-14T01:26:12.000Z | {
"targets": [
{
"target_name": "binding",
"sources": [
"native\\winhttpBindings.cpp"
],
"include_dirs": [
"<!@(node -p \"require('node-addon-api').include\")"
],
"libraries": [
"WinHTTP.lib",
"-DelayLoad:node.exe"
],
"msbuild_settings": {
"ClCompile": {
"RuntimeLibrary": "MultiThreaded"
}
}
}
]
}
| 16.652174 | 68 | 0.48564 | 28 | 383 | 6.535714 | 0.892857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.302872 | 383 | 22 | 69 | 17.409091 | 0.685393 | 0 | 0 | 0.136364 | 0 | 0 | 0.496084 | 0.070496 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
499a41cfbffd9bf9473869aaf693707dd595ba03 | 6,671 | py | Python | tests/test_formDef.py | swhume/odmlib | 597f71c60f4c6bd8639c92e9fc0ae71b8a5416a7 | [
"MIT"
] | 9 | 2021-09-15T12:26:30.000Z | 2022-03-30T10:14:14.000Z | tests/test_formDef.py | swhume/odmlib | 597f71c60f4c6bd8639c92e9fc0ae71b8a5416a7 | [
"MIT"
] | 1 | 2021-09-28T09:05:01.000Z | 2021-09-28T09:05:01.000Z | tests/test_formDef.py | swhume/odmlib | 597f71c60f4c6bd8639c92e9fc0ae71b8a5416a7 | [
"MIT"
] | 1 | 2021-09-29T04:50:23.000Z | 2021-09-29T04:50:23.000Z | from unittest import TestCase
import json
import odmlib.odm_1_3_2.model as ODM
class TestFormDef(TestCase):
def setUp(self) -> None:
attrs = self.set_formdef_attributes()
self.formdef = ODM.FormDef(**attrs)
def test_add_description(self):
tt1 = ODM.TranslatedText(_content="this is the first test description", lang="en")
tt2 = ODM.TranslatedText(_content="this is the second test description", lang="en")
self.formdef.Description = ODM.Description()
self.formdef.Description.TranslatedText = [tt1, tt2]
self.assertEqual(len(self.formdef.Description.TranslatedText), 2)
self.assertEqual(self.formdef.Description.TranslatedText[1]._content, 'this is the second test description')
def test_add_item_group_ref(self):
igr = ODM.ItemGroupRef(ItemGroupOID="ODM.IG.COMMON", Mandatory="Yes", OrderNumber=1)
self.formdef.ItemGroupRef.append(igr)
igr = ODM.ItemGroupRef(ItemGroupOID="ODM.IG.VS_GENERAL", Mandatory="Yes", OrderNumber=2)
self.formdef.ItemGroupRef.append(igr)
igr = ODM.ItemGroupRef(ItemGroupOID="ODM.IG.VS", Mandatory="Yes", OrderNumber=3)
self.formdef.ItemGroupRef.append(igr)
self.assertEqual(self.formdef.ItemGroupRef[0].ItemGroupOID, "ODM.IG.COMMON")
self.assertEqual(self.formdef.ItemGroupRef[2].OrderNumber, 3)
def test_append_item_group_ref(self):
fd = ODM.FormDef(OID="ODM.F.VS", Name="Vital Signs Form", Repeating="Yes")
fd.Description = ODM.Description()
fd.Description.TranslatedText.append(ODM.TranslatedText(_content="this is the first test description", lang="en"))
fd.ItemGroupRef.append(ODM.ItemGroupRef(ItemGroupOID="ODM.IG.COMMON", Mandatory="Yes", OrderNumber=1))
fd.ItemGroupRef.append(ODM.ItemGroupRef(ItemGroupOID="ODM.IG.VS_GENERAL", Mandatory="Yes", OrderNumber=2))
fd.ItemGroupRef.append(ODM.ItemGroupRef(ItemGroupOID="ODM.IG.VS", Mandatory="Yes", OrderNumber=3))
self.assertEqual(fd.ItemGroupRef[0].ItemGroupOID, "ODM.IG.COMMON")
self.assertEqual(fd.ItemGroupRef[2].OrderNumber, 3)
self.assertEqual(fd.Description.TranslatedText[0]._content, "this is the first test description")
def test_add_alias(self):
self.formdef.Alias.append(ODM.Alias(Context="SDTMIG", Name="VS"))
self.formdef.Alias.append(ODM.Alias(Context="CDASHIG", Name="VS"))
self.assertEqual(len(self.formdef.Alias), 2)
self.assertEqual(self.formdef.Alias[1].Context, "CDASHIG")
def test_add_not_alias(self):
item = ODM.ItemDef(OID="ODM.IT.VSPOS", Name="VS Position", DataType="text")
with self.assertRaises(TypeError):
self.formdef.Alias = [item]
self.formdef.Alias.append(ODM.Alias(Context="SDTMIG", Name="VS"))
# list accepts invalid objects
self.formdef.Alias.append(ODM.ItemDef(OID="ODM.IT.VSDT", Name="VS Date", DataType="text"))
self.assertEqual(len(self.formdef.Alias), 2)
self.assertEqual(self.formdef.Alias[0].Context, "SDTMIG")
def test_to_json(self):
attrs = self.set_formdef_attributes()
fd = ODM.FormDef(**attrs)
tt = ODM.TranslatedText(_content="this is the first test description", lang="en")
fd.Description = ODM.Description()
fd.Description.TranslatedText = [tt]
fd.ItemGroupRef.append(ODM.ItemGroupRef(ItemGroupOID="ODM.IG.COMMON", Mandatory="Yes", OrderNumber=1))
fd.ItemGroupRef.append(ODM.ItemGroupRef(ItemGroupOID="ODM.IG.VS_GENERAL", Mandatory="Yes", OrderNumber=2))
fd.ItemGroupRef.append(ODM.ItemGroupRef(ItemGroupOID="ODM.IG.VS", Mandatory="Yes", OrderNumber=3))
fd.Alias.append(ODM.Alias(Context="SDTMIG", Name="VS"))
fd_json = fd.to_json()
fd_dict = json.loads(fd_json)
print(fd_dict)
self.assertEqual(fd_dict["OID"], "ODM.F.VS")
self.assertDictEqual(fd_dict, self.expected_dict())
def test_to_dict(self):
attrs = self.set_formdef_attributes()
fd = ODM.FormDef(**attrs)
fd.Description = ODM.Description()
fd.Description.TranslatedText.append(ODM.TranslatedText(_content="this is the first test description", lang="en"))
fd.ItemGroupRef.append(ODM.ItemGroupRef(ItemGroupOID="ODM.IG.COMMON", Mandatory="Yes", OrderNumber=1))
fd.ItemGroupRef.append(ODM.ItemGroupRef(ItemGroupOID="ODM.IG.VS_GENERAL", Mandatory="Yes", OrderNumber=2))
fd.ItemGroupRef.append(ODM.ItemGroupRef(ItemGroupOID="ODM.IG.VS", Mandatory="Yes", OrderNumber=3))
fd.Alias.append(ODM.Alias(Context="SDTMIG", Name="VS"))
fd_dict = fd.to_dict()
self.assertEqual(fd_dict["OID"], "ODM.F.VS")
self.assertDictEqual(fd_dict, self.expected_dict())
def test_to_xml(self):
attrs = self.set_formdef_attributes()
fd = ODM.FormDef(**attrs)
fd.Description = ODM.Description()
fd.Description.TranslatedText.append(ODM.TranslatedText(_content="this is the first test description", lang="en"))
fd.ItemGroupRef.append(ODM.ItemGroupRef(ItemGroupOID="ODM.IG.COMMON", Mandatory="Yes", OrderNumber=1))
fd.ItemGroupRef.append(ODM.ItemGroupRef(ItemGroupOID="ODM.IG.VS_GENERAL", Mandatory="Yes", OrderNumber=2))
fd.ItemGroupRef.append(ODM.ItemGroupRef(ItemGroupOID="ODM.IG.VS", Mandatory="Yes", OrderNumber=3))
fd.Alias.append(ODM.Alias(Context="SDTMIG", Name="VS"))
fd_xml = fd.to_xml()
self.assertEqual(fd_xml.attrib["OID"], "ODM.F.VS")
igr = fd_xml.findall("ItemGroupRef")
self.assertEqual(len(igr), 3)
self.assertEqual(igr[0].attrib, {"ItemGroupOID": "ODM.IG.COMMON", "Mandatory": "Yes", "OrderNumber": "1"})
@staticmethod
def set_formdef_attributes():
"""
set some FormDef element attributes using test data
:return: dictionary with FormDef attribute information
"""
return {"OID": "ODM.F.VS", "Name": "Vital Signs Form", "Repeating": "Yes"}
@staticmethod
def expected_dict():
return {'OID': 'ODM.F.VS', 'Name': 'Vital Signs Form', 'Repeating': 'Yes',
'ItemGroupRef': [{'ItemGroupOID': 'ODM.IG.COMMON', 'Mandatory': 'Yes', 'OrderNumber': 1},
{'ItemGroupOID': 'ODM.IG.VS_GENERAL', 'Mandatory': 'Yes', 'OrderNumber': 2},
{'ItemGroupOID': 'ODM.IG.VS', 'Mandatory': 'Yes', 'OrderNumber': 3}],
'Description': {'TranslatedText':
[{'_content': 'this is the first test description', 'lang': 'en'}]},
'Alias': [{'Context': 'SDTMIG', 'Name': 'VS'}]}
| 57.017094 | 122 | 0.665867 | 806 | 6,671 | 5.42804 | 0.124069 | 0.045257 | 0.0816 | 0.106057 | 0.778286 | 0.708343 | 0.703771 | 0.658286 | 0.6144 | 0.589486 | 0 | 0.007526 | 0.183331 | 6,671 | 116 | 123 | 57.508621 | 0.795521 | 0.020387 | 0 | 0.424242 | 0 | 0 | 0.17269 | 0 | 0 | 0 | 0 | 0 | 0.191919 | 1 | 0.111111 | false | 0 | 0.030303 | 0.010101 | 0.171717 | 0.010101 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
499d165572daf46e08305c7a946da82bbf43582f | 767 | py | Python | broadcasts/managers.py | foolwealth/django-site-broadcasts | f870fbf96cde7ea29fc8179e71ab738d2192628f | [
"MIT"
] | 5 | 2016-08-08T07:31:53.000Z | 2020-01-21T00:10:22.000Z | broadcasts/managers.py | foolwealth/django-site-broadcasts | f870fbf96cde7ea29fc8179e71ab738d2192628f | [
"MIT"
] | 2 | 2015-05-22T00:47:14.000Z | 2018-08-15T19:07:21.000Z | broadcasts/managers.py | bennylope/django-site-broadcasts | 0c7556462e7aa09a48ccce4ca8d0b4827a2ce190 | [
"MIT"
] | 2 | 2015-05-21T23:23:16.000Z | 2018-08-15T17:03:51.000Z | from django.db import models
from django.db.models import Q
from django.utils import timezone
class BroadcastManager(models.Manager):
"""
Manager class to show only active broadcast messages
"""
def active(self):
"""Return only active messages"""
return super(BroadcastManager, self).filter(is_published=True)
def current(self):
"""Return only current and active messages"""
return self.active().filter(end_time__gte=timezone.now()).filter(
Q(Q(start_time__lte=timezone.now()) | Q(start_time=None)))
def latest(self):
"""Return the broadcast message to display"""
try:
return self.current().order_by("end_time")[0]
except IndexError:
return None
| 29.5 | 73 | 0.65189 | 95 | 767 | 5.157895 | 0.463158 | 0.061224 | 0.04898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001709 | 0.237288 | 767 | 25 | 74 | 30.68 | 0.835897 | 0.208605 | 0 | 0 | 0 | 0 | 0.013889 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.214286 | 0 | 0.785714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
b8d180754d7fc90d954cb1d916a92cd2b5b1aea1 | 589 | py | Python | dribdat/decorators.py | gonzalocasas/dribdat | f8c326c96e851be199eb9f61daed6c8780e3bc27 | [
"MIT"
] | 21 | 2015-10-25T23:22:04.000Z | 2019-04-01T06:42:54.000Z | dribdat/decorators.py | gonzalocasas/dribdat | f8c326c96e851be199eb9f61daed6c8780e3bc27 | [
"MIT"
] | 108 | 2020-02-11T10:07:53.000Z | 2021-06-19T20:30:03.000Z | dribdat/decorators.py | OpendataCH/dribdat | 90d95a12c782dea7d284a4c454a06481e67c1e37 | [
"MIT"
] | 12 | 2016-09-02T03:12:28.000Z | 2021-06-02T07:58:48.000Z | # -*- coding: utf-8 -*-
from functools import wraps
from flask import abort, jsonify
from flask_login import current_user
def admin_required(f):
@wraps(f)
def decorated_function(*args, **kwargs):
if not current_user.active or not current_user.is_admin:
abort(403)
return f(*args, **kwargs)
return decorated_function
def requires_auth(f):
@wraps(f)
def decorated(*args, **kwargs):
if not current_user.is_allowed:
return jsonify(flag='fail', msg='Login required')
return f(*args, **kwargs)
return decorated
| 25.608696 | 64 | 0.657046 | 78 | 589 | 4.820513 | 0.435897 | 0.117021 | 0.111702 | 0.053191 | 0.409574 | 0.308511 | 0 | 0 | 0 | 0 | 0 | 0.008869 | 0.234295 | 589 | 22 | 65 | 26.772727 | 0.824834 | 0.035654 | 0 | 0.235294 | 0 | 0 | 0.031802 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.176471 | 0 | 0.705882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.