hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fe0ff5fe7b70112082aca5dca0ee41e68d1aef81 | 1,335 | py | Python | Module 2/Chapter06/my_app/auth/models.py | nikitabrazhnik/flask2 | fd7a7e26f00402da09ee2159253cc1958df54814 | [
"MIT"
] | 40 | 2017-02-11T18:28:53.000Z | 2021-10-04T04:52:49.000Z | Module 2/Chapter06/my_app/auth/models.py | nikitabrazhnik/flask2 | fd7a7e26f00402da09ee2159253cc1958df54814 | [
"MIT"
] | 1 | 2018-06-15T13:46:47.000Z | 2018-06-15T13:46:47.000Z | Module 2/Chapter06/my_app/auth/models.py | nikitabrazhnik/flask2 | fd7a7e26f00402da09ee2159253cc1958df54814 | [
"MIT"
] | 40 | 2017-04-11T12:01:22.000Z | 2021-05-30T18:05:27.000Z | from werkzeug.security import generate_password_hash, check_password_hash
from flask_wtf import Form
from wtforms import TextField, PasswordField
from wtforms.validators import InputRequired, EqualTo
from my_app import db
class User(db.Model):
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(100))
pwdhash = db.Column(db.String())
def __init__(self, username, password):
self.username = username
self.pwdhash = generate_password_hash(password)
def check_password(self, password):
return check_password_hash(self.pwdhash, password)
def is_authenticated(self):
return True
def is_active(self):
return True
def is_anonymous(self):
return False
def get_id(self):
return unicode(self.id)
class RegistrationForm(Form):
username = TextField('Username', [InputRequired()])
password = PasswordField(
'Password', [
InputRequired(), EqualTo('confirm', message='Passwords must match')
]
)
confirm = PasswordField('Confirm Password', [InputRequired()])
class LoginForm(Form):
username = TextField('Username', [InputRequired()])
password = PasswordField('Password', [InputRequired()])
class OpenIDForm(Form):
openid = TextField('OpenID', [InputRequired()])
| 26.7 | 79 | 0.691386 | 146 | 1,335 | 6.184932 | 0.363014 | 0.053156 | 0.033223 | 0.035437 | 0.228128 | 0.186047 | 0.186047 | 0.186047 | 0.186047 | 0 | 0 | 0.002817 | 0.202247 | 1,335 | 49 | 80 | 27.244898 | 0.84507 | 0 | 0 | 0.114286 | 1 | 0 | 0.060674 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.171429 | false | 0.314286 | 0.142857 | 0.142857 | 0.828571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 4 |
fe12ad06b35d2d54c70f3169192c6e64ab3f95e8 | 9,096 | py | Python | ensemble_model/main_ensemble_1.py | achm6174/kaggle-physics-tau | 46f2922aa5cefd69b03ca7eaf62fabd24799c53b | [
"MIT"
] | 1 | 2015-10-20T13:47:08.000Z | 2015-10-20T13:47:08.000Z | ensemble_model/main_ensemble_1.py | achm6174/kaggle-physics-tau | 46f2922aa5cefd69b03ca7eaf62fabd24799c53b | [
"MIT"
] | null | null | null | ensemble_model/main_ensemble_1.py | achm6174/kaggle-physics-tau | 46f2922aa5cefd69b03ca7eaf62fabd24799c53b | [
"MIT"
] | null | null | null | """
@author: achm
Final ensemble of strong and weak model, weight based on ../ensemble_weight
"""
import numpy as np
import pandas as pd
import sys
sys.path.append('../input')
import evaluation
import cPickle
import copy
import glob
import os
print("Load test data using pandas")
test = pd.read_csv("../input/test.csv")
################################################################################
# Load weak model submission
print("Load weak model")
ep = 1500
# Two final submission path
search_dir = "../weak_model/output/weak_prediction_ep_%i_a1_0_a2_10" %ep
#search_dir = "../weak_model/output/weak_prediction_ep_%i_a1_5_a2_5" %ep
temp_file = filter(os.path.isfile, glob.glob(search_dir + "*"))
temp_file.sort(key=lambda x: os.path.getmtime(x))
weak_model_sub = pd.read_csv(temp_file[-1])
# Double Check
if weak_model_sub.shape[0] != test.shape[0]:
print "ERROR!"
sys.exit(0)
if not all(weak_model_sub['id'] == test['id']):
print "ERROR!"
sys.exit(0)
if not all(weak_model_sub['prediction'] <= 1):
print "ERROR!"
sys.exit(0)
if not all(weak_model_sub['prediction'] >= 0):
print "ERROR!"
sys.exit(0)
################################################################################
# Load Semi-Strong model submission
print("Load Semi-strong model")
semi_strong_model_sub = pd.read_csv("../semi_strong_model/output/semi_strong_ensemble_submission.csv")
# Double Check
if semi_strong_model_sub.shape[0] != test.shape[0]:
print "ERROR!"
sys.exit(0)
if not all(semi_strong_model_sub['id'] == test['id']):
print "ERROR!"
sys.exit(0)
if not all(semi_strong_model_sub['prediction'] <= 1):
print "ERROR!"
sys.exit(0)
if not all(semi_strong_model_sub['prediction'] >= 0):
print "ERROR!"
sys.exit(0)
################################################################################
# Load strong model submission
print("Load strong model")
strong_model_sub = pd.read_csv("../strong_model/output/strong_ensemble_submission.csv")
# Double Check
if strong_model_sub.shape[0] != test.shape[0]:
print "ERROR!"
sys.exit(0)
if not all(strong_model_sub['id'] == test['id']):
print "ERROR!"
sys.exit(0)
if not all(strong_model_sub['prediction'] <= 1):
print "ERROR!"
sys.exit(0)
if not all(strong_model_sub['prediction'] >= 0):
print "ERROR!"
sys.exit(0)
################################################################################
# Load ensemble weighting
print("Load ensemble weighting")
temp_epoch=300
ensemble = pd.read_csv("../ensemble_weight/output/ensemble_weight_epoch_%i.csv" %temp_epoch)
# Double Check
if ensemble.shape[0] != test.shape[0]:
print "ERROR!"
sys.exit(0)
if not all(ensemble['id'] == test['id']):
print "ERROR!"
sys.exit(0)
################################################################################
# Setting
option = 7
rho = 0
def sigmoid_norm(x, norm):
return 1/(1+np.exp(-(x*(2*norm)-norm)))
def sigmoid_shift_slope(x, target, slope):
return 1/(1+np.exp(-(x-target)*slope))
# Ensemble Signal processing
if option==0: # simple weighting
ensemble['strong_pred'] = np.array(strong_model_sub['prediction'])
ensemble['semi_strong_pred'] = np.array(semi_strong_model_sub['prediction'])
ensemble['weak_pred'] = np.array(weak_model_sub['prediction'])
ensemble['prediction'] = ((1-rho)*ensemble['weight']*ensemble['strong_pred'] +
rho*ensemble['weight']*ensemble['semi_strong_pred'] +
(1-ensemble['weight'])*ensemble['weak_pred'])
elif option ==1: #absulute cutoff
threshold = 0.15
ensemble['strong_pred'] = np.array(strong_model_sub['prediction'])
ensemble['semi_strong_pred'] = np.array(semi_strong_model_sub['prediction'])
ensemble['weak_pred'] = np.array(weak_model_sub['prediction'])
ensemble['prediction'] = ((1.0*(ensemble['weight']>threshold)) * ensemble['strong_pred'] * (1-rho)+
(1.0*(ensemble['weight']>threshold)) * ensemble['semi_strong_pred'] * (rho) +
(1.0*(ensemble['weight']<=threshold)) * ensemble['weak_pred'])
elif option ==2: #absulute cutoff stacking
threshold = 0.15
ensemble['strong_pred'] = np.array(strong_model_sub['prediction'])
ensemble['semi_strong_pred'] = np.array(semi_strong_model_sub['prediction'])
ensemble['weak_pred'] = np.array(weak_model_sub['prediction'])
ensemble['prediction'] = ((1.0*(ensemble['weight']>threshold)) * (ensemble['strong_pred']*(1-rho) + ensemble['semi_strong_pred']*rho + ensemble['weak_pred'])/2. +
(1.0*(ensemble['weight']<=threshold)) * ensemble['weak_pred'] )
elif option ==3: # cutoff with weighting
threshold = 0.15
ensemble['strong_pred'] = np.array(strong_model_sub['prediction'])
ensemble['semi_strong_pred'] = np.array(semi_strong_model_sub['prediction'])
ensemble['weak_pred'] = np.array(weak_model_sub['prediction'])
ensemble['prediction'] = ( (1.0*(ensemble['weight']>threshold)) * (ensemble['weight']*ensemble['strong_pred']*(1-rho) +
ensemble['semi_strong_pred']*rho + (1-ensemble['weight'])*ensemble['weak_pred']) +
(1.0*(ensemble['weight']<=threshold)) * ensemble['weak_pred'] )
elif option ==4: # sigmoid norm
s_norm = 10
ensemble['strong_pred'] = np.array(strong_model_sub['prediction'])
ensemble['semi_strong_pred'] = np.array(semi_strong_model_sub['prediction'])
ensemble['weak_pred'] = np.array(weak_model_sub['prediction'])
ensemble['prediction'] = ( sigmoid_norm(ensemble['weight'], norm=s_norm)*(ensemble['strong_pred']*(1-rho) + ensemble['semi_strong_pred']*rho) +
(1-sigmoid_norm(ensemble['weight'], norm=s_norm))*ensemble['weak_pred'] )
elif option ==5: # Tanh
ensemble['strong_pred'] = np.array(strong_model_sub['prediction'])
ensemble['semi_strong_pred'] = np.array(semi_strong_model_sub['prediction'])
ensemble['weak_pred'] = np.array(weak_model_sub['prediction'])
ensemble['prediction'] = ( np.tanh(ensemble['weight'])*(ensemble['strong_pred']*(1-rho) + ensemble['semi_strong_pred']*rho) +
(1-np.tanh(ensemble['weight']))*ensemble['weak_pred'] )
elif option ==6: # cutoff with Sigmoid
threshold = 0.15
s_norm = 10
ensemble['strong_pred'] = np.array(strong_model_sub['prediction'])
ensemble['semi_strong_pred'] = np.array(semi_strong_model_sub['prediction'])
ensemble['weak_pred'] = np.array(weak_model_sub['prediction'])
ensemble['prediction'] = ( (1.0*(ensemble['weight']>threshold)) *
(sigmoid(ensemble['weight'], norm=s_norm)*(ensemble['strong_pred']*(1-rho) + ensemble['semi_strong_pred']*rho) +
(1-sigmoid(ensemble['weight'], norm=s_norm))*ensemble['weak_pred']) +
(1.0*ensemble['weight']<=threshold) * ensemble['weak_pred'] )
elif option ==7: # Shifted Sigmoid with slope control
target = 0.092
slope = 50
ensemble['strong_pred'] = np.array(strong_model_sub['prediction'])
ensemble['semi_strong_pred'] = np.array(semi_strong_model_sub['prediction'])
ensemble['weak_pred'] = np.array(weak_model_sub['prediction'])
ensemble['prediction'] = ( sigmoid_shift_slope(ensemble['weight'], target, slope) * (ensemble['strong_pred']*(1-rho) + ensemble['semi_strong_pred']*rho) +
(1-sigmoid_shift_slope(ensemble['weight'], target, slope)) * ensemble['weak_pred'] )
if not all(ensemble['prediction'] <= 1):
print "ERROR!"
sys.exit(0)
if not all(ensemble['prediction'] >= 0):
print "ERROR!"
sys.exit(0)
# Write final output
print "Generate submission"
if option ==4:
with open('./output/ensemblesw_ep_%i_option_%i_norm_%i.csv' %(ep, option, s_norm), 'w') as f:
f.write('id,prediction\n')
for ID, p in zip(ensemble['id'], ensemble['prediction']):
f.write('%s,%.8f\n' % (ID, p))
elif option ==6:
with open('./output/ensemblesw_ep_%i_option_%i_tshold_%i_norm_%i.csv' %(ep, option, threshold*1000, s_norm), 'w') as f:
f.write('id,prediction\n')
for ID, p in zip(ensemble['id'], ensemble['prediction']):
f.write('%s,%.8f\n' % (ID, p))
elif option ==7:
with open('./output/ensemblesw_ep_%i_option_%i_ep2_%i_param_%i_%i.csv' %(ep, option, temp_epoch, target*1000, slope), 'w') as f:
f.write('id,prediction\n')
for ID, p in zip(ensemble['id'], ensemble['prediction']):
f.write('%s,%.8f\n' % (ID, p))
else:
with open('./output/ensemblesw_ep_%i_option_%i_tshold_%i_ep2_%i.csv' %(ep, option, threshold*1000, temp_epoch), 'w') as f:
f.write('id,prediction\n')
for ID, p in zip(ensemble['id'], ensemble['prediction']):
f.write('%s,%.8f\n' % (ID, p))
| 45.029703 | 167 | 0.6073 | 1,175 | 9,096 | 4.488511 | 0.104681 | 0.059158 | 0.102389 | 0.118316 | 0.786879 | 0.759006 | 0.726204 | 0.682215 | 0.636329 | 0.633295 | 0 | 0.018893 | 0.185356 | 9,096 | 201 | 168 | 45.253731 | 0.692848 | 0.051341 | 0 | 0.50625 | 0 | 0 | 0.251012 | 0.055766 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.05 | null | null | 0.1375 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
430a657b957501807ccf91597f8bfedc08ceead7 | 99 | py | Python | Projects/Naver_map_halte_navigator.py | wooooooogi/SAlgorithm | bf76bb721785a52b6abf158077b554b0626ee1f7 | [
"MIT"
] | null | null | null | Projects/Naver_map_halte_navigator.py | wooooooogi/SAlgorithm | bf76bb721785a52b6abf158077b554b0626ee1f7 | [
"MIT"
] | null | null | null | Projects/Naver_map_halte_navigator.py | wooooooogi/SAlgorithm | bf76bb721785a52b6abf158077b554b0626ee1f7 | [
"MIT"
] | null | null | null | # Travling salesperson problem using Naver Map Application. (Map App. only can use halte under 5) | 99 | 99 | 0.777778 | 15 | 99 | 5.133333 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012195 | 0.171717 | 99 | 1 | 99 | 99 | 0.926829 | 0.959596 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
430cf8803539e615bed4fdad3888afda70ae5e8c | 162 | py | Python | src/ppe_match/__init__.py | samorani/MatchingPPE | c684c75afc6671f90644d1a2622a9972f689eadc | [
"MIT"
] | null | null | null | src/ppe_match/__init__.py | samorani/MatchingPPE | c684c75afc6671f90644d1a2622a9972f689eadc | [
"MIT"
] | null | null | null | src/ppe_match/__init__.py | samorani/MatchingPPE | c684c75afc6671f90644d1a2622a9972f689eadc | [
"MIT"
] | null | null | null | from . import testing_framework
from . import strategies
from .testing_framework import TestingFramework
__all__ = [
'testing_framework',
'strategies'
]
| 18 | 47 | 0.765432 | 16 | 162 | 7.3125 | 0.4375 | 0.410256 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 162 | 8 | 48 | 20.25 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
432fd0ba964910848bc9543840be0cf1c7bacb91 | 455 | py | Python | motionvation/forms/change_password_form.py | v-like-engine/plane-planer-web | 0e1fb546b6b780697e45d259c07a99ac634390c7 | [
"MIT"
] | 1 | 2020-06-26T17:45:06.000Z | 2020-06-26T17:45:06.000Z | motionvation/forms/change_password_form.py | v-like-engine/plane-planer-web | 0e1fb546b6b780697e45d259c07a99ac634390c7 | [
"MIT"
] | null | null | null | motionvation/forms/change_password_form.py | v-like-engine/plane-planer-web | 0e1fb546b6b780697e45d259c07a99ac634390c7 | [
"MIT"
] | 9 | 2020-04-18T11:04:29.000Z | 2020-05-08T08:53:45.000Z | from flask_wtf import FlaskForm
from wtforms import StringField, SubmitField, PasswordField
from wtforms.validators import DataRequired
class ChangePasswordForm(FlaskForm):
old_password = PasswordField('Old password', validators=[DataRequired()])
new_password = PasswordField('New password', validators=[DataRequired()])
new_password_again = PasswordField('New password again', validators=[DataRequired()])
submit = SubmitField('Change') | 45.5 | 89 | 0.789011 | 45 | 455 | 7.866667 | 0.422222 | 0.124294 | 0.169492 | 0.186441 | 0.231638 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 455 | 10 | 90 | 45.5 | 0.878412 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.625 | 0.375 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 4 |
4333cde494f86894d049863806f7f90f5bb0ae57 | 136 | py | Python | 1. Estrutura Sequencial/Lista 01/programa 08.py | enosteteo/Introducao-a-Programacao-P1 | bd8f90c96fd16c88e7dc7136942fe29a828f86fa | [
"MIT"
] | null | null | null | 1. Estrutura Sequencial/Lista 01/programa 08.py | enosteteo/Introducao-a-Programacao-P1 | bd8f90c96fd16c88e7dc7136942fe29a828f86fa | [
"MIT"
] | null | null | null | 1. Estrutura Sequencial/Lista 01/programa 08.py | enosteteo/Introducao-a-Programacao-P1 | bd8f90c96fd16c88e7dc7136942fe29a828f86fa | [
"MIT"
] | null | null | null | #Programa 08
c = 49
print(c)
a = (c - 7) / 6
print(c, a)
b = c % 5
print(c, a, b)
c = c / a + 2
print(c, a, b)
a = b ** 3
print(c, a, b) | 12.363636 | 15 | 0.470588 | 35 | 136 | 1.828571 | 0.342857 | 0.1875 | 0.546875 | 0.5 | 0.28125 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 0.294118 | 136 | 11 | 16 | 12.363636 | 0.572917 | 0.080882 | 0 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 1 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
4a2a71d4405f02e59da471819cd9a907763a3496 | 346 | py | Python | tests/asp/AllAnswerSets/nontight/example.hamiltonian.6.asp.gringo.test.py | bernardocuteri/wasp | 05c8f961776dbdbf7afbf905ee00fc262eba51ad | [
"Apache-2.0"
] | 19 | 2015-12-03T08:53:45.000Z | 2022-03-31T02:09:43.000Z | tests/asp/AllAnswerSets/nontight/example.hamiltonian.6.asp.gringo.test.py | bernardocuteri/wasp | 05c8f961776dbdbf7afbf905ee00fc262eba51ad | [
"Apache-2.0"
] | 80 | 2017-11-25T07:57:32.000Z | 2018-06-10T19:03:30.000Z | tests/asp/AllAnswerSets/nontight/example.hamiltonian.6.asp.gringo.test.py | bernardocuteri/wasp | 05c8f961776dbdbf7afbf905ee00fc262eba51ad | [
"Apache-2.0"
] | 6 | 2015-01-15T07:51:48.000Z | 2020-06-18T14:47:48.000Z | input = """
1 2 1 1 3
1 3 1 1 2
1 4 1 1 5
1 6 1 1 7
1 5 1 1 4
1 7 1 1 6
1 8 1 0 5
1 9 2 0 8 2
1 8 2 0 9 7
1 1 2 0 5 7
1 1 1 1 8
1 1 1 1 9
0
8 reached(2)
9 reached(4)
3 out_hm(2,4)
4 out_hm(3,2)
6 out_hm(4,2)
2 in_hm(2,4)
5 in_hm(3,2)
7 in_hm(4,2)
0
B+
0
B-
1
0
1
"""
output = """
{in_hm(2,4), in_hm(3,2), out_hm(4,2), reached(2), reached(4)}
"""
| 10.176471 | 61 | 0.546243 | 125 | 346 | 1.44 | 0.136 | 0.144444 | 0.066667 | 0.044444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.402439 | 0.289017 | 346 | 33 | 62 | 10.484848 | 0.329268 | 0 | 0 | 0.060606 | 0 | 0.030303 | 0.910405 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
4a3da80a455b810bf83bb3c556de21cc1904b534 | 179 | py | Python | setup.py | jacopoabbate/watchilst-api-python-client | 1d0b25ab778a34e36612d1629f31d9c07d017b9e | [
"MIT"
] | null | null | null | setup.py | jacopoabbate/watchilst-api-python-client | 1d0b25ab778a34e36612d1629f31d9c07d017b9e | [
"MIT"
] | null | null | null | setup.py | jacopoabbate/watchilst-api-python-client | 1d0b25ab778a34e36612d1629f31d9c07d017b9e | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import setuptools
# this shim allows for editable installs when setup.cfg and pyproject.toml are in use.
if __name__ == "__main__":
setuptools.setup()
| 22.375 | 86 | 0.743017 | 26 | 179 | 4.807692 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162011 | 179 | 7 | 87 | 25.571429 | 0.833333 | 0.586592 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
4a400f27dabb214dace08cbe711ca989e936e3e9 | 321 | py | Python | stats/config.example.py | harkovs/x4stats | 86a1c3ffb487aa9cd82938e68647aa045c28eb50 | [
"MIT"
] | 1 | 2021-06-07T19:02:06.000Z | 2021-06-07T19:02:06.000Z | stats/config.example.py | harkovs/x4stats | 86a1c3ffb487aa9cd82938e68647aa045c28eb50 | [
"MIT"
] | null | null | null | stats/config.example.py | harkovs/x4stats | 86a1c3ffb487aa9cd82938e68647aa045c28eb50 | [
"MIT"
] | null | null | null | # SAVE_LOCATION can be a specific file or a directory. When the update button on the dashboard is clicked, the program
# will check whether a file in the directory is newer than the current file or if the single file has been updated.
SAVE_LOCATION = r"C:\Users\<your_username>\Documents\Egosoft\X4\<random_number>\save"
| 80.25 | 118 | 0.791277 | 56 | 321 | 4.464286 | 0.714286 | 0.096 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00365 | 0.146417 | 321 | 3 | 119 | 107 | 0.908759 | 0.716511 | 0 | 0 | 0 | 0 | 0.75 | 0.75 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
4a59941f3c6365dceecccbedc3dc83c5830f391d | 214 | py | Python | back/ai/models.py | soulgchoi/bonsildoongsil | 6f596ab86339b6b62cc5d76ab11cb1b0e343b5f3 | [
"MIT"
] | 1 | 2020-08-18T01:10:17.000Z | 2020-08-18T01:10:17.000Z | back/ai/models.py | soulgchoi/bonsildoongsil | 6f596ab86339b6b62cc5d76ab11cb1b0e343b5f3 | [
"MIT"
] | 14 | 2021-03-19T08:51:18.000Z | 2022-02-27T07:29:26.000Z | back/ai/models.py | sigk218/BSDS | 2976a9da57609b47e3b8ce06d2c05d1e2eae5f71 | [
"MIT"
] | 1 | 2020-08-18T01:10:18.000Z | 2020-08-18T01:10:18.000Z | from django.db import models
class Color(models.Model):
color = models.CharField(max_length=20, unique=True)
class Category(models.Model):
category = models.CharField(max_length=20, unique=True)
| 12.588235 | 59 | 0.728972 | 29 | 214 | 5.310345 | 0.517241 | 0.142857 | 0.233766 | 0.311688 | 0.467532 | 0.467532 | 0.467532 | 0 | 0 | 0 | 0 | 0.022346 | 0.163551 | 214 | 16 | 60 | 13.375 | 0.837989 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
4a7d716c8d9154f4413c1ffbca9ddb80e138c12a | 150 | py | Python | modules/invite.py | xfnw/kim | 847d5ae7e9a842c724c928c629016e724dc08620 | [
"MIT"
] | 1 | 2021-07-20T08:16:13.000Z | 2021-07-20T08:16:13.000Z | modules/invite.py | xfnw/kim | 847d5ae7e9a842c724c928c629016e724dc08620 | [
"MIT"
] | 2 | 2021-08-01T00:52:57.000Z | 2022-03-24T23:52:33.000Z | modules/invite.py | xfnw/kim | 847d5ae7e9a842c724c928c629016e724dc08620 | [
"MIT"
] | 2 | 2021-07-20T08:16:17.000Z | 2021-09-26T03:22:48.000Z |
from bot import *
@listener('INVITE')
async def on_invite(self,line):
self.send(build("JOIN",[line.params[1]]))
async def init(self):
pass
| 15 | 45 | 0.666667 | 23 | 150 | 4.304348 | 0.73913 | 0.161616 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007937 | 0.16 | 150 | 9 | 46 | 16.666667 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0.067114 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.166667 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 4 |
4abb2b569afde4d5e19fbe7e245cc01998de21be | 285 | py | Python | ncov/parser/__init__.py | fmaguire/ncov-parser | bf3ba6e609f04e0a33c3779973bc4c9066e498b0 | [
"MIT"
] | 1 | 2021-03-17T16:12:18.000Z | 2021-03-17T16:12:18.000Z | ncov/parser/__init__.py | fmaguire/ncov-parser | bf3ba6e609f04e0a33c3779973bc4c9066e498b0 | [
"MIT"
] | null | null | null | ncov/parser/__init__.py | fmaguire/ncov-parser | bf3ba6e609f04e0a33c3779973bc4c9066e498b0 | [
"MIT"
] | 2 | 2021-02-18T18:07:57.000Z | 2021-04-05T22:56:38.000Z | from .qc import *
from .Meta import *
from .Consensus import *
from .Alleles import *
from .PerBaseCoverage import *
from .Variants import *
from .Vcf import *
from .NegativeControl import *
from .primers import *
from .Lineage import *
from .Snpeff import *
from .WatchList import *
| 20.357143 | 30 | 0.74386 | 36 | 285 | 5.888889 | 0.388889 | 0.518868 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.17193 | 285 | 13 | 31 | 21.923077 | 0.898305 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
4ac797afc1ca02224eecd8a5c246f287a0d66554 | 145 | py | Python | thundera.py | maxhbr/thunderabsa-cli | c0a426720cc225fd742921a7bbd4b5307860e17d | [
"Apache-2.0"
] | 5 | 2021-08-16T23:26:11.000Z | 2022-01-03T16:11:02.000Z | thundera.py | maxhbr/thunderabsa-cli | c0a426720cc225fd742921a7bbd4b5307860e17d | [
"Apache-2.0"
] | 11 | 2021-08-21T05:12:29.000Z | 2021-12-17T07:22:15.000Z | thundera.py | oscarvalenzuelab/thunderabsa-cli | 3eff9e2582e5f634beb06a73b13cacba913966c3 | [
"Apache-2.0"
] | 2 | 2021-08-16T22:15:05.000Z | 2022-01-02T10:11:04.000Z | #!/usr/bin/env python3
import os
from thundera import thundera
os.environ['LANG'] = 'C.UTF-8'
os.environ['LC_ALL'] = 'C.UTF-8'
thundera.cli()
| 14.5 | 32 | 0.682759 | 25 | 145 | 3.92 | 0.64 | 0.183673 | 0.102041 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023622 | 0.124138 | 145 | 9 | 33 | 16.111111 | 0.748032 | 0.144828 | 0 | 0 | 0 | 0 | 0.195122 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
4ac941aeb682deb1d10afc01b0cbb280275fda76 | 1,294 | py | Python | test/test_ast.py | gulan/jsdtools | 1707f7c1571dcde6eac456caadb625f691a16bba | [
"0BSD"
] | null | null | null | test/test_ast.py | gulan/jsdtools | 1707f7c1571dcde6eac456caadb625f691a16bba | [
"0BSD"
] | 4 | 2018-09-04T14:40:24.000Z | 2018-09-04T19:36:27.000Z | test/test_ast.py | gulan/jsdtools | 1707f7c1571dcde6eac456caadb625f691a16bba | [
"0BSD"
] | null | null | null | #!python
from jsdtools.ast import (Lit, Seq, Alt, Rep)
# def test_repr_lit():
# assert repr(Lit('alpha')) == '(lit alpha)'
# def test_repr_seq():
# s = Seq('seq1')
# s.add_child(Lit('alpha'))
# s.add_child(Lit('beta'))
# s.add_child(Lit('gamma'))
# assert repr(s) == '(seq seq1 [(lit alpha) (lit beta) (lit gamma)])'
# def test_repr_alt():
# s = Alt('alt1')
# s.add_child(Lit('alpha'))
# s.add_child(Lit('beta'))
# s.add_child(Lit('gamma'))
# assert repr(s) == '(alt alt1 [(lit alpha) (lit beta) (lit gamma)])'
# def test_repr_rep():
# s = Rep('repX')
# s.add_child(Lit('alpha'))
# assert repr(s) == '(rep repX (lit alpha))'
# def test_eq():
# s = Lit('s')
# t = Lit('s')
# u = Lit('u')
# assert s == t
# assert id(s) != id(t)
# assert s != u
# ss0 = Seq('ss')
# [ss0.add_child(Lit(n)) for n in "abcdef"]
# ss1 = Seq('ss')
# [ss1.add_child(Lit(n)) for n in "abcdef"]
# assert ss0 == ss1
# aa0 = Alt('aa')
# aa0.add_child(ss0)
# aa0.add_child(s)
# aa1 = Alt('aa')
# aa1.add_child(ss1)
# aa1.add_child(t)
# assert aa0 == aa1
# rr0 = Rep('rr')
# rr0.add_child(aa0)
# rr1 = Rep('rr')
# rr1.add_child(aa1)
# assert rr0 == rr1
| 23.962963 | 73 | 0.513138 | 196 | 1,294 | 3.265306 | 0.19898 | 0.1875 | 0.154688 | 0.13125 | 0.398438 | 0.371875 | 0.371875 | 0.371875 | 0.296875 | 0.190625 | 0 | 0.029661 | 0.270479 | 1,294 | 53 | 74 | 24.415094 | 0.648305 | 0.878671 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
435928e4b6dd7e694fef39e0c053e8f7a9dd350b | 27,561 | py | Python | geosoft/gxapi/GXMATH.py | fearaschiarrai/gxpy | 4c5e7594b24e530a8cd94df1eef562c5c6ce3e92 | [
"BSD-2-Clause"
] | 25 | 2017-07-14T06:39:37.000Z | 2022-03-09T21:39:51.000Z | geosoft/gxapi/GXMATH.py | fearaschiarrai/gxpy | 4c5e7594b24e530a8cd94df1eef562c5c6ce3e92 | [
"BSD-2-Clause"
] | 100 | 2016-12-13T17:30:41.000Z | 2021-08-01T20:21:13.000Z | geosoft/gxapi/GXMATH.py | fearaschiarrai/gxpy | 4c5e7594b24e530a8cd94df1eef562c5c6ce3e92 | [
"BSD-2-Clause"
] | 28 | 2016-12-12T17:34:40.000Z | 2022-03-16T15:39:39.000Z | ### extends 'class_empty.py'
### block ClassImports
# NOTICE: Do not edit anything here, it is generated code
from . import gxapi_cy
from geosoft.gxapi import GXContext, float_ref, int_ref, str_ref
### endblock ClassImports
### block Header
# NOTICE: The code generator will not replace the code in this block
### endblock Header
### block ClassImplementation
# NOTICE: Do not edit anything here, it is generated code
class GXMATH(gxapi_cy.WrapMATH):
"""
GXMATH class.
This is not a class. This is a collection of standard
mathematical functions, including most of the common
logarithmic and geometric functions.
"""
def __init__(self, handle=0):
super(GXMATH, self).__init__(GXContext._get_tls_geo(), handle)
@classmethod
def null(cls):
"""
A null (undefined) instance of `GXMATH <geosoft.gxapi.GXMATH>`
:returns: A null `GXMATH <geosoft.gxapi.GXMATH>`
:rtype: GXMATH
"""
return GXMATH()
def is_null(self):
"""
Check if this is a null (undefined) instance
:returns: True if this is a null (undefined) instance, False otherwise.
:rtype: bool
"""
return self._internal_handle() == 0
# Miscellaneous
@classmethod
def cross_product_(cls, x1, y1, z1, x2, y2, z2, x3, y3, z3):
"""
Cross product of two vectors.
:param x1: X1 component
:param y1: Y1 component
:param z1: Z1 component
:param x2: X2 component
:param y2: Y2 component
:param z2: Z2 component
:param x3: X3 component (output)
:param y3: Y3 component (output)
:param z3: Z3 component (output)
:type x1: float
:type y1: float
:type z1: float
:type x2: float
:type y2: float
:type z2: float
:type x3: float_ref
:type y3: float_ref
:type z3: float_ref
.. versionadded:: 6.0
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
"""
x3.value, y3.value, z3.value = gxapi_cy.WrapMATH._cross_product_(GXContext._get_tls_geo(), x1, y1, z1, x2, y2, z2, x3.value, y3.value, z3.value)
@classmethod
def abs_int_(cls, n):
"""
Calculate absolute value
:param n: Integer
:type n: int
:returns: Integer
:rtype: int
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._abs_int_(GXContext._get_tls_geo(), n)
return ret_val
@classmethod
def and_(cls, pi_val1, pi_val2):
"""
Return the unary operation result of A & B
Returns an integer number
If A or B is a dummy, returns dummy.
:param pi_val1: A
:param pi_val2: B
:type pi_val1: int
:type pi_val2: int
:rtype: int
.. versionadded:: 6.3
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
"""
ret_val = gxapi_cy.WrapMATH._and_(GXContext._get_tls_geo(), pi_val1, pi_val2)
return ret_val
@classmethod
def mod_int_(cls, a, b):
"""
Calculates the modulus of two integers
:param a: A
:param b: B (must not be zero)
:type a: int
:type b: int
:returns: Int
:rtype: int
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** If A or B is a dummy, returns dummy.
"""
ret_val = gxapi_cy.WrapMATH._mod_int_(GXContext._get_tls_geo(), a, b)
return ret_val
@classmethod
def or_(cls, pi_val1, pi_val2):
"""
Return the unary operation result of A | B
Returns an integer number
If A or B is a dummy, returns dummy.
:param pi_val1: A
:param pi_val2: B
:type pi_val1: int
:type pi_val2: int
:rtype: int
.. versionadded:: 6.3
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
"""
ret_val = gxapi_cy.WrapMATH._or_(GXContext._get_tls_geo(), pi_val1, pi_val2)
return ret_val
@classmethod
def round_int_(cls, z):
"""
Round to the nearest whole number
:param z: Round
:type z: float
:returns: Integer
:rtype: int
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Negative values with decimal parts larger than .5 round down (-1.5 -> 2.0)
Positive values with decimal parts larger than .5 round up (1.5 -> 2.0)
Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._round_int_(GXContext._get_tls_geo(), z)
return ret_val
@classmethod
def xor_(cls, pi_val1, pi_val2):
"""
Return the unary operation result of A ^ B
Returns an integer number
If A or B is a dummy, returns dummy.
:param pi_val1: A
:param pi_val2: B
:type pi_val1: int
:type pi_val2: int
:rtype: int
.. versionadded:: 6.3
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
"""
ret_val = gxapi_cy.WrapMATH._xor_(GXContext._get_tls_geo(), pi_val1, pi_val2)
return ret_val
@classmethod
def nicer_log_scale_(cls, min, max, fine):
"""
Finds nicer min, max values for logarithmic plot scales.
:param min: Min value (changed)
:param max: Max value (changed)
:param fine: Fine flag
:type min: float_ref
:type max: float_ref
:type fine: int
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Will fail if the input upper bound is less than the lower
bound, but will work if the two values are equal.
The input bounds are overwritten.
Input lower and upper bounds, returns "nicer" values.
If the Fine flag is set to TRUE, the values will have the
form N x 10^Y, where N is a value from 1 to 9, and 10^Y
is an integral power of 10. If the Fine flag is set to
FALSE, the scaling is coarse, and the bounding exact
powers of 10 are returned.
For example, the values (.034, 23) return (.03, 30) for
fine scaling, and (0.01, 100) for coarse scaling.
"""
min.value, max.value = gxapi_cy.WrapMATH._nicer_log_scale_(GXContext._get_tls_geo(), min.value, max.value, fine)
@classmethod
def nicer_scale_(cls, min, max, inc, pow):
"""
Compute a nicer scale for a given min and max.
:param min: Min value (changed)
:param max: Max value (changed)
:param inc: Inc value (returned)
:param pow: Power value (returned)
:type min: float_ref
:type max: float_ref
:type inc: float_ref
:type pow: int_ref
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
"""
min.value, max.value, inc.value, pow.value = gxapi_cy.WrapMATH._nicer_scale_(GXContext._get_tls_geo(), min.value, max.value, inc.value, pow.value)
@classmethod
def normalise_3d_(cls, x, y, z):
"""
Scale a vector to unit length.
:param x: X component (altered)
:param y: Y component (altered)
:param z: Z component (altered)
:type x: float_ref
:type y: float_ref
:type z: float_ref
.. versionadded:: 6.0
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Divides each component by the vector
magnitude.
"""
x.value, y.value, z.value = gxapi_cy.WrapMATH._normalise_3d_(GXContext._get_tls_geo(), x.value, y.value, z.value)
@classmethod
def abs_double_(cls, z):
"""
Calculate absolute value
:param z: Real
:type z: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._abs_double_(GXContext._get_tls_geo(), z)
return ret_val
@classmethod
def arc_cos_(cls, val):
"""
Calculate the arccosine
:param val: Real
:type val: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values or values < -1 or > 1 return dummy
"""
ret_val = gxapi_cy.WrapMATH._arc_cos_(GXContext._get_tls_geo(), val)
return ret_val
@classmethod
def arc_sin_(cls, val):
"""
Calculate the arcsin
:param val: Real
:type val: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values or values < -1 or > 1 return dummy
"""
ret_val = gxapi_cy.WrapMATH._arc_sin_(GXContext._get_tls_geo(), val)
return ret_val
@classmethod
def arc_tan_(cls, val):
"""
Calculate the arctan
:param val: Real
:type val: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._arc_tan_(GXContext._get_tls_geo(), val)
return ret_val
@classmethod
def arc_tan2_(cls, y, x):
"""
Calculate ArcTan(Y/X)
:param y: Y
:param x: X
:type y: float
:type x: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** If either X or Y is a dummy, returns dummy
"""
ret_val = gxapi_cy.WrapMATH._arc_tan2_(GXContext._get_tls_geo(), y, x)
return ret_val
@classmethod
def ceil_(cls, z):
"""
Calculates the ceiling of the value
:param z: Real
:type z: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._ceil_(GXContext._get_tls_geo(), z)
return ret_val
@classmethod
def cos_(cls, val):
"""
Calculate the cosine
:param val: Angle in radians
:type val: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._cos_(GXContext._get_tls_geo(), val)
return ret_val
@classmethod
def dot_product_3d_(cls, x1, y1, z1, x2, y2, z2):
"""
Compute Dot product of two vectors.
:param x1: X1 component
:param y1: Y1 component
:param z1: Z1 component
:param x2: X2 component
:param y2: Y2 component
:param z2: Z2 component
:type x1: float
:type y1: float
:type z1: float
:type x2: float
:type y2: float
:type z2: float
:returns: Dot product
:rtype: float
.. versionadded:: 6.0
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
"""
ret_val = gxapi_cy.WrapMATH._dot_product_3d_(GXContext._get_tls_geo(), x1, y1, z1, x2, y2, z2)
return ret_val
@classmethod
def exp_(cls, val):
"""
Calculate e raised to the power of X
:param val: X
:type val: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._exp_(GXContext._get_tls_geo(), val)
return ret_val
@classmethod
def floor_(cls, z):
"""
Calculates the floor of the value
:param z: Real
:type z: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._floor_(GXContext._get_tls_geo(), z)
return ret_val
@classmethod
def hypot_(cls, x, y):
"""
sqrt(X*X + Y*Y)
:param x: X
:param y: Y
:type x: float
:type y: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** If either X or Y is a dummy, the returned value is dummy
"""
ret_val = gxapi_cy.WrapMATH._hypot_(GXContext._get_tls_geo(), x, y)
return ret_val
@classmethod
def lambda_trans_(cls, z, lda):
"""
Performs lambda transform on a value.
:param z: Z Value
:param lda: Lambda value
:type z: float
:type lda: float
:returns: The lambda transformed value
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Returns 0 for input Z = 0.
returns log10(Z) for lambda = 0.
returns (Z^lambda - 1)/lambda for Z > 0.
returns dummy for Z = dummy.
.. seealso::
`lambda_trans_rev_ <geosoft.gxapi.GXMATH.lambda_trans_rev_>`
"""
ret_val = gxapi_cy.WrapMATH._lambda_trans_(GXContext._get_tls_geo(), z, lda)
return ret_val
@classmethod
def lambda_trans_rev_(cls, z, lda):
"""
Performs a reverse lambda transform on a value.
:param z: Lambda transformed Z Value
:param lda: Lambda value
:type z: float
:type lda: float
:returns: The original non-lambda transformed value
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** See rLambdaTrans.
.. seealso::
`lambda_trans_ <geosoft.gxapi.GXMATH.lambda_trans_>`
"""
ret_val = gxapi_cy.WrapMATH._lambda_trans_rev_(GXContext._get_tls_geo(), z, lda)
return ret_val
@classmethod
def log_(cls, val):
"""
Calculate the natural log
:param val: Real
:type val: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._log_(GXContext._get_tls_geo(), val)
return ret_val
@classmethod
def log10_(cls, val):
"""
Calculate the base 10 log
:param val: Real
:type val: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._log10_(GXContext._get_tls_geo(), val)
return ret_val
@classmethod
def log_z_(cls, z, mode, min):
"""
Given a Z value and the Log style and Log Minimum this
function will return the log value.
:param z: Z Value
:param mode: Log Mode (0 - Log, 1 - LogLinearLog)
:param min: Log Minimum (must be greater than 0)
:type z: float
:type mode: int
:type min: float
:returns: The Log Value.
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Mode = 0 (regular log mode) returns:
::
Log10(Z) for Z > minimum
Log10(minimum) for Z <= minimum
Mode = 1 (log / linear / negative log mode) returns:
::
minimum * ( log10( |Z| / minimum) + 1 ) for Z > minimum
Z for |Z| <= minimum (the linear part of the range)
-minimum * ( log10( |Z| / minimum) + 1 ) for Z < -minimum
.. seealso::
`un_log_z_ <geosoft.gxapi.GXMATH.un_log_z_>`
"""
ret_val = gxapi_cy.WrapMATH._log_z_(GXContext._get_tls_geo(), z, mode, min)
return ret_val
@classmethod
def mod_double_(cls, a, b):
"""
Calculates the modulus of two reals (A mod B)
:param a: A
:param b: B (must not be zero)
:type a: float
:type b: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** The modulus of A with respect to B is defined
as the difference of A with the largest integral multiple of B
smaller than or equal to A.
e.g. A mod B
20 mod 10 = 0
20 mod 9 = 2
f A or B is a dummy, returns dummy.
"""
ret_val = gxapi_cy.WrapMATH._mod_double_(GXContext._get_tls_geo(), a, b)
return ret_val
@classmethod
def rotate_vector_(cls, x1, y1, z1, angle, x2, y2, z2, x3, y3, z3):
"""
Rotate a vector about an axis.
:param x1: X1 component (vector to rotate)
:param y1: Y1 component
:param z1: Z1 component
:param angle: Angle to rotate, CW in radians
:param x2: X2 component (axis of rotation)
:param y2: Y2 component
:param z2: Z2 component
:param x3: X3 component (rotated vector, can
:param y3: Y3 component be the same as input)
:param z3: Z3 component
:type x1: float
:type y1: float
:type z1: float
:type angle: float
:type x2: float
:type y2: float
:type z2: float
:type x3: float_ref
:type y3: float_ref
:type z3: float_ref
.. versionadded:: 6.0
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Rotates a vector by the input angle around an arbitrary axis.
Angles are measured clockwise looking along the axis (away from the origin).
Assumes a right hand coordinate system.
"""
x3.value, y3.value, z3.value = gxapi_cy.WrapMATH._rotate_vector_(GXContext._get_tls_geo(), x1, y1, z1, angle, x2, y2, z2, x3.value, y3.value, z3.value)
@classmethod
def pow_(cls, x, y):
"""
Calculate X raised to the power of Y
:param x: X
:param y: Y
:type x: float
:type y: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** If either X or Y is a dummy, returns dummy
"""
ret_val = gxapi_cy.WrapMATH._pow_(GXContext._get_tls_geo(), x, y)
return ret_val
@classmethod
def rand_(cls):
"""
Get a random number between 0 and 1
:returns: A real number
:rtype: float
.. versionadded:: 6.3
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Use `s_rand_ <geosoft.gxapi.GXMATH.s_rand_>` to seed the random number generator before a series of
calls to this function are made.
The standard "C" function rand() is called.
"""
ret_val = gxapi_cy.WrapMATH._rand_(GXContext._get_tls_geo())
return ret_val
@classmethod
def round_double_(cls, z, n):
"""
Round to n significant digits
:param z: Real
:param n: Number of significant digits to round to
:type z: float
:type n: int
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Negative values ending in 5XXX to n sig digits round down
Positive values ending in 5XXX to n sig digits round up
Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._round_double_(GXContext._get_tls_geo(), z, n)
return ret_val
@classmethod
def sign_(cls, z_sign, z_val):
"""
Determine return value based on value of Z1
:param z_sign: Z1
:param z_val: Z2
:type z_sign: float
:type z_val: float
:returns: ``|Z2| if Z1 > 0, -|Z2| if Z1 < 0, 0 if Z1 = 0, and Z2 if Z1 = Dummy``
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._sign_(GXContext._get_tls_geo(), z_sign, z_val)
return ret_val
@classmethod
def sin_(cls, val):
"""
Calculate the sin
:param val: Angle in radians
:type val: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._sin_(GXContext._get_tls_geo(), val)
return ret_val
@classmethod
def sqrt_(cls, val):
"""
Calculate the square root
:param val: Real
:type val: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._sqrt_(GXContext._get_tls_geo(), val)
return ret_val
@classmethod
def tan_(cls, val):
"""
Calculate the tangent
:param val: Angle in radians
:type val: float
:returns: Real
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Dummy values return dummy
"""
ret_val = gxapi_cy.WrapMATH._tan_(GXContext._get_tls_geo(), val)
return ret_val
@classmethod
def un_log_z_(cls, z, mode, min):
"""
Inverse of rLogZ
:param z: Log value
:param mode: Log Mode (0 - Log, 1 - LogLinearLog)
:param min: Log Minimum (must be greater than 0)
:type z: float
:type mode: int
:type min: float
:returns: The original value
:rtype: float
.. versionadded:: 6.0.1
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** See Notes for rLogZ.
.. seealso::
`log_z_ <geosoft.gxapi.GXMATH.log_z_>`
"""
ret_val = gxapi_cy.WrapMATH._un_log_z_(GXContext._get_tls_geo(), z, mode, min)
return ret_val
@classmethod
def s_rand_(cls):
"""
Seed the random-number generator with current time
.. versionadded:: 6.3
**License:** `Geosoft Open License <https://geosoftgxdev.atlassian.net/wiki/spaces/GD/pages/2359406/License#License-open-lic>`_
**Note:** Use the `rand_ <geosoft.gxapi.GXMATH.rand_>` function to create a random number between 0 and 1.
The standard "C" function srand() is called.
"""
gxapi_cy.WrapMATH._s_rand_(GXContext._get_tls_geo())
### endblock ClassImplementation
### block ClassExtend
# NOTICE: The code generator will not replace the code in this block
### endblock ClassExtend
### block Footer
# NOTICE: The code generator will not replace the code in this block
### endblock Footer | 27.533467 | 159 | 0.573056 | 3,440 | 27,561 | 4.44564 | 0.09593 | 0.024325 | 0.037272 | 0.044726 | 0.75407 | 0.71621 | 0.698097 | 0.672138 | 0.654156 | 0.633035 | 0 | 0.032948 | 0.319437 | 27,561 | 1,001 | 160 | 27.533467 | 0.782375 | 0.602264 | 0 | 0.453947 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.263158 | false | 0 | 0.013158 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
437734c9309ad9fa7431aabce1558f62ec39c6ad | 117 | py | Python | COVID19Map/PublicFun/config/pgsqlconfig.py | xtawfnhdx/COVID19Map | 3e5dbb610d2f569137b1331736cd50e070a080c2 | [
"Apache-2.0"
] | null | null | null | COVID19Map/PublicFun/config/pgsqlconfig.py | xtawfnhdx/COVID19Map | 3e5dbb610d2f569137b1331736cd50e070a080c2 | [
"Apache-2.0"
] | null | null | null | COVID19Map/PublicFun/config/pgsqlconfig.py | xtawfnhdx/COVID19Map | 3e5dbb610d2f569137b1331736cd50e070a080c2 | [
"Apache-2.0"
] | null | null | null | import psycopg2
conn = psycopg2.connect(host="10.100.157.168",user="postgres",password="postgres",database="testdb") | 39 | 100 | 0.769231 | 16 | 117 | 5.625 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116071 | 0.042735 | 117 | 3 | 100 | 39 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0.305085 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.5 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 4 |
437aa83f1e9c28624ccdd32ee4dfce6e67bae358 | 364 | py | Python | vframe_cli/vframe/utils/misc_utils.py | ngi-nix/vframe | 60469e25203136f9d6a5ecaabe2423695ee9a0f2 | [
"MIT"
] | null | null | null | vframe_cli/vframe/utils/misc_utils.py | ngi-nix/vframe | 60469e25203136f9d6a5ecaabe2423695ee9a0f2 | [
"MIT"
] | null | null | null | vframe_cli/vframe/utils/misc_utils.py | ngi-nix/vframe | 60469e25203136f9d6a5ecaabe2423695ee9a0f2 | [
"MIT"
] | null | null | null | #############################################################################
#
# VFRAME
# MIT License
# Copyright (c) 2019 Adam Harvey and VFRAME
# https://vframe.io
#
#############################################################################
def even(n, ceil=True):
return n if n % 2 == 0 else n + 1
def odd(n, ceil=True):
return n if n % 2 else n + 1 | 26 | 78 | 0.351648 | 40 | 364 | 3.2 | 0.575 | 0.078125 | 0.140625 | 0.234375 | 0.3125 | 0.3125 | 0.3125 | 0.3125 | 0 | 0 | 0 | 0.029032 | 0.148352 | 364 | 14 | 79 | 26 | 0.383871 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 4 |
43881796961dee7be2afe06468b9be207332aa37 | 219 | py | Python | order/urls.py | Amwata-Albert/Fast-Food | 1a6667efb10652852877422616ff80f69c3b066a | [
"MIT"
] | null | null | null | order/urls.py | Amwata-Albert/Fast-Food | 1a6667efb10652852877422616ff80f69c3b066a | [
"MIT"
] | 2 | 2020-11-13T06:51:26.000Z | 2020-11-17T07:22:36.000Z | order/urls.py | Amwata-Albert/Fast-Food | 1a6667efb10652852877422616ff80f69c3b066a | [
"MIT"
] | null | null | null | # from django.urls import path
# from .api import CurrentOrders, CurrentOrdersApi
# urlpatterns = [
# path('api/order', CurrentOrdersApi.as_view()),
# path('api/order/<int:pk>', CurrentOrdersApi.as_view()),
# ] | 31.285714 | 61 | 0.694064 | 25 | 219 | 6 | 0.56 | 0.093333 | 0.16 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141553 | 219 | 7 | 62 | 31.285714 | 0.797872 | 0.940639 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
43be9a7500733dca7fbc0856013a261c72cafd5f | 202 | py | Python | src/u012_input_laukums.py | armandspucs/python-testi-1 | d23a483a08640226bf3f0933dd72d9a354ffe695 | [
"MIT"
] | null | null | null | src/u012_input_laukums.py | armandspucs/python-testi-1 | d23a483a08640226bf3f0933dd72d9a354ffe695 | [
"MIT"
] | null | null | null | src/u012_input_laukums.py | armandspucs/python-testi-1 | d23a483a08640226bf3f0933dd72d9a354ffe695 | [
"MIT"
] | null | null | null | """
Uzrakstiet programmu, kas ielasa divus skaitļus (kā float) -
trijstūra augstumu un pamatu
un izvada uz ekrāna (print) trijstūra laukumu.
Pārbaudiet programmas darbību ar dažādiem ievaddatiem.
"""
| 28.857143 | 60 | 0.782178 | 25 | 202 | 6.32 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148515 | 202 | 6 | 61 | 33.666667 | 0.918605 | 0.950495 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
43d717d8e28bb30a38dba8d7c7ab1ca5563d82f8 | 329 | py | Python | api/namex/services/payment/models/abstract/serializable.py | sumesh-aot/namex | 53e11aed5ea550b71b7b983f1b57b65db5a06766 | [
"Apache-2.0"
] | 4 | 2018-10-05T23:41:05.000Z | 2019-06-19T16:17:50.000Z | api/namex/services/payment/models/abstract/serializable.py | sumesh-aot/namex | 53e11aed5ea550b71b7b983f1b57b65db5a06766 | [
"Apache-2.0"
] | 635 | 2018-05-31T04:12:46.000Z | 2022-03-31T18:45:42.000Z | api/namex/services/payment/models/abstract/serializable.py | sumesh-aot/namex | 53e11aed5ea550b71b7b983f1b57b65db5a06766 | [
"Apache-2.0"
] | 71 | 2018-05-14T20:47:55.000Z | 2022-03-31T23:08:30.000Z | import jsonpickle
class Serializable(dict):
def as_dict(self):
return self.__dict__
def to_json_test(self):
# Allows us to unwrap the response when we're running pytests
return jsonpickle.encode(self)
def to_json(self):
return jsonpickle.encode(self, unpicklable=False, warn=True)
| 23.5 | 69 | 0.68997 | 44 | 329 | 4.977273 | 0.613636 | 0.063927 | 0.082192 | 0.237443 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.234043 | 329 | 13 | 70 | 25.307692 | 0.869048 | 0.179331 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0 | 0.125 | 0.375 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
78e26433f28a188c1a876f7198bcba676508ecfa | 336 | py | Python | src/test.py | decafjoe/clik-shell | ccacf02a49efc53cf063e56484aaf53e95ecc368 | [
"BSD-3-Clause"
] | null | null | null | src/test.py | decafjoe/clik-shell | ccacf02a49efc53cf063e56484aaf53e95ecc368 | [
"BSD-3-Clause"
] | null | null | null | src/test.py | decafjoe/clik-shell | ccacf02a49efc53cf063e56484aaf53e95ecc368 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Tests for :mod:`clik_shell`.
:author: Joe Joyce <joe@decafjoe.com>
:copyright: Copyright (c) Joe Joyce and contributors, 2017-2019.
:license: BSD
"""
import clik_shell # noqa: F401 (required for coverage to generate report)
def test():
"""Assert that true is in fact true."""
assert True is True
| 22.4 | 74 | 0.675595 | 49 | 336 | 4.591837 | 0.755102 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043478 | 0.178571 | 336 | 14 | 75 | 24 | 0.771739 | 0.764881 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
78e75d2fb467e6ec2a72bb4b9a79d05ee11a384b | 39 | py | Python | cvat/settings/staging.py | syonekura/cvat | c579ba2319f967f09ab2bf080e357b2bde7180dd | [
"MIT"
] | 1 | 2022-01-18T16:08:34.000Z | 2022-01-18T16:08:34.000Z | cvat/settings/staging.py | syonekura/cvat | c579ba2319f967f09ab2bf080e357b2bde7180dd | [
"MIT"
] | null | null | null | cvat/settings/staging.py | syonekura/cvat | c579ba2319f967f09ab2bf080e357b2bde7180dd | [
"MIT"
] | null | null | null | from .production import *
DEBUG = True | 13 | 25 | 0.74359 | 5 | 39 | 5.8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179487 | 39 | 3 | 26 | 13 | 0.90625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
78ff4295f87a0b99286385582b5b22bfd8168f15 | 126 | py | Python | account/tokens.py | saarikabhasi/postIT | 8a877eea5df0a44e9b6a1868776172fff78a6d47 | [
"MIT"
] | null | null | null | account/tokens.py | saarikabhasi/postIT | 8a877eea5df0a44e9b6a1868776172fff78a6d47 | [
"MIT"
] | null | null | null | account/tokens.py | saarikabhasi/postIT | 8a877eea5df0a44e9b6a1868776172fff78a6d47 | [
"MIT"
] | null | null | null | from django.contrib.auth.tokens import PasswordResetTokenGenerator
account_activation_token = PasswordResetTokenGenerator() | 31.5 | 66 | 0.880952 | 11 | 126 | 9.909091 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 126 | 4 | 67 | 31.5 | 0.931624 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 1 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 4 |
6021a1a6677b884e62aa7ee5698e0e480a7bc05c | 930 | py | Python | tmtrader/controller/order_history_controller.py | reouno/tomatorader | b781206051129fa59439a0f314f4f1ed647a6852 | [
"MIT"
] | null | null | null | tmtrader/controller/order_history_controller.py | reouno/tomatorader | b781206051129fa59439a0f314f4f1ed647a6852 | [
"MIT"
] | null | null | null | tmtrader/controller/order_history_controller.py | reouno/tomatorader | b781206051129fa59439a0f314f4f1ed647a6852 | [
"MIT"
] | null | null | null | from abc import ABC, abstractmethod
from typing import List
from tmtrader.entity.order import BasicOrder
from tmtrader.entity.trade import Trade
class OrderManagerClient(ABC):
@abstractmethod
def list_open_orders(self) -> List[BasicOrder]:
pass
@abstractmethod
def list_trade_history(self) -> List[Trade]:
pass
@abstractmethod
def list_filled_orders(self) -> List[BasicOrder]:
pass
@abstractmethod
def list_cancelled_orders(self) -> List[BasicOrder]:
pass
class OrderHistoryController(ABC):
@abstractmethod
def list_open_orders(self) -> List[BasicOrder]:
pass
class BTOrderHistoryController(OrderHistoryController):
def __init__(self, order_mng_client: OrderManagerClient):
self.__order_mng_client = order_mng_client
def list_open_orders(self) -> List[BasicOrder]:
return self.__order_mng_client.list_open_orders()
| 24.473684 | 61 | 0.727957 | 104 | 930 | 6.221154 | 0.259615 | 0.064915 | 0.162287 | 0.185471 | 0.394127 | 0.394127 | 0.335394 | 0.281298 | 0.173107 | 0.173107 | 0 | 0 | 0.194624 | 930 | 37 | 62 | 25.135135 | 0.863818 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.269231 | false | 0.192308 | 0.153846 | 0.038462 | 0.576923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 4 |
6035d66cca8c91654d59c1afb2a9f3fb4ecc93ae | 22 | py | Python | test/main/__init__.py | crockmitnic/question-paper-generator | 3f5339226aedd4332c562913945a08cdb45983b0 | [
"MIT"
] | 6 | 2020-08-02T20:58:34.000Z | 2022-03-23T20:33:20.000Z | test/main/__init__.py | crockmitnic/question-paper-generator | 3f5339226aedd4332c562913945a08cdb45983b0 | [
"MIT"
] | 209 | 2020-02-12T17:09:15.000Z | 2021-06-03T20:34:35.000Z | test/main/__init__.py | crockmitnic/question-paper-generator | 3f5339226aedd4332c562913945a08cdb45983b0 | [
"MIT"
] | 54 | 2020-02-18T14:54:35.000Z | 2021-09-05T06:31:12.000Z | """main test suite"""
| 11 | 21 | 0.590909 | 3 | 22 | 4.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136364 | 22 | 1 | 22 | 22 | 0.684211 | 0.681818 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
6036e90c4b417cb5ce7ac329528a9894730ec3b0 | 125 | py | Python | compilador/vm/router_solver.py | Nombre-Pendiente/Super-Compi | 3f2a8e0219b04863fbf78d03aba782d235ccb11a | [
"MIT"
] | 6 | 2021-05-20T16:01:45.000Z | 2021-05-27T18:48:57.000Z | compilador/vm/router_solver.py | Nombre-Pendiente/Super-Compi | 3f2a8e0219b04863fbf78d03aba782d235ccb11a | [
"MIT"
] | 1 | 2021-05-18T14:44:04.000Z | 2021-05-18T14:44:04.000Z | compilador/vm/router_solver.py | Nombre-Pendiente/Super-Compi | 3f2a8e0219b04863fbf78d03aba782d235ccb11a | [
"MIT"
] | null | null | null | import sys
import os
route = os.getcwd().replace("/compilador/vm", "").replace("/compilador", "")
sys.path.insert(1, route)
| 20.833333 | 76 | 0.68 | 17 | 125 | 5 | 0.647059 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00885 | 0.096 | 125 | 5 | 77 | 25 | 0.743363 | 0 | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
60452360f87c92bc5eb0bd208b6009773287c134 | 18 | py | Python | kallisticore/version.py | jpmorganchase/kallisti-core | d9dfcaa2ec3c9cd26dd37b5f2c39c3788a3d05aa | [
"Apache-2.0"
] | 1 | 2022-03-03T14:27:25.000Z | 2022-03-03T14:27:25.000Z | kallisticore/version.py | jpmorganchase/kallisti-core | d9dfcaa2ec3c9cd26dd37b5f2c39c3788a3d05aa | [
"Apache-2.0"
] | null | null | null | kallisticore/version.py | jpmorganchase/kallisti-core | d9dfcaa2ec3c9cd26dd37b5f2c39c3788a3d05aa | [
"Apache-2.0"
] | 1 | 2022-03-09T05:57:55.000Z | 2022-03-09T05:57:55.000Z | VERSION = '1.3.5'
| 9 | 17 | 0.555556 | 4 | 18 | 2.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 0.166667 | 18 | 1 | 18 | 18 | 0.466667 | 0 | 0 | 0 | 0 | 0 | 0.277778 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
605055397842ed97ab324194b1d9191f3cea91d4 | 145 | py | Python | books_management/publisher/api.py | blackriddle/books-management | ba485a362a8bc50052dd6f4fc3884e639ca762b0 | [
"MIT"
] | null | null | null | books_management/publisher/api.py | blackriddle/books-management | ba485a362a8bc50052dd6f4fc3884e639ca762b0 | [
"MIT"
] | null | null | null | books_management/publisher/api.py | blackriddle/books-management | ba485a362a8bc50052dd6f4fc3884e639ca762b0 | [
"MIT"
] | null | null | null | from books_management import api
from resource import PublisherResource
api.add_resource(PublisherResource, '/publisher/', '/publisher/<isbn>')
| 29 | 71 | 0.813793 | 16 | 145 | 7.25 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.082759 | 145 | 4 | 72 | 36.25 | 0.87218 | 0 | 0 | 0 | 0 | 0 | 0.193103 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
60543ede8fd901b81627543ce6607434a401af48 | 1,705 | py | Python | codewars/8kyu/dinamuh/keepHydrated/test_bench.py | dinamuh/Training_one | d18e8fb12608ce1753162c20252ca928c4df97ab | [
"MIT"
] | null | null | null | codewars/8kyu/dinamuh/keepHydrated/test_bench.py | dinamuh/Training_one | d18e8fb12608ce1753162c20252ca928c4df97ab | [
"MIT"
] | 2 | 2019-01-22T10:53:42.000Z | 2019-01-31T08:02:48.000Z | codewars/8kyu/dinamuh/keepHydrated/test_bench.py | dinamuh/Training_one | d18e8fb12608ce1753162c20252ca928c4df97ab | [
"MIT"
] | 13 | 2019-01-22T10:37:42.000Z | 2019-01-25T13:30:43.000Z | from main import litres
from main import litres2
def test(benchmark):
assert benchmark(litres, 2) == 1
def test2(benchmark):
assert benchmark(litres2, 2) == 1
''''''''''
--------------------------------------------------------------------------------------------- benchmark: 2 tests --------------------------------------------------------------------------------------------
Name (time in ns) Min Max Mean StdDev Median IQR Outliers OPS (Kops/s) Rounds Iterations
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
test2 413.0750 (1.0) 2,886.0750 (1.0) 499.1945 (1.0) 173.1229 (1.0) 424.1000 (1.0) 6.0249 (1.0) 8874;10001 2,003.2272 (1.0) 56304 40
test 1,137.9998 (2.75) 34,955.9996 (12.11) 1,487.1526 (2.98) 674.8714 (3.90) 1,192.9988 (2.81) 115.0001 (19.09) 18538;21133 672.4259 (0.34) 92679 1
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Legend:
Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile.
OPS: Operations Per Second, computed as 1 / Mean
============================================================================ 2 passed in 5.16 seconds
'''''''''
| 65.576923 | 205 | 0.313196 | 147 | 1,705 | 3.632653 | 0.62585 | 0.026217 | 0.052434 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144587 | 0.22522 | 1,705 | 25 | 206 | 68.2 | 0.259652 | 0 | 0 | 0.111111 | 0 | 0.222222 | 0.890524 | 0.394938 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.111111 | false | 0.055556 | 0.111111 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 4 |
605f4445cfe53f0b128df6bf7ab40b7151f7072d | 2,852 | py | Python | app/models.py | theonilahtash/pitches | 2bc26a3f87078572b745cc961e67a3f56edaad76 | [
"MIT",
"Unlicense"
] | null | null | null | app/models.py | theonilahtash/pitches | 2bc26a3f87078572b745cc961e67a3f56edaad76 | [
"MIT",
"Unlicense"
] | null | null | null | app/models.py | theonilahtash/pitches | 2bc26a3f87078572b745cc961e67a3f56edaad76 | [
"MIT",
"Unlicense"
] | null | null | null | from . import db
from werkzeug.security import generate_password_hash, check_password_hash
from flask_login import UserMixin
from . import login_manager
@login_manager.user_loader
def load_user(user_id):
return User.query.get(int(user_id))
class User(UserMixin,db.Model):
__tablename__ = 'users'
id = db.Column(db.Integer, primary_key = True)
username = db.Column(db.String(255),index = True)
email = db.Column(db.String(255), unique = True, index = True)
password = db.Column(db.String(255))
# Defining the One to many relationship between a user and a pitch
pitch = db.relationship('Pitch', backref="user", lazy='dynamic')
# Defining the One to many relationship between a user and a comment
comment = db.relationship('Comment', backref='main_user', lazy='dynamic')
pass_secure = db.Column(db.String(255))
@property
def password(self):
raise AttributeError('You cannot read the password attribute')
@password.setter
def password(self, password):
self.pass_secure = generate_password_hash(password)
def verify_password(self, password):
return check_password_hash(self.pass_secure,password)
def __repr__(self):
return f'User {self.username}'
class Category(db.Model):
__tablename__ = 'categories'
id = db.Column(db.Integer, primary_key = True)
name = db.Column(db.String(255))
# Defining a one to many relationship between a category and a pitch
pitch = db.relationship('Pitch', backref='parent_category', lazy='dynamic')
def __repr__(self):
return f'Category {self.name}'
class Pitch(db.Model):
__tablename__ = 'pitches'
id = db.Column(db.Integer, primary_key = True)
title = db.Column(db.String(255))
body = db.Column(db.String)
# Defining the foreign key from the relationship between a user and a pitch
user_id = db.Column(db.Integer, db.ForeignKey("users.id"))
# Defining the foreign key from the relationship between a category and a pitch
category_id = db.Column(db.Integer, db.ForeignKey("categories.id"))
# Defining a one to many relationship between a pitch and a comment
comments = db.relationship('Comment', backref="main_pitch", lazy="dynamic")
def __repr__(self):
return f'Pitch {self.title}'
class Comment(db.Model):
__tablename__ = 'comments'
id = db.Column(db.Integer, primary_key = True)
author = db.Column(db.String(255))
comment = db.Column(db.String)
# Defining the foreign key from the relationship between a pitch and a comment
pitch_id = db.Column(db.Integer, db.ForeignKey("pitches.id"))
# Defining the foreign key from the relationship between a user and a comment
user_id = db.Column(db.Integer, db.ForeignKey("users.id"))
def __repr__(self):
return f'Comment {self.comment}' | 35.209877 | 83 | 0.704418 | 399 | 2,852 | 4.879699 | 0.192982 | 0.069851 | 0.087314 | 0.07396 | 0.582948 | 0.482794 | 0.463277 | 0.362609 | 0.216744 | 0.216744 | 0 | 0.009067 | 0.187938 | 2,852 | 81 | 84 | 35.209877 | 0.831606 | 0.199509 | 0 | 0.192308 | 1 | 0 | 0.121758 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0.192308 | 0.076923 | 0.115385 | 0.903846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 4 |
6079713f9988d3ab6b671e0ef49d5c5a39410f2b | 148 | py | Python | python/use_C_fortran_in_python/fortran/f2py-hw/setup.py | qingkaikong/useful_script | 2547931dd11dbff7438e323ff4cd168427ff92ce | [
"BSD-3-Clause"
] | 4 | 2016-03-16T17:06:42.000Z | 2021-07-26T15:43:42.000Z | python/use_C_fortran_in_python/fortran/f2py-hw/setup.py | qingkaikong/useful_script | 2547931dd11dbff7438e323ff4cd168427ff92ce | [
"BSD-3-Clause"
] | null | null | null | python/use_C_fortran_in_python/fortran/f2py-hw/setup.py | qingkaikong/useful_script | 2547931dd11dbff7438e323ff4cd168427ff92ce | [
"BSD-3-Clause"
] | 3 | 2015-12-01T20:38:19.000Z | 2020-12-15T20:10:34.000Z | from numpy.distutils.core import Extension, setup
setup(name='hw',
ext_modules=[Extension(name='hw', sources=['../useFortran77.f'])],) | 37 | 80 | 0.668919 | 18 | 148 | 5.444444 | 0.777778 | 0.122449 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015748 | 0.141892 | 148 | 4 | 80 | 37 | 0.755906 | 0 | 0 | 0 | 0 | 0 | 0.14094 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
607f2d1ab92403f8ec8e60f13e4e3c23fe59fc6a | 141 | py | Python | biggestno.py | sushmitajaiswal/PythonPrograms | d4fb1b36953185e2f8dd866798ca6965a52563a9 | [
"MIT"
] | null | null | null | biggestno.py | sushmitajaiswal/PythonPrograms | d4fb1b36953185e2f8dd866798ca6965a52563a9 | [
"MIT"
] | null | null | null | biggestno.py | sushmitajaiswal/PythonPrograms | d4fb1b36953185e2f8dd866798ca6965a52563a9 | [
"MIT"
] | null | null | null | n1=int(input("enter first no."))
n2=int(input("enter second no."))
if n1>n2:
print("biggest no. is:",n1)
else :
print("biggest no. is",n2)
| 20.142857 | 33 | 0.64539 | 26 | 141 | 3.5 | 0.5 | 0.175824 | 0.285714 | 0.351648 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04878 | 0.12766 | 141 | 6 | 34 | 23.5 | 0.691057 | 0 | 0 | 0 | 0 | 0 | 0.425532 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
60b4acaebe160845a0943a04023efc4315033d9e | 84 | py | Python | Copyright.novaextension/Tests/file.py | tommasongr/nova-copyright | d0162688d57aba21d8579c2808bf42c285e347e1 | [
"MIT"
] | 1 | 2021-04-06T18:24:20.000Z | 2021-04-06T18:24:20.000Z | Copyright.novaextension/Tests/file.py | tommasongr/nova-copyright | d0162688d57aba21d8579c2808bf42c285e347e1 | [
"MIT"
] | 1 | 2021-08-30T17:47:03.000Z | 2021-09-04T10:53:31.000Z | Copyright.novaextension/Tests/file.py | tommasongr/nova-copyright | d0162688d57aba21d8579c2808bf42c285e347e1 | [
"MIT"
] | null | null | null | #
# file.py
# Copyright.novaextension
#
# Created by Tommaso Negri on 07/11/20.
#
| 9.333333 | 39 | 0.678571 | 12 | 84 | 4.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088235 | 0.190476 | 84 | 8 | 40 | 10.5 | 0.75 | 0.821429 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
60d2c0626b861acf4ef800b084fa2fc97499ef57 | 639 | py | Python | napalm_dellos6/dellos6_constants.py | danimo/napalm-dellos6 | 859a24cad8e9d9ef8b60309dd207d5e320696c2b | [
"Apache-2.0"
] | 4 | 2021-04-01T13:47:26.000Z | 2021-12-09T03:35:49.000Z | napalm_dellos6/dellos6_constants.py | danimo/napalm-dellos6 | 859a24cad8e9d9ef8b60309dd207d5e320696c2b | [
"Apache-2.0"
] | 22 | 2020-06-10T23:49:21.000Z | 2020-06-24T18:59:12.000Z | napalm_dellos6/dellos6_constants.py | danimo/napalm-dellos6 | 859a24cad8e9d9ef8b60309dd207d5e320696c2b | [
"Apache-2.0"
] | 1 | 2020-06-14T19:13:28.000Z | 2020-06-14T19:13:28.000Z | """Constants to be used with Dell OS6 driver."""
DELLOS6_SANITIZE_FILTERS = {
r"^(username\s+\S+\s+password)\s+(\S+)(\s+privilege\s+\d+(\s+encrypted)?)?$": r"\1 <removed>\3",
r"^(key)\s+\S+$": r"\1 <removed>",
r"^(snmp-server engineid local).*$": r"\1 <removed>",
r"^(snmp-server community)\s+\S+(\s*(ro|rw))?$": r"\1 <removed>\2",
r"^(snmp-server host \S+ traps version (1|2)) \S+(\s+(filter \S+)"
r"?(udp-port \d+)?)?$": r"\1 <removed>\3",
r"^(snmp-server host \S+ informs\s*(timeout \d+)?\s*(retries \d+)?)\s*\S+$": r"\1 <removed>",
r"^(enable\s+password)\s+(\S+)(\s+encrypted)?$": r"\1 <removed>\3",
}
| 49.153846 | 100 | 0.539906 | 107 | 639 | 3.205607 | 0.373832 | 0.06414 | 0.183673 | 0.087464 | 0.466472 | 0.274052 | 0 | 0 | 0 | 0 | 0 | 0.026978 | 0.12989 | 639 | 12 | 101 | 53.25 | 0.589928 | 0.065728 | 0 | 0 | 0 | 0.3 | 0.764805 | 0.248731 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 4 |
60d3019c19bc9a3a70db6005bec862cfc966d266 | 305 | py | Python | test_excercise2.py | LukasHoste/Algorithms_Vives_exercises_test | 9bc17450352ed1adf57833f739ba9e58bd2deb80 | [
"Apache-2.0"
] | null | null | null | test_excercise2.py | LukasHoste/Algorithms_Vives_exercises_test | 9bc17450352ed1adf57833f739ba9e58bd2deb80 | [
"Apache-2.0"
] | null | null | null | test_excercise2.py | LukasHoste/Algorithms_Vives_exercises_test | 9bc17450352ed1adf57833f739ba9e58bd2deb80 | [
"Apache-2.0"
] | 1 | 2021-10-14T07:30:44.000Z | 2021-10-14T07:30:44.000Z | from exercise2 import luc, lucasRow, sumOfEvenLucasNumbers
def test_generateLucas():
assert(3 == luc(2))
assert(4 == luc(3))
def test_sumOfLucas():
assert(2 == lucasRow(3))
def test_generateSumOfEvenLucasNumbers():
assert(6 == sumOfEvenLucasNumbers(4))
assert(24 == sumOfEvenLucasNumbers(7))
| 23.461538 | 58 | 0.734426 | 35 | 305 | 6.314286 | 0.514286 | 0.095023 | 0.072398 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045283 | 0.131148 | 305 | 12 | 59 | 25.416667 | 0.788679 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.555556 | 1 | 0.333333 | true | 0 | 0.111111 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
60e5c961108a703bedf7bcdf680b917f3992bf64 | 6,534 | py | Python | octavia/tests/unit/cmd/test_status.py | buty4649/octavia | a4aa03d3bc98eb27cc353140cd998a623baa505f | [
"Apache-2.0"
] | null | null | null | octavia/tests/unit/cmd/test_status.py | buty4649/octavia | a4aa03d3bc98eb27cc353140cd998a623baa505f | [
"Apache-2.0"
] | 2 | 2018-09-28T08:41:14.000Z | 2019-08-01T11:20:37.000Z | octavia/tests/unit/cmd/test_status.py | buty4649/octavia | a4aa03d3bc98eb27cc353140cd998a623baa505f | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2018 NEC, Corp.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from unittest import mock
from oslo_config import cfg
from oslo_config import fixture as oslo_fixture
from oslo_upgradecheck.upgradecheck import Code
from octavia.cmd import status
from octavia.common import constants
from octavia.common import policy
from octavia.tests.unit import base
class TestUpgradeChecks(base.TestCase):
def setUp(self):
super().setUp()
self.cmd = status.Checks()
def test__check_amphorav2_not_enabled(self):
self.conf = self.useFixture(oslo_fixture.Config(cfg.CONF))
self.conf.config(group='api_settings',
default_provider_driver=constants.AMPHORA,
enabled_provider_drivers={constants.AMPHORA: "Test"})
check_result = self.cmd._check_amphorav2()
self.assertEqual(
Code.SUCCESS, check_result.code)
def test__check_persistence_sqlite(self):
check_result = self.cmd._check_persistence()
self.assertEqual(
Code.WARNING, check_result.code)
@mock.patch('octavia.controller.worker.v2.taskflow_jobboard_driver.'
'MysqlPersistenceDriver')
def test__check_persistence_error(self, mysql_driver):
mysql_driver().get_persistence.side_effect = Exception
check_result = self.cmd._check_persistence()
self.assertEqual(
Code.FAILURE, check_result.code)
@mock.patch('octavia.controller.worker.v2.taskflow_jobboard_driver.'
'MysqlPersistenceDriver')
def test__check_persistence(self, mysql_driver):
pers_mock = mock.MagicMock()
mysql_driver().get_persistence().__enter__.return_value = pers_mock
check_result = self.cmd._check_persistence()
self.assertEqual(pers_mock, check_result)
@mock.patch('octavia.controller.worker.v2.taskflow_jobboard_driver.'
'RedisTaskFlowDriver')
def test__check_jobboard_error(self, redis_driver):
pers_mock = mock.MagicMock()
redis_driver().job_board.side_effect = Exception
check_result = self.cmd._check_jobboard(pers_mock)
self.assertEqual(Code.FAILURE, check_result.code)
@mock.patch('octavia.controller.worker.v2.taskflow_jobboard_driver.'
'RedisTaskFlowDriver')
def test__check_jobboard_not_connected(self, redis_driver):
jb_connected = mock.Mock(connected=False)
redis_driver().job_board().__enter__.return_value = jb_connected
check_result = self.cmd._check_jobboard(mock.MagicMock())
self.assertEqual(Code.FAILURE, check_result.code)
@mock.patch('octavia.controller.worker.v2.taskflow_jobboard_driver.'
'RedisTaskFlowDriver')
def test__check_jobboard(self, redis_driver):
jb_connected = mock.Mock(connected=True)
redis_driver().job_board().__enter__.return_value = jb_connected
check_result = self.cmd._check_jobboard(mock.MagicMock())
self.assertEqual(Code.SUCCESS, check_result.code)
@mock.patch('octavia.controller.worker.v2.taskflow_jobboard_driver.'
'RedisTaskFlowDriver')
@mock.patch('octavia.controller.worker.v2.taskflow_jobboard_driver.'
'MysqlPersistenceDriver')
def test__check_amphorav2_success(self, mysql_driver, redis_driver):
self.conf = self.useFixture(oslo_fixture.Config(cfg.CONF))
self.conf.config(group='api_settings',
default_provider_driver=constants.AMPHORA,
enabled_provider_drivers={constants.AMPHORAV2:
"Test"})
jb_connected = mock.Mock(connected=True)
redis_driver().job_board().__enter__.return_value = jb_connected
check_result = self.cmd._check_amphorav2()
self.assertEqual(
Code.SUCCESS, check_result.code)
@mock.patch('octavia.controller.worker.v2.taskflow_jobboard_driver.'
'RedisTaskFlowDriver')
def test__check_amphorav2_warning(self, redis_driver):
self.conf = self.useFixture(oslo_fixture.Config(cfg.CONF))
self.conf.config(group='api_settings',
default_provider_driver=constants.AMPHORA,
enabled_provider_drivers={constants.AMPHORAV2:
"Test"})
check_result = self.cmd._check_amphorav2()
self.assertEqual(
Code.WARNING, check_result.code)
@mock.patch('octavia.controller.worker.v2.taskflow_jobboard_driver.'
'RedisTaskFlowDriver')
@mock.patch('octavia.controller.worker.v2.taskflow_jobboard_driver.'
'MysqlPersistenceDriver')
def test__check_amphorav2_failure(self, mysql_driver, redis_driver):
self.conf = self.useFixture(oslo_fixture.Config(cfg.CONF))
self.conf.config(group='api_settings',
default_provider_driver=constants.AMPHORAV2,
enabled_provider_drivers={constants.AMPHORA: "Test"})
jb_connected = mock.Mock(connected=False)
redis_driver().job_board().__enter__.return_value = jb_connected
check_result = self.cmd._check_amphorav2()
self.assertEqual(
Code.FAILURE, check_result.code)
def test__check_yaml_policy(self):
policy.Policy()
self.conf = self.useFixture(oslo_fixture.Config(cfg.CONF))
self.conf.config(group='oslo_policy', policy_file='test.yaml')
check_result = self.cmd._check_yaml_policy()
self.assertEqual(Code.SUCCESS, check_result.code)
self.conf.config(group='oslo_policy', policy_file='test.json')
check_result = self.cmd._check_yaml_policy()
self.assertEqual(Code.WARNING, check_result.code)
self.conf.config(group='oslo_policy', policy_file='test')
check_result = self.cmd._check_yaml_policy()
self.assertEqual(Code.FAILURE, check_result.code)
| 45.375 | 78 | 0.684726 | 746 | 6,534 | 5.705094 | 0.187668 | 0.067199 | 0.045818 | 0.054981 | 0.75 | 0.737312 | 0.71828 | 0.704652 | 0.671758 | 0.650141 | 0 | 0.005696 | 0.220845 | 6,534 | 143 | 79 | 45.692308 | 0.830289 | 0.088613 | 0 | 0.693694 | 0 | 0 | 0.144925 | 0.105706 | 0 | 0 | 0 | 0 | 0.117117 | 1 | 0.108108 | false | 0 | 0.072072 | 0 | 0.189189 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
60e70b07afdde00648cc8638f1db07d0d68bba25 | 300 | py | Python | testapp/testdata/__init__.py | shockflash/reviews | f6cf2727e56f190e48f08d5da7932ff9d7b12936 | [
"BSD-3-Clause"
] | 1 | 2015-03-01T10:39:22.000Z | 2015-03-01T10:39:22.000Z | testapp/testdata/__init__.py | shockflash/reviews | f6cf2727e56f190e48f08d5da7932ff9d7b12936 | [
"BSD-3-Clause"
] | null | null | null | testapp/testdata/__init__.py | shockflash/reviews | f6cf2727e56f190e48f08d5da7932ff9d7b12936 | [
"BSD-3-Clause"
] | null | null | null | from models import TestReview, TestReviewSegment
from forms import TestReviewForm, TestReviewSegmentForm
def get_model():
return TestReview
def get_segment_model():
return TestReviewSegment
def get_form():
return TestReviewForm
def get_segment_form():
return TestReviewSegmentForm | 21.428571 | 55 | 0.803333 | 32 | 300 | 7.34375 | 0.4375 | 0.102128 | 0.110638 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 300 | 14 | 56 | 21.428571 | 0.921569 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | true | 0 | 0.2 | 0.4 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
88096622ada8c477c0aea93172dbd699d2932deb | 308 | py | Python | BitTornado/tests/__init__.py | crossbrowsertesting/BitTornado | 30b8da65ec620573351838e0281b3c9c9dc0982b | [
"MIT"
] | 1 | 2020-06-01T06:16:03.000Z | 2020-06-01T06:16:03.000Z | BitTornado/tests/__init__.py | crossbrowsertesting/BitTornado | 30b8da65ec620573351838e0281b3c9c9dc0982b | [
"MIT"
] | null | null | null | BitTornado/tests/__init__.py | crossbrowsertesting/BitTornado | 30b8da65ec620573351838e0281b3c9c9dc0982b | [
"MIT"
] | null | null | null | from .test_bencode import CodecTests
from .test_bitfield import BitfieldTests
from .test_networkaddress import AddressTests, AddressRangeTests, \
SubnetTests, TestAddrList
from .test_parseargs import ParseArgsTest
from .test_piecebuffer import PieceBufferTests
from .test_selectpoll import PollListTests
| 38.5 | 67 | 0.863636 | 33 | 308 | 7.878788 | 0.545455 | 0.184615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103896 | 308 | 7 | 68 | 44 | 0.942029 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.857143 | 0 | 0.857143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
716d9f0ca71afa0e524c1033e8d9607870977570 | 160 | py | Python | files/data/ChunLua.py | ArezalGame89/ChungOSB | 8fbf7d875af911eb9fe7faeb97bf802e974e3b38 | [
"MIT"
] | null | null | null | files/data/ChunLua.py | ArezalGame89/ChungOSB | 8fbf7d875af911eb9fe7faeb97bf802e974e3b38 | [
"MIT"
] | null | null | null | files/data/ChunLua.py | ArezalGame89/ChungOSB | 8fbf7d875af911eb9fe7faeb97bf802e974e3b38 | [
"MIT"
] | null | null | null | import lupa
from lupa import LuaRuntime
from ply import lex, yacc
l = LuaRuntime()
tokens = {"52554e", "52454144", "4E554C4C" ""} #! RUNNER #! READ #! NUL
| 20 | 74 | 0.6625 | 20 | 160 | 5.3 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139535 | 0.19375 | 160 | 7 | 75 | 22.857143 | 0.682171 | 0.14375 | 0 | 0 | 0 | 0 | 0.164179 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
71a2bed1085d9a2a0942251f741829af9f8a00a4 | 26 | py | Python | modules/utils/__init__.py | hasandiwan/ZppixBot-Source | 8364ace985897b84433244450ab22d4ed4ad8191 | [
"EFL-2.0"
] | 1 | 2019-11-27T16:56:51.000Z | 2019-11-27T16:56:51.000Z | modules/utils/__init__.py | RhinosF1/ZppixBot-Source | 32f45c85d70a1961aafe7a4ce9b95c34e0133aa7 | [
"EFL-2.0"
] | null | null | null | modules/utils/__init__.py | RhinosF1/ZppixBot-Source | 32f45c85d70a1961aafe7a4ce9b95c34e0133aa7 | [
"EFL-2.0"
] | null | null | null | """Utils for ZppixBot."""
| 13 | 25 | 0.615385 | 3 | 26 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115385 | 26 | 1 | 26 | 26 | 0.695652 | 0.730769 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
71e46d16e1692d8b0e822e2973b98a5888007258 | 132 | py | Python | notification/parameters.py | EhsanSaZ/send_message_api_bale_bot | 803e9b91d1eea477d3060b5dcc4e0099641876c9 | [
"MIT"
] | 1 | 2018-11-12T17:00:35.000Z | 2018-11-12T17:00:35.000Z | notification/parameters.py | EhsanSaZ/send_message_api_bale_bot | 803e9b91d1eea477d3060b5dcc4e0099641876c9 | [
"MIT"
] | null | null | null | notification/parameters.py | EhsanSaZ/send_message_api_bale_bot | 803e9b91d1eea477d3060b5dcc4e0099641876c9 | [
"MIT"
] | null | null | null | class Parameters:
def __init__(self, **kwargs):
for key, value in kwargs.items():
setattr(self, key, value)
| 26.4 | 41 | 0.598485 | 16 | 132 | 4.6875 | 0.75 | 0.213333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.280303 | 132 | 4 | 42 | 33 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
e07a4e1adf9fd9c67a7dfa28e3ca0cf9758a88ec | 76 | py | Python | src/integrations/datadog/__init__.py | augustuswm/flagsmith-api | 6f37947fe3791726a92b4df2cdbded11e77387d3 | [
"BSD-3-Clause"
] | 1,259 | 2021-06-10T11:24:09.000Z | 2022-03-31T10:30:44.000Z | src/integrations/datadog/__init__.py | augustuswm/flagsmith-api | 6f37947fe3791726a92b4df2cdbded11e77387d3 | [
"BSD-3-Clause"
] | 392 | 2021-06-10T11:12:29.000Z | 2022-03-31T10:13:53.000Z | src/integrations/datadog/__init__.py | augustuswm/flagsmith-api | 6f37947fe3791726a92b4df2cdbded11e77387d3 | [
"BSD-3-Clause"
] | 58 | 2021-06-11T03:18:07.000Z | 2022-03-31T14:39:10.000Z | default_app_config = "integrations.datadog.apps.DataDogConfigurationConfig"
| 38 | 75 | 0.881579 | 7 | 76 | 9.285714 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039474 | 76 | 1 | 76 | 76 | 0.890411 | 0 | 0 | 0 | 0 | 0 | 0.684211 | 0.684211 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
e088633c1599193e94b7da771979b83b31597e30 | 725 | py | Python | finwatch/app/serializers.py | bgunebakan/finwatch | 28e70b58a1ba1de55146262eeb94d20a8e2ac909 | [
"MIT"
] | null | null | null | finwatch/app/serializers.py | bgunebakan/finwatch | 28e70b58a1ba1de55146262eeb94d20a8e2ac909 | [
"MIT"
] | null | null | null | finwatch/app/serializers.py | bgunebakan/finwatch | 28e70b58a1ba1de55146262eeb94d20a8e2ac909 | [
"MIT"
] | null | null | null | from app.models import News, Stock
from django.contrib.auth.models import User, Group
from rest_framework import serializers
class UserSerializer(serializers.HyperlinkedModelSerializer):
class Meta:
model = User
fields = ['url', 'username', 'email', 'groups']
class GroupSerializer(serializers.HyperlinkedModelSerializer):
class Meta:
model = Group
fields = ['url', 'name']
class NewsSerializer(serializers.HyperlinkedModelSerializer):
class Meta:
model = News
fields = ['stock', 'description', 'link', 'published_at']
class StockSerializer(serializers.HyperlinkedModelSerializer):
class Meta:
model = Stock
fields = ['symbol', 'name']
| 25 | 65 | 0.689655 | 68 | 725 | 7.323529 | 0.485294 | 0.297189 | 0.337349 | 0.369478 | 0.409639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208276 | 725 | 28 | 66 | 25.892857 | 0.867596 | 0 | 0 | 0.210526 | 0 | 0 | 0.097931 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.157895 | 0 | 0.578947 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
e090b4c7243d288b8cefe0af06ba46317ce8b8ad | 9,897 | py | Python | brainlit/algorithms/generate_fragments/tests/test_tube_seg.py | neurodata/brainl | 2de7b5b161000d4d0957de4e836c9e72f7b62ec0 | [
"Apache-2.0"
] | null | null | null | brainlit/algorithms/generate_fragments/tests/test_tube_seg.py | neurodata/brainl | 2de7b5b161000d4d0957de4e836c9e72f7b62ec0 | [
"Apache-2.0"
] | null | null | null | brainlit/algorithms/generate_fragments/tests/test_tube_seg.py | neurodata/brainl | 2de7b5b161000d4d0957de4e836c9e72f7b62ec0 | [
"Apache-2.0"
] | null | null | null | import pytest
import brainlit
from brainlit.algorithms.generate_fragments import tube_seg
import numpy as np
from brainlit.utils.session import NeuroglancerSession
from brainlit.utils.Neuron_trace import NeuronTrace
from skimage import draw
from pathlib import Path
top_level = Path(__file__).parents[4] / "data"
input = (top_level / "data_octree").as_posix()
url = (top_level / "test_upload").as_uri()
url_seg = url + "_segments"
url = url + "/serial"
def test_pairwise():
"""
For a given iterable array [A1,A2,...,An], test if the function can return a zipped list [(A1,A2),(A2,A3),...,(An-1,An)]
The list should contain n-1 pairs(2-element tuple)
The first element of all the tuples are [A1,A2,...,An-1]
The second element of all the tubles are [A2,A3,...,An]
"""
n = np.random.randint(4, 9)
iterable = np.random.randint(10, size=(n, 3))
pair = list(tube_seg.pairwise(iterable))
"""
Verify:
I. the number of pairs
II. the first element of each pair
III. the second elemnt of each pair
"""
assert len(pair) == n - 1
assert (iterable[:-1, :] == [a[0] for a in pair]).all()
assert (iterable[1:, :] == [b[1] for b in pair]).all()
def test_draw_sphere():
"""
Test if the function maps all the points located within the given radius of the given center to 1, otherwise 0
The output array should have the same size of input image and binary values
Distance between a point and the given center:
<= radius (if the point has value 1)
> radius (if the point has value 0)
"""
ngl_session = NeuroglancerSession(url=url, url_segments=url_seg)
img, _, _ = ngl_session.pull_vertex_list(2, [4], expand=True)
shape = img.shape
center = [
np.random.randint(shape[0]),
np.random.randint(shape[1]),
np.random.randint(shape[2]),
]
radius = np.random.randint(1, 4)
sphere = tube_seg.draw_sphere(shape, center, radius)
coords = np.where(sphere < 1)
d_bg = min(np.sum((np.array(coords).T - center) ** 2, axis=1))
coords = np.where(sphere > 0)
d_s = max(np.sum((np.array(coords).T - center) ** 2, axis=1))
"""
Verify:
I. the size of output array
II. if the output is binary-valued
III. minimum distance between 0-valued points and the center is greater than radius
IV. maximum distance between 1-valued points and the center is less than or equal to radius
"""
assert sphere.shape == shape
assert np.unique(sphere).all() in [0, 1]
assert d_bg > radius ** 2
assert d_s <= radius ** 2
def test_draw_tube_spheres():
"""
Test if the function maps all the points within the radius of a segment line (defined by 2 given points) to 1, otherwise 0
The output array should have the same size of input image and binary values
Distance between a point and the segment line:
<= radius (if the point has value 1)
> radius (if the point has value 0)
"""
ngl_session = NeuroglancerSession(url=url, url_segments=url_seg)
img, _, _ = ngl_session.pull_vertex_list(2, [4], expand=True)
shape = img.shape
vertex0 = [
np.random.randint(shape[0] / 2),
np.random.randint(shape[1]),
np.random.randint(shape[2]),
]
vertex1 = [
np.random.randint(shape[0] / 2, shape[0]),
np.random.randint(shape[1]),
np.random.randint(shape[2]),
]
radius = np.random.randint(1, 4)
labels = tube_seg.draw_tube_from_spheres(img, vertex0, vertex1, radius)
line = draw.line_nd(vertex0, vertex1, endpoint=True)
coords = np.where(labels < 1)
d_bg = max(shape)
for pt in np.array(coords).T:
distance_min = min(np.sum((np.array(line).T - pt) ** 2, axis=1))
d_bg = min(distance_min, d_bg)
coords = np.where(labels > 0)
d_tube = 0
for pt in np.array(coords).T:
distance_min = min(np.sum((np.array(line).T - pt) ** 2, axis=1))
d_tube = max(distance_min, d_tube)
"""
Verify:
I. the size of output array
II. if the output is binary-valued
III. minimum distance between 0-valued points and the segment line is greater than radius
IV. maximum distance between 1-valued points and the segment line is less than or equal to radius
"""
assert labels.shape == shape
assert np.unique(labels).all() in [0, 1]
assert d_bg > radius ** 2
assert d_tube <= radius ** 2
def test_draw_tube_edt():
"""
Test if the function maps all the points within the radius of a segment line (defined by 2 given points) to 1, otherwise 0
The output array should have the same size of input image and binary values
Distance between a point and the segment line:
<= radius (if the point has value 1)
> radius (if the point has value 0)
"""
ngl_session = NeuroglancerSession(url=url, url_segments=url_seg)
img, _, _ = ngl_session.pull_vertex_list(2, [4], expand=True)
shape = img.shape
vertex0 = [
np.random.randint(shape[0] / 2),
np.random.randint(shape[1]),
np.random.randint(shape[2]),
]
vertex1 = [
np.random.randint(shape[0] / 2, shape[0]),
np.random.randint(shape[1]),
np.random.randint(shape[2]),
]
radius = np.random.randint(1, 4)
labels = tube_seg.draw_tube_from_edt(img, vertex0, vertex1, radius)
line = draw.line_nd(vertex0, vertex1, endpoint=True)
coords = np.where(labels < 1)
d_bg = max(shape)
for pt in np.array(coords).T:
distance_min = min(np.sum((np.array(line).T - pt) ** 2, axis=1))
d_bg = min(distance_min, d_bg)
coords = np.where(labels > 0)
d_tube = 0
for pt in np.array(coords).T:
distance_min = min(np.sum((np.array(line).T - pt) ** 2, axis=1))
d_tube = max(distance_min, d_tube)
"""
Verify:
I. the size of output array
II. if the output is binary-valued
III. minimum distance between 0-valued points and the segment line is greater than radius
IV. maximum distance between 1-valued points and the segment line is less than or equal to radius
"""
assert labels.shape == shape
assert np.unique(labels).all() in [0, 1]
assert d_bg > radius ** 2
assert d_tube <= radius ** 2
def test_tubes_seg():
"""
Test if the function maps all the points within the radius of polyline (defined by given vertices) to 1, otherwise 0
The output array should have the same size of input image and binary values
Distance between a point and the polyline:
<= radius (if the point has value 1)
> radius (if the point has value 0)
"""
ngl_session = NeuroglancerSession(url=url, url_segments=url_seg)
img, _, _ = ngl_session.pull_vertex_list(2, [4], expand=True)
shape = img.shape
vertices = np.random.randint(min(shape), size=(4, 3))
radius = np.random.randint(1, 4)
labels = tube_seg.tubes_seg(img, vertices, radius)
point = np.empty((3, 0), dtype=int)
for i in range(3):
lines = draw.line_nd(vertices[i], vertices[i + 1], endpoint=True)
point = np.concatenate((point, np.array(lines)), axis=1)
coords = np.where(labels < 1)
d_bg = max(shape)
for pt in np.array(coords).T:
distance_min = min(np.sum((point.T - pt) ** 2, axis=1))
d_bg = min(distance_min, d_bg)
coords = np.where(labels > 0)
d_tube = 0
for pt in np.array(coords).T:
distance_min = min(np.sum((point.T - pt) ** 2, axis=1))
d_tube = max(distance_min, d_tube)
"""
Verify:
I. the size of output array
II. if the output is binary-valued
III. minimum distance between 0-valued points and the polyline is greater than radius
IV. maximum distance between 1-valued points and the polyline is less than or equal to radius
"""
assert labels.shape == shape
assert np.unique(labels).all() in [0, 1]
assert d_bg > radius ** 2
assert d_tube <= radius ** 2
def test_tubes_from_paths_bad_inputs():
"""Tests that the tubes_from_paths method raises errors when given bad inputs."""
sess = NeuroglancerSession(url, 0, url_seg)
img, bbox, verts = sess.pull_voxel(2, 300, radius=5) # A valid bbox with data.
G_paths = sess.get_segments(2, bbox)
G = G_paths[0]
paths = G_paths[1] # valid paths
bbox = bbox.to_list()
size = np.subtract(bbox[3:], bbox[:3])
with pytest.raises(TypeError):
tube_seg.tubes_from_paths("asdf", paths)
with pytest.raises(ValueError):
tube_seg.tubes_from_paths((-1, -1, -1), paths)
with pytest.raises(TypeError):
tube_seg.tubes_from_paths(size, "asdf")
with pytest.raises(TypeError):
tube_seg.tubes_from_paths(size, [[0, 0, "asdf"]])
with pytest.raises(TypeError):
tube_seg.tubes_from_paths(size, paths, radius="asdf")
with pytest.raises(ValueError):
tube_seg.tubes_from_paths(size, paths, radius=-1)
def test_tubes_from_paths():
"""Tests that, given valid paths, valid tubes are created."""
sess = NeuroglancerSession(url, 0, url_seg)
img, bbox, verts = sess.pull_voxel(2, 300, radius=5) # A valid bbox with data.
G_paths = sess.get_segments(2, bbox)
bbox = bbox.to_list()
paths = G_paths[1] # valid paths
size = np.subtract(bbox[3:], bbox[:3])
tubes = tube_seg.tubes_from_paths(size, paths)
assert (tubes != 0).any()
def test_tubes_exact():
"""Tests that exact pixels are filled in."""
img = np.zeros((10, 10, 10))
verts = [[5, 5, 0], [5, 5, 10]]
tubes = tube_seg.tubes_from_paths(img.shape, [verts])
assert tubes.shape == img.shape
assert (tubes[5, 5, :] == 1).all()
for i in range(10): # set middle column to zero
tubes[5, 5, i] = 0
assert (tubes == 0).all() # now everything should be zero
| 35.473118 | 126 | 0.640598 | 1,535 | 9,897 | 4.023453 | 0.127036 | 0.028497 | 0.053433 | 0.048575 | 0.742228 | 0.727655 | 0.70612 | 0.693491 | 0.678918 | 0.650421 | 0 | 0.026543 | 0.238658 | 9,897 | 278 | 127 | 35.600719 | 0.793099 | 0.194503 | 0 | 0.591463 | 1 | 0 | 0.008913 | 0 | 0 | 0 | 0 | 0 | 0.140244 | 1 | 0.04878 | false | 0 | 0.04878 | 0 | 0.097561 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
e0afaf31235f5b87510cddc561581440be006f0d | 162 | py | Python | basic/conditionals.py | tonper19/PythonDemos | 633a40e282049e511fd965c0afe104e775a2f526 | [
"MIT"
] | null | null | null | basic/conditionals.py | tonper19/PythonDemos | 633a40e282049e511fd965c0afe104e775a2f526 | [
"MIT"
] | null | null | null | basic/conditionals.py | tonper19/PythonDemos | 633a40e282049e511fd965c0afe104e775a2f526 | [
"MIT"
] | null | null | null | name = 'jon Snow'
if name == 'Arya Stark':
print('Valar Morghulis')
elif name == 'Jon Snow':
print('You know nothing')
else:
print('carry on') | 23.142857 | 29 | 0.58642 | 22 | 162 | 4.318182 | 0.727273 | 0.147368 | 0.231579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.253086 | 162 | 7 | 30 | 23.142857 | 0.785124 | 0 | 0 | 0 | 0 | 0 | 0.398773 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.428571 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
e0b2c8db6308779748279a600cddaea18642cdf7 | 260 | py | Python | app/api/v1/api.py | carpe-astra/jarvis-client | 26f14fee1a971dce67e28ddd5192fa0688361b66 | [
"MIT"
] | null | null | null | app/api/v1/api.py | carpe-astra/jarvis-client | 26f14fee1a971dce67e28ddd5192fa0688361b66 | [
"MIT"
] | null | null | null | app/api/v1/api.py | carpe-astra/jarvis-client | 26f14fee1a971dce67e28ddd5192fa0688361b66 | [
"MIT"
] | null | null | null | from fastapi import APIRouter
from app.api.v1.endpoints import modules, tasks
router = APIRouter()
router.include_router(router=modules.router, prefix="/modules", tags=["Modules"])
router.include_router(router=tasks.router, prefix="/tasks", tags=["Tasks"])
| 28.888889 | 81 | 0.769231 | 34 | 260 | 5.823529 | 0.411765 | 0.111111 | 0.191919 | 0.252525 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004184 | 0.080769 | 260 | 8 | 82 | 32.5 | 0.824268 | 0 | 0 | 0 | 0 | 0 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
e0bfc1da540cd1ab90513f95fecb4d06f621acad | 89 | py | Python | pagework/page404/apps.py | Grizzy-bear/Virtual-experiment-WEB | 9fc1fdfc1bb369addfc52e39f713200a3724c8e8 | [
"MIT"
] | null | null | null | pagework/page404/apps.py | Grizzy-bear/Virtual-experiment-WEB | 9fc1fdfc1bb369addfc52e39f713200a3724c8e8 | [
"MIT"
] | null | null | null | pagework/page404/apps.py | Grizzy-bear/Virtual-experiment-WEB | 9fc1fdfc1bb369addfc52e39f713200a3724c8e8 | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class Page404Config(AppConfig):
name = 'page404'
| 14.833333 | 33 | 0.752809 | 10 | 89 | 6.7 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081081 | 0.168539 | 89 | 5 | 34 | 17.8 | 0.824324 | 0 | 0 | 0 | 0 | 0 | 0.078652 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
4601a58dcbd32a630ac20a2afe9b6da031888991 | 205 | py | Python | minigest/docfisc/urls/fatture_acquisto.py | ctrlmaniac/minigest | 2bfceb57e41c872e4112e24d0e6991164846888b | [
"MIT"
] | null | null | null | minigest/docfisc/urls/fatture_acquisto.py | ctrlmaniac/minigest | 2bfceb57e41c872e4112e24d0e6991164846888b | [
"MIT"
] | 1 | 2021-09-22T19:10:20.000Z | 2021-09-22T19:10:20.000Z | minigest/docfisc/urls/fatture_acquisto.py | ctrlmaniac/minigest | 2bfceb57e41c872e4112e24d0e6991164846888b | [
"MIT"
] | null | null | null | from django.urls import path
from ..views import FattureAcquisto
urlpatterns = [
path("<int:azienda>/", FattureAcquisto.as_view()),
path("<int:azienda>/<periodo>/", FattureAcquisto.as_view()),
]
| 22.777778 | 64 | 0.702439 | 23 | 205 | 6.173913 | 0.565217 | 0.098592 | 0.197183 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126829 | 205 | 8 | 65 | 25.625 | 0.793296 | 0 | 0 | 0 | 0 | 0 | 0.185366 | 0.117073 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
460bbe71fbd897862154794e0d743d49c1ef07a2 | 225 | py | Python | src/modules/role_selectors/__init__.py | NagisaZj/RODE | f7f6831fee58a7910e1d7c3a8ae19cef82ab8d03 | [
"Apache-2.0"
] | null | null | null | src/modules/role_selectors/__init__.py | NagisaZj/RODE | f7f6831fee58a7910e1d7c3a8ae19cef82ab8d03 | [
"Apache-2.0"
] | null | null | null | src/modules/role_selectors/__init__.py | NagisaZj/RODE | f7f6831fee58a7910e1d7c3a8ae19cef82ab8d03 | [
"Apache-2.0"
] | null | null | null | REGISTRY = {}
from .dot_selector import DotSelector
from .q_selector import QSelector
from .dot_rnn_selector import DotRNNSelector
REGISTRY['dot'] = DotSelector
REGISTRY['dot_rnn'] = DotRNNSelector
REGISTRY['q'] = QSelector
| 25 | 44 | 0.795556 | 27 | 225 | 6.444444 | 0.37037 | 0.241379 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 225 | 8 | 45 | 28.125 | 0.87 | 0 | 0 | 0 | 0 | 0 | 0.048889 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
461a6c0bcd3910c6d56d11a3f32a21aea6ae0a6f | 107 | py | Python | keycloak_admin_aio/_resources/users/__init__.py | V-Mann-Nick/keycloak-admin-aio | 83ac1af910e492a5864eb369aacfc0512e5c8c45 | [
"Apache-2.0"
] | 12 | 2021-11-08T18:03:09.000Z | 2022-03-17T16:34:06.000Z | keycloak_admin_aio/_resources/users/__init__.py | V-Mann-Nick/keycloak-admin-aio | 83ac1af910e492a5864eb369aacfc0512e5c8c45 | [
"Apache-2.0"
] | null | null | null | keycloak_admin_aio/_resources/users/__init__.py | V-Mann-Nick/keycloak-admin-aio | 83ac1af910e492a5864eb369aacfc0512e5c8c45 | [
"Apache-2.0"
] | 1 | 2021-11-14T13:55:30.000Z | 2021-11-14T13:55:30.000Z | """https://www.keycloak.org/docs-api/15.0/rest-api/index.html#_users_resource"""
from .users import Users
| 26.75 | 80 | 0.747664 | 18 | 107 | 4.333333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029703 | 0.056075 | 107 | 3 | 81 | 35.666667 | 0.742574 | 0.691589 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
1cf51434bf855fe1b93df31729c23e30823dc8cb | 98 | py | Python | click_web/exceptions.py | Whatang/click-web | c07068f4fa39e3900da6883f9a2ee7e85fe02700 | [
"MIT"
] | 204 | 2019-01-27T22:37:33.000Z | 2022-03-24T15:59:11.000Z | click_web/exceptions.py | Whatang/click-web | c07068f4fa39e3900da6883f9a2ee7e85fe02700 | [
"MIT"
] | 10 | 2019-02-02T22:33:23.000Z | 2021-12-09T00:33:15.000Z | click_web/exceptions.py | Whatang/click-web | c07068f4fa39e3900da6883f9a2ee7e85fe02700 | [
"MIT"
] | 7 | 2019-01-28T15:14:57.000Z | 2021-08-25T10:32:42.000Z | class ClickWebException(Exception):
pass
class CommandNotFound(ClickWebException):
pass
| 14 | 41 | 0.77551 | 8 | 98 | 9.5 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163265 | 98 | 6 | 42 | 16.333333 | 0.926829 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 4 |
e80315ca43687ef9dc3abae40de83e4e2ed434af | 286 | py | Python | vueSuit/vuePacks/layout/__init__.py | Ryuchen/django-vue-su | 28c08a157cd243b475673ca7486aedb1719759ea | [
"BSD-3-Clause"
] | 23 | 2019-08-16T07:53:51.000Z | 2021-06-15T14:50:19.000Z | vueSuit/vuePacks/layout/__init__.py | Ryuchen/django-vue-su | 28c08a157cd243b475673ca7486aedb1719759ea | [
"BSD-3-Clause"
] | null | null | null | vueSuit/vuePacks/layout/__init__.py | Ryuchen/django-vue-su | 28c08a157cd243b475673ca7486aedb1719759ea | [
"BSD-3-Clause"
] | 3 | 2019-11-14T08:04:32.000Z | 2021-02-18T07:05:02.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# ==================================================
# @Author : Copyright@Ryuchen
# ==================================================
from .base import (
LayoutA, LayoutB, LayoutC
)
__all__ = [
"LayoutA", "LayoutB", "LayoutC"
]
| 20.428571 | 52 | 0.384615 | 20 | 286 | 5.3 | 0.85 | 0.264151 | 0.396226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004065 | 0.13986 | 286 | 13 | 53 | 22 | 0.426829 | 0.601399 | 0 | 0 | 0 | 0 | 0.192661 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
e82310e03600fd283adcf4eee8fa5f8f2dac23a9 | 3,025 | py | Python | untypy/impl/generic.py | skogsbaer/untypy | 514b57cef05cdc7a2a11e9e4a7f648d073b304b5 | [
"MIT"
] | null | null | null | untypy/impl/generic.py | skogsbaer/untypy | 514b57cef05cdc7a2a11e9e4a7f648d073b304b5 | [
"MIT"
] | null | null | null | untypy/impl/generic.py | skogsbaer/untypy | 514b57cef05cdc7a2a11e9e4a7f648d073b304b5 | [
"MIT"
] | null | null | null | import typing
from typing import Optional, TypeVar, Any
from untypy.error import UntypyTypeError
from untypy.impl.protocol import ProtocolChecker
from untypy.interfaces import TypeCheckerFactory, CreationContext, TypeChecker, ExecutionContext
class GenericProtocolChecker(ProtocolChecker):
def protocol_type(self) -> str:
return "Generic"
def check_and_wrap(self, arg: Any, ctx: ExecutionContext) -> Any:
if not isinstance(arg, self.proto):
raise ctx.wrap(UntypyTypeError(
expected=self.describe(),
given=arg
)).with_note(f"Type '{type(arg).__name__}' does not inherit from '{self.proto.__name__}'")
return super().check_and_wrap(arg, ctx)
class GenericFactory(TypeCheckerFactory):
def create_from(self, annotation: Any, ctx: CreationContext) -> Optional[TypeChecker]:
# TODO: Support other typevar features
if type(annotation) is TypeVar:
(found, replacement_annotation) = ctx.resolve_typevar(annotation)
if found:
inner = ctx.find_checker(replacement_annotation)
if inner is not None:
return BoundTypeVar(inner, annotation)
else:
return None
else:
return UnboundTypeVar(annotation)
elif hasattr(annotation, '__args__') and hasattr(annotation.__origin__,
'__mro__') and typing.Generic in annotation.__origin__.__mro__:
return GenericProtocolChecker(annotation, ctx)
else:
return None
class BoundTypeVar(TypeChecker):
def __init__(self, inner: TypeChecker, typevar: TypeVar):
self.inner = inner
self.typevar = typevar
def describe(self) -> str:
return f"{self.typevar}={self.inner.describe()}"
def may_be_wrapped(self) -> bool:
return self.inner.may_be_wrapped()
def base_type(self) -> list[Any]:
return self.inner.base_type()
def base_type_priority(self) -> int:
return self.inner.base_type_priority()
def check_and_wrap(self, arg: Any, ctx: ExecutionContext) -> Any:
return self.inner.check_and_wrap(arg, BoundTypeVarCtx(self, ctx))
class BoundTypeVarCtx(ExecutionContext):
def __init__(self, bv: BoundTypeVar, ctx: ExecutionContext):
self.bv = bv
self.upper = ctx
def wrap(self, err: UntypyTypeError) -> UntypyTypeError:
(nt, ni) = err.next_type_and_indicator()
if nt == err.expected and nt == self.bv.inner.describe():
err.expected = self.bv.describe()
return self.upper.wrap(err)
class UnboundTypeVar(TypeChecker):
def __init__(self, typevar: TypeVar):
self.typevar = typevar
def check_and_wrap(self, arg: Any, ctx: ExecutionContext) -> Any:
return arg
def describe(self) -> str:
return str(self.typevar)
def base_type(self) -> list[Any]:
return [self.typevar]
| 32.880435 | 120 | 0.642314 | 331 | 3,025 | 5.655589 | 0.259819 | 0.033654 | 0.032051 | 0.024038 | 0.160791 | 0.115919 | 0.115919 | 0.115919 | 0.081731 | 0.081731 | 0 | 0 | 0.262149 | 3,025 | 91 | 121 | 33.241758 | 0.83871 | 0.011901 | 0 | 0.21875 | 0 | 0 | 0.044526 | 0.027787 | 0 | 0 | 0 | 0.010989 | 0 | 1 | 0.234375 | false | 0 | 0.078125 | 0.140625 | 0.640625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
e830db137baa4c6770d62def6834e370a78390d3 | 198 | py | Python | src/cobra/apps/home/config.py | lyoniionly/django-cobra | 2427e5cf74b7739115b1224da3306986b3ee345c | [
"Apache-2.0"
] | 1 | 2015-01-27T08:56:46.000Z | 2015-01-27T08:56:46.000Z | src/cobra/apps/home/config.py | lyoniionly/django-cobra | 2427e5cf74b7739115b1224da3306986b3ee345c | [
"Apache-2.0"
] | null | null | null | src/cobra/apps/home/config.py | lyoniionly/django-cobra | 2427e5cf74b7739115b1224da3306986b3ee345c | [
"Apache-2.0"
] | null | null | null | from django.apps import AppConfig
from django.utils.translation import ugettext_lazy as _
class HomeConfig(AppConfig):
label = 'home'
name = 'cobra.apps.home'
verbose_name = _('Home')
| 22 | 55 | 0.727273 | 25 | 198 | 5.6 | 0.68 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176768 | 198 | 8 | 56 | 24.75 | 0.858896 | 0 | 0 | 0 | 0 | 0 | 0.116162 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
1c0345c3cc2ca1f8a8242224dc9e4d0a06b335b8 | 89 | py | Python | api/projeto/__main__.py | ldynczuki/iris-model-api | 0519f5d3d8158f7560326219bd2263a234ad0cbe | [
"MIT"
] | null | null | null | api/projeto/__main__.py | ldynczuki/iris-model-api | 0519f5d3d8158f7560326219bd2263a234ad0cbe | [
"MIT"
] | null | null | null | api/projeto/__main__.py | ldynczuki/iris-model-api | 0519f5d3d8158f7560326219bd2263a234ad0cbe | [
"MIT"
] | null | null | null | def start():
print(__package__, ' started.')
if __name__ == '__main__':
start() | 14.833333 | 35 | 0.606742 | 9 | 89 | 4.666667 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.213483 | 89 | 6 | 36 | 14.833333 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0.188889 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
1c0b8fed01a4879f94e908cb37d6468936fdbca2 | 64 | py | Python | statsmanager/main/settings.py | bugg86/CSC490-League-Stat-Manager | 243236432dd4addf5dc08bfc6a6edd41add1f0c1 | [
"MIT"
] | null | null | null | statsmanager/main/settings.py | bugg86/CSC490-League-Stat-Manager | 243236432dd4addf5dc08bfc6a6edd41add1f0c1 | [
"MIT"
] | null | null | null | statsmanager/main/settings.py | bugg86/CSC490-League-Stat-Manager | 243236432dd4addf5dc08bfc6a6edd41add1f0c1 | [
"MIT"
] | null | null | null | EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend' | 64 | 64 | 0.84375 | 8 | 64 | 6.625 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 64 | 1 | 64 | 64 | 0.854839 | 0 | 0 | 0 | 0 | 0 | 0.707692 | 0.707692 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
1c0f2c585b2baa70d6ec7d59aab5fb6fa95ad0b4 | 275 | py | Python | exercise 2.py | lucasdasneves/python_puc_minas | aae6f9407e9709130b0f38a8b227cd72b168b80a | [
"MIT"
] | null | null | null | exercise 2.py | lucasdasneves/python_puc_minas | aae6f9407e9709130b0f38a8b227cd72b168b80a | [
"MIT"
] | null | null | null | exercise 2.py | lucasdasneves/python_puc_minas | aae6f9407e9709130b0f38a8b227cd72b168b80a | [
"MIT"
] | null | null | null | #exercise 2
#Faça um programa para calcular e exibir o valor de 𝑃, onde 𝑥, 𝑎
#e 𝑁 são informados pelo usuário, sendo 𝑁 o número de
#termos da sequência:
#Ao final, você deverá armazenar e imprimir todos os termos da
#sequência e o valor de 𝑃.
#𝑃 = 𝑎𝑥^1 + 𝑎𝑥^2 + ... + 𝑎𝑥^n
| 27.5 | 64 | 0.705455 | 54 | 275 | 3.592593 | 0.685185 | 0.061856 | 0.082474 | 0.092784 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013825 | 0.210909 | 275 | 9 | 65 | 30.555556 | 0.880184 | 0.941818 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0.111111 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
1c24b83ea9d6cd0a7c5ac91a4ae8af09f41e347d | 188 | py | Python | conbench/tests/test_version.py | jonkeane/conbench | f096cc2f8b7a85d8e9aea32d8310127cf1923212 | [
"MIT"
] | 48 | 2020-03-02T16:55:46.000Z | 2022-02-26T00:35:57.000Z | conbench/tests/test_version.py | jonkeane/conbench | f096cc2f8b7a85d8e9aea32d8310127cf1923212 | [
"MIT"
] | 103 | 2020-03-23T00:22:46.000Z | 2022-03-31T22:34:40.000Z | conbench/tests/test_version.py | jonkeane/conbench | f096cc2f8b7a85d8e9aea32d8310127cf1923212 | [
"MIT"
] | 6 | 2020-03-04T17:52:35.000Z | 2022-03-30T11:53:40.000Z | import importlib.metadata as importlib_metadata
import conbench
__version__ = importlib_metadata.version("conbench")
def test_version():
assert __version__ == conbench.__version__
| 18.8 | 52 | 0.808511 | 20 | 188 | 6.85 | 0.45 | 0.372263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12234 | 188 | 9 | 53 | 20.888889 | 0.830303 | 0 | 0 | 0 | 0 | 0 | 0.042553 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.2 | false | 0 | 0.6 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
1c62f9a5a3f6c0b2f24f9613fbd98d0843c50ba1 | 86,586 | py | Python | src/openprocurement/tender/pricequotation/tests/tender_blanks.py | ProzorroUKR/openprocurement.api | 2855a99aa8738fb832ee0dbad4e9590bd3643511 | [
"Apache-2.0"
] | 10 | 2020-02-18T01:56:21.000Z | 2022-03-28T00:32:57.000Z | src/openprocurement/tender/pricequotation/tests/tender_blanks.py | quintagroup/openprocurement.api | 2855a99aa8738fb832ee0dbad4e9590bd3643511 | [
"Apache-2.0"
] | 26 | 2018-07-16T09:30:44.000Z | 2021-02-02T17:51:30.000Z | src/openprocurement/tender/pricequotation/tests/tender_blanks.py | ProzorroUKR/openprocurement.api | 2855a99aa8738fb832ee0dbad4e9590bd3643511 | [
"Apache-2.0"
] | 15 | 2019-08-08T10:50:47.000Z | 2022-02-05T14:13:36.000Z | import mock
from uuid import uuid4
from copy import deepcopy
from datetime import timedelta
from openprocurement.api.utils import get_now
from openprocurement.api.constants import (
ROUTE_PREFIX,
CPV_BLOCK_FROM,
NOT_REQUIRED_ADDITIONAL_CLASSIFICATION_FROM,
SANDBOX_MODE,
CPV_ITEMS_CLASS_FROM,
PQ_MULTI_PROFILE_FROM,
)
from openprocurement.tender.pricequotation.tests.base import (
test_organization,
test_cancellation,
test_shortlisted_firms,
test_short_profile,
test_requirement_response,
test_tender_data_before_multiprofile,
test_tender_data_after_multiprofile,
test_item_before_multiprofile,
test_item_after_multiprofile,
)
from openprocurement.tender.pricequotation.tests.data import test_milestones
# TenderTest
from openprocurement.tender.core.tests.base import change_auth
from openprocurement.tender.pricequotation.constants import PMT, PQ_KINDS
def listing(self):
response = self.app.get("/tenders")
self.assertEqual(response.status, "200 OK")
self.assertEqual(len(response.json["data"]), 0)
tenders = []
for i in range(3):
offset = get_now().isoformat()
response = self.app.post_json("/tenders", {"data": self.initial_data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
self.tender_id = response.json['data']['id']
self.set_status('active.tendering')
tender = self.app.get("/tenders/{}".format(self.tender_id)).json['data']
tenders.append(tender)
ids = ",".join([i["id"] for i in tenders])
while True:
response = self.app.get("/tenders")
self.assertTrue(ids.startswith(",".join([i["id"] for i in response.json["data"]])))
if len(response.json["data"]) == 3:
break
self.assertEqual(len(response.json["data"]), 3)
self.assertEqual(set(response.json["data"][0]), set(["id", "dateModified"]))
self.assertEqual(set([i["id"] for i in response.json["data"]]), set([i["id"] for i in tenders]))
self.assertEqual(set([i["dateModified"] for i in response.json["data"]]), set([i["dateModified"] for i in tenders]))
self.assertEqual([i["dateModified"] for i in response.json["data"]], sorted([i["dateModified"] for i in tenders]))
response = self.app.get("/tenders?limit=2")
self.assertEqual(response.status, "200 OK")
self.assertNotIn("prev_page", response.json)
self.assertEqual(len(response.json["data"]), 2)
response = self.app.get(response.json["next_page"]["path"].replace(ROUTE_PREFIX, ""))
self.assertEqual(response.status, "200 OK")
self.assertIn("descending=1", response.json["prev_page"]["uri"])
self.assertEqual(len(response.json["data"]), 1)
response = self.app.get(response.json["next_page"]["path"].replace(ROUTE_PREFIX, ""))
self.assertEqual(response.status, "200 OK")
self.assertIn("descending=1", response.json["prev_page"]["uri"])
self.assertEqual(len(response.json["data"]), 0)
response = self.app.get("/tenders", params=[("opt_fields", "status")])
self.assertEqual(response.status, "200 OK")
self.assertEqual(len(response.json["data"]), 3)
self.assertEqual(set(response.json["data"][0]), set(["id", "dateModified", "status"]))
self.assertIn("opt_fields=status", response.json["next_page"]["uri"])
response = self.app.get("/tenders", params=[("opt_fields", "status")])
self.assertEqual(response.status, "200 OK")
self.assertEqual(len(response.json["data"]), 3)
self.assertEqual(set(response.json["data"][0]), set(["id", "dateModified", "status"]))
self.assertIn("opt_fields=status", response.json["next_page"]["uri"])
response = self.app.get("/tenders?descending=1")
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(len(response.json["data"]), 3)
self.assertEqual(set(response.json["data"][0]), set(["id", "dateModified"]))
self.assertEqual(set([i["id"] for i in response.json["data"]]), set([i["id"] for i in tenders]))
self.assertEqual(
[i["dateModified"] for i in response.json["data"]], sorted([i["dateModified"] for i in tenders], reverse=True)
)
response = self.app.get("/tenders?descending=1&limit=2")
self.assertEqual(response.status, "200 OK")
self.assertNotIn("descending=1", response.json["prev_page"]["uri"])
self.assertEqual(len(response.json["data"]), 2)
response = self.app.get(response.json["next_page"]["path"].replace(ROUTE_PREFIX, ""))
self.assertEqual(response.status, "200 OK")
self.assertNotIn("descending=1", response.json["prev_page"]["uri"])
self.assertEqual(len(response.json["data"]), 1)
response = self.app.get(response.json["next_page"]["path"].replace(ROUTE_PREFIX, ""))
self.assertEqual(response.status, "200 OK")
self.assertNotIn("descending=1", response.json["prev_page"]["uri"])
self.assertEqual(len(response.json["data"]), 0)
test_tender_data2 = self.initial_data.copy()
test_tender_data2["mode"] = "test"
response = self.app.post_json("/tenders", {"data": test_tender_data2})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
def listing_changes(self):
response = self.app.get("/tenders?feed=changes")
self.assertEqual(response.status, "200 OK")
self.assertEqual(len(response.json["data"]), 0)
tenders = []
for i in range(3):
response = self.app.post_json("/tenders", {"data": self.initial_data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
self.tender_id = response.json['data']['id']
self.set_status('active.tendering')
tender = self.app.get("/tenders/{}".format(self.tender_id)).json['data']
tenders.append(tender)
ids = ",".join([i["id"] for i in tenders])
while True:
response = self.app.get("/tenders?feed=changes")
self.assertTrue(ids.startswith(",".join([i["id"] for i in response.json["data"]])))
if len(response.json["data"]) == 3:
break
self.assertEqual(",".join([i["id"] for i in response.json["data"]]), ids)
self.assertEqual(response.status, "200 OK")
self.assertEqual(len(response.json["data"]), 3)
self.assertEqual(set(response.json["data"][0]), set(["id", "dateModified"]))
self.assertEqual(set([i["id"] for i in response.json["data"]]), set([i["id"] for i in tenders]))
self.assertEqual(set([i["dateModified"] for i in response.json["data"]]), set([i["dateModified"] for i in tenders]))
self.assertEqual([i["dateModified"] for i in response.json["data"]], sorted([i["dateModified"] for i in tenders]))
response = self.app.get("/tenders?feed=changes&limit=2")
self.assertEqual(response.status, "200 OK")
self.assertNotIn("prev_page", response.json)
self.assertEqual(len(response.json["data"]), 2)
response = self.app.get(response.json["next_page"]["path"].replace(ROUTE_PREFIX, ""))
self.assertEqual(response.status, "200 OK")
self.assertIn("descending=1", response.json["prev_page"]["uri"])
self.assertEqual(len(response.json["data"]), 1)
response = self.app.get(response.json["next_page"]["path"].replace(ROUTE_PREFIX, ""))
self.assertEqual(response.status, "200 OK")
self.assertIn("descending=1", response.json["prev_page"]["uri"])
self.assertEqual(len(response.json["data"]), 0)
response = self.app.get("/tenders?feed=changes", params=[("opt_fields", "status")])
self.assertEqual(response.status, "200 OK")
self.assertEqual(len(response.json["data"]), 3)
self.assertEqual(set(response.json["data"][0]), set(["id", "dateModified", "status"]))
self.assertIn("opt_fields=status", response.json["next_page"]["uri"])
response = self.app.get("/tenders?feed=changes", params=[("opt_fields", "status")])
self.assertEqual(response.status, "200 OK")
self.assertEqual(len(response.json["data"]), 3)
self.assertEqual(set(response.json["data"][0]), set(["id", "dateModified", "status"]))
self.assertIn("opt_fields=status", response.json["next_page"]["uri"])
response = self.app.get("/tenders?feed=changes&descending=1")
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(len(response.json["data"]), 3)
self.assertEqual(set(response.json["data"][0]), set(["id", "dateModified"]))
self.assertEqual(set([i["id"] for i in response.json["data"]]), set([i["id"] for i in tenders]))
self.assertEqual(
[i["dateModified"] for i in response.json["data"]], sorted([i["dateModified"] for i in tenders], reverse=True)
)
response = self.app.get("/tenders?feed=changes&descending=1&limit=2")
self.assertEqual(response.status, "200 OK")
self.assertNotIn("descending=1", response.json["prev_page"]["uri"])
self.assertEqual(len(response.json["data"]), 2)
response = self.app.get(response.json["next_page"]["path"].replace(ROUTE_PREFIX, ""))
self.assertEqual(response.status, "200 OK")
self.assertNotIn("descending=1", response.json["prev_page"]["uri"])
self.assertEqual(len(response.json["data"]), 1)
response = self.app.get(response.json["next_page"]["path"].replace(ROUTE_PREFIX, ""))
self.assertEqual(response.status, "200 OK")
self.assertNotIn("descending=1", response.json["prev_page"]["uri"])
self.assertEqual(len(response.json["data"]), 0)
test_tender_data2 = self.initial_data.copy()
test_tender_data2["mode"] = "test"
response = self.app.post_json("/tenders", {"data": test_tender_data2})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
def listing_draft(self):
response = self.app.get("/tenders")
self.assertEqual(response.status, "200 OK")
self.assertEqual(len(response.json["data"]), 0)
tenders = []
data = self.initial_data.copy()
data.update({"status": "draft"})
for i in range(3):
response = self.app.post_json("/tenders", {"data": self.initial_data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
self.tender_id = response.json['data']['id']
self.set_status('active.tendering')
tender = self.app.get("/tenders/{}".format(self.tender_id)).json['data']
tenders.append(tender)
response = self.app.post_json("/tenders", {"data": data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
ids = ",".join([i["id"] for i in tenders])
while True:
response = self.app.get("/tenders")
self.assertTrue(ids.startswith(",".join([i["id"] for i in response.json["data"]])))
if len(response.json["data"]) == 3:
break
self.assertEqual(len(response.json["data"]), 3)
self.assertEqual(set(response.json["data"][0]), set(["id", "dateModified"]))
self.assertEqual(set([i["id"] for i in response.json["data"]]), set([i["id"] for i in tenders]))
self.assertEqual(set([i["dateModified"] for i in response.json["data"]]), set([i["dateModified"] for i in tenders]))
self.assertEqual([i["dateModified"] for i in response.json["data"]], sorted([i["dateModified"] for i in tenders]))
def create_tender_invalid(self):
request_path = "/tenders"
response = self.app.post(request_path, "data", status=415)
self.assertEqual(response.status, "415 Unsupported Media Type")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
"description": "Content-Type header should be one of ['application/json']",
"location": "header",
"name": "Content-Type",
}
],
)
response = self.app.post(request_path, "data", content_type="application/json", status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{"description": "Expecting value: line 1 column 1 (char 0)", "location": "body", "name": "data"}],
)
response = self.app.post_json(request_path, "data", status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{"description": "Data not available", "location": "body", "name": "data"}]
)
response = self.app.post_json(request_path, {"not_data": {}}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{"description": "Data not available", "location": "body", "name": "data"}]
)
response = self.app.post_json(request_path, {"data": []}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{"description": "Data not available", "location": "body", "name": "data"}]
)
response = self.app.post_json(request_path, {"data": {"procurementMethodType": "invalid_value"}}, status=415)
self.assertEqual(response.status, "415 Unsupported Media Type")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{"description": "Not implemented", "location": "body", "name": "procurementMethodType"}],
)
response = self.app.post_json(request_path, {"data": {"invalid_field": "invalid_value", "procurementMethodType": PMT}}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{"description": "Rogue field", "location": "body", "name": "invalid_field"}]
)
response = self.app.post_json(request_path, {"data": {"value": "invalid_value", "procurementMethodType": PMT}}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
"description": ["Please use a mapping for this field or Value instance instead of str."],
"location": "body",
"name": "value",
}
],
)
response = self.app.post_json(request_path, {"data": {"procurementMethod": "invalid_value", "procurementMethodType": PMT }}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertIn(
{
"description": ["Value must be one of ['selective']."],
"location": "body",
"name": "procurementMethod",
},
response.json["errors"],
)
self.assertIn(
{"description": ["This field is required."], "location": "body", "name": "tenderPeriod"},
response.json["errors"],
)
self.assertIn(
{"description": ["This field is required."], "location": "body", "name": "items"}, response.json["errors"]
)
data = self.initial_data["tenderPeriod"]
self.initial_data["tenderPeriod"] = {"startDate": "2014-10-31T00:00:00", "endDate": "2014-10-01T00:00:00"}
response = self.app.post_json(request_path, {"data": self.initial_data}, status=422)
self.initial_data["tenderPeriod"] = data
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
"description": {"startDate": ["period should begin before its end"]},
"location": "body",
"name": "tenderPeriod",
}
],
)
now = get_now()
self.initial_data["awardPeriod"] = {"startDate": now.isoformat(), "endDate": now.isoformat()}
response = self.app.post_json(request_path, {"data": self.initial_data}, status=422)
del self.initial_data["awardPeriod"]
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{"description": ["period should begin after tenderPeriod"], "location": "body", "name": "awardPeriod"}],
)
data = self.initial_data["items"][0].pop("additionalClassifications")
if get_now() > CPV_ITEMS_CLASS_FROM:
cpv_code = self.initial_data["items"][0]["classification"]["id"]
self.initial_data["items"][0]["classification"]["id"] = "99999999-9"
status = 422 if get_now() < NOT_REQUIRED_ADDITIONAL_CLASSIFICATION_FROM else 201
response = self.app.post_json(request_path, {"data": self.initial_data}, status=status)
self.initial_data["items"][0]["additionalClassifications"] = data
if get_now() > CPV_ITEMS_CLASS_FROM:
self.initial_data["items"][0]["classification"]["id"] = cpv_code
if status == 201:
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.status, "201 Created")
else:
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
"description": [{"additionalClassifications": ["This field is required."]}],
"location": "body",
"name": "items",
}
],
)
invalid_data = deepcopy(self.initial_data)
del invalid_data["procuringEntity"]["contactPoint"]["telephone"]
response = self.app.post_json(request_path, {"data": invalid_data}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
"description": {"contactPoint": {"email": ["telephone or email should be present"]}},
"location": "body",
"name": "procuringEntity",
}
],
)
correct_phone = self.initial_data["procuringEntity"]["contactPoint"]["telephone"]
self.initial_data["procuringEntity"]["contactPoint"]["telephone"] = "++223"
response = self.app.post_json(request_path, {"data": self.initial_data}, status=422)
self.initial_data["procuringEntity"]["contactPoint"]["telephone"] = correct_phone
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u'description': {u'contactPoint': {u'telephone': [u'wrong telephone format (could be missed +)']}},
u'location': u'body',
u'name': u'procuringEntity'
}
]
)
cpv = self.initial_data["items"][0]["classification"]["id"]
self.initial_data["items"][0]["classification"]["id"] = "160173000-1"
response = self.app.post_json(request_path, {"data": self.initial_data}, status=422)
self.initial_data["items"][0]["classification"]["id"] = cpv
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertIn("classification", response.json["errors"][0]["description"][0])
self.assertIn("id", response.json["errors"][0]["description"][0]["classification"])
self.assertIn("Value must be one of [", response.json["errors"][0]["description"][0]["classification"]["id"][0])
cpv = self.initial_data["items"][0]["classification"]["id"]
if get_now() < CPV_BLOCK_FROM:
self.initial_data["items"][0]["classification"]["scheme"] = "CPV"
self.initial_data["items"][0]["classification"]["id"] = "00000000-0"
response = self.app.post_json(request_path, {"data": self.initial_data}, status=422)
if get_now() < CPV_BLOCK_FROM:
self.initial_data["items"][0]["classification"]["scheme"] = "CPV"
self.initial_data["items"][0]["classification"]["id"] = cpv
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertIn("classification", response.json["errors"][0]["description"][0])
self.assertIn("id", response.json["errors"][0]["description"][0]["classification"])
self.assertIn("Value must be one of [", response.json["errors"][0]["description"][0]["classification"]["id"][0])
procuringEntity = self.initial_data["procuringEntity"]
data = self.initial_data["procuringEntity"].copy()
del data["kind"]
self.initial_data["procuringEntity"] = data
response = self.app.post_json(request_path, {"data": self.initial_data}, status=403)
self.initial_data["procuringEntity"] = procuringEntity
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
"description": "'' procuringEntity cannot publish this type of procedure. "
"Only general, special, defense, other, social, authority are allowed.",
"location": "body",
"name": "kind",
}
],
)
data = deepcopy(self.initial_data)
data['milestones'] = test_milestones
response = self.app.post_json(request_path, {"data": data}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{
"description": ["Milestones are not applicable to pricequotation"],
"location": "body",
"name": "milestones"
}],
)
data = deepcopy(self.initial_data)
data["procuringEntity"]['kind'] = 'central'
response = self.app.post_json(request_path, {"data": data}, status=403)
self.assertEqual(response.status, '403 Forbidden')
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{
"description": "'central' procuringEntity cannot publish this type of procedure. Only general, special, defense, other, social, authority are allowed.",
"location": "body",
"name": "kind"
}],
)
with mock.patch("openprocurement.tender.pricequotation.models.tender.PQ_MULTI_PROFILE_FROM", get_now() + timedelta(days=1)):
data = deepcopy(test_tender_data_before_multiprofile)
data["agreement"] = {"id": self.agreement_id}
response = self.app.post_json(request_path, {"data": data}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{
"description": ["Rogue field."],
"location": "body",
"name": "agreement"
}],
)
data = deepcopy(test_tender_data_before_multiprofile)
data["items"] = [test_item_after_multiprofile]
response = self.app.post_json(request_path, {"data": data}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{
"description": [{"profile": ["Rogue field."]}],
"location": "body",
"name": "items"
}],
)
with mock.patch("openprocurement.tender.pricequotation.models.tender.PQ_MULTI_PROFILE_FROM", get_now() - timedelta(days=1)):
data = deepcopy(test_tender_data_after_multiprofile)
data["agreement"]["id"] = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaab"
response = self.app.post_json(request_path, {"data": data}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{
"description": {"id": ["Hash value is wrong length."]},
"location": "body",
"name": "agreement"
}],
)
data["agreement"]["id"] = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
response = self.app.post_json(request_path, {"data": data}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{
"description": {"id": ["id must be one of exists agreement"]},
"location": "body",
"name": "agreement"
}],
)
del data["agreement"]["id"]
response = self.app.post_json(request_path, {"data": data}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{
"description": {"id": ["This field is required."]},
"location": "body",
"name": "agreement"
}],
)
del data["agreement"]
response = self.app.post_json(request_path, {"data": data}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{
"description": ["This field is required."],
"location": "body",
"name": "agreement"
}],
)
data = deepcopy(test_tender_data_after_multiprofile)
data["items"] = [test_item_before_multiprofile]
response = self.app.post_json(request_path, {"data": data}, status=422)
self.assertEqual(response.status, '422 Unprocessable Entity')
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{
"description": [{"profile": ["This field is required."]}],
"location": "body",
"name": "items"
}],
)
def create_tender_with_inn(self):
request_path = "/tenders"
addit_classif = [
{"scheme": "INN", "id": "17.21.1", "description": "папір і картон гофровані, паперова й картонна тара"}
]
data = self.initial_data["items"][0]["classification"]["id"]
self.initial_data["items"][0]["classification"]["id"] = "33611000-6"
orig_addit_classif = self.initial_data["items"][0]["additionalClassifications"]
self.initial_data["items"][0]["additionalClassifications"] = addit_classif
response = self.app.post_json(request_path, {"data": self.initial_data})
self.initial_data["items"][0]["additionalClassifications"] = orig_addit_classif
self.initial_data["items"][0]["classification"]["id"] = data
self.assertEqual(response.status, "201 Created")
addit_classif = [
{"scheme": "NotINN", "id": "17.21.1", "description": "папір і картон гофровані, паперова й картонна тара"},
{"scheme": "NotINN", "id": "17.21.1", "description": "папір і картон гофровані, паперова й картонна тара"},
]
data = self.initial_data["items"][0]["classification"]["id"]
self.initial_data["items"][0]["classification"]["id"] = "33652000-5"
orig_addit_classif = self.initial_data["items"][0]["additionalClassifications"]
self.initial_data["items"][0]["additionalClassifications"] = addit_classif
response = self.app.post_json(request_path, {"data": self.initial_data})
self.initial_data["items"][0]["additionalClassifications"] = orig_addit_classif
self.initial_data["items"][0]["classification"]["id"] = data
self.assertEqual(response.status, "201 Created")
def create_tender_generated(self):
data = self.initial_data.copy()
data.update({"id": "hash", "doc_id": "hash2", "tenderID": "hash3"})
response = self.app.post_json("/tenders", {"data": data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
if "procurementMethodDetails" in tender:
tender.pop("procurementMethodDetails")
tender_keys = [
"procurementMethodType",
"id",
"date",
"dateModified",
"tenderID",
"status",
"tenderPeriod",
"items",
"procuringEntity",
"procurementMethod",
"awardCriteria",
"submissionMethod",
"title",
"owner",
"mainProcurementCategory",
"value",
]
if get_now() < PQ_MULTI_PROFILE_FROM:
tender_keys.append("profile")
else:
tender_keys.append("agreement")
self.assertEqual(
set(tender),
set(tender_keys),
)
self.assertNotEqual(data["id"], tender["id"])
self.assertNotEqual(data["doc_id"], tender["id"])
self.assertNotEqual(data["tenderID"], tender["tenderID"])
def create_tender_draft(self):
data = self.initial_data.copy()
data.update({"status": "draft"})
response = self.app.post_json("/tenders", {"data": data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
token = response.json["access"]["token"]
self.assertEqual(tender["status"], "draft")
self.assertNotIn("noticePublicationDate", tender)
self.assertNotIn("unsuccessfulReason", tender)
if SANDBOX_MODE:
period = {
'endDate': (get_now() + timedelta(minutes=1)).isoformat()
}
else:
period = {
'endDate': (get_now() + timedelta(days=1)).isoformat()
}
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token),
{"data": {"status": self.primary_tender_status, "tenderPeriod": period}},
status=422
)
self.assertEqual(
response.json["errors"],
[{'description': ['tenderPeriod must be at least 2 full business days long'],
'location': 'body',
'name': 'tenderPeriod'}]
)
forbidden_statuses = ("draft.unsuccessful", "active.tendering", "active.qualification", "active.awarded",
"complete", "cancelled", "unsuccessful")
current_status = tender["status"]
for forbidden_status in forbidden_statuses:
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token),
{"data": {"status": forbidden_status}},
status=403
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.json['status'], "error")
self.assertEqual(
response.json['errors'],
[{'description': "tender_owner can't switch tender from status ({}) to ({})".format(current_status,
forbidden_status),
'location': 'body',
'name': 'data'}]
)
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token),
{"data": {"procuringEntity": {"kind": 'central'}}},
status=403
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json['status'], 'error')
self.assertEqual(
response.json['errors'],
[{
'description': "'central' procuringEntity cannot publish this type of procedure. Only general, special, defense, other, social, authority are allowed.",
'location': 'body',
'name': 'kind'
}]
)
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token),
{"data": {"status": self.primary_tender_status, "unsuccessfulReason": ["some value from buyer"]}}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
self.assertEqual(tender["status"], self.primary_tender_status)
self.assertEqual(tender["noticePublicationDate"], tender["tenderPeriod"]["startDate"])
self.assertNotIn("unsuccessfulReason", tender)
response = self.app.get("/tenders/{}".format(tender["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
self.assertEqual(tender["status"], self.primary_tender_status)
def create_tender_draft_with_criteria(self):
data = self.initial_data.copy()
data["criteria"] = self.test_criteria_1
data["status"] = "draft"
response = self.app.post_json("/tenders", {"data": data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
tender_id = tender["id"]
token = response.json["access"]["token"]
self.assertEqual(
set(e["description"] for e in tender["criteria"]),
set(e["description"] for e in data["criteria"])
)
# try updating criteria ids
patch_criteria = deepcopy(tender["criteria"])
for c in patch_criteria:
c["id"] = uuid4().hex
response = self.app.patch_json(
f"/tenders/{tender_id}?acc_token={token}",
{"data": {"criteria": patch_criteria}}
)
patch_result = response.json["data"]
self.assertEqual(
set(e["id"] for e in patch_result["criteria"]),
set(e["id"] for e in patch_criteria)
)
# try adding a new criteria
patch_criteria = patch_criteria + deepcopy(patch_criteria)
from openprocurement.tender.pricequotation.models.requirement import PQ_CRITERIA_ID_FROM
if get_now() > PQ_CRITERIA_ID_FROM:
response = self.app.patch_json(
f"/tenders/{tender_id}?acc_token={token}",
{"data": {"criteria": patch_criteria}},
status=422
)
self.assertEqual(
response.json["errors"],
[{"location": "body", "name": "criteria",
"description": ["Criteria id should be uniq"]}]
)
# fix criteria ids
for c in patch_criteria:
c["id"] = uuid4().hex
response = self.app.patch_json(
f"/tenders/{tender_id}?acc_token={token}",
{"data": {"criteria": patch_criteria}},
status=422
)
self.assertEqual(
response.json["errors"],
[{"location": "body", "name": "criteria",
"description": ["Requirement group id should be uniq in tender"]}]
)
# fix group ids
for c in patch_criteria:
for g in c["requirementGroups"]:
g["id"] = uuid4().hex
response = self.app.patch_json(
f"/tenders/{tender_id}?acc_token={token}",
{"data": {"criteria": patch_criteria}},
status=422
)
self.assertEqual(
response.json["errors"],
[{"location": "body", "name": "criteria",
"description": ["Requirement id should be uniq for all requirements in tender"]}]
)
# fix requirement ids
for c in patch_criteria:
for g in c["requirementGroups"]:
for r in g["requirements"]:
r["id"] = uuid4().hex
response = self.app.patch_json(
f"/tenders/{tender_id}?acc_token={token}",
{"data": {"criteria": patch_criteria}},
)
patch_result = response.json["data"]
# old object ids hasn't been changed
self.assertEqual(len(patch_result["criteria"]), 4)
self.assertEqual(
[e["id"] for e in patch_result["criteria"]],
[e["id"] for e in patch_criteria]
)
def create_tender_in_not_draft_status(self):
data = self.initial_data.copy()
forbidden_statuses = ("draft.unsuccessful", "active.tendering", "active.qualification", "active.awarded",
"complete", "cancelled", "unsuccessful")
for forbidden_status in forbidden_statuses:
data.update({"status": forbidden_status})
response = self.app.post_json("/tenders", {"data": data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
token = response.json["access"]["token"]
self.assertEqual(tender["status"], "draft")
def tender_owner_can_change_in_draft(self):
data = self.initial_data.copy()
data.update({"status": "draft"})
response = self.app.post_json("/tenders", {"data": data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
token = response.json["access"]["token"]
self.assertEqual(tender["status"], "draft")
general = {
"numberOfBidders": 1,
"tenderPeriod": {"endDate": (get_now() + timedelta(days=14)).isoformat()},
"procuringEntity": {"name": "Національне управління справами"},
"mainProcurementCategory": "services",
"guarantee": {"amount": 50},
}
descriptions = {
"description": "Some text 1",
"description_en": "Some text 2",
"description_ru": "Some text 3",
"procurementMethodRationale": "Some text 4",
"procurementMethodRationale_en": "Some text 5",
"procurementMethodRationale_ru": "Some text 6",
"submissionMethodDetails": "Some text 7",
"submissionMethodDetails_en": "Some text 8",
"submissionMethodDetails_ru": "Some text 9"
}
titles = {
"title": "Test title 1",
"title_en": "Test title 2",
"title_ru": "Test title 3"
}
criterias = {
"eligibilityCriteria": "Test criteria 1",
"eligibilityCriteria_en": "Test criteria 2",
"eligibilityCriteria_ru": "Test criteria 3",
"awardCriteriaDetails": "Test criteria 4",
"awardCriteriaDetails_en": "Test criteria 5",
"awardCriteriaDetails_ru": "Test criteria 6"
}
buyer_id = uuid4().hex
lists = {
"buyers": [
{
"id": buyer_id,
"name": "John Doe",
"identifier": {
"scheme": "AE-DCCI",
"id": "AE1"
}
}
],
"funders": [
{
"name": "First funder",
"identifier": {
"scheme": "XM-DAC",
"id": "44000"
},
"address": {
"countryName": "Японія"
},
"contactPoint": {
"name": "Funder name",
"email": "fake_japan_email@gmail.net"
}
}
],
"items": [
{
"description": "New description"
}
]
}
status = {
"status": "draft.publishing"
}
# general
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token), {"data": general}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
self.assertEqual(tender["numberOfBidders"], general["numberOfBidders"])
self.assertNotEqual(tender["numberOfBidders"], data.get("numberOfBidders"))
self.assertEqual(tender["mainProcurementCategory"], general["mainProcurementCategory"])
self.assertNotEqual(tender["mainProcurementCategory"], data.get("mainProcurementCategory"))
self.assertEqual(tender["tenderPeriod"]["endDate"], general["tenderPeriod"]["endDate"])
self.assertNotEqual(tender["tenderPeriod"]["endDate"], data.get("tenderPeriod", {}).get("endDate"))
self.assertEqual(tender["procuringEntity"]["name"], general["procuringEntity"]["name"])
self.assertNotEqual(tender["procuringEntity"]["name"], data.get("procuringEntity", {}).get("name"))
self.assertEqual(tender["guarantee"]["amount"], general["guarantee"]["amount"])
self.assertNotEqual(tender["guarantee"]["amount"], data.get("guarantee", {}).get("amount"))
# descriptions
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token), {"data": descriptions}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
self.assertEqual(tender["description"], descriptions["description"])
self.assertNotEqual(tender["description"], data.get("description"))
self.assertEqual(tender["description_en"], descriptions["description_en"])
self.assertNotEqual(tender["description_en"], data.get("description_en"))
self.assertEqual(tender["description_ru"], descriptions["description_ru"])
self.assertNotEqual(tender["description_ru"], data.get("description_ru"))
self.assertEqual(tender["procurementMethodRationale"], descriptions["procurementMethodRationale"])
self.assertNotEqual(tender["procurementMethodRationale"], data.get("procurementMethodRationale"))
self.assertEqual(tender["procurementMethodRationale_en"], descriptions["procurementMethodRationale_en"])
self.assertNotEqual(tender["procurementMethodRationale_en"], data.get("procurementMethodRationale_en"))
self.assertEqual(tender["procurementMethodRationale_ru"], descriptions["procurementMethodRationale_ru"])
self.assertNotEqual(tender["procurementMethodRationale_ru"], data.get("procurementMethodRationale_ru"))
self.assertEqual(tender["submissionMethodDetails"], descriptions["submissionMethodDetails"])
self.assertNotEqual(tender["submissionMethodDetails"], data.get("submissionMethodDetails"))
self.assertEqual(tender["submissionMethodDetails_en"], descriptions["submissionMethodDetails_en"])
self.assertNotEqual(tender["submissionMethodDetails_en"], data.get("submissionMethodDetails_en"))
self.assertEqual(tender["submissionMethodDetails_ru"], descriptions["submissionMethodDetails_ru"])
self.assertNotEqual(tender["submissionMethodDetails_ru"], data.get("submissionMethodDetails_ru"))
# titles
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token), {"data": titles}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
self.assertEqual(tender["title"], titles["title"])
self.assertNotEqual(tender["title"], data.get("title"))
self.assertEqual(tender["title_en"], titles["title_en"])
self.assertNotEqual(tender["title_en"], data.get("title_en"))
self.assertEqual(tender["title_ru"], titles["title_ru"])
self.assertNotEqual(tender["title_ru"], data.get("title_ru"))
# criterias
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token), {"data": criterias}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
self.assertEqual(tender["eligibilityCriteria"], criterias["eligibilityCriteria"])
self.assertNotEqual(tender["eligibilityCriteria"], data.get("eligibilityCriteria"))
self.assertEqual(tender["eligibilityCriteria_en"], criterias["eligibilityCriteria_en"])
self.assertNotEqual(tender["eligibilityCriteria_en"], data.get("eligibilityCriteria_en"))
self.assertEqual(tender["eligibilityCriteria_ru"], criterias["eligibilityCriteria_ru"])
self.assertNotEqual(tender["eligibilityCriteria_ru"], data.get("eligibilityCriteria_ru"))
self.assertEqual(tender["awardCriteriaDetails"], criterias["awardCriteriaDetails"])
self.assertNotEqual(tender["awardCriteriaDetails"], data.get("awardCriteriaDetails"))
self.assertEqual(tender["awardCriteriaDetails_en"], criterias["awardCriteriaDetails_en"])
self.assertNotEqual(tender["awardCriteriaDetails_en"], data.get("awardCriteriaDetails_en"))
self.assertEqual(tender["awardCriteriaDetails_ru"], criterias["awardCriteriaDetails_ru"])
self.assertNotEqual(tender["awardCriteriaDetails_ru"], data.get("awardCriteriaDetails_ru"))
# lists
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token), {"data": lists}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
self.assertEqual(tender["funders"], lists["funders"])
buyer_id = tender["buyers"][0]["id"]
lists["buyers"][0]["id"] = buyer_id
self.assertEqual(tender["buyers"], lists["buyers"])
self.assertEqual(tender["items"][0]["description"], lists["items"][0]["description"])
# status
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token), {"data": status},
status=422
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(
response.json["errors"],
[
{'description': [
{'relatedBuyer': ['This field is required.']}
],
'location': 'body',
'name': 'items'}
],
)
patch_data = {"items": [{"relatedBuyer": buyer_id}]}
patch_data.update(status)
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token),
{
"data": patch_data
},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
self.assertEqual(tender["status"], status["status"])
self.assertNotEqual(tender["status"], data["status"])
def tender_owner_cannot_change_in_draft(self):
data = self.initial_data.copy()
data.update({"status": "draft"})
response = self.app.post_json("/tenders", {"data": data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
token = response.json["access"]["token"]
self.assertEqual(tender["status"], "draft")
general = {
"tenderID": "Some id",
"procurementMethodType": "belowThreshold",
"procurementMethod": "selective",
"submissionMethod": "written",
"mode": "test"
}
owner = {
"owner": "Test owner",
"transfer_token": "17bc682ec79245bca7d9cdbabbfce8f8",
"owner_token": "17bc682ec79245bca7d9cdbabbfce8f7"
}
time = {
"awardPeriod": {"endDate": (get_now() + timedelta(days=14)).isoformat()},
"date": (get_now() + timedelta(days=1)).isoformat(),
"dateModified": (get_now() + timedelta(days=1)).isoformat(),
}
lists = {
"revisions": [{"author": "Some author"}],
"plans": [{"id": uuid4().hex}],
"cancellations": [
{
"reason": "Some reason",
"reasonType": "noDemand"
}
],
}
# general
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token), {"data": general}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
self.assertNotEqual(tender.get("tenderID"), general["tenderID"])
self.assertNotEqual(tender.get("procurementMethodType"), general["procurementMethodType"])
self.assertEqual(tender.get("procurementMethod"), general["procurementMethod"])
self.assertNotEqual(tender.get("submissionMethod"), general["submissionMethod"])
self.assertNotEqual(tender.get("mode"), general["mode"])
# owner
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token), {"data": owner}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
self.assertNotEqual(tender.get("owner"), owner["owner"])
self.assertNotEqual(tender.get("transfer_token"), owner["transfer_token"])
self.assertNotEqual(tender.get("owner_token"), owner["owner_token"])
# time
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token), {"data": time}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
self.assertNotEqual(tender.get("awardPeriod", {}).get("endDate"), time["awardPeriod"]["endDate"])
self.assertNotEqual(tender.get("date"), time["date"])
self.assertNotEqual(tender.get("dateModified"), time["dateModified"])
# lists
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token), {"data": lists}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
self.assertEqual(tender.get("revisions", []), [])
self.assertEqual(tender.get("plans", []), [])
self.assertEqual(tender.get("cancellations", []), [])
def create_tender(self):
response = self.app.get("/tenders")
self.assertEqual(response.status, "200 OK")
self.assertEqual(len(response.json["data"]), 0)
response = self.app.post_json("/tenders", {"data": self.initial_data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
response = self.app.get("/tenders/{}".format(tender["id"]))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(set(response.json["data"]), set(tender))
self.assertEqual(response.json["data"], tender)
response = self.app.post_json("/tenders?opt_jsonp=callback", {"data": self.initial_data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/javascript")
self.assertIn('callback({"', response.body.decode())
response = self.app.post_json("/tenders?opt_pretty=1", {"data": self.initial_data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
self.assertIn('{\n "', response.body.decode())
response = self.app.post_json("/tenders", {"data": self.initial_data, "options": {"pretty": True}})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
self.assertIn('{\n "', response.body.decode())
tender_data = deepcopy(self.initial_data)
tender_data["guarantee"] = {"amount": 100500, "currency": "USD"}
response = self.app.post_json("/tenders", {"data": tender_data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
data = response.json["data"]
self.assertIn("guarantee", data)
self.assertEqual(data["guarantee"]["amount"], 100500)
self.assertEqual(data["guarantee"]["currency"], "USD")
data = deepcopy(self.initial_data)
del data["items"][0]["deliveryAddress"]["postalCode"]
del data["items"][0]["deliveryAddress"]["locality"]
del data["items"][0]["deliveryAddress"]["streetAddress"]
del data["items"][0]["deliveryAddress"]["region"]
response = self.app.post_json("/tenders", {"data": data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
self.assertNotIn("postalCode", response.json["data"]["items"][0]["deliveryAddress"])
self.assertNotIn("locality", response.json["data"]["items"][0]["deliveryAddress"])
self.assertNotIn("streetAddress", response.json["data"]["items"][0]["deliveryAddress"])
self.assertNotIn("region", response.json["data"]["items"][0]["deliveryAddress"])
for kind in PQ_KINDS:
data = deepcopy(self.initial_data)
data['procuringEntity']['kind'] = kind
response = self.app.post_json("/tenders", {"data": data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json['data']['procuringEntity']['kind'],
kind
)
def tender_fields(self):
response = self.app.post_json("/tenders", {"data": self.initial_data})
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
tender = response.json["data"]
self.assertEqual(
set(tender) - set(self.initial_data),
set(
[
"id",
"dateModified",
"tenderID",
"date",
"status",
"awardCriteria",
"submissionMethod",
"owner",
]
),
)
self.assertIn(tender["id"], response.headers["Location"])
def patch_tender(self):
data = self.initial_data.copy()
data["procuringEntity"]["contactPoint"]["faxNumber"] = "+0440000000"
response = self.app.get("/tenders")
self.assertEqual(response.status, "200 OK")
self.assertEqual(len(response.json["data"]), 0)
response = self.app.post_json("/tenders", {"data": data})
self.assertEqual(response.status, "201 Created")
tender = response.json["data"]
owner_token = response.json["access"]["token"]
dateModified = tender.pop("dateModified")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], owner_token),
{"data": {"milestones": test_milestones}},
status=422
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{
"description": ["Milestones are not applicable to pricequotation"],
"location": "body",
"name": "milestones"
}],
)
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], owner_token), {"data": {"procuringEntity": {"kind": "defense"}}}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["procuringEntity"]["kind"], "defense")
tender["procuringEntity"]['kind'] = "defense"
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], owner_token),
{"data": {"procuringEntity": {"contactPoint": {"faxNumber": None}}}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertNotIn("faxNumber", response.json["data"]["procuringEntity"]["contactPoint"])
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], owner_token),
{"data": {"procuringEntity": {"contactPoint": {"faxNumber": "+0440000000"}}}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertIn("startDate", response.json["data"]["tenderPeriod"])
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], owner_token), {"data": {"procurementMethodRationale": "Open"}}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
new_tender = response.json["data"]
new_dateModified = new_tender.pop("dateModified")
tender["procurementMethodRationale"] = "Open"
self.assertEqual(tender, new_tender)
self.assertNotEqual(dateModified, new_dateModified)
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], owner_token), {"data": {"dateModified": new_dateModified}}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
new_tender2 = response.json["data"]
new_dateModified2 = new_tender2.pop("dateModified")
self.assertEqual(new_tender, new_tender2)
self.assertEqual(new_dateModified, new_dateModified2)
revisions = self.db.get(tender["id"]).get("revisions")
self.assertEqual(revisions[-1]["changes"][0]["op"], "remove")
self.assertEqual(revisions[-1]["changes"][0]["path"], "/procurementMethodRationale")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], owner_token), {"data": {"items": [data["items"][0]]}}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], owner_token), {"data": {"items": [{}, data["items"][0]]}}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
item0 = response.json["data"]["items"][0]
item1 = response.json["data"]["items"][1]
self.assertNotEqual(item0.pop("id"), item1.pop("id"))
self.assertEqual(item0, item1)
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], owner_token), {"data": {"items": [{}]}}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(len(response.json["data"]["items"]), 1)
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], owner_token),
{
"data": {
"items": [
{
"classification": {
"scheme": "ДК021",
"id": "55523100-3",
"description": "Послуги з харчування у школах",
}
}
]
}
},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], owner_token),
{"data": {"guarantee": {"amount": 12, "valueAddedTaxIncluded": True}}},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(
response.json["errors"][0],
{"description": {"valueAddedTaxIncluded": "Rogue field"}, "location": "body", "name": "guarantee"},
)
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], owner_token), {"data": {"guarantee": {"amount": 12}}}
)
self.assertEqual(response.status, "200 OK")
self.assertIn("guarantee", response.json["data"])
self.assertEqual(response.json["data"]["guarantee"]["amount"], 12)
self.assertEqual(response.json["data"]["guarantee"]["currency"], "UAH")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], owner_token), {"data": {"guarantee": {"currency": "USD"}}}
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["guarantee"]["currency"], "USD")
# response = self.app.patch_json('/tenders/{}'.format(tender['id']), {'data': {'status': 'active.auction'}})
# self.assertEqual(response.status, '200 OK')
# response = self.app.get('/tenders/{}'.format(tender['id']))
# self.assertEqual(response.status, '200 OK')
# self.assertEqual(response.content_type, 'application/json')
tender_data = self.db.get(tender["id"])
tender_data["status"] = "complete"
self.db.save(tender_data)
@mock.patch("openprocurement.tender.core.models.CANT_DELETE_PERIOD_START_DATE_FROM", get_now() - timedelta(days=1))
def required_field_deletion(self):
response = self.app.post_json("/tenders", {"data": self.initial_data})
self.assertEqual(response.status, "201 Created")
tender = response.json["data"]
token = response.json["access"]["token"]
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender["id"], token),
{"data": {"tenderPeriod": {"startDate": None}}},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
"description": {"startDate": ["This field cannot be deleted"]},
"location": "body",
"name": "tenderPeriod",
}
],
)
def tender_Administrator_change(self):
self.create_tender()
self.set_status('active.tendering')
cancellation = dict(**test_cancellation)
cancellation.update({
"reasonType": "noDemand",
"status": "active",
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
self.app.authorization = ("Basic", ("administrator", ""))
response = self.app.patch_json("/tenders/{}".format(self.tender_id), {"data": {"mode": "test"}})
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["mode"], "test")
def patch_tender_status(self):
cur_status = "draft.publishing"
patch_status = "cancelled"
self.create_tender()
self.set_status(cur_status)
data = {"data": {
"status": patch_status,
}
}
response = self.app.patch_json("/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token), data, status=403)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
"location": "body",
"name": "data",
"description": f"You can't switch tender from status ({cur_status}) to ({patch_status})"
}
],
)
@mock.patch("openprocurement.tender.pricequotation.models.tender.PQ_MULTI_PROFILE_FROM", get_now() + timedelta(days=1))
def patch_tender_by_pq_bot_before_multiprofile(self):
response = self.app.post_json("/tenders", {"data": deepcopy(test_tender_data_before_multiprofile)})
self.assertEqual(response.status, "201 Created")
tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
tender = response.json["data"]
self.assertEqual(tender["status"], "draft")
self.assertEqual(len(tender["items"]), 1)
self.assertNotIn("shortlistedFirms", tender)
self.assertNotIn("classification", tender["items"][0])
data = {"data": {
"status": "draft.publishing",
"profile": test_short_profile["id"]}
}
response = self.app.patch_json("/tenders/{}?acc_token={}".format(tender_id, owner_token), data)
self.assertEqual(response.status, "200 OK")
tender = response.json["data"]
self.assertEqual(tender["status"], "draft.publishing")
self.assertEqual(tender["profile"], test_short_profile["id"])
items = deepcopy(tender["items"])
items[0]["classification"] = test_short_profile["classification"]
items[0]["unit"] = test_short_profile["unit"]
amount = sum([item["quantity"] for item in items]) * test_short_profile["value"]["amount"]
value = deepcopy(test_short_profile["value"])
value["amount"] = amount
criteria = deepcopy(test_short_profile["criteria"])
data = {
"data": {
"status": "active.tendering",
"items": items,
"shortlistedFirms": test_shortlisted_firms,
"criteria": criteria,
"value": value
}
}
# try to patch by user
for patch in ({'data': {'status': 'active.tendering'}}, data):
with change_auth(self.app, ("Basic", ("broker", ""))) as app:
resp = app.patch_json("/tenders/{}?acc_token={}".format(tender_id, owner_token), patch, status=403)
self.assertEqual(resp.status, "403 Forbidden")
self.assertEqual(resp.json['status'], "error")
self.assertEqual(resp.json['errors'], [
{'description': "tender_owner can't switch tender from status (draft.publishing) to (active.tendering)",
'location': 'body',
'name': 'data'}
])
# patch by bot
with change_auth(self.app, ("Basic", ("pricequotation", ""))) as app:
resp = app.patch_json("/tenders/{}".format(tender_id), data)
response = self.app.get("/tenders/{}".format(tender_id))
self.assertEqual(response.status, "200 OK")
tender = response.json["data"]
self.assertEqual(tender["status"], data["data"]["status"])
self.assertIn("classification", tender["items"][0])
self.assertIn("unit", tender["items"][0])
self.assertEqual(len(tender["shortlistedFirms"]), len(test_shortlisted_firms))
self.assertEqual(len(tender["criteria"]), len(test_short_profile["criteria"]))
self.assertEqual(tender["value"], value)
# switch tender to `draft.unsuccessful`
response = self.app.post_json("/tenders", {"data": deepcopy(test_tender_data_before_multiprofile)})
self.assertEqual(response.status, "201 Created")
tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
tender = response.json["data"]
self.assertEqual(tender["status"], "draft")
self.assertEqual(len(tender["items"]), 1)
self.assertNotIn("shortlistedFirms", tender)
self.assertNotIn("classification", tender["items"][0])
data = {"data": {"status": "draft.publishing", "profile": "a1b2c3-a1b2c3e4-f1g2i3-h1g2k3l4"}}
response = self.app.patch_json("/tenders/{}?acc_token={}".format(tender_id, owner_token), data, status=422)
self.assertEqual(
response.json["errors"],
[{"location": "body", "name": "profile", "description": ["The profile value doesn't match id pattern"]}]
)
# set not existed profile id
data["data"]["profile"] = "123456-12345678-123456-12345678"
response = self.app.patch_json("/tenders/{}?acc_token={}".format(tender_id, owner_token), data)
self.assertEqual(response.status, "200 OK")
tender = response.json["data"]
self.assertEqual(tender["status"], "draft.publishing")
self.assertEqual(tender["profile"], "123456-12345678-123456-12345678")
with change_auth(self.app, ("Basic", ("pricequotation", ""))) as app:
self.app.patch_json(
"/tenders/{}".format(tender_id),
{"data": {"status": "draft.unsuccessful", "unsuccessfulReason": ["Profile not found in catalogue"]}}
)
response = self.app.get("/tenders/{}".format(tender_id))
self.assertEqual(response.status, "200 OK")
tender = response.json["data"]
self.assertEqual(tender["status"], "draft.unsuccessful")
self.assertEqual(tender["unsuccessfulReason"], ["Profile not found in catalogue"])
self.assertNotIn("classification", tender["items"][0])
self.assertNotIn("shortlistedFirms", tender)
@mock.patch("openprocurement.tender.pricequotation.models.tender.PQ_MULTI_PROFILE_FROM", get_now() - timedelta(days=1))
def patch_tender_by_pq_bot_after_multiprofile(self):
response = self.app.post_json("/tenders", {"data": deepcopy(test_tender_data_after_multiprofile)})
self.assertEqual(response.status, "201 Created")
tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
tender = response.json["data"]
self.assertEqual(tender["status"], "draft")
self.assertEqual(len(tender["items"]), 1)
self.assertNotIn("shortlistedFirms", tender)
self.assertIn("classification", tender["items"][0])
self.assertIn("additionalClassifications", tender["items"][0])
test_agreement = {
"id": self.agreement_id,
}
data = {
"data": {
"status": "draft.publishing",
"agreement": test_agreement,
"criteria": test_short_profile["criteria"]
}
}
response = self.app.patch_json("/tenders/{}?acc_token={}".format(tender_id, owner_token), data)
self.assertEqual(response.status, "200 OK")
tender = response.json["data"]
self.assertEqual(tender["status"], "draft.publishing")
self.assertEqual(tender["agreement"], test_agreement)
expected_criteria = deepcopy(test_short_profile["criteria"])
for c in expected_criteria:
c.update(id=mock.ANY)
for g in c.get("requirementGroups"):
g.update(id=mock.ANY)
for r in g.get("requirements"):
r.update(id=mock.ANY)
self.assertEqual(tender["criteria"], expected_criteria)
amount = sum([item["quantity"] for item in tender["items"]]) * test_short_profile["value"]["amount"]
value = deepcopy(test_short_profile["value"])
value["amount"] = amount
data = {
"data": {
"value": value,
"status": "active.tendering",
"shortlistedFirms": test_shortlisted_firms,
}
}
# try to patch by user
for patch in ({'data': {'status': 'active.tendering'}}, data):
with change_auth(self.app, ("Basic", ("broker", ""))) as app:
resp = app.patch_json("/tenders/{}?acc_token={}".format(tender_id, owner_token), patch, status=403)
self.assertEqual(resp.status, "403 Forbidden")
self.assertEqual(resp.json['status'], "error")
self.assertEqual(resp.json['errors'], [
{'description': "tender_owner can't switch tender from status (draft.publishing) to (active.tendering)",
'location': 'body',
'name': 'data'}
])
# patch by bot
with change_auth(self.app, ("Basic", ("pricequotation", ""))) as app:
resp = app.patch_json("/tenders/{}".format(tender_id), data)
response = self.app.get("/tenders/{}".format(tender_id))
self.assertEqual(response.status, "200 OK")
tender = response.json["data"]
self.assertEqual(tender["status"], data["data"]["status"])
self.assertEqual(len(tender["shortlistedFirms"]), len(test_shortlisted_firms))
self.assertEqual(tender["value"], value)
# switch tender to `draft.unsuccessful`
response = self.app.post_json("/tenders", {"data": deepcopy(test_tender_data_after_multiprofile)})
self.assertEqual(response.status, "201 Created")
tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
tender = response.json["data"]
self.assertEqual(tender["status"], "draft")
self.assertEqual(len(tender["items"]), 1)
self.assertNotIn("shortlistedFirms", tender)
data = {"data": {"status": "draft.publishing", "items": [{"profile": "a1b2c3-a1b2c3e4-f1g2i3-h1g2k3l4"}]}}
response = self.app.patch_json("/tenders/{}?acc_token={}".format(tender_id, owner_token), data, status=422)
self.assertEqual(
response.json["errors"],
[{"location": "body", "name": "items", "description": [{"profile": ["The profile value doesn't match id pattern"]}]}]
)
# set not existed profile id
data["data"]["items"][0]["profile"] = "123456-12345678-123456-12345678"
response = self.app.patch_json("/tenders/{}?acc_token={}".format(tender_id, owner_token), data)
self.assertEqual(response.status, "200 OK")
tender = response.json["data"]
self.assertEqual(tender["status"], "draft.publishing")
self.assertEqual(tender["items"][0]["profile"], "123456-12345678-123456-12345678")
with change_auth(self.app, ("Basic", ("pricequotation", ""))) as app:
self.app.patch_json(
"/tenders/{}".format(tender_id),
{"data": {"status": "draft.unsuccessful", "unsuccessfulReason": ["Profile not found in catalogue"]}}
)
response = self.app.get("/tenders/{}".format(tender_id))
self.assertEqual(response.status, "200 OK")
tender = response.json["data"]
self.assertEqual(tender["status"], "draft.unsuccessful")
self.assertEqual(tender["unsuccessfulReason"], ["Profile not found in catalogue"])
self.assertNotIn("shortlistedFirms", tender)
def invalid_tender_conditions(self):
# create tender
response = self.app.post_json("/tenders", {"data": self.initial_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
# switch to active.tendering
self.set_status("active.tendering")
# cancellation
cancellation = dict(**test_cancellation)
cancellation.update({
"reason": "invalid conditions",
"reasonType": "noDemand",
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(tender_id, owner_token),
{"data": cancellation},
)
cancellation_id = response.json["data"]["id"]
response = self.app.post(
"/tenders/{}/cancellations/{}/documents?acc_token={}".format(
self.tender_id, cancellation_id, self.tender_token
),
upload_files=[("file", "name.doc", b"content")],
)
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(tender_id, cancellation_id, owner_token),
{"data": {"status": "active"}},
)
# check status
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.json["data"]["status"], "cancelled")
def one_valid_bid_tender(self):
tender_id = self.tender_id
owner_token = self.tender_token
# create bid
self.app.authorization = ("Basic", ("broker", ""))
resp = self.app.post_json(
"/tenders/{}/bids".format(tender_id), {"data": {
"tenderers": [test_organization],
"value": {"amount": 500},
"requirementResponses": test_requirement_response,
}}
)
token = resp.json['access']['token']
# switch to active.qualification
self.set_status("active.qualification")
response = self.check_chronograph()
# get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award_id = [i["id"] for i in response.json["data"] if i["status"] == "pending"][0]
award_date = [i["date"] for i in response.json["data"] if i["status"] == "pending"][0]
# set award as active
response = self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(tender_id, award_id, token),
{"data": {"status": "active"}}
)
self.assertNotEqual(response.json["data"]["date"], award_date)
# get contract id
response = self.app.get("/tenders/{}".format(tender_id))
contract_id = response.json["data"]["contracts"][-1]["id"]
# after stand slill period
self.set_status("active.awarded", 'end')
# sign contract
self.app.authorization = ("Basic", ("broker", ""))
self.app.patch_json(
"/tenders/{}/contracts/{}?acc_token={}".format(tender_id, contract_id, owner_token),
{"data": {"status": "active", "value": {"valueAddedTaxIncluded": False}}},
)
# check status
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}".format(tender_id))
self.assertEqual(response.json["data"]["status"], "complete")
def one_invalid_bid_tender(self):
tender_id = self.tender_id
owner_token = self.tender_token
# create bid
self.app.authorization = ("Basic", ("broker", ""))
bid, token = self.create_bid(
self.tender_id,
{
"tenderers": [test_organization],
"value": {"amount": 500},
"requirementResponses": test_requirement_response,
},
)
# switch to active.qualification
self.set_status('active.tendering', 'end')
resp = self.check_chronograph()
self.assertEqual(resp.json['data']['status'], 'active.qualification')
# get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award_id = [i["id"] for i in response.json["data"] if i["status"] == "pending"][0]
# set award as unsuccessful
self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(tender_id, award_id, token),
{"data": {"status": "unsuccessful"}},
)
# check status
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}".format(tender_id))
self.assertEqual(response.json["data"]["status"], "unsuccessful")
def first_bid_tender(self):
tender_id = self.tender_id
owner_token = self.tender_token
# create bid
bid, bid_token1 = self.create_bid(
self.tender_id,
{
"tenderers": [test_organization],
"value": {"amount": 450},
"requirementResponses": test_requirement_response,
},
)
bid_1 = bid["id"]
# create second bid
bid, bid_token2 = self.create_bid(
self.tender_id,
{
"tenderers": [test_organization],
"value": {"amount": 300},
"requirementResponses": test_requirement_response,
},
)
bid_2 = bid["id"]
self.set_status('active.tendering', 'end')
resp = self.check_chronograph()
self.assertEqual(resp.json['data']['status'], 'active.qualification')
# get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award = [i for i in response.json["data"] if i["status"] == "pending"][0]
award_id = award['id']
self.assertEqual(award['bid_id'], bid_2)
self.assertEqual(award['value']['amount'], 300)
# set award as unsuccessful
self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(tender_id, award_id, bid_token2),
{"data": {"status": "unsuccessful"}},
)
# get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award = [i for i in response.json["data"] if i["status"] == "pending"][0]
award2_id = award['id']
self.assertEqual(award['bid_id'], bid_1)
self.assertEqual(award['value']['amount'], 450)
self.assertNotEqual(award_id, award2_id)
# get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award_id = [i["id"] for i in response.json["data"] if i["status"] == "pending"][0]
# set award as active
self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(tender_id, award_id, bid_token1),
{"data": {"status": "active"}}
)
# get contract id
response = self.app.get("/tenders/{}".format(tender_id))
contract_id = response.json["data"]["contracts"][-1]["id"]
# create tender contract document for test
response = self.app.post(
"/tenders/{}/contracts/{}/documents?acc_token={}".format(tender_id, contract_id, owner_token),
upload_files=[("file", "name.doc", b"content")],
status=201,
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
doc_id = response.json["data"]["id"]
self.assertIn(doc_id, response.headers["Location"])
# sign contract
self.app.authorization = ("Basic", ("broker", ""))
self.app.patch_json(
"/tenders/{}/contracts/{}?acc_token={}".format(tender_id, contract_id, owner_token),
{"data": {"status": "active", "value": {"valueAddedTaxIncluded": False}}},
)
# check status
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}".format(tender_id))
self.assertEqual(response.json["data"]["status"], "complete")
response = self.app.post(
"/tenders/{}/contracts/{}/documents?acc_token={}".format(tender_id, contract_id, owner_token),
upload_files=[("file", "name.doc", b"content")],
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"][0]["description"], "Can't add document in current (complete) tender status"
)
response = self.app.patch_json(
"/tenders/{}/contracts/{}/documents/{}?acc_token={}".format(tender_id, contract_id, doc_id, owner_token),
{"data": {"description": "document description"}},
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"][0]["description"], "Can't update document in current (complete) tender status"
)
response = self.app.put(
"/tenders/{}/contracts/{}/documents/{}?acc_token={}".format(tender_id, contract_id, doc_id, owner_token),
upload_files=[("file", "name.doc", b"content3")],
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"][0]["description"], "Can't update document in current (complete) tender status"
)
def lost_contract_for_active_award(self):
tender_id = self.tender_id
owner_token = self.tender_token
# create bid
self.app.authorization = ("Basic", ("broker", ""))
bid, token = self.create_bid(
self.tender_id,
{
"tenderers": [test_organization],
"value": {"amount": 500},
"requirementResponses": test_requirement_response,
},
)
# switch to active.qualification
self.set_status("active.tendering", 'end')
resp = self.check_chronograph().json
self.assertEqual(resp['data']['status'], 'active.qualification')
# get awards
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}/awards?acc_token={}".format(tender_id, owner_token))
# get pending award
award_id = [i["id"] for i in response.json["data"] if i["status"] == "pending"][0]
# set award as active
self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(tender_id, award_id, token), {"data": {"status": "active"}}
)
# lost contract
tender = self.db.get(tender_id)
del tender["contracts"]
self.db.save(tender)
# check tender
response = self.app.get("/tenders/{}".format(tender_id))
self.assertEqual(response.json["data"]["status"], "active.awarded")
self.assertNotIn("contracts", response.json["data"])
self.assertIn("next_check", response.json["data"])
# create lost contract
response = self.check_chronograph()
self.assertEqual(response.json["data"]["status"], "active.awarded")
self.assertIn("contracts", response.json["data"])
self.assertNotIn("next_check", response.json["data"])
contract_id = response.json["data"]["contracts"][-1]["id"]
# sign contract
self.app.authorization = ("Basic", ("broker", ""))
self.app.patch_json(
"/tenders/{}/contracts/{}?acc_token={}".format(tender_id, contract_id, owner_token),
{"data": {"status": "active", "value": {"valueAddedTaxIncluded": False}}},
)
# check status
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}".format(tender_id))
self.assertEqual(response.json["data"]["status"], "complete")
| 43.034791 | 164 | 0.637771 | 9,378 | 86,586 | 5.774366 | 0.050117 | 0.120494 | 0.123597 | 0.066941 | 0.773914 | 0.745679 | 0.720417 | 0.704997 | 0.681951 | 0.669726 | 0 | 0.015493 | 0.191186 | 86,586 | 2,011 | 165 | 43.056191 | 0.757754 | 0.017151 | 0 | 0.576587 | 0 | 0.001165 | 0.267533 | 0.059253 | 0 | 0 | 0 | 0 | 0.311008 | 1 | 0.013978 | false | 0 | 0.006407 | 0 | 0.020384 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
1c8d67def6c3a9e7b0b9bfe27fd07a97f6e9f14f | 90 | py | Python | snacks/admin.py | shahd1995913/djangox | 8e3c11bd893a96b686d0c18a324c77f8e7dd876d | [
"MIT"
] | null | null | null | snacks/admin.py | shahd1995913/djangox | 8e3c11bd893a96b686d0c18a324c77f8e7dd876d | [
"MIT"
] | 1 | 2021-11-17T19:51:48.000Z | 2021-11-17T19:51:48.000Z | snacks/admin.py | shahd1995913/djangox | 8e3c11bd893a96b686d0c18a324c77f8e7dd876d | [
"MIT"
] | null | null | null | from .models import Snack
from django.contrib import admin
admin.site.register(Snack)
| 11.25 | 32 | 0.788889 | 13 | 90 | 5.461538 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144444 | 90 | 7 | 33 | 12.857143 | 0.922078 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
98e0b570b572a018216e20abe4b392f286b1ca89 | 225 | py | Python | UnityEngine/Events/UnityAction/__init__.py | Grim-es/udon-pie-auto-completion | c2cd86554ed615cdbbb01e19fa40665eafdfaedc | [
"MIT"
] | null | null | null | UnityEngine/Events/UnityAction/__init__.py | Grim-es/udon-pie-auto-completion | c2cd86554ed615cdbbb01e19fa40665eafdfaedc | [
"MIT"
] | null | null | null | UnityEngine/Events/UnityAction/__init__.py | Grim-es/udon-pie-auto-completion | c2cd86554ed615cdbbb01e19fa40665eafdfaedc | [
"MIT"
] | null | null | null | from UdonPie import UnityEngine
from UdonPie.Undefined import *
class UnityAction:
def __new__(cls, arg1=None):
'''
:returns: UnityAction
:rtype: UnityEngine.UnityAction
'''
pass
| 18.75 | 39 | 0.622222 | 21 | 225 | 6.47619 | 0.714286 | 0.161765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006289 | 0.293333 | 225 | 11 | 40 | 20.454545 | 0.849057 | 0.235556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0.2 | 0.4 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 4 |
c70cff7261b038a056770c5ccb5d404398917130 | 651 | py | Python | src/yupdates/yapi.py | yupdates/yupdates-sdk-py | d767b1dd884a6597c3310dc409d10a10d435fe84 | [
"MIT"
] | 1 | 2022-01-18T15:55:08.000Z | 2022-01-18T15:55:08.000Z | src/yupdates/yapi.py | yupdates/yupdates-sdk-py | d767b1dd884a6597c3310dc409d10a10d435fe84 | [
"MIT"
] | null | null | null | src/yupdates/yapi.py | yupdates/yupdates-sdk-py | d767b1dd884a6597c3310dc409d10a10d435fe84 | [
"MIT"
] | null | null | null | """
This module is a convenience wrapper around a default instance of `YupdatesClient` for quick
scripts and easy calls from a Python interpreter.
This will always maintain parity with the `YupdatesClient` methods, so you can build programs
on top of this module or store one `YupdatesClient` instance in a variable and use the instance
methods on that.
See the `YupdatesClient` class and method documentation for a particular call's usage.
Example:
from yupdates import yapi
yapi.ping()
"""
from .client import yupdates_client
def ping():
return yupdates_client().ping()
def ping_bool():
return yupdates_client().ping_bool()
| 25.038462 | 95 | 0.764977 | 95 | 651 | 5.189474 | 0.610526 | 0.085193 | 0.081136 | 0.097363 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173579 | 651 | 25 | 96 | 26.04 | 0.916357 | 0.760369 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | true | 0 | 0.2 | 0.4 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
c71bc9fc2c03ee4e7d29d52bf63758fd81a3725f | 239 | py | Python | src/openpersonen/api/serializers/verblijfs_titel_historie.py | maykinmedia/open-personen | ddcf083ccd4eb864c5305bcd8bc75c6c64108272 | [
"RSA-MD"
] | 2 | 2020-08-26T11:24:43.000Z | 2021-07-28T09:46:40.000Z | src/openpersonen/api/serializers/verblijfs_titel_historie.py | maykinmedia/open-personen | ddcf083ccd4eb864c5305bcd8bc75c6c64108272 | [
"RSA-MD"
] | 153 | 2020-08-26T10:45:35.000Z | 2021-12-10T17:33:16.000Z | src/openpersonen/api/serializers/verblijfs_titel_historie.py | maykinmedia/open-personen | ddcf083ccd4eb864c5305bcd8bc75c6c64108272 | [
"RSA-MD"
] | null | null | null | from rest_framework import serializers
from .verblijfs_titel import VerblijfsTitelSerializer
class VerblijfsTitelHistorieSerializer(VerblijfsTitelSerializer):
geheimhoudingPersoonsgegevens = serializers.BooleanField(required=False)
| 29.875 | 76 | 0.878661 | 18 | 239 | 11.555556 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083682 | 239 | 7 | 77 | 34.142857 | 0.949772 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
c71fc9883e3a40901f8962899a28612bfb974a3d | 143 | py | Python | src/rotest/core/result/monitor/__init__.py | gregoil/rotest | c443bc1b99e02f047adfcab9943966f0023f652c | [
"MIT"
] | 26 | 2017-06-11T18:21:17.000Z | 2021-02-21T20:36:30.000Z | src/rotest/core/result/monitor/__init__.py | gregoil/rotest | c443bc1b99e02f047adfcab9943966f0023f652c | [
"MIT"
] | 143 | 2017-06-29T11:18:35.000Z | 2021-06-10T17:23:46.000Z | src/rotest/core/result/monitor/__init__.py | gregoil/rotest | c443bc1b99e02f047adfcab9943966f0023f652c | [
"MIT"
] | 11 | 2017-06-12T09:16:14.000Z | 2021-07-11T23:20:59.000Z | from .monitor import (AbstractMonitor, AbstractResourceMonitor, require_attr,
skip_if_block, skip_if_case, skip_if_flow)
| 47.666667 | 77 | 0.727273 | 16 | 143 | 6.0625 | 0.75 | 0.185567 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.216783 | 143 | 2 | 78 | 71.5 | 0.866071 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
c727a0d5c7f074a8a0d47a2af39a4980e9cbd89e | 92 | py | Python | Python Part 1 Exercises/number_7.py | ryanzhao2/grade11cs | 173b9f50db49368ea2042f6803d6674dd9f185cd | [
"Apache-2.0"
] | null | null | null | Python Part 1 Exercises/number_7.py | ryanzhao2/grade11cs | 173b9f50db49368ea2042f6803d6674dd9f185cd | [
"Apache-2.0"
] | null | null | null | Python Part 1 Exercises/number_7.py | ryanzhao2/grade11cs | 173b9f50db49368ea2042f6803d6674dd9f185cd | [
"Apache-2.0"
] | null | null | null | def clear_screen():
for i in range(25):
print('\n')
clear_screen()
| 9.2 | 23 | 0.5 | 12 | 92 | 3.666667 | 0.833333 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033898 | 0.358696 | 92 | 9 | 24 | 10.222222 | 0.711864 | 0 | 0 | 0 | 0 | 0 | 0.022727 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
c72bc3cdaffbf24886ad017ea36ace674ce49859 | 139 | py | Python | setup.py | yunqingqing/django_practice | 95ac5f2aee2fd1f736cec795c05489a23548abac | [
"Apache-2.0"
] | null | null | null | setup.py | yunqingqing/django_practice | 95ac5f2aee2fd1f736cec795c05489a23548abac | [
"Apache-2.0"
] | 7 | 2020-06-05T17:11:56.000Z | 2021-09-22T17:43:59.000Z | setup.py | yunqingqing/django_practice | 95ac5f2aee2fd1f736cec795c05489a23548abac | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
import setuptools
setuptools.setup(name="django_practice",
packages=setuptools.find_packages())
| 19.857143 | 53 | 0.654676 | 14 | 139 | 6.357143 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009009 | 0.201439 | 139 | 6 | 54 | 23.166667 | 0.792793 | 0.151079 | 0 | 0 | 0 | 0 | 0.12931 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
c72f675faf49b71e48566b3b0478548010fcb099 | 10,991 | py | Python | dynaban/tests/code.py | laukik-hase/imitation_of_human_arm_on_robotic_manipulator | 995beb1ab41597ca6cbecd0baecdef1ef13450f9 | [
"MIT"
] | 3 | 2021-11-13T16:54:31.000Z | 2021-11-13T20:50:18.000Z | dynaban/tests/code.py | laukik-hase/human_arm_imitation | 995beb1ab41597ca6cbecd0baecdef1ef13450f9 | [
"MIT"
] | null | null | null | dynaban/tests/code.py | laukik-hase/human_arm_imitation | 995beb1ab41597ca6cbecd0baecdef1ef13450f9 | [
"MIT"
] | null | null | null | # This file was automatically generated by SWIG (http://www.swig.org).
# Version 3.0.12
#
# Do not make changes to this file unless you know what you are doing--modify
# the SWIG interface file instead.
from sys import version_info as _swig_python_version_info
if _swig_python_version_info >= (2, 7, 0):
def swig_import_helper():
import importlib
pkg = __name__.rpartition('.')[0]
mname = '.'.join((pkg, '_code')).lstrip('.')
try:
return importlib.import_module(mname)
except ImportError:
return importlib.import_module('_code')
_code = swig_import_helper()
del swig_import_helper
elif _swig_python_version_info >= (2, 6, 0):
def swig_import_helper():
from os.path import dirname
import imp
fp = None
try:
fp, pathname, description = imp.find_module('_code', [dirname(__file__)])
except ImportError:
import _code
return _code
try:
_mod = imp.load_module('_code', fp, pathname, description)
finally:
if fp is not None:
fp.close()
return _mod
_code = swig_import_helper()
del swig_import_helper
else:
import _code
del _swig_python_version_info
try:
_swig_property = property
except NameError:
pass # Python < 2.2 doesn't have 'property'.
try:
import builtins as __builtin__
except ImportError:
import __builtin__
def _swig_setattr_nondynamic(self, class_type, name, value, static=1):
if (name == "thisown"):
return self.this.own(value)
if (name == "this"):
if type(value).__name__ == 'SwigPyObject':
self.__dict__[name] = value
return
method = class_type.__swig_setmethods__.get(name, None)
if method:
return method(self, value)
if (not static):
if _newclass:
object.__setattr__(self, name, value)
else:
self.__dict__[name] = value
else:
raise AttributeError("You cannot add attributes to %s" % self)
def _swig_setattr(self, class_type, name, value):
return _swig_setattr_nondynamic(self, class_type, name, value, 0)
def _swig_getattr(self, class_type, name):
if (name == "thisown"):
return self.this.own()
method = class_type.__swig_getmethods__.get(name, None)
if method:
return method(self)
raise AttributeError("'%s' object has no attribute '%s'" % (class_type.__name__, name))
def _swig_repr(self):
try:
strthis = "proxy of " + self.this.__repr__()
except __builtin__.Exception:
strthis = ""
return "<%s.%s; %s >" % (self.__class__.__module__, self.__class__.__name__, strthis,)
try:
_object = object
_newclass = 1
except __builtin__.Exception:
class _object:
pass
_newclass = 0
class SwigPyIterator(_object):
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, SwigPyIterator, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, SwigPyIterator, name)
def __init__(self, *args, **kwargs):
raise AttributeError("No constructor defined - class is abstract")
__repr__ = _swig_repr
__swig_destroy__ = _code.delete_SwigPyIterator
__del__ = lambda self: None
def value(self):
return _code.SwigPyIterator_value(self)
def incr(self, n=1):
return _code.SwigPyIterator_incr(self, n)
def decr(self, n=1):
return _code.SwigPyIterator_decr(self, n)
def distance(self, x):
return _code.SwigPyIterator_distance(self, x)
def equal(self, x):
return _code.SwigPyIterator_equal(self, x)
def copy(self):
return _code.SwigPyIterator_copy(self)
def next(self):
return _code.SwigPyIterator_next(self)
def __next__(self):
return _code.SwigPyIterator___next__(self)
def previous(self):
return _code.SwigPyIterator_previous(self)
def advance(self, n):
return _code.SwigPyIterator_advance(self, n)
def __eq__(self, x):
return _code.SwigPyIterator___eq__(self, x)
def __ne__(self, x):
return _code.SwigPyIterator___ne__(self, x)
def __iadd__(self, n):
return _code.SwigPyIterator___iadd__(self, n)
def __isub__(self, n):
return _code.SwigPyIterator___isub__(self, n)
def __add__(self, n):
return _code.SwigPyIterator___add__(self, n)
def __sub__(self, *args):
return _code.SwigPyIterator___sub__(self, *args)
def __iter__(self):
return self
SwigPyIterator_swigregister = _code.SwigPyIterator_swigregister
SwigPyIterator_swigregister(SwigPyIterator)
class VecDouble(_object):
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, VecDouble, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, VecDouble, name)
__repr__ = _swig_repr
def iterator(self):
return _code.VecDouble_iterator(self)
def __iter__(self):
return self.iterator()
def __nonzero__(self):
return _code.VecDouble___nonzero__(self)
def __bool__(self):
return _code.VecDouble___bool__(self)
def __len__(self):
return _code.VecDouble___len__(self)
def __getslice__(self, i, j):
return _code.VecDouble___getslice__(self, i, j)
def __setslice__(self, *args):
return _code.VecDouble___setslice__(self, *args)
def __delslice__(self, i, j):
return _code.VecDouble___delslice__(self, i, j)
def __delitem__(self, *args):
return _code.VecDouble___delitem__(self, *args)
def __getitem__(self, *args):
return _code.VecDouble___getitem__(self, *args)
def __setitem__(self, *args):
return _code.VecDouble___setitem__(self, *args)
def pop(self):
return _code.VecDouble_pop(self)
def append(self, x):
return _code.VecDouble_append(self, x)
def empty(self):
return _code.VecDouble_empty(self)
def size(self):
return _code.VecDouble_size(self)
def swap(self, v):
return _code.VecDouble_swap(self, v)
def begin(self):
return _code.VecDouble_begin(self)
def end(self):
return _code.VecDouble_end(self)
def rbegin(self):
return _code.VecDouble_rbegin(self)
def rend(self):
return _code.VecDouble_rend(self)
def clear(self):
return _code.VecDouble_clear(self)
def get_allocator(self):
return _code.VecDouble_get_allocator(self)
def pop_back(self):
return _code.VecDouble_pop_back(self)
def erase(self, *args):
return _code.VecDouble_erase(self, *args)
def __init__(self, *args):
this = _code.new_VecDouble(*args)
try:
self.this.append(this)
except __builtin__.Exception:
self.this = this
def push_back(self, x):
return _code.VecDouble_push_back(self, x)
def front(self):
return _code.VecDouble_front(self)
def back(self):
return _code.VecDouble_back(self)
def assign(self, n, x):
return _code.VecDouble_assign(self, n, x)
def resize(self, *args):
return _code.VecDouble_resize(self, *args)
def insert(self, *args):
return _code.VecDouble_insert(self, *args)
def reserve(self, n):
return _code.VecDouble_reserve(self, n)
def capacity(self):
return _code.VecDouble_capacity(self)
__swig_destroy__ = _code.delete_VecDouble
__del__ = lambda self: None
VecDouble_swigregister = _code.VecDouble_swigregister
VecDouble_swigregister(VecDouble)
class VecVecdouble(_object):
__swig_setmethods__ = {}
__setattr__ = lambda self, name, value: _swig_setattr(self, VecVecdouble, name, value)
__swig_getmethods__ = {}
__getattr__ = lambda self, name: _swig_getattr(self, VecVecdouble, name)
__repr__ = _swig_repr
def iterator(self):
return _code.VecVecdouble_iterator(self)
def __iter__(self):
return self.iterator()
def __nonzero__(self):
return _code.VecVecdouble___nonzero__(self)
def __bool__(self):
return _code.VecVecdouble___bool__(self)
def __len__(self):
return _code.VecVecdouble___len__(self)
def __getslice__(self, i, j):
return _code.VecVecdouble___getslice__(self, i, j)
def __setslice__(self, *args):
return _code.VecVecdouble___setslice__(self, *args)
def __delslice__(self, i, j):
return _code.VecVecdouble___delslice__(self, i, j)
def __delitem__(self, *args):
return _code.VecVecdouble___delitem__(self, *args)
def __getitem__(self, *args):
return _code.VecVecdouble___getitem__(self, *args)
def __setitem__(self, *args):
return _code.VecVecdouble___setitem__(self, *args)
def pop(self):
return _code.VecVecdouble_pop(self)
def append(self, x):
return _code.VecVecdouble_append(self, x)
def empty(self):
return _code.VecVecdouble_empty(self)
def size(self):
return _code.VecVecdouble_size(self)
def swap(self, v):
return _code.VecVecdouble_swap(self, v)
def begin(self):
return _code.VecVecdouble_begin(self)
def end(self):
return _code.VecVecdouble_end(self)
def rbegin(self):
return _code.VecVecdouble_rbegin(self)
def rend(self):
return _code.VecVecdouble_rend(self)
def clear(self):
return _code.VecVecdouble_clear(self)
def get_allocator(self):
return _code.VecVecdouble_get_allocator(self)
def pop_back(self):
return _code.VecVecdouble_pop_back(self)
def erase(self, *args):
return _code.VecVecdouble_erase(self, *args)
def __init__(self, *args):
this = _code.new_VecVecdouble(*args)
try:
self.this.append(this)
except __builtin__.Exception:
self.this = this
def push_back(self, x):
return _code.VecVecdouble_push_back(self, x)
def front(self):
return _code.VecVecdouble_front(self)
def back(self):
return _code.VecVecdouble_back(self)
def assign(self, n, x):
return _code.VecVecdouble_assign(self, n, x)
def resize(self, *args):
return _code.VecVecdouble_resize(self, *args)
def insert(self, *args):
return _code.VecVecdouble_insert(self, *args)
def reserve(self, n):
return _code.VecVecdouble_reserve(self, n)
def capacity(self):
return _code.VecVecdouble_capacity(self)
__swig_destroy__ = _code.delete_VecVecdouble
__del__ = lambda self: None
VecVecdouble_swigregister = _code.VecVecdouble_swigregister
VecVecdouble_swigregister(VecVecdouble)
def average(i_matrix):
return _code.average(i_matrix)
average = _code.average
# This file is compatible with both classic and new-style classes.
| 27.825316 | 92 | 0.666636 | 1,313 | 10,991 | 5.059406 | 0.140137 | 0.120428 | 0.082192 | 0.058859 | 0.596116 | 0.50414 | 0.47885 | 0.397561 | 0.291585 | 0.175373 | 0 | 0.002266 | 0.237012 | 10,991 | 394 | 93 | 27.895939 | 0.789888 | 0.02684 | 0 | 0.415225 | 1 | 0 | 0.016844 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.314879 | false | 0.00692 | 0.065744 | 0.287197 | 0.788927 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
c773b8a8855913f3b14d1f29870250ef036181cf | 1,065 | py | Python | pya/backend/base.py | thomas-hermann/pya | 7dccda7e40c39c09f53ef97eed4b10790faff0ad | [
"MIT"
] | 23 | 2019-10-18T07:48:44.000Z | 2022-01-24T06:51:17.000Z | pya/backend/base.py | dreinsch/pya | 7dccda7e40c39c09f53ef97eed4b10790faff0ad | [
"MIT"
] | 35 | 2019-09-16T18:56:23.000Z | 2021-04-03T09:53:09.000Z | pya/backend/base.py | dreinsch/pya | 7dccda7e40c39c09f53ef97eed4b10790faff0ad | [
"MIT"
] | 3 | 2020-07-07T19:27:19.000Z | 2021-06-20T15:17:53.000Z | from abc import abstractmethod, ABC
class BackendBase(ABC):
@abstractmethod
def get_device_count(self):
raise NotImplementedError
@abstractmethod
def get_device_info_by_index(self):
raise NotImplementedError
@abstractmethod
def get_default_output_device_info(self):
raise NotImplementedError
@abstractmethod
def get_default_input_device_info(self):
raise NotImplementedError
@abstractmethod
def open(self, *args, **kwargs):
raise NotImplementedError
@abstractmethod
def terminate(self):
raise NotImplementedError
@abstractmethod
def process_buffer(self, *args, **kwargs):
raise NotImplementedError
class StreamBase(ABC):
@abstractmethod
def is_active(self):
raise NotImplementedError
@abstractmethod
def start_stream(self):
raise NotImplementedError
@abstractmethod
def stop_stream(self):
raise NotImplementedError
@abstractmethod
def close(self):
raise NotImplementedError
| 20.480769 | 46 | 0.699531 | 99 | 1,065 | 7.343434 | 0.323232 | 0.257221 | 0.34663 | 0.507565 | 0.675378 | 0.447043 | 0.240715 | 0 | 0 | 0 | 0 | 0 | 0.243192 | 1,065 | 51 | 47 | 20.882353 | 0.901985 | 0 | 0 | 0.611111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.305556 | false | 0 | 0.027778 | 0 | 0.388889 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
c7938372f573665a1ea1d911a4aad413a4cc429e | 9 | py | Python | hdp/process/img/recognition.py | Open-Health-Data-Project/ohdp | 9b7e339ecf09cb9319ee4d507327701619b70317 | [
"BSD-3-Clause"
] | 10 | 2020-05-06T17:59:37.000Z | 2021-02-15T23:11:05.000Z | hdp/process/img/recognition.py | Open-Health-Data-Project/ohdp | 9b7e339ecf09cb9319ee4d507327701619b70317 | [
"BSD-3-Clause"
] | 2 | 2020-07-02T16:36:23.000Z | 2020-07-14T21:51:07.000Z | hdp/process/img/recognition.py | Open-Health-Data-Project/ohdp | 9b7e339ecf09cb9319ee4d507327701619b70317 | [
"BSD-3-Clause"
] | 11 | 2020-05-08T18:49:55.000Z | 2020-06-29T17:33:10.000Z | # Team 4
| 4.5 | 8 | 0.555556 | 2 | 9 | 2.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 0.333333 | 9 | 1 | 9 | 9 | 0.666667 | 0.666667 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
c79f42ec44757bad85071709751e9c50e18cdf07 | 411 | py | Python | BiliReminder/Apps/User/forms.py | HoBeedzc/BiliReminder | bf9f2bde7e41cb3c0816b39159d5a916b01609fa | [
"MIT"
] | 2 | 2021-02-22T15:01:36.000Z | 2021-12-17T03:15:02.000Z | BiliReminder/Apps/User/forms.py | HoBeedzc/BiliReminder | bf9f2bde7e41cb3c0816b39159d5a916b01609fa | [
"MIT"
] | null | null | null | BiliReminder/Apps/User/forms.py | HoBeedzc/BiliReminder | bf9f2bde7e41cb3c0816b39159d5a916b01609fa | [
"MIT"
] | null | null | null | from django import forms
class LoginForm(forms.Form):
uid = forms.CharField(required=True)
pwd = forms.CharField(required=True)
class SignupForm(forms.Form):
email = forms.CharField(required=True)
name = forms.CharField(required=True)
pwd = forms.CharField(required=True)
pwd2 = forms.CharField(required=True)
class LogoffForm(forms.Form):
pwd = forms.CharField(required=True)
| 22.833333 | 42 | 0.727494 | 51 | 411 | 5.862745 | 0.333333 | 0.327759 | 0.51505 | 0.608696 | 0.585284 | 0.367893 | 0.367893 | 0.367893 | 0.367893 | 0 | 0 | 0.00289 | 0.158151 | 411 | 17 | 43 | 24.176471 | 0.861272 | 0 | 0 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
c7a1bac432af2c592332d43f6838cc948070bf52 | 159 | py | Python | run.py | timkpaine/live-pnl-test | 04898a9552c3858d1960d602722f5e23c75037a7 | [
"Apache-2.0"
] | null | null | null | run.py | timkpaine/live-pnl-test | 04898a9552c3858d1960d602722f5e23c75037a7 | [
"Apache-2.0"
] | 4 | 2018-06-01T03:56:15.000Z | 2019-01-01T00:42:39.000Z | run.py | timkpaine/live-pnl-test | 04898a9552c3858d1960d602722f5e23c75037a7 | [
"Apache-2.0"
] | 1 | 2019-11-27T17:59:09.000Z | 2019-11-27T17:59:09.000Z | import os
from app import app, socketio
if __name__ == "__main__":
socketio.run(app, host='0.0.0.0', debug=True, port=int(os.environ.get('PORT', 8080)))
| 22.714286 | 89 | 0.679245 | 27 | 159 | 3.703704 | 0.666667 | 0.06 | 0.06 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058394 | 0.138365 | 159 | 6 | 90 | 26.5 | 0.671533 | 0 | 0 | 0 | 0 | 0 | 0.119497 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
c7d33b392ae41b64beeccadc3ef9b5a108ef8d0e | 9,646 | py | Python | ci/test_kernelsMatricesStore.py | joacorapela/svGPFA | 181c03f47a3b7023ac7f873c2ec98cba54a73feb | [
"MIT"
] | 2 | 2020-09-01T16:46:20.000Z | 2022-02-28T19:43:04.000Z | ci/test_kernelsMatricesStore.py | joacorapela/svGPFA | 181c03f47a3b7023ac7f873c2ec98cba54a73feb | [
"MIT"
] | 13 | 2020-12-22T20:16:28.000Z | 2021-06-01T16:18:00.000Z | ci/test_kernelsMatricesStore.py | joacorapela/svGPFA | 181c03f47a3b7023ac7f873c2ec98cba54a73feb | [
"MIT"
] | 1 | 2020-03-26T18:14:52.000Z | 2020-03-26T18:14:52.000Z |
import sys
import os
import pdb
import math
from scipy.io import loadmat
import numpy as np
import torch
sys.path.append("../src")
from stats.kernels import PeriodicKernel, ExponentialQuadraticKernel
from stats.svGPFA.kernelsMatricesStore import IndPointsLocsKMS, IndPointsLocsAndAllTimesKMS, IndPointsLocsAndAssocTimesKMS
def test_eval_IndPointsLocsKMS():
tol = 1e-5
tolKzzi = 6e-2
dataFilename = os.path.join(os.path.dirname(__file__), "data/BuildKernelMatrices.mat")
mat = loadmat(dataFilename)
nLatents = mat['Z'].shape[0]
nTrials = mat['Z'][0,0].shape[2]
Z0 = [torch.from_numpy(mat['Z'][(i,0)]).type(torch.DoubleTensor).permute(2,0,1) for i in range(nLatents)]
leasKzz = [torch.from_numpy(mat['Kzz'][(i,0)]).type(torch.DoubleTensor).permute(2,0,1) for i in range(nLatents)]
leasKzzi = [torch.from_numpy(mat['Kzzi'][(i,0)]).type(torch.DoubleTensor).permute(2,0,1) for i in range(nLatents)]
leasKtz = [torch.from_numpy(mat['Ktz'][(i,0)]).type(torch.DoubleTensor).permute(2,0,1) for i in range(nLatents)]
leasKttDiag = torch.from_numpy(mat['Ktt']).type(torch.DoubleTensor).permute(2, 0, 1)
kernelNames = mat["kernelNames"]
hprs = mat["hprs"]
kernels = [[None] for k in range(nLatents)]
kernelsParams0 = [[None] for k in range(nLatents)]
for k in range(nLatents):
if np.char.equal(kernelNames[0,k][0], "PeriodicKernel"):
kernels[k] = PeriodicKernel(scale=1.0)
kernelsParams0[k] = torch.tensor([float(hprs[k,0][0]),
float(hprs[k,0][1])],
dtype=torch.double)
elif np.char.equal(kernelNames[0,k][0], "rbfKernel"):
kernels[k] = ExponentialQuadraticKernel(scale=1.0)
kernelsParams0[k] = torch.tensor([float(hprs[k,0][0])],
dtype=torch.double)
else:
raise ValueError("Invalid kernel name: %s"%(kernelNames[k]))
dataFilename = os.path.join(os.path.dirname(__file__), "data/BuildKernelMatrices_fromSpikes.mat")
mat = loadmat(dataFilename)
Y = [torch.from_numpy(mat['Y'][tr,0]).type(torch.DoubleTensor) for tr in range(nTrials)]
leasKtz_spikes = [[torch.from_numpy(mat['Ktz'][i,j]).type(torch.DoubleTensor) for j in range(nTrials)] for i in range(nLatents)]
leasKttDiag_spikes = [[torch.from_numpy(mat['Ktt'][i,j]).type(torch.DoubleTensor) for j in range(nTrials)] for i in range(nLatents)]
kmsParams0 = {"kernelsParams0": kernelsParams0,
"inducingPointsLocs0": Z0}
indPointsLocsKMS = IndPointsLocsKMS()
indPointsLocsKMS.setKernels(kernels=kernels)
indPointsLocsKMS.setInitialParams(initialParams=kmsParams0)
indPointsLocsKMS.setEpsilon(epsilon=1e-5) # Fix: need to read indPointsLocsKMSEpsilon from Matlab's CI test data
indPointsLocsKMS.buildKernelsMatrices()
Kzz = indPointsLocsKMS.getKzz()
for k in range(len(Kzz)):
error = math.sqrt(((Kzz[k]-leasKzz[k])**2).flatten().mean())
assert(error<tol)
'''
Kzzi = indPointsLocsKMS.getKzzi()
for k in range(len(Kzzi)):
error = math.sqrt(((Kzzi[k]-leasKzzi[k])**2).flatten().mean())
assert(error<tolKzzi)
'''
def test_eval_IndPointsLocsAndAllTimesKMS():
tol = 1e-5
tolKzzi = 6e-2
dataFilename = os.path.join(os.path.dirname(__file__), "data/BuildKernelMatrices.mat")
mat = loadmat(dataFilename)
nLatents = mat['Z'].shape[0]
nTrials = mat['Z'][0,0].shape[2]
t = torch.from_numpy(mat['tt']).type(torch.DoubleTensor).permute(2, 0, 1)
Z0 = [torch.from_numpy(mat['Z'][(i,0)]).type(torch.DoubleTensor).permute(2,0,1) for i in range(nLatents)]
leasKzz = [torch.from_numpy(mat['Kzz'][(i,0)]).type(torch.DoubleTensor).permute(2,0,1) for i in range(nLatents)]
leasKzzi = [torch.from_numpy(mat['Kzzi'][(i,0)]).type(torch.DoubleTensor).permute(2,0,1) for i in range(nLatents)]
leasKtz = [torch.from_numpy(mat['Ktz'][(i,0)]).type(torch.DoubleTensor).permute(2,0,1) for i in range(nLatents)]
leasKttDiag = torch.from_numpy(mat['Ktt']).type(torch.DoubleTensor).permute(2, 0, 1)
kernelNames = mat["kernelNames"]
hprs = mat["hprs"]
kernels = [[None] for k in range(nLatents)]
kernelsParams0 = [[None] for k in range(nLatents)]
for k in range(nLatents):
if np.char.equal(kernelNames[0,k][0], "PeriodicKernel"):
kernels[k] = PeriodicKernel(scale=1.0)
kernelsParams0[k] = torch.tensor([float(hprs[k,0][0]),
float(hprs[k,0][1])],
dtype=torch.double)
elif np.char.equal(kernelNames[0,k][0], "rbfKernel"):
kernels[k] = ExponentialQuadraticKernel(scale=1.0)
kernelsParams0[k] = torch.tensor([float(hprs[k,0][0])],
dtype=torch.double)
else:
raise ValueError("Invalid kernel name: %s"%(kernelNames[k]))
dataFilename = os.path.join(os.path.dirname(__file__), "data/BuildKernelMatrices_fromSpikes.mat")
mat = loadmat(dataFilename)
Y = [torch.from_numpy(mat['Y'][tr,0]).type(torch.DoubleTensor) for tr in range(nTrials)]
leasKtz_spikes = [[torch.from_numpy(mat['Ktz'][i,j]).type(torch.DoubleTensor) for j in range(nTrials)] for i in range(nLatents)]
leasKttDiag_spikes = [[torch.from_numpy(mat['Ktt'][i,j]).type(torch.DoubleTensor) for j in range(nTrials)] for i in range(nLatents)]
kmsParams0 = {"kernelsParams0": kernelsParams0,
"inducingPointsLocs0": Z0}
indPointsLocsAndAllTimesKMS = IndPointsLocsAndAllTimesKMS()
indPointsLocsAndAllTimesKMS.setKernels(kernels=kernels)
indPointsLocsAndAllTimesKMS.setTimes(times=t)
indPointsLocsAndAllTimesKMS.setInitialParams(initialParams=kmsParams0)
indPointsLocsAndAllTimesKMS.buildKernelsMatrices()
Ktz_allTimes = indPointsLocsAndAllTimesKMS.getKtz()
for k in range(len(Ktz_allTimes)):
error = math.sqrt(((Ktz_allTimes[k]-leasKtz[k])**2).flatten().mean())
assert(error<tol)
KttDiag_allTimes = indPointsLocsAndAllTimesKMS.getKttDiag()
error = math.sqrt(((KttDiag_allTimes-leasKttDiag)**2).flatten().mean())
assert(error<tol)
def test_eval_IndPointsLocsAndAssocTimesKMS():
tol = 1e-5
tolKzzi = 6e-2
dataFilename = os.path.join(os.path.dirname(__file__), "data/BuildKernelMatrices.mat")
mat = loadmat(dataFilename)
nLatents = mat['Z'].shape[0]
nTrials = mat['Z'][0,0].shape[2]
t = torch.from_numpy(mat['tt']).type(torch.DoubleTensor).permute(2, 0, 1)
Z0 = [torch.from_numpy(mat['Z'][(i,0)]).type(torch.DoubleTensor).permute(2,0,1) for i in range(nLatents)]
leasKzz = [torch.from_numpy(mat['Kzz'][(i,0)]).type(torch.DoubleTensor).permute(2,0,1) for i in range(nLatents)]
leasKzzi = [torch.from_numpy(mat['Kzzi'][(i,0)]).type(torch.DoubleTensor).permute(2,0,1) for i in range(nLatents)]
leasKtz = [torch.from_numpy(mat['Ktz'][(i,0)]).type(torch.DoubleTensor).permute(2,0,1) for i in range(nLatents)]
leasKttDiag = torch.from_numpy(mat['Ktt']).type(torch.DoubleTensor).permute(2, 0, 1)
kernelNames = mat["kernelNames"]
hprs = mat["hprs"]
kernels = [[None] for k in range(nLatents)]
kernelsParams0 = [[None] for k in range(nLatents)]
for k in range(nLatents):
if np.char.equal(kernelNames[0,k][0], "PeriodicKernel"):
kernels[k] = PeriodicKernel(scale=1.0)
kernelsParams0[k] = torch.tensor([float(hprs[k,0][0]),
float(hprs[k,0][1])],
dtype=torch.double)
elif np.char.equal(kernelNames[0,k][0], "rbfKernel"):
kernels[k] = ExponentialQuadraticKernel(scale=1.0)
kernelsParams0[k] = torch.tensor([float(hprs[k,0][0])],
dtype=torch.double)
else:
raise ValueError("Invalid kernel name: %s"%(kernelNames[k]))
dataFilename = os.path.join(os.path.dirname(__file__), "data/BuildKernelMatrices_fromSpikes.mat")
mat = loadmat(dataFilename)
Y = [torch.from_numpy(mat['Y'][tr,0]).type(torch.DoubleTensor) for tr in range(nTrials)]
leasKtz_spikes = [[torch.from_numpy(mat['Ktz'][i,j]).type(torch.DoubleTensor) for j in range(nTrials)] for i in range(nLatents)]
leasKttDiag_spikes = [[torch.from_numpy(mat['Ktt'][i,j]).type(torch.DoubleTensor) for j in range(nTrials)] for i in range(nLatents)]
kmsParams0 = {"kernelsParams0": kernelsParams0,
"inducingPointsLocs0": Z0}
indPointsLocsAndAssocTimesKMS = IndPointsLocsAndAssocTimesKMS()
indPointsLocsAndAssocTimesKMS.setKernels(kernels=kernels)
indPointsLocsAndAssocTimesKMS.setTimes(times=Y)
indPointsLocsAndAssocTimesKMS.setInitialParams(initialParams=kmsParams0)
indPointsLocsAndAssocTimesKMS.buildKernelsMatrices()
Ktz_associatedTimes = indPointsLocsAndAssocTimesKMS.getKtz()
for k in range(nLatents):
for tr in range(nTrials):
error = math.sqrt(((Ktz_associatedTimes[k][tr]-leasKtz_spikes[k][tr])**2).flatten().mean())
assert(error<tol)
KttDiag_associatedTimes = indPointsLocsAndAssocTimesKMS.getKttDiag()
for k in range(nLatents):
for tr in range(nTrials):
error = math.sqrt(((KttDiag_associatedTimes[k][tr]-leasKttDiag_spikes[k][tr])**2).flatten().mean())
assert(error<tol)
if __name__=='__main__':
test_eval_IndPointsLocsKMS()
test_eval_IndPointsLocsAndAllTimesKMS()
test_eval_IndPointsLocsAndAssocTimesKMS()
| 50.768421 | 136 | 0.657164 | 1,188 | 9,646 | 5.260101 | 0.103535 | 0.048168 | 0.069611 | 0.070731 | 0.724756 | 0.716115 | 0.708113 | 0.697232 | 0.697232 | 0.68603 | 0 | 0.021795 | 0.191375 | 9,646 | 189 | 137 | 51.037037 | 0.779359 | 0.00705 | 0 | 0.719745 | 0 | 0 | 0.060677 | 0.021397 | 0 | 0 | 0 | 0 | 0.031847 | 1 | 0.019108 | false | 0 | 0.057325 | 0 | 0.076433 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
c7d8248cc0e26addc70430bfdd058ab6bc0777ff | 235 | py | Python | inbm-lib/tests/unit/inbm_vision_lib/test_path_prefixes.py | ahameedx/intel-inb-manageability | aca445fa4cef0b608e6e88e74476547e10c06073 | [
"Apache-2.0"
] | 5 | 2021-12-13T21:19:31.000Z | 2022-01-18T18:29:43.000Z | inbm-lib/tests/unit/inbm_vision_lib/test_path_prefixes.py | ahameedx/intel-inb-manageability | aca445fa4cef0b608e6e88e74476547e10c06073 | [
"Apache-2.0"
] | 45 | 2021-12-30T17:21:09.000Z | 2022-03-29T22:47:32.000Z | inbm-lib/tests/unit/inbm_vision_lib/test_path_prefixes.py | ahameedx/intel-inb-manageability | aca445fa4cef0b608e6e88e74476547e10c06073 | [
"Apache-2.0"
] | 4 | 2022-01-26T17:42:54.000Z | 2022-03-30T04:48:04.000Z | from unittest import TestCase
from mock import patch
class TestPathPrefixes(TestCase):
@patch('platform.system', return_value='Windows')
def test_client_creation(self, platform):
import inbm_vision_lib.path_prefixes
| 23.5 | 53 | 0.770213 | 29 | 235 | 6.034483 | 0.793103 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153191 | 235 | 9 | 54 | 26.111111 | 0.879397 | 0 | 0 | 0 | 0 | 0 | 0.093617 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.5 | 0 | 0.833333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
1bf84faa7e3c45eda282863998c46996250727c5 | 79 | py | Python | core/api/request/eval_request.py | rits-dajare/DaaS | ab8483250a1a2b2838c316ba71fdaf748130dff1 | [
"MIT"
] | 7 | 2020-07-20T12:03:06.000Z | 2021-05-22T15:57:18.000Z | core/api/request/eval_request.py | averak/DaaS | ab8483250a1a2b2838c316ba71fdaf748130dff1 | [
"MIT"
] | 19 | 2020-08-28T10:23:53.000Z | 2021-11-17T23:48:45.000Z | core/api/request/eval_request.py | averak/DaaS | ab8483250a1a2b2838c316ba71fdaf748130dff1 | [
"MIT"
] | 2 | 2020-08-08T21:20:01.000Z | 2021-05-20T01:37:46.000Z | from pydantic import BaseModel
class EvalRequest(BaseModel):
dajare: str
| 13.166667 | 30 | 0.772152 | 9 | 79 | 6.777778 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177215 | 79 | 5 | 31 | 15.8 | 0.938462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
1bfa8ae48cc09d84cb0e3fdb290358125d24227f | 106 | py | Python | djangovalidators/apps.py | jpic/django-validators | c7a017060befe6659aceaa0577cd32a04db93583 | [
"MIT"
] | 5 | 2019-03-19T11:25:17.000Z | 2020-08-29T05:35:37.000Z | djangovalidators/apps.py | jpic/django-validators | c7a017060befe6659aceaa0577cd32a04db93583 | [
"MIT"
] | null | null | null | djangovalidators/apps.py | jpic/django-validators | c7a017060befe6659aceaa0577cd32a04db93583 | [
"MIT"
] | 4 | 2020-05-15T16:15:08.000Z | 2020-08-19T18:35:16.000Z | from django.apps import AppConfig
class DurationwidgetConfig(AppConfig):
name = 'django_validators'
| 17.666667 | 38 | 0.792453 | 11 | 106 | 7.545455 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141509 | 106 | 5 | 39 | 21.2 | 0.912088 | 0 | 0 | 0 | 0 | 0 | 0.160377 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
4041d7acf078257ff501697242c162955f95e427 | 651 | py | Python | servidor/machine_learning/static_beacons.py | FelipeLimaM/ItsMyLife-Framework | c1d1ce89db1882a2594b126ac6407fca6d9255aa | [
"MIT"
] | null | null | null | servidor/machine_learning/static_beacons.py | FelipeLimaM/ItsMyLife-Framework | c1d1ce89db1882a2594b126ac6407fca6d9255aa | [
"MIT"
] | null | null | null | servidor/machine_learning/static_beacons.py | FelipeLimaM/ItsMyLife-Framework | c1d1ce89db1882a2594b126ac6407fca6d9255aa | [
"MIT"
] | null | null | null | '''
This class returns the hardcoded static beacons (for now)
'''
class StaticBeacons():
def build(self,df):
static = ["0C:F3:EE:09:3A:01", "0C:F3:EE:09:38:3C", "0C:F3:EE:09:39:55",
"0C:F3:EE:09:36:47", "0C:F3:EE:09:39:A4", "0C:F3:EE:09:38:EF",
"0C:F3:EE:09:35:E0", "20:91:48:DF:2D:09", "20:91:48:DF:06:B1",
"20:91:48:DE:EA:4B", "FD:3C:04:B6:A8:18", "DA:29:71:A1:D4:97",
"20:91:48:DE:E5:6E", "2C:31:21:5C:3F:78","0C:F3:EE:09:3A:01",
"0C:F3:EE:09:3A:11", "0C:F3:EE:09:39:A4", "0C:F3:EE:09:36:47" ,
"20:91:48:DF:02:5F" ]
return [s for s in static if s in df]
| 40.6875 | 80 | 0.520737 | 140 | 651 | 2.421429 | 0.464286 | 0.129794 | 0.19469 | 0.259587 | 0.330383 | 0.283186 | 0.235988 | 0.235988 | 0.235988 | 0.235988 | 0 | 0.284314 | 0.21659 | 651 | 15 | 81 | 43.4 | 0.380392 | 0.087558 | 0 | 0 | 0 | 0 | 0.551195 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
404b279b4d76afad23b17f7cb47117beb729bbd4 | 235 | py | Python | mezzanine_fluent_pages/mezzanine_page/apps.py | sjdines/mezzanine-fluent-pages | 43c804756acc9ba039846d2d9d6584fed3837f94 | [
"BSD-2-Clause"
] | 1 | 2016-05-04T12:05:29.000Z | 2016-05-04T12:05:29.000Z | mezzanine_fluent_pages/mezzanine_page/apps.py | sjdines/mezzanine-fluent-pages | 43c804756acc9ba039846d2d9d6584fed3837f94 | [
"BSD-2-Clause"
] | null | null | null | mezzanine_fluent_pages/mezzanine_page/apps.py | sjdines/mezzanine-fluent-pages | 43c804756acc9ba039846d2d9d6584fed3837f94 | [
"BSD-2-Clause"
] | null | null | null | from django.apps import AppConfig
class FluentMezzaninePageConfig(AppConfig):
"""
App configuration for the `mezzanine_page` app.
"""
label = 'fluent_mezzanine_page'
name = 'mezzanine_fluent_pages.mezzanine_page'
| 23.5 | 51 | 0.73617 | 25 | 235 | 6.68 | 0.68 | 0.233533 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178723 | 235 | 9 | 52 | 26.111111 | 0.865285 | 0.2 | 0 | 0 | 0 | 0 | 0.337209 | 0.337209 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
40621f74a7414673fdbe08f5c6ee5400d8a1fd78 | 112 | py | Python | django_orghierarchy/apps.py | City-of-Turku/django-orghierarchy | 5dd6b402b1954654c5d5c0791ef8db59f1f3db76 | [
"MIT"
] | 6 | 2019-12-27T15:01:14.000Z | 2022-02-03T19:19:19.000Z | django_orghierarchy/apps.py | City-of-Turku/django-orghierarchy | 5dd6b402b1954654c5d5c0791ef8db59f1f3db76 | [
"MIT"
] | 6 | 2017-11-14T11:51:39.000Z | 2019-12-13T09:00:38.000Z | django_orghierarchy/apps.py | City-of-Turku/django-orghierarchy | 5dd6b402b1954654c5d5c0791ef8db59f1f3db76 | [
"MIT"
] | 6 | 2017-11-08T13:34:53.000Z | 2022-02-03T19:19:22.000Z | from django.apps import AppConfig
class DjangoOrghierarchyConfig(AppConfig):
name = 'django_orghierarchy'
| 18.666667 | 42 | 0.803571 | 11 | 112 | 8.090909 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133929 | 112 | 5 | 43 | 22.4 | 0.917526 | 0 | 0 | 0 | 0 | 0 | 0.169643 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
4073cbc6777e860f562d4afc18e094d563269c67 | 26,274 | py | Python | xchem_db/models.py | TJGorrie/pipeline | 95d7e728a71ade41e2fc089777c61510a94eabed | [
"Apache-2.0"
] | null | null | null | xchem_db/models.py | TJGorrie/pipeline | 95d7e728a71ade41e2fc089777c61510a94eabed | [
"Apache-2.0"
] | null | null | null | xchem_db/models.py | TJGorrie/pipeline | 95d7e728a71ade41e2fc089777c61510a94eabed | [
"Apache-2.0"
] | null | null | null | # This is an auto-generated Django model module.
# You'll have to do the following manually to clean this up:
# * Rearrange models' order
# * Make sure each model has one field with primary_key=True
# * Make sure each ForeignKey has `on_delete` set to the desired behavior.
# * Remove `managed = False` lines if you wish to allow Django to create, modify, and delete the table
# Feel free to rename the models, but don't rename db_table values or field names.
from __future__ import unicode_literals
from django.db import models
from django.contrib.postgres.fields import ArrayField
import os
class Tasks(models.Model):
task_name = models.CharField(max_length=255, blank=False, null=False, unique=False, db_index=True)
uuid = models.CharField(max_length=37, blank=False, null=False, unique=True, db_index=True)
class Meta:
app_label = 'xchem_db'
db_table = 'tasks'
unique_together = ('task_name', 'uuid')
class Target(models.Model):
target_name = models.CharField(max_length=255, blank=False, null=False, unique=True, db_index=True)
class Meta:
app_label = 'xchem_db'
db_table = 'target'
class Compounds(models.Model):
smiles = models.CharField(max_length=255, blank=True, null=True, db_index=True, unique=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'compounds'
class Reference(models.Model):
reference_pdb = models.CharField(max_length=255, null=True, default='not_assigned', unique=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'reference'
class Proposals(models.Model):
# TODO - can we refactor this for title
proposal = models.CharField(max_length=255, blank=False, null=False, unique=True, db_index=True)
title = models.CharField(max_length=10, blank=True, null=True)
fedids = models.TextField(blank=True, null=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'proposals'
class SoakdbFiles(models.Model):
filename = models.CharField(max_length=255, blank=False, null=False, unique=True)
modification_date = models.BigIntegerField(blank=False, null=False)
proposal = models.ForeignKey(Proposals, on_delete=models.CASCADE, unique=False)
visit = models.TextField(blank=False, null=False)
status = models.IntegerField(blank=True, null=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'soakdb_files'
class Crystal(models.Model):
crystal_name = models.CharField(max_length=255, blank=False, null=False, db_index=True)
target = models.ForeignKey(Target, on_delete=models.CASCADE)
compound = models.ForeignKey(Compounds, on_delete=models.CASCADE, null=True, blank=True)
visit = models.ForeignKey(SoakdbFiles, on_delete=models.CASCADE)
product = models.CharField(max_length=255, blank=True, null=True)
# model types
PREPROCESSING = 'PP'
PANDDA = 'PD'
PROASIS = 'PR'
REFINEMENT = 'RE'
COMPCHEM = 'CC'
DEPOSITION = 'DP'
CHOICES = (
(PREPROCESSING, 'preprocessing'),
(PANDDA, 'pandda'),
(REFINEMENT, 'refinement'),
(COMPCHEM, 'comp_chem'),
(DEPOSITION, 'deposition')
)
status = models.CharField(choices=CHOICES, max_length=2, default=PREPROCESSING)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'crystal'
unique_together = ('crystal_name', 'visit', 'compound', 'product')
class DataProcessing(models.Model):
auto_assigned = models.TextField(blank=True, null=True)
cchalf_high = models.FloatField(blank=True, null=True)
cchalf_low = models.FloatField(blank=True, null=True)
cchalf_overall = models.FloatField(blank=True, null=True)
completeness_high = models.FloatField(blank=True, null=True)
completeness_low = models.FloatField(blank=True, null=True)
completeness_overall = models.FloatField(blank=True, null=True)
crystal_name = models.ForeignKey(Crystal, on_delete=models.CASCADE, unique=True) # changed to foreign key
dimple_mtz_path = models.TextField(blank=True, null=True)
dimple_pdb_path = models.TextField(blank=True, null=True)
dimple_status = models.TextField(blank=True, null=True)
image_path = models.TextField(blank=True, null=True)
isig_high = models.FloatField(blank=True, null=True)
isig_low = models.FloatField(blank=True, null=True)
isig_overall = models.FloatField(blank=True, null=True)
lattice = models.TextField(blank=True, null=True)
log_name = models.TextField(blank=True, null=True)
logfile_path = models.TextField(blank=True, null=True)
mtz_name = models.TextField(blank=True, null=True)
mtz_path = models.TextField(blank=True, null=True)
multiplicity_high = models.FloatField(blank=True, null=True)
multiplicity_low = models.FloatField(blank=True, null=True)
multiplicity_overall = models.FloatField(blank=True, null=True)
original_directory = models.TextField(blank=True, null=True)
point_group = models.TextField(blank=True, null=True)
program = models.TextField(blank=True, null=True)
r_cryst = models.FloatField(blank=True, null=True)
r_free = models.FloatField(blank=True, null=True)
r_merge_high = models.FloatField(blank=True, null=True)
r_merge_low = models.FloatField(blank=True, null=True)
r_merge_overall = models.FloatField(blank=True, null=True)
res_high = models.FloatField(blank=True, null=True)
res_high_15_sigma = models.FloatField(blank=True, null=True)
res_high_outer_shell = models.FloatField(blank=True, null=True)
res_low = models.FloatField(blank=True, null=True)
res_low_inner_shell = models.FloatField(blank=True, null=True)
res_overall = models.TextField(blank=True, null=True) # range
score = models.FloatField(blank=True, null=True)
spacegroup = models.TextField(blank=True, null=True)
status = models.TextField(blank=True, null=True)
unique_ref_overall = models.IntegerField(blank=True, null=True)
unit_cell = models.TextField(blank=True, null=True)
unit_cell_vol = models.FloatField(blank=True, null=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'data_processing'
class Dimple(models.Model):
crystal_name = models.ForeignKey(Crystal, on_delete=models.CASCADE, unique=True) # changed to foreign key
mtz_path = models.CharField(max_length=255, blank=True, null=True)
pdb_path = models.CharField(max_length=255, blank=True, null=True)
r_free = models.FloatField(blank=True, null=True)
res_high = models.FloatField(blank=True, null=True)
status = models.TextField(blank=True, null=True)
reference = models.ForeignKey(Reference, blank=True, null=True, on_delete=models.CASCADE)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'dimple'
unique_together = ('pdb_path', 'mtz_path')
class Lab(models.Model):
cryo_frac = models.FloatField(blank=True, null=True)
cryo_status = models.TextField(blank=True, null=True)
cryo_stock_frac = models.FloatField(blank=True, null=True)
cryo_transfer_vol = models.FloatField(blank=True, null=True)
crystal_name = models.ForeignKey(Crystal, on_delete=models.CASCADE, unique=True) # changed to foreign key
data_collection_visit = models.TextField(blank=True, null=True)
expr_conc = models.FloatField(blank=True, null=True)
harvest_status = models.TextField(blank=True, null=True)
library_name = models.TextField(blank=True, null=True)
library_plate = models.TextField(blank=True, null=True)
mounting_result = models.TextField(blank=True, null=True)
mounting_time = models.TextField(blank=True, null=True)
soak_status = models.TextField(blank=True, null=True)
soak_time = models.TextField(blank=True, null=True)
soak_vol = models.FloatField(blank=True, null=True)
solv_frac = models.FloatField(blank=True, null=True)
stock_conc = models.FloatField(blank=True, null=True)
visit = models.TextField(blank=True, null=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'lab'
class Refinement(models.Model):
bound_conf = models.CharField(max_length=255, blank=True, null=True, unique=True)
cif = models.TextField(blank=True, null=True)
cif_prog = models.TextField(blank=True, null=True)
cif_status = models.TextField(blank=True, null=True)
crystal_name = models.ForeignKey(Crystal, on_delete=models.CASCADE, unique=True) # changed to foreign key
lig_bound_conf = models.TextField(blank=True, null=True)
lig_cc = models.TextField(blank=True, null=True)
lig_confidence = models.TextField(blank=True, null=True)
lig_confidence_int = models.IntegerField(blank=True, null=True)
lig_confidence_string = models.TextField(blank=True, null=True)
matrix_weight = models.TextField(blank=True, null=True)
molprobity_score = models.FloatField(blank=True, null=True)
mtz_free = models.TextField(blank=True, null=True)
mtz_latest = models.TextField(blank=True, null=True)
outcome = models.IntegerField(blank=True, null=True)
pdb_latest = models.TextField(blank=True, null=True)
r_free = models.FloatField(blank=True, null=True)
ramachandran_favoured = models.TextField(blank=True, null=True)
ramachandran_outliers = models.TextField(blank=True, null=True)
rcryst = models.FloatField(blank=True, null=True)
refinement_path = models.TextField(blank=True, null=True)
res = models.FloatField(blank=True, null=True)
rmsd_angles = models.TextField(blank=True, null=True)
rmsd_bonds = models.TextField(blank=True, null=True)
spacegroup = models.TextField(blank=True, null=True)
status = models.TextField(blank=True, null=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'refinement'
class ProasisHits(models.Model):
refinement = models.ForeignKey(Refinement, on_delete=models.CASCADE)
pdb_file = models.TextField(blank=False, null=False)
crystal_name = models.ForeignKey(Crystal, on_delete=models.CASCADE) # changed to foreign key
modification_date = models.TextField(blank=True, null=True)
strucid = models.TextField(blank=True, null=True)
ligand_list = models.TextField(blank=True, null=True)
mtz = models.TextField(blank=False, null=False)
two_fofc = models.TextField(blank=False, null=False)
fofc = models.TextField(blank=False, null=False)
sdf = models.TextField(blank=True, null=True)
altconf = models.CharField(max_length=255, blank=True, null=True)
added = models.DateTimeField(auto_now_add=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'proasis_hits'
unique_together = ('refinement', 'crystal_name', 'altconf')
class LigandEdstats(models.Model):
baa = models.FloatField(blank=True, null=True) # Field name made lowercase.
ccpa = models.FloatField(blank=True, null=True) # Field name made lowercase.
ccsa = models.FloatField(blank=True, null=True) # Field name made lowercase.
npa = models.FloatField(blank=True, null=True) # Field name made lowercase.
rga = models.FloatField(blank=True, null=True) # Field name made lowercase.
ra = models.FloatField(blank=True, null=True) # Field name made lowercase.
srga = models.FloatField(blank=True, null=True) # Field name made lowercase.
zccpa = models.FloatField(blank=True, null=True) # Field name made lowercase.
zd_a = models.FloatField(blank=True, null=True) # Field name made lowercase.
zd_a_0 = models.FloatField(blank=True, null=True) # Field name made lowercase.
zda = models.FloatField(blank=True, null=True) # Field name made lowercase.
zoa = models.FloatField(blank=True, null=True) # Field name made lowercase.
crystal_name = models.ForeignKey(Crystal, on_delete=models.CASCADE) # changed to foreign key # changed from crystal
ligand = models.CharField(max_length=255, blank=True, null=True)
strucid = models.ForeignKey(ProasisHits, on_delete=models.CASCADE)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'ligand_edstats'
unique_together = ('crystal_name', 'ligand', 'strucid')
class ProasisLeads(models.Model):
reference_pdb = models.ForeignKey(Reference, to_field='reference_pdb', on_delete=models.CASCADE, unique=True)
strucid = models.CharField(max_length=255, blank=True, null=True, unique=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'proasis_leads'
class PanddaAnalysis(models.Model):
pandda_dir = models.CharField(max_length=255, unique=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'pandda_analysis'
class PanddaRun(models.Model):
input_dir = models.TextField(blank=True, null=True)
pandda_analysis = models.ForeignKey(PanddaAnalysis, on_delete=models.CASCADE)
pandda_log = models.CharField(max_length=255, unique=True)
pandda_version = models.TextField(blank=True, null=True)
sites_file = models.TextField(blank=True, null=True)
events_file = models.TextField(blank=True, null=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'pandda_run'
class PanddaStatisticalMap(models.Model):
resolution_from = models.FloatField(blank=True, null=True)
resolution_to = models.FloatField(blank=True, null=True)
dataset_list = models.TextField()
pandda_run = models.ForeignKey(PanddaRun, on_delete=models.CASCADE)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'pandda_statistical_map'
unique_together = ('resolution_from', 'resolution_to', 'pandda_run')
class PanddaSite(models.Model):
pandda_run = models.ForeignKey(PanddaRun, on_delete=models.CASCADE)
site = models.IntegerField(blank=True, null=True, db_index=True)
site_aligned_centroid_x = models.FloatField(blank=True, null=True)
site_aligned_centroid_y = models.FloatField(blank=True, null=True)
site_aligned_centroid_z = models.FloatField(blank=True, null=True)
site_native_centroid_x = models.FloatField(blank=True, null=True)
site_native_centroid_y = models.FloatField(blank=True, null=True)
site_native_centroid_z = models.FloatField(blank=True, null=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'pandda_site'
unique_together = ('pandda_run', 'site')
class PanddaEvent(models.Model):
crystal = models.ForeignKey(Crystal, on_delete=models.CASCADE)
site = models.ForeignKey(PanddaSite, on_delete=models.CASCADE)
refinement = models.ForeignKey(Refinement, on_delete=models.CASCADE)
data_proc = models.ForeignKey(DataProcessing, on_delete=models.CASCADE)
pandda_run = models.ForeignKey(PanddaRun, on_delete=models.CASCADE)
event = models.IntegerField(blank=True, null=True, db_index=True)
event_centroid_x = models.FloatField(blank=True, null=True)
event_centroid_y = models.FloatField(blank=True, null=True)
event_centroid_z = models.FloatField(blank=True, null=True)
event_dist_from_site_centroid = models.TextField(blank=True, null=True)
lig_centroid_x = models.FloatField(blank=True, null=True)
lig_centroid_y = models.FloatField(blank=True, null=True)
lig_centroid_z = models.FloatField(blank=True, null=True)
lig_dist_event = models.FloatField(blank=True, null=True)
lig_id = models.TextField(blank=True, null=True)
pandda_event_map_native = models.TextField(blank=True, null=True)
pandda_event_map_cut = models.TextField(blank=True, null=True)
pandda_model_pdb = models.TextField(blank=True, null=True)
pandda_input_mtz = models.TextField(blank=True, null=True)
pandda_input_pdb = models.TextField(blank=True, null=True)
ligand_confidence_inspect = models.TextField(blank=True, null=True)
ligand_confidence = models.TextField(blank=True, null=True)
comment = models.TextField(blank=True, null=True)
interesting = models.BooleanField()
event_status = models.TextField(blank=True, null=True)
created_date = models.DateTimeField(auto_now_add=True, null=True)
modified_date = models.DateTimeField(auto_now=True, null=True)
# model types
NONE = 'NA'
SOAKDB = 'SD'
FRAGSPECT = 'FS'
CHOICES = (
(NONE, 'none'),
(SOAKDB, 'soak_db'),
(FRAGSPECT, 'fragspect')
)
ligand_confidence_source = models.CharField(choices=CHOICES, max_length=2, default=NONE)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'pandda_event'
unique_together = ('site', 'event', 'crystal', 'pandda_run')
class PanddaEventStats(models.Model):
event = models.ForeignKey(PanddaEvent, on_delete=models.CASCADE)
one_minus_bdc = models.FloatField(blank=True, null=True)
cluster_size = models.IntegerField(blank=True, null=True)
glob_corr_av_map = models.FloatField(blank=True, null=True)
glob_corr_mean_map = models.FloatField(blank=True, null=True)
loc_corr_av_map = models.FloatField(blank=True, null=True)
loc_corr_mean_map = models.FloatField(blank=True, null=True)
z_mean = models.FloatField(blank=True, null=True)
z_peak = models.FloatField(blank=True, null=True)
b_factor_scaled = models.FloatField(blank=True, null=True)
high_res = models.FloatField(blank=True, null=True)
low_res = models.FloatField(blank=True, null=True)
r_free = models.FloatField(blank=True, null=True)
r_work = models.FloatField(blank=True, null=True)
ref_rmsd = models.FloatField(blank=True, null=True)
wilson_scaled_b = models.FloatField(blank=True, null=True)
wilson_scaled_ln_dev = models.FloatField(blank=True, null=True)
wilson_scaled_ln_dev_z = models.FloatField(blank=True, null=True)
wilson_scaled_ln_rmsd = models.FloatField(blank=True, null=True)
wilson_scaled_ln_rmsd_z = models.FloatField(blank=True, null=True)
wilson_scaled_below_four_rmsd = models.FloatField(blank=True, null=True)
wilson_scaled_below_four_rmsd_z = models.FloatField(blank=True, null=True)
wilson_scaled_above_four_rmsd = models.FloatField(blank=True, null=True)
wilson_scaled_above_four_rmsd_z = models.FloatField(blank=True, null=True)
wilson_scaled_rmsd_all = models.FloatField(blank=True, null=True)
wilson_scaled_rmsd_all_z = models.FloatField(blank=True, null=True)
wilson_unscaled = models.FloatField(blank=True, null=True)
wilson_unscaled_ln_dev = models.FloatField(blank=True, null=True)
wilson_unscaled_ln_dev_z = models.FloatField(blank=True, null=True)
wilson_unscaled_ln_rmsd = models.FloatField(blank=True, null=True)
wilson_unscaled_ln_rmsd_z = models.FloatField(blank=True, null=True)
wilson_unscaled_below_four_rmsd = models.FloatField(blank=True, null=True)
wilson_unscaled_below_four_rmsd_z = models.FloatField(blank=True, null=True)
wilson_unscaled_above_four_rmsd = models.FloatField(blank=True, null=True)
wilson_unscaled_above_four_rmsd_z = models.FloatField(blank=True, null=True)
wilson_unscaled_rmsd_all = models.FloatField(blank=True, null=True)
wilson_unscaled_rmsd_all_z = models.FloatField(blank=True, null=True)
resolution = models.FloatField(blank=True, null=True)
map_uncertainty = models.FloatField(blank=True, null=True)
obs_map_mean = models.FloatField(blank=True, null=True)
obs_map_rms = models.FloatField(blank=True, null=True)
z_map_kurt = models.FloatField(blank=True, null=True)
z_map_mean = models.FloatField(blank=True, null=True)
z_map_skew = models.FloatField(blank=True, null=True)
z_map_std = models.FloatField(blank=True, null=True)
scl_map_mean = models.FloatField(blank=True, null=True)
scl_map_rms = models.FloatField(blank=True, null=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'pandda_event_stats'
class ProasisPandda(models.Model):
crystal = models.ForeignKey(Crystal, on_delete=models.CASCADE)
hit = models.ForeignKey(ProasisHits, on_delete=models.CASCADE)
event = models.ForeignKey(PanddaEvent, on_delete=models.CASCADE)
event_map_native = models.TextField(blank=False, null=False)
model_pdb = models.TextField(blank=False, null=False)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'proasis_pandda'
unique_together = ('crystal', 'hit', 'event')
class ProasisOut(models.Model):
crystal = models.ForeignKey(Crystal, on_delete=models.CASCADE) # done
proasis = models.ForeignKey(ProasisHits, on_delete=models.CASCADE) # done
ligand = models.CharField(max_length=255, blank=False, null=False) # done
ligid = models.IntegerField(blank=True, null=True)
root = models.TextField(blank=True, null=True) # root directory for all crystals in this project (target)
start = models.TextField(blank=True, null=True) # directory name for this crystal within root
curated = models.TextField(blank=True, null=True) # done
sdf = models.TextField(blank=True, null=True) # done
apo = models.TextField(blank=True, null=True) # done
mol = models.TextField(blank=True, null=True) # done
mol2 = models.TextField(blank=True, null=True) # done
h_mol = models.TextField(blank=True, null=True) # done
stripped = models.TextField(blank=True, null=True) # done
event = models.TextField(blank=True, null=True)
mtz = models.TextField(blank=True, null=True)
contacts = models.TextField(blank=True, null=True) # done
acc = models.TextField(blank=True, null=True)
don = models.TextField(blank=True, null=True)
lip = models.TextField(blank=True, null=True)
pmap = models.TextField(blank=True, null=True)
ppdb = models.TextField(blank=True, null=True)
pjson = models.TextField(blank=True, null=True)
pmtz = models.TextField(blank=True, null=True)
added = models.DateTimeField(auto_now_add=True)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'proasis_out'
unique_together = ('crystal', 'proasis', 'ligand', 'ligid')
class Occupancy(models.Model):
crystal = models.ForeignKey(Crystal, on_delete=models.CASCADE)
refinement = models.ForeignKey(Refinement, on_delete=models.CASCADE)
refine_log = models.TextField(blank=True, null=True)
all_occupancy = ArrayField(models.FloatField())
occupancy = models.FloatField(blank=True, null=True)
occupancy_group = models.IntegerField(blank=True, null=True)
complete_group = models.TextField(blank=True, null=True)
resid = models.IntegerField(blank=True, null=True)
alte = models.CharField(max_length=1, blank=True, null=True)
state = models.CharField(max_length=7, blank=True, null=True)
resname = models.CharField(max_length=3, blank=True, null=True)
comment = models.TextField(blank=True, null=True)
edited = models.DateTimeField(auto_now=True)
added = models.DateTimeField(auto_now_add=True)
class ConvergenceRefinement(Refinement):
""" Refinement with more cycles to convergence
Notes
------------------------------
Inheritance
https://godjango.com/blog/django-abstract-base-class-multi-table-inheritance/"""
orignal_refinement = models.ForeignKey(Refinement, on_delete=models.CASCADE, related_name='+')
success = models.NullBooleanField(null=True)
cycles = models.IntegerField(null=True, blank=True)
error = models.TextField(blank=True, null=True)
class ConvergenceOccupancy(Occupancy):
convergence_refinement = models.ForeignKey(ConvergenceRefinement, on_delete=models.CASCADE, related_name='+')
class NonSuperposedRefinement(Refinement):
orignal_refinement = models.ForeignKey(Refinement, on_delete=models.CASCADE, related_name='+')
class NonSuperposedOccupancy(Occupancy):
nonsuper_refinement = models.ForeignKey(NonSuperposedRefinement, on_delete=models.CASCADE, related_name='+')
class ReviewResponses(models.Model):
crystal = models.ForeignKey(Crystal, on_delete=models.CASCADE) # This may not be correctly linked in psql...
fedid = models.TextField(blank=False, null=False)
decision_int = models.IntegerField(blank=False, null=False)
decision_str = models.TextField(blank=False, null=False)
reason = models.TextField(blank=False, null=False)
time_submitted = models.IntegerField(blank=False, null=False)
class Meta:
if os.getcwd() != '/dls/science/groups/i04-1/software/luigi_pipeline/pipelineDEV':
app_label = 'xchem_db'
db_table = 'review_responses'
| 47.086022 | 120 | 0.724671 | 3,454 | 26,274 | 5.35524 | 0.115229 | 0.098611 | 0.145321 | 0.204033 | 0.782613 | 0.761367 | 0.671028 | 0.527707 | 0.431638 | 0.369033 | 0 | 0.005741 | 0.158065 | 26,274 | 557 | 121 | 47.170557 | 0.830433 | 0.052485 | 0 | 0.242697 | 1 | 0 | 0.083451 | 0.052479 | 0 | 0 | 0 | 0.001795 | 0 | 1 | 0 | false | 0 | 0.008989 | 0 | 0.804494 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
4084d37358eca1aab0f46b1693bc1dc5e34986ec | 6,675 | py | Python | tests/components/climate/test_device_action.py | miccico/core | 14c205384171dee59c1a908f8449f9864778b2dc | [
"Apache-2.0"
] | 6 | 2017-08-02T19:26:39.000Z | 2020-03-14T22:47:41.000Z | tests/components/climate/test_device_action.py | miccico/core | 14c205384171dee59c1a908f8449f9864778b2dc | [
"Apache-2.0"
] | 58 | 2020-08-03T07:33:02.000Z | 2022-03-31T06:02:05.000Z | tests/components/climate/test_device_action.py | miccico/core | 14c205384171dee59c1a908f8449f9864778b2dc | [
"Apache-2.0"
] | 14 | 2018-08-19T16:28:26.000Z | 2021-09-02T18:26:53.000Z | """The tests for Climate device actions."""
import pytest
import voluptuous_serialize
import homeassistant.components.automation as automation
from homeassistant.components.climate import DOMAIN, const, device_action
from homeassistant.helpers import config_validation as cv, device_registry
from homeassistant.setup import async_setup_component
from tests.common import (
MockConfigEntry,
assert_lists_same,
async_get_device_automations,
async_mock_service,
mock_device_registry,
mock_registry,
)
from tests.components.blueprint.conftest import stub_blueprint_populate # noqa
@pytest.fixture
def device_reg(hass):
"""Return an empty, loaded, registry."""
return mock_device_registry(hass)
@pytest.fixture
def entity_reg(hass):
"""Return an empty, loaded, registry."""
return mock_registry(hass)
async def test_get_actions(hass, device_reg, entity_reg):
"""Test we get the expected actions from a climate."""
config_entry = MockConfigEntry(domain="test", data={})
config_entry.add_to_hass(hass)
device_entry = device_reg.async_get_or_create(
config_entry_id=config_entry.entry_id,
connections={(device_registry.CONNECTION_NETWORK_MAC, "12:34:56:AB:CD:EF")},
)
entity_reg.async_get_or_create(DOMAIN, "test", "5678", device_id=device_entry.id)
hass.states.async_set("climate.test_5678", const.HVAC_MODE_COOL, {})
hass.states.async_set("climate.test_5678", "attributes", {"supported_features": 17})
expected_actions = [
{
"domain": DOMAIN,
"type": "set_hvac_mode",
"device_id": device_entry.id,
"entity_id": "climate.test_5678",
},
{
"domain": DOMAIN,
"type": "set_preset_mode",
"device_id": device_entry.id,
"entity_id": "climate.test_5678",
},
]
actions = await async_get_device_automations(hass, "action", device_entry.id)
assert_lists_same(actions, expected_actions)
async def test_get_action_hvac_only(hass, device_reg, entity_reg):
"""Test we get the expected actions from a climate."""
config_entry = MockConfigEntry(domain="test", data={})
config_entry.add_to_hass(hass)
device_entry = device_reg.async_get_or_create(
config_entry_id=config_entry.entry_id,
connections={(device_registry.CONNECTION_NETWORK_MAC, "12:34:56:AB:CD:EF")},
)
entity_reg.async_get_or_create(DOMAIN, "test", "5678", device_id=device_entry.id)
hass.states.async_set("climate.test_5678", const.HVAC_MODE_COOL, {})
hass.states.async_set("climate.test_5678", "attributes", {"supported_features": 1})
expected_actions = [
{
"domain": DOMAIN,
"type": "set_hvac_mode",
"device_id": device_entry.id,
"entity_id": "climate.test_5678",
},
]
actions = await async_get_device_automations(hass, "action", device_entry.id)
assert_lists_same(actions, expected_actions)
async def test_action(hass):
"""Test for actions."""
hass.states.async_set(
"climate.entity",
const.HVAC_MODE_COOL,
{
const.ATTR_HVAC_MODES: [const.HVAC_MODE_COOL, const.HVAC_MODE_OFF],
const.ATTR_PRESET_MODES: [const.PRESET_HOME, const.PRESET_AWAY],
},
)
assert await async_setup_component(
hass,
automation.DOMAIN,
{
automation.DOMAIN: [
{
"trigger": {
"platform": "event",
"event_type": "test_event_set_hvac_mode",
},
"action": {
"domain": DOMAIN,
"device_id": "abcdefgh",
"entity_id": "climate.entity",
"type": "set_hvac_mode",
"hvac_mode": const.HVAC_MODE_OFF,
},
},
{
"trigger": {
"platform": "event",
"event_type": "test_event_set_preset_mode",
},
"action": {
"domain": DOMAIN,
"device_id": "abcdefgh",
"entity_id": "climate.entity",
"type": "set_preset_mode",
"preset_mode": const.PRESET_AWAY,
},
},
]
},
)
set_hvac_mode_calls = async_mock_service(hass, "climate", "set_hvac_mode")
set_preset_mode_calls = async_mock_service(hass, "climate", "set_preset_mode")
hass.bus.async_fire("test_event_set_hvac_mode")
await hass.async_block_till_done()
assert len(set_hvac_mode_calls) == 1
assert len(set_preset_mode_calls) == 0
hass.bus.async_fire("test_event_set_preset_mode")
await hass.async_block_till_done()
assert len(set_hvac_mode_calls) == 1
assert len(set_preset_mode_calls) == 1
async def test_capabilities(hass):
"""Test getting capabilities."""
hass.states.async_set(
"climate.entity",
const.HVAC_MODE_COOL,
{
const.ATTR_HVAC_MODES: [const.HVAC_MODE_COOL, const.HVAC_MODE_OFF],
const.ATTR_PRESET_MODES: [const.PRESET_HOME, const.PRESET_AWAY],
},
)
# Set HVAC mode
capabilities = await device_action.async_get_action_capabilities(
hass,
{
"domain": DOMAIN,
"device_id": "abcdefgh",
"entity_id": "climate.entity",
"type": "set_hvac_mode",
},
)
assert capabilities and "extra_fields" in capabilities
assert voluptuous_serialize.convert(
capabilities["extra_fields"], custom_serializer=cv.custom_serializer
) == [
{
"name": "hvac_mode",
"options": [("cool", "cool"), ("off", "off")],
"required": True,
"type": "select",
}
]
# Set preset mode
capabilities = await device_action.async_get_action_capabilities(
hass,
{
"domain": DOMAIN,
"device_id": "abcdefgh",
"entity_id": "climate.entity",
"type": "set_preset_mode",
},
)
assert capabilities and "extra_fields" in capabilities
assert voluptuous_serialize.convert(
capabilities["extra_fields"], custom_serializer=cv.custom_serializer
) == [
{
"name": "preset_mode",
"options": [("home", "home"), ("away", "away")],
"required": True,
"type": "select",
}
]
| 32.881773 | 88 | 0.594607 | 723 | 6,675 | 5.146611 | 0.160443 | 0.047299 | 0.032518 | 0.029024 | 0.746842 | 0.741467 | 0.733674 | 0.718624 | 0.675625 | 0.651975 | 0 | 0.011621 | 0.290936 | 6,675 | 202 | 89 | 33.044554 | 0.774562 | 0.021423 | 0 | 0.514793 | 0 | 0 | 0.164384 | 0.015746 | 0 | 0 | 0 | 0 | 0.071006 | 1 | 0.011834 | false | 0 | 0.047337 | 0 | 0.071006 | 0.005917 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
40869fc64c7b4f47893154af2bdf0e5cd337ecd8 | 90 | py | Python | yir/relay/__init__.py | zhiqwang/yir | 5dae3060e40ccc24804ef38591a03d4ca36b38c1 | [
"Apache-2.0"
] | 4 | 2021-11-16T08:54:02.000Z | 2021-11-16T10:39:31.000Z | yir/relay/__init__.py | zhiqwang/huluir | 5dae3060e40ccc24804ef38591a03d4ca36b38c1 | [
"Apache-2.0"
] | 4 | 2021-12-06T19:02:54.000Z | 2022-03-07T07:14:48.000Z | yir/relay/__init__.py | zhiqwang/huluir | 5dae3060e40ccc24804ef38591a03d4ca36b38c1 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2021, Zhiqiang Wang. All Rights Reserved.
from .dot import JitVisualizer
| 22.5 | 57 | 0.766667 | 12 | 90 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 0.155556 | 90 | 3 | 58 | 30 | 0.855263 | 0.611111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
409d52edcc2a0ccd941f44d0d5abeeaec3cbd1d1 | 4,164 | py | Python | test/python/testfiledatabase.py | neuml/paperetl | 5b2737db07c88d0249c9768b67e6fa8eb344eb6d | [
"Apache-2.0"
] | 95 | 2020-07-10T17:51:14.000Z | 2022-03-30T21:03:59.000Z | test/python/testfiledatabase.py | neuml/paperetl | 5b2737db07c88d0249c9768b67e6fa8eb344eb6d | [
"Apache-2.0"
] | 40 | 2020-07-24T20:07:44.000Z | 2021-12-17T18:09:53.000Z | test/python/testfiledatabase.py | neuml/paperetl | 5b2737db07c88d0249c9768b67e6fa8eb344eb6d | [
"Apache-2.0"
] | 12 | 2020-07-16T01:16:51.000Z | 2022-02-19T19:01:48.000Z | """
File ETL to database tests
"""
import sqlite3
# pylint: disable=E0401
from paperetl.file.execute import Execute
from testprocess import TestProcess
from utils import Utils
class TestFileDatabase(TestProcess):
"""
File ETL to database tests
"""
@classmethod
def setUpClass(cls):
"""
One-time initialization. Run File ETL process for the test dataset.
"""
# Build articles database
Execute.run(Utils.FILE + "/data", Utils.FILE + "/models", Utils.STUDY)
def setUp(self):
"""
Initialization run before each test. Opens a database connection.
"""
# Connect to articles database
self.db = sqlite3.connect(Utils.FILE + "/models/articles.sqlite")
# Create database cursor
self.cur = self.db.cursor()
def testArticleCount(self):
"""
Test number of articles
"""
self.articleCount(17)
def testArticles(self):
"""
Test article metadata
"""
hashes = {"1000": "6fe045e72f58c43e6f10275b1498cc54",
"1001": "c4d0e20fc00eb09ccadaf6a6ba3d6d0d",
"00398e4c637f5e5447e35e63669187f0239c0357": "3d33d69c9000a17a458d9295ab1e7457",
"00c4c8c42473d25ebb38c4a8a14200c6900be2e9": "0285f5accebeff1397fbdd5aa9ee51c4",
"17a845a8681cca77a4497462e797172148448d7d": "dcedb273321b4e936bcc0c7b8b85488c",
"1d6a755d67e76049551898de66c95f77b9420b0c": "0346bb2747c4d4c1e06284cdba67c5df",
"3d2fb136bbd9bd95f86fc49bdcf5ad08ada6913b": "54c9504e3b83b23cdd973d501528f5be",
"5ea7c57e339a078196ec69223c4681fd7a5aab8b": "e3374c8a38f198c136e553642ca4a747",
"6cb7a79749913fa0c2c3748cbfee2f654d5cea36": "ff7ef702bdc5908009c37bdb6b5fd1d5",
"a09f0fcf41e01f2cdb5685b5000964797f679132": "1ef532b4d24ce367f5a1600ee9c93bef",
"b9f6e3d2dd7d18902ac3a538789d836793dd48b2": "4f60700131c5015cee997b7a1b74948f",
"dff0088d65a56e2673d11ad2f7a180687cab6f70": "d6a6cf1830e6088b8e63fb897edc7a2c",
"33024096": "498a6bb5ab19795ae4f8243f18748309",
"33046957": "734487f06cd1765d6f3cd8980ba91881",
"33100476": "b38f08aa1694b850bf0efab9c01db89d",
"33126180": "3dfe0a75a817edac9472bc10c8127a0c",
"33268238": "4c6440d7532785c8486882daafbd183e"}
self.articles(hashes)
def testSectionCount(self):
"""
Test number of sections
"""
self.sectionCount(3640)
def testSections(self):
"""
Test section content
"""
hashes = {"1000": "2ab25b41ef2b0ff011fad6ed5978cb3b",
"1001": "000e1842e52760c4596e6e8db3b53be8",
"00398e4c637f5e5447e35e63669187f0239c0357": "d6e428814a980bdbed5f50b2669f6785",
"00c4c8c42473d25ebb38c4a8a14200c6900be2e9": "cfd8bf5cffec86b836e497ac4973ea05",
"17a845a8681cca77a4497462e797172148448d7d": "422895ced816b2686ed4ea34c62e0f3a",
"1d6a755d67e76049551898de66c95f77b9420b0c": "644e14e17727715a205f1f92a5d98d83",
"3d2fb136bbd9bd95f86fc49bdcf5ad08ada6913b": "a3429b2fce7efa296282cf8923a37fe6",
"5ea7c57e339a078196ec69223c4681fd7a5aab8b": "4ec8a19839db5adf64c50b250d8c1081",
"6cb7a79749913fa0c2c3748cbfee2f654d5cea36": "d9f536cbca1abaab53682423d0b51af5",
"a09f0fcf41e01f2cdb5685b5000964797f679132": "f04bb10cd15dea0180e9ae610a8ada9a",
"b9f6e3d2dd7d18902ac3a538789d836793dd48b2": "79581ee4ec2f6b6339afba4df559c927",
"dff0088d65a56e2673d11ad2f7a180687cab6f70": "d214010dbae4fff941875b4dfe57ee71",
"33024096": "4507f361159aa92f0b2969f2bda62183",
"33046957": "555c48f3611dbb12416588fdc2551e46",
"33100476": "1b70d681039f3907266df6cb934400cb",
"33126180": "0d182d334545b11561271247350a0ca0",
"33268238": "d46b8f96bcc6d59c2d55d5bb5c257a8b"}
self.sections(hashes)
| 41.227723 | 97 | 0.662104 | 201 | 4,164 | 13.716418 | 0.572139 | 0.011607 | 0.006529 | 0.012332 | 0.015959 | 0 | 0 | 0 | 0 | 0 | 0 | 0.422754 | 0.256964 | 4,164 | 100 | 98 | 41.64 | 0.468326 | 0.090538 | 0 | 0 | 0 | 0 | 0.557273 | 0.527463 | 0 | 0 | 0 | 0 | 0 | 1 | 0.113208 | false | 0 | 0.075472 | 0 | 0.207547 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
40b6dcb0a22af4a47844bcd381a0cb6e4fe1d8cc | 89 | py | Python | stinkin/lincoln/apps.py | NewsNerdsAtCoJMC/ProjectTicoTeam7 | 904ce0d74b986a64e8bedf31c499b6f57bb65863 | [
"MIT"
] | null | null | null | stinkin/lincoln/apps.py | NewsNerdsAtCoJMC/ProjectTicoTeam7 | 904ce0d74b986a64e8bedf31c499b6f57bb65863 | [
"MIT"
] | null | null | null | stinkin/lincoln/apps.py | NewsNerdsAtCoJMC/ProjectTicoTeam7 | 904ce0d74b986a64e8bedf31c499b6f57bb65863 | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class LincolnConfig(AppConfig):
name = 'lincoln'
| 14.833333 | 33 | 0.752809 | 10 | 89 | 6.7 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168539 | 89 | 5 | 34 | 17.8 | 0.905405 | 0 | 0 | 0 | 0 | 0 | 0.078652 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
40dcce655a25818e5ce58c54ad016f1404c89e3e | 75 | py | Python | src/model/final_test.py | Nix-code/Nix-code-Neural-Machine-Translation-communication-system-to-diminish-language-barrier | f54d3ad007a3a6b20101522bf9cc6300fb89cdf7 | [
"MIT"
] | 9 | 2021-12-31T01:31:41.000Z | 2022-01-09T12:24:28.000Z | src/model/final_test.py | Nix-code/Nix-code-Neural-Machine-Translation-communication-system-to-diminish-language-barrier | f54d3ad007a3a6b20101522bf9cc6300fb89cdf7 | [
"MIT"
] | null | null | null | src/model/final_test.py | Nix-code/Nix-code-Neural-Machine-Translation-communication-system-to-diminish-language-barrier | f54d3ad007a3a6b20101522bf9cc6300fb89cdf7 | [
"MIT"
] | 2 | 2021-12-31T01:31:44.000Z | 2022-01-04T11:41:39.000Z | from translate import*
'''load the saved latest checkpoint'''
mycheck()
| 10.714286 | 38 | 0.72 | 9 | 75 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 75 | 6 | 39 | 12.5 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
dc07cbb2de2936d0f628bedea3552a3b28f39dd3 | 225 | py | Python | files/ta/Splunk_TA_nix/bin/ta_unix/decorators.py | robruma/puppet-splunk | 48726344643642e37faa96561a2fab82a1088ae5 | [
"MIT"
] | 14 | 2015-01-08T15:50:36.000Z | 2019-11-02T15:31:51.000Z | files/ta/Splunk_TA_nix/bin/ta_unix/decorators.py | robruma/puppet-splunk | 48726344643642e37faa96561a2fab82a1088ae5 | [
"MIT"
] | 32 | 2015-02-08T14:24:41.000Z | 2020-07-08T17:16:30.000Z | files/ta/Splunk_TA_nix/bin/ta_unix/decorators.py | robruma/puppet-splunk | 48726344643642e37faa96561a2fab82a1088ae5 | [
"MIT"
] | 33 | 2015-02-17T21:01:35.000Z | 2021-02-23T10:39:49.000Z | import cherrypy
def host_app(fn):
def decorator(self, *args, **kwargs):
kwargs.update({'host_app' : cherrypy.request.path_info.split('/')[3]})
return fn(self, *args, **kwargs)
return decorator
| 20.454545 | 78 | 0.622222 | 28 | 225 | 4.892857 | 0.607143 | 0.10219 | 0.20438 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005682 | 0.217778 | 225 | 10 | 79 | 22.5 | 0.772727 | 0 | 0 | 0 | 0 | 0 | 0.040179 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0 | 0.833333 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
904c982a8d90ce6c2ab4748ae70fc0b98c35ec87 | 99 | py | Python | tests/utils.py | ninoseki/niteru | 9d05a221bffc6c399ce10586ce5546fdb0b483c8 | [
"MIT"
] | 3 | 2021-04-30T17:33:53.000Z | 2021-08-24T13:05:49.000Z | tests/utils.py | ninoseki/niteru | 9d05a221bffc6c399ce10586ce5546fdb0b483c8 | [
"MIT"
] | null | null | null | tests/utils.py | ninoseki/niteru | 9d05a221bffc6c399ce10586ce5546fdb0b483c8 | [
"MIT"
] | null | null | null | def almost_equal(x: float, y: float, threshold: float = 0.0001):
return abs(x - y) < threshold
| 33 | 64 | 0.666667 | 16 | 99 | 4.0625 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0.191919 | 99 | 2 | 65 | 49.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 4 |
905ea98b3b9c8234e532f2dd1a21d958d38ce0f1 | 4,371 | py | Python | osgar/drivers/test_gps.py | m3d/osgar_archive_2020 | 556b534e59f8aa9b6c8055e2785c8ae75a1a0a0e | [
"MIT"
] | 12 | 2017-02-16T10:22:59.000Z | 2022-03-20T05:48:06.000Z | osgar/drivers/test_gps.py | m3d/osgar_archive_2020 | 556b534e59f8aa9b6c8055e2785c8ae75a1a0a0e | [
"MIT"
] | 618 | 2016-08-30T04:46:12.000Z | 2022-03-25T16:03:10.000Z | osgar/drivers/test_gps.py | robotika/osgar | 6f4f584d5553ab62c08a1c7bb493fefdc9033173 | [
"MIT"
] | 11 | 2016-08-27T20:02:55.000Z | 2022-03-07T08:53:53.000Z | import unittest
from osgar.drivers import gps
class GPSTest(unittest.TestCase):
def test_checksum(self):
self.assertEqual(gps.checksum(b'GNGGA,182433.10,5007.71882,N,01422.50467,E,1,05,6.09,305.1,M,44.3,M,,'), b'41')
def test_str2ms(self):
self.assertEqual(gps.str2ms(b'5007.71882'), 180463129)
def test_str2ms_err_input(self):
self.assertEqual(gps.str2ms(b'01422.489915\x0052'), None) # the checksum is passing because a zero byte is added
def test_parse_line(self):
line = b'$GNGGA,182433.10,5007.71882,N,01422.50467,E,1,05,6.09,305.1,M,44.3,M,,*41'
self.assertEqual(gps.parse_line(line), [51750280, 180463129]) # arc milliseconds (x, y) = (lon, lat)
def test_split(self):
buf = b'blabla$GPGGAsomething*12\r\n'
self.assertEqual(gps.split_buffer(buf), (b'\r\n', b'$GPGGAsomething*12'))
self.assertEqual(gps.split_buffer(b'$toofew*1'), (b'$toofew*1', b''))
def test_parsing_old_GPGGA(self):
line = b'$GPGGA,051852.000,5005.0244,N,01430.3360,E,1,06,3.8,253.1,M,45.4,M,,0000*58'
self.assertEqual(gps.parse_line(line), [52220160, 180301464]) # arc milliseconds (x, y) = (lon, lat)
def test_split_with_binary_data(self):
buf = b'\xb5b\x010\x04\x01\xf8n\x8a\x0e\x15\x04\x00\x00\x05\x02\r\x07&&@\x00S\xff\xff\xff\x02\x04\x10\x07$\xa5' \
b'\x00\x00\x00\x00\x00\x00\x13\x05\x04\x04\x15\x00h\x00\x00\x00\x00\x00\x03\x06\r\x07\x1b\r#\x00\xee\xfa' \
b'\xff\xff\x00\x0c\r\x07/*`\x00\x1d\x00\x00\x00\x04\x0e\r\x07"\x17\x0b\x01\x12\x01\x00\x00\n\x18\x0c\x07"' \
b'\x07\x9e\x00\x8b\x00\x00\x00\x01\x19\r\x070S8\x00\xc8\xff\xff\xff\x0f\x1a\x04\x01\x00\x04\x1f\x01\x00\x00' \
b'\x00\x00\r\x1d\r\x071C\xe3\x00\xcf\xff\xff\xff\x06\x1f\r\x07&).\x01\xa2\x03\x00\x00\x14 \x00\x07$\xa5\x00' \
b'\x00\x00\x00\x00\x00\x0cA\x04\x07\x1c\x05\x1b\x00\x00\x00\x00\x00\x12G\x04\x01\x00\x02*\x01\x00\x00\x00\x00' \
b'\x0bH\x04\x04\x13\x0bW\x01\x00\x00\x00\x00\x08I\r\x07\x1b\x1c=\x01]\x02\x00\x00\x11O\r\x07)\x1c\x88\x00\xdf' \
b'\xfe\xff\xff\x10P\r\x07(VD\x01}\xff\xff\xff\tQ\r\x07\',)\x00\x1a\x02\x00\x00\x07R\r\x07*N\xb5\x00\xd7\xfe' \
b'\xff\xff\x0eS\r\x07!\x19\xd0\x00\xca\x00\x00\x00\x83\x17\xb5b\x01\x03\x10\x00\xf8n\x8a\x0e\x03\xdd\x00\x08' \
b'iy\x00\x00\xdd\xd9\x03\x00\x95($GNGGA,194535.40,5007.71276,N,01422.49206,E,1,12,0.80,284.6,M,44.3,M,,*42\r\n'
buf, data = gps.split_buffer(buf)
self.assertTrue(data.startswith(bytes([0xB5, 0x62])))
gps.parse_bin(data)
buf, data = gps.split_buffer(buf)
self.assertTrue(data.startswith(bytes([0xB5, 0x62])))
gps.parse_bin(data)
buf, nmea = gps.split_buffer(buf)
self.assertEqual(nmea, b'$GNGGA,194535.40,5007.71276,N,01422.49206,E,1,12,0.80,284.6,M,44.3,M,,*42')
def test_invalid_coordinates(self):
self.assertEqual(gps.parse_line(b'$GPGGA,053446.426,,,,,0,00,,,M,0.0,M,,0000*56'), gps.INVALID_COORDINATES)
def test_parse_rel_position(self):
data = b'\xb5b\x01<(\x00\x00\x00\x00\x00\x00&\x08\x18\xd9\n\x00\x00N\xeb\xff\xff]\xff\xff\xffV\xd0\xd0\x00' \
b'\xf6\x00\x00\x00)\x01\x00\x00\xea\x01\x00\x00\x0f\x00\x00\x00/\xc6'
ret = gps.parse_bin(data)
self.assertIsNotNone(ret)
self.assertIn('rel_position', ret)
self.assertEqual(ret['rel_position'], [-5298, 2777])
def test_parse_velocity_and_heading(self):
data = b'\xb5b\x01\x12$\x00\x00&\x08\x18\xfe\xff\xff\xff\x00\x00\x00\x00\xfd\xff\xff\xff' \
b'\x04\x00\x00\x00\x02\x00\x00\x00\xd5e\x92\x01\x03\x00\x00\x00S\xcc,\x00\x93\x14'
ret = gps.parse_bin(data)
self.assertIsNone(ret)
def test_parse_position_velocity_time_solution(self):
data = b"\xb5b\x01\x07\\\x00\x00&\x08\x18\xe2\x07\x04\x1a\x0f;\x1d\xf7\xe9\x03\x00\x00" \
b"\xf5\x1a\xe6\x0b\x03C\xea\x0e@`\x91\x08\x94\xfa\xe0\x1d\xd2\xf1\x04\x00\xadD\x04" \
b"\x00'\x00\x00\x001\x00\x00\x00\xee\xff\xff\xff\x04\x00\x00\x00\xe2\xff\xff\xff\x12" \
b"\x00\x00\x00\xd5e\x92\x01 \x00\x00\x00S\xcc,\x00\xa2\x00\x00\x00\x12\x19&;\x00" \
b"\x00\x00\x00\x00\x00\x00\x00W\x03"
ret = gps.parse_bin(data)
self.assertIsNone(ret)
# vim: expandtab sw=4 ts=4
| 56.038462 | 124 | 0.641043 | 802 | 4,371 | 3.437656 | 0.269327 | 0.163221 | 0.127312 | 0.078346 | 0.367428 | 0.278201 | 0.206384 | 0.184621 | 0.15778 | 0.13239 | 0 | 0.248713 | 0.155571 | 4,371 | 77 | 125 | 56.766234 | 0.498239 | 0.034546 | 0 | 0.183333 | 0 | 0.216667 | 0.504389 | 0.47758 | 0 | 0 | 0.003796 | 0 | 0.266667 | 1 | 0.183333 | false | 0 | 0.033333 | 0 | 0.233333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
9076f84be858afb7efe68f05f078b81c502a2d92 | 84 | py | Python | pycodeanalyzer/core/languages/analyzers/__init__.py | miong/pycodeanalyzer | 6728d8f77385a1145db67952167710cf412b2343 | [
"MIT"
] | 3 | 2022-03-25T16:13:16.000Z | 2022-03-26T06:42:39.000Z | pycodeanalyzer/core/languages/analyzers/__init__.py | miong/pycodeanalyzer | 6728d8f77385a1145db67952167710cf412b2343 | [
"MIT"
] | null | null | null | pycodeanalyzer/core/languages/analyzers/__init__.py | miong/pycodeanalyzer | 6728d8f77385a1145db67952167710cf412b2343 | [
"MIT"
] | null | null | null | """Languages analyzer package.
Package of all parsers for supported languages.
"""
| 16.8 | 47 | 0.761905 | 10 | 84 | 6.4 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 84 | 4 | 48 | 21 | 0.888889 | 0.904762 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
909830c31710e9b28e6a74d98d27f92afdd668ed | 574 | py | Python | lib/galaxy/tool_shed/repository_type.py | rikeshi/galaxy | c536a877e4a9b3d12aa0d00fd4d5e705109a0d0a | [
"CC-BY-3.0"
] | 1,085 | 2015-02-18T16:14:38.000Z | 2022-03-30T23:52:07.000Z | lib/galaxy/tool_shed/repository_type.py | rikeshi/galaxy | c536a877e4a9b3d12aa0d00fd4d5e705109a0d0a | [
"CC-BY-3.0"
] | 11,253 | 2015-02-18T17:47:32.000Z | 2022-03-31T21:47:03.000Z | lib/galaxy/tool_shed/repository_type.py | rikeshi/galaxy | c536a877e4a9b3d12aa0d00fd4d5e705109a0d0a | [
"CC-BY-3.0"
] | 1,000 | 2015-02-18T16:18:10.000Z | 2022-03-29T08:22:56.000Z | REPOSITORY_DEPENDENCY_DEFINITION_FILENAME = 'repository_dependencies.xml'
REPOSITORY_SUITE_DEFINITION = 'repository_suite_definition'
TOOL_DEPENDENCY_DEFINITION = 'tool_dependency_definition'
TOOL_DEPENDENCY_DEFINITION_FILENAME = 'tool_dependencies.xml'
UNRESTRICTED = 'unrestricted'
types = [UNRESTRICTED, TOOL_DEPENDENCY_DEFINITION, REPOSITORY_SUITE_DEFINITION]
__all__ = (
'REPOSITORY_DEPENDENCY_DEFINITION_FILENAME',
'REPOSITORY_SUITE_DEFINITION',
'TOOL_DEPENDENCY_DEFINITION',
'TOOL_DEPENDENCY_DEFINITION_FILENAME',
'UNRESTRICTED',
'types',
)
| 33.764706 | 79 | 0.827526 | 53 | 574 | 8.320755 | 0.207547 | 0.362812 | 0.326531 | 0.385488 | 0.61678 | 0.421769 | 0.421769 | 0.421769 | 0.331066 | 0 | 0 | 0 | 0.097561 | 574 | 16 | 80 | 35.875 | 0.851351 | 0 | 0 | 0 | 0 | 0 | 0.45122 | 0.400697 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
90ab0e00395646d5033fdb478a3f998bfd70959f | 95 | py | Python | dqs_poc_app/apps.py | Stark27398/dqs | 5a88a5195927643108a26a8f02792c800d092733 | [
"MIT"
] | null | null | null | dqs_poc_app/apps.py | Stark27398/dqs | 5a88a5195927643108a26a8f02792c800d092733 | [
"MIT"
] | 2 | 2020-06-05T23:25:52.000Z | 2021-06-10T22:00:48.000Z | dqs_poc_app/apps.py | unknownjedi/dqs | 5a88a5195927643108a26a8f02792c800d092733 | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class DqsPocAppConfig(AppConfig):
name = 'dqs_poc_app'
| 15.833333 | 33 | 0.768421 | 12 | 95 | 5.916667 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157895 | 95 | 5 | 34 | 19 | 0.8875 | 0 | 0 | 0 | 0 | 0 | 0.115789 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
90e6a76b53778063273175c30ae48af5319131b5 | 1,040 | py | Python | DigitalMeLib/servers/gedis/systemactors/chat.py | jdelrue/digital_me | e5c92c405c0cea419ce18d25863f35d1bfe5a428 | [
"Apache-2.0"
] | null | null | null | DigitalMeLib/servers/gedis/systemactors/chat.py | jdelrue/digital_me | e5c92c405c0cea419ce18d25863f35d1bfe5a428 | [
"Apache-2.0"
] | 72 | 2018-08-01T06:13:46.000Z | 2019-02-01T15:50:20.000Z | DigitalMeLib/servers/gedis/systemactors/chat.py | jdelrue/digital_me | e5c92c405c0cea419ce18d25863f35d1bfe5a428 | [
"Apache-2.0"
] | 2 | 2018-08-05T08:09:13.000Z | 2018-11-21T13:11:28.000Z | from jumpscale import j
JSBASE = j.application.jsbase_get_class()
class chat(JSBASE):
"""
"""
def __init__(self):
JSBASE.__init__(self)
self.chatbot = j.servers.gedis.latest.chatbot
# check self.chatbot.chatflows for the existing chatflows
# all required commands are here
def work_get(self, sessionid, schema_out):
"""
```in
sessionid = "" (S)
```
```out
cat = "" (S)
msg = "" (S)
error = "" (S)
options = L(S)
```
"""
res = self.chatbot.session_work_get(sessionid)
return res
def work_report(self, sessionid, result):
"""
```in
sessionid = "" (S)
result = "" (S)
```
```out
```
"""
self.chatbot.session_work_set(sessionid, result)
return
def session_alive(self, sessionid, schema_out):
# TODO:*1 check if greenlet is alive
pass
def ping(self):
return 'PONG' | 20 | 65 | 0.506731 | 108 | 1,040 | 4.703704 | 0.481481 | 0.086614 | 0.074803 | 0.086614 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001517 | 0.366346 | 1,040 | 52 | 66 | 20 | 0.769348 | 0.264423 | 0 | 0 | 0 | 0 | 0.006667 | 0 | 0 | 0 | 0 | 0.019231 | 0 | 1 | 0.3125 | false | 0.0625 | 0.0625 | 0.0625 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 4 |
90f6c72806139881a8ebc1b7d89813853ca685a5 | 223 | py | Python | utils/paginator.py | mallusrgreatv2/PyHDISCORD | e414976441cbdb3a57b2c545ab164810bebe2e4b | [
"MIT"
] | 2 | 2021-07-05T12:00:39.000Z | 2021-07-05T12:00:49.000Z | utils/paginator.py | mallusrgreatv2/PyHDISCORD | e414976441cbdb3a57b2c545ab164810bebe2e4b | [
"MIT"
] | null | null | null | utils/paginator.py | mallusrgreatv2/PyHDISCORD | e414976441cbdb3a57b2c545ab164810bebe2e4b | [
"MIT"
] | null | null | null | import discord
from discord.ext.buttons import Paginator
class Pag(Paginator):
async def teardown(self):
try:
await self.page.clear_reactions()
except discord.HTTPException:
pass | 24.777778 | 45 | 0.659193 | 25 | 223 | 5.84 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.273543 | 223 | 9 | 46 | 24.777778 | 0.901235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.125 | 0.25 | 0 | 0.375 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 4 |
292d4ba914046b93da871cc64c66e0f7877c7898 | 153 | py | Python | ExerciciosPythonMundo1/53. Separando os dígitos.py | juniorppb/arquivos-python | 25690e9ba65352fc332a0517ef65f1042e44edff | [
"MIT"
] | null | null | null | ExerciciosPythonMundo1/53. Separando os dígitos.py | juniorppb/arquivos-python | 25690e9ba65352fc332a0517ef65f1042e44edff | [
"MIT"
] | null | null | null | ExerciciosPythonMundo1/53. Separando os dígitos.py | juniorppb/arquivos-python | 25690e9ba65352fc332a0517ef65f1042e44edff | [
"MIT"
] | null | null | null | frase = str(input('Digite o ano que você nasceu:'))
dividido = frase.split()
print(dividido[0])
print(dividido[1])
print(dividido[2])
print(dividido[3])
| 21.857143 | 51 | 0.718954 | 24 | 153 | 4.583333 | 0.666667 | 0.472727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028986 | 0.098039 | 153 | 6 | 52 | 25.5 | 0.768116 | 0 | 0 | 0 | 0 | 0 | 0.189542 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.666667 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
293f2dea2333a35854b9bfc524e0aa1a41adac7b | 121 | py | Python | hackpyapp_project/hackpyapp/admin.py | abdielfloresz/HackPyAppRep | 4f8110af44f5b9460383fc3f92b4056146da1bb4 | [
"MIT"
] | null | null | null | hackpyapp_project/hackpyapp/admin.py | abdielfloresz/HackPyAppRep | 4f8110af44f5b9460383fc3f92b4056146da1bb4 | [
"MIT"
] | null | null | null | hackpyapp_project/hackpyapp/admin.py | abdielfloresz/HackPyAppRep | 4f8110af44f5b9460383fc3f92b4056146da1bb4 | [
"MIT"
] | null | null | null | from django.contrib import admin
# Register your models here.
from .models import BodyPart
admin.site.register(BodyPart) | 24.2 | 32 | 0.818182 | 17 | 121 | 5.823529 | 0.647059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115702 | 121 | 5 | 33 | 24.2 | 0.925234 | 0.214876 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
294414b219a98032b0a9eee382344f0e85e12208 | 206 | py | Python | FastDetector/models/CustomV1/__init__.py | Yoshi-E/Object-Localizer-Project | 0f55009581d207cce6345a3e2c44a8a91c9bb3c4 | [
"MIT"
] | null | null | null | FastDetector/models/CustomV1/__init__.py | Yoshi-E/Object-Localizer-Project | 0f55009581d207cce6345a3e2c44a8a91c9bb3c4 | [
"MIT"
] | null | null | null | FastDetector/models/CustomV1/__init__.py | Yoshi-E/Object-Localizer-Project | 0f55009581d207cce6345a3e2c44a8a91c9bb3c4 | [
"MIT"
] | null | null | null | from .model import FastModel
import sys
import os
# importing sibiling package
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
if __name__ == "__main__":
print("LOAD OK") | 20.6 | 76 | 0.747573 | 30 | 206 | 4.733333 | 0.666667 | 0.126761 | 0.183099 | 0.211268 | 0.225352 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121359 | 206 | 10 | 77 | 20.6 | 0.78453 | 0.126214 | 0 | 0 | 0 | 0 | 0.083799 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.166667 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
29465cebc863e4386c2cd3e59ad0bf1aa0d5da8b | 102,740 | py | Python | synapse/tests/test_lib_grammar.py | kcreyts/synapse | fe740fd1e0febfa32f8d431b32ab48f8a0cf306e | [
"Apache-2.0"
] | 1 | 2021-02-15T22:07:05.000Z | 2021-02-15T22:07:05.000Z | synapse/tests/test_lib_grammar.py | kcreyts/synapse | fe740fd1e0febfa32f8d431b32ab48f8a0cf306e | [
"Apache-2.0"
] | null | null | null | synapse/tests/test_lib_grammar.py | kcreyts/synapse | fe740fd1e0febfa32f8d431b32ab48f8a0cf306e | [
"Apache-2.0"
] | null | null | null | import unittest
import lark # type: ignore
import synapse.exc as s_exc
import synapse.lib.parser as s_parser
import synapse.lib.datfile as s_datfile
import synapse.lib.grammar as s_grammar
import synapse.tests.utils as s_t_utils
# flake8: noqa: E501
_Queries = [
'metrics.edits.byprop inet:fqdn:domain --newv $lib.null',
'tee // comment',
'inet:fqdn=newp.com\n | tee\n { inet:fqdn } // faz\n | uniq',
'inet:fqdn=newp.com\n | tee\n { inet:fqdn }\n /* faz */\n | uniq',
'hehe.haha\xa0foo // a comment | uniq ',
'inet:ipv4 --> *',
'inet:ipv4 <-- *',
'inet:fqdn=woot.com\xa0[ <(refs)+ { media:news } ]',
'inet:fqdn=woot.com [ <(refs)+ { media:news } ]',
'$refs = refs media:news -($refs)> * -(#foo or #bar)',
'$refs = refs media:news <($refs)- (inet:ipv4,inet:ipv6) -(#foo or #bar)',
'media:news -(refs)> * -(#foo or #bar)',
'media:news <(refs)- $bar -(#foo or #bar)',
'media:news [ -(refs)> { inet:fqdn=woot.com } ]',
'media:news [ +(refs)> { inet:fqdn=woot.com } ]',
'cron add --monthly=-1:12:30 {#bar}',
'$foo=$(1 or 1 or 0)',
'$foo=$(1 and 1 and 0)',
'$var=tag1 #base.$var',
'test:str $var=tag1 +#base.$var@=2014',
'test:str $var=tag1 -> #base.$var',
'$var=hehe [test:str=foo :$var=heval]',
'[test:str=heval] test:str $var=hehe +:$var',
'[test:str=foo :tick=2019] $var=tick [-:$var]',
'test:str=foo $var=hehe :$var -> test:str',
'test:str=foo $var=seen [.$var=2019]',
'test:str $var="seen" +.$var',
'test:str=foo $var="seen" [ -.$var ] | spin | test:str=foo',
'$var=hehe [test:str=foo :$hehe=heval]',
'#tag.$bar',
'+#tag.$bar',
'+#tag.$bar.*',
'''#tag.$"escaped \\"string\\""''',
'''+#tag.$"escaped \\"string\\"".*''',
'''[+#tag.$"escaped \\"string\\""]''',
r'''test:str $"some\bvar"=$node.repr()''',
'$x = 0 while $($x < 10) { $x=$($x+1) [test:int=$x] }',
'[test:int?=4] [ test:int?=nonono ]',
'[test:int=4\xa0+?#hehe.haha +?#hehe.newp=newp +#hehe.yes=2020]',
'[test:str=foo :tick?=2019 ]',
'[test:str=a] switch $node.form() { hehe\xa0: {[+#baz]} }',
'[test:type10=2 :strprop=1] spin | test:type10 +$(:strprop) $foo=1 +$foo',
'inet:fqdn#xxx.xxxxxx.xxxx.xx for $tag in $node.tags(xxx.xxxxxx.*.xx) { <- edge:refs +#xx <- graph:cluster [ +#foo] ->edge:refs }',
' +(syn:tag~=aka.*.mal.*)',
'+(syn:tag^=aka or syn:tag^=cno or syn:tag^=rep)',
'[test:str=foo][test:int=42]',
'|help',
"[ test:str=abcd :tick=2015 +#cool ]",
'{ #baz } test:str=foo',
'##baz.faz',
'#$tag [ -#$tag ]',
'#$tag',
'#foo',
' #foo',
'#foo ',
'#hehe.haha',
'$hehe.haha',
'#test.bar +test:pivcomp -+> *',
'#test.bar +test:pivcomp -> *',
'#test.bar +test:str <+- *',
'#test.bar +test:str <- *',
'test:migr <- edge:refs',
'#test.bar -#test -+> *',
'#test.bar -#test -> *',
'#test.bar -#test <+- *',
'#test.bar -#test <- *',
'$bar=5.5.5.5 [ inet:ipv4=$bar ]',
'$blah = $lib.dict(foo=vertex.link) [ inet:fqdn=$blah.foo ]',
'($tick, $tock) = .seen',
'.created',
'.created<2010',
'.created>2010',
'.created*range=("2010", "?")',
'.created*range=(2010, 3001)',
'.created="2001"',
'.created="{created}"',
'.seen [ -.seen ]',
'.seen~="^r"',
"[graph:node='*' :type=m1]",
'[ geo:place="*" :latlong=(-30.0,20.22) ]',
'[ inet:asn=200 :name=visi ]',
'[ inet:dns:a = ( woot.com , 12.34.56.78 ) ]',
'[ inet:dns:a=$blob.split("|") ]',
'[ inet:dns:a=(vertex.link, 5.5.5.5) +#nope ]',
'[ inet:dns:a=(woot.com,1.2.3.4) ]',
'[ inet:dns:a=(woot.com, 1.2.3.4) +#yepr ]',
'[ inet:dns:a=(woot.com, 1.2.3.4) inet:dns:a=(vertex.link, 1.2.3.4) ]',
'[ inet:dns:a=(woot.com,1.2.3.4) .seen=(2015,2016) ]',
'[ inet:fqdn = hehe.com inet:ipv4 = 127.0.0.1 hash:md5 = d41d8cd98f00b204e9800998ecf8427e]',
'[ inet:fqdn = woot.com ]',
'[ inet:fqdn=vertex.link inet:ipv4=1.2.3.4 ]',
'[ inet:fqdn=woot.com +#bad=(2015,2016) ]',
'[ inet:fqdn=woot.com ] -> *',
'[ inet:fqdn=woot.com inet:fqdn=vertex.link ] [ inet:user = :zone ] +inet:user',
'[ inet:ipv4 = 94.75.194.194 :loc = nl ]',
'[ inet:ipv4=$foo ]',
'[ test:int=$hehe.haha ]',
'[ inet:ipv4=1.2.3.0/30 inet:ipv4=5.5.5.5 ]',
'[ inet:ipv4=1.2.3.4 :asn=2 ]',
'[ inet:ipv4=1.2.3.4 :loc=us inet:dns:a=(vertex.link,1.2.3.4) ]',
'[ inet:ipv4=1.2.3.4 ]',
'[ inet:ipv4=192.168.1.0/24]',
'[ inet:ipv4=4.3.2.1 :loc=zz inet:dns:a=(example.com,4.3.2.1) ]',
'[inet:ipv4=197.231.221.211 :asn=37560 :loc=lr.lo.voinjama :latlong="8.4219,-9.7478" :dns:rev=exit1.ipredator.se +#cno.anon.tor.exit = (2017/12/19, 2019/02/15) ]',
'[ inet:user=visi inet:user=whippit ]',
'[ test:comp=(10, haha) +#foo.bar -#foo.bar ]',
'[ test:comp=(127,newp) ] [test:comp=(127,127)]',
"[test:comp=(123, test) test:comp=(123, duck) test:comp=(123, mode)]",
'[ test:guid="*" :tick=2015 ]',
'[ test:guid="*" :tick=2016 ]',
'[ test:guid="*" :tick=2017 ]',
'[ test:pivcomp=(foo,bar) :tick=2018 ]',
'[ test:pivcomp=(foo,bar) ]',
'[ test:pivcomp=(hehe,haha) :tick=2015 +#foo=(2014,2016) ]',
'[ test:pivcomp=(xxx,yyy) :width=42 ]',
'[ test:str="foo bar" :tick=2018]',
'[ test:str=bar +#baz ]',
'[ test:str=foo +#$tag ]',
'test:str=foo +#$tag',
'[ test:str=foo +#bar ] +(#baz or not .seen)',
'[ test:str=foo +#bar ] +(not .seen)',
'[ test:str=foo +#bar ] { [ +#baz ] -#bar }',
'[ test:str=foo test:str=bar ] | sleep 10',
'[ test:str=foo test:str=bar ] | spin',
'[ test:str=foo test:str=bar ]',
'[ test:str=foo test:str=bar test:int=42 ]',
'[ test:str=haha +#bar=2015 ]',
'[ test:str=haha +#foo ]',
'[ test:str=hehe +#foo=(2014,2016) ]',
'[ test:str=hehe ]',
'[ test:str=oof +#bar ] { [ test:int=0xdeadbeef ] }',
'[ test:str=visi +#foo.bar ] -> # [ +#baz.faz ]',
'[ test:str=visi +#foo.bar ] -> #',
'[ test:str=visi test:int=20 +#foo.bar ]',
'[ test:str=woot +#foo=(2015,2018) +#bar .seen=(2014,2016) ]',
'[ test:str=woot +#foo=(2015,2018) .seen=(2014,2016) ]',
'[ test:str=woot +#foo=(2015,2018) ]',
'[ test:str=woot .seen=(2014,2015) ]',
'[ test:str=woot .seen=20 ]',
'[-#foo]',
'[edge:has=((test:str, foobar), (test:str, foo))]',
'[edge:refs=((test:comp, (2048, horton)), (test:comp, (4096, whoville)))]',
'[edge:refs=((test:comp, (9001, "A mean one")), (test:comp, (40000, greeneggs)))]',
'[edge:refs=((test:int, 16), (test:comp, (9999, greenham)))]',
'[edge:refs=((test:str, 123), (test:int, 123))]',
'[inet:dns:query=(tcp://1.2.3.4, "", 1)]',
'[inet:dns:query=(tcp://1.2.3.4, "foo*.haha.com", 1)]',
'[inet:ipv4=1.2.3.1-1.2.3.3]',
'[inet:ipv4=1.2.3.4 :asn=10] [meta:seen=(abcd, (inet:asn, 10))]',
'[meta:seen=(abcd, (test:str, pennywise))]',
'[meta:source=abcd +#omit.nopiv] [meta:seen=(abcd, (test:pivtarg, foo))]',
'[test:comp=(1234, 5678)]',
'[test:comp=(3, foob) +#meep.gorp +#bleep.zlorp +#cond]',
'[test:guid="*" :tick=2001]',
'[test:guid=abcd :tick=2015]',
'[test:int=1 test:int=2 test:int=3]',
'[test:int=10 :loc=us.va]',
'[test:int=2 :loc=us.va.sydney]',
'[test:int=20]',
'[test:int=3 :loc=""]',
'[test:int=4 :loc=us.va.fairfax]',
'[test:int=9 :loc=us.ओं]',
'[test:int=99999]',
'[test:pivcomp=(foo, 123)]',
'[test:str=beep test:str=boop]',
'[test:str=foo :tick=201808021201]',
'[test:str=hehe] | iden abcd | count',
'[test:str=hello]',
'edge:refs +:n1*range=((test:comp, (1000, green)), (test:comp, (3000, ham)))',
'edge:refs',
'edge:wentto',
'file:bytes:size=4',
'for $fqdn in $fqdns { [ inet:fqdn=$fqdn ] }',
'for ($fqdn, $ipv4) in $dnsa { [ inet:dns:a=($fqdn,$ipv4) ] }',
'for ($fqdn,$ipv4,$boom) in $dnsa { [ inet:dns:a=($fqdn,$ipv4) ] }',
'geo:place +geo:place:latlong*near=((34.1, -118.3), 10km)',
'geo:place -:latlong*near=((34.1, -118.3), 50m)',
'geo:place:latlong*near=(("34.118560", "-118.300370"), 2600m)',
'geo:place:latlong*near=(("34.118560", "-118.300370"), 50m)',
'geo:place:latlong*near=((0, 0), 50m)',
'geo:place:latlong*near=((34.1, -118.3), 10km)',
'geo:place=$place <- edge:has <- *',
'geo:place=$place <- edge:has <- ps:person',
'geo:place=abcd $latlong=:latlong $radius=:radius | spin | tel:mob:telem:latlong*near=($latlong, 3km)',
'graph:cluster=abcd | noderefs -d 2 --join',
'help',
'iden 2cdd997872b10a65407ad5fadfa28e0d',
'iden deadb33f',
'$foo=42 iden deadb33f',
'inet:asn=10 | noderefs -of inet:ipv4 --join -d 3',
'inet:dns:a +{ :ipv4 -> inet:ipv4 +:loc=us }',
'inet:dns:a +{ :ipv4 -> inet:ipv4 -:loc=us }',
'inet:dns:a -{ :ipv4 -> inet:ipv4 +:loc=us }',
'inet:dns:a -{ :ipv4 -> inet:ipv4 -:loc=us }',
'inet:dns:a :ipv4 -> *',
'inet:dns:a = (woot.com, 12.34.56.78) [ .seen=( 201708010123, 201708100456 ) ]',
'inet:dns:a = (woot.com, 12.34.56.78) [ .seen=( 201708010123, \"?\" ) ]',
'inet:dns:a',
'inet:dns:a=(woot.com,1.2.3.4) $hehe=:fqdn +:fqdn=$hehe',
'inet:dns:a=(woot.com,1.2.3.4) $hehe=:fqdn -:fqdn=$hehe',
'inet:dns:a=(woot.com,1.2.3.4) $hehe=:fqdn inet:fqdn=$hehe',
'inet:dns:a=(woot.com,1.2.3.4) $newp=.seen',
'inet:dns:a=(woot.com,1.2.3.4) $seen=.seen :fqdn -> inet:fqdn [ .seen=$seen ]',
'inet:dns:a=(woot.com,1.2.3.4) [ .seen=(2015,2018) ]',
'inet:dns:query=(tcp://1.2.3.4, "", 1) :name -> inet:fqdn',
'inet:dns:query=(tcp://1.2.3.4, "foo*.haha.com", 1) :name -> inet:fqdn',
'inet:fqdn +#bad $fqdnbad=#bad -> inet:dns:a:fqdn +.seen@=$fqdnbad',
'inet:fqdn=woot.com -> inet:dns:a -> inet:ipv4',
'inet:fqdn=woot.com -> inet:dns:a',
'inet:fqdn=woot.com | delnode',
'inet:fqdn | graph --filter { -#nope }',
'inet:fqdn=woot.com',
'inet:ipv4 +:asn::name=visi',
'inet:ipv4 +inet:ipv4=1.2.3.0/30',
'inet:ipv4 +inet:ipv4=1.2.3.1-1.2.3.3',
'inet:ipv4 +inet:ipv4=10.2.1.4/32',
'inet:ipv4 -> test:str',
'inet:ipv4 | reindex --subs',
'inet:ipv4:loc=us',
'inet:ipv4:loc=zz',
'inet:ipv4=1.2.3.1-1.2.3.3',
'inet:ipv4=192.168.1.0/24',
'inet:ipv4=1.2.3.4 +:asn',
'inet:ipv4=1.2.3.4 +{ -> inet:dns:a } < 2 ',
'inet:ipv4=1.2.3.4 +( { -> inet:dns:a }<=1 )',
'inet:ipv4=1.2.3.4 +( { -> inet:dns:a } !=2 )',
'inet:ipv4=1.2.3.4|limit 20',
'inet:ipv4=12.34.56.78 [ :loc = us.oh.wilmington ]',
'inet:ipv4=12.34.56.78 inet:fqdn=woot.com [ inet:ipv4=1.2.3.4 :asn=10101 inet:fqdn=woowoo.com +#my.tag ]',
'inet:user | limit --woot',
'inet:user | limit 1',
'inet:user | limit 10 | +inet:user=visi',
'inet:user | limit 10 | [ +#foo.bar ]',
'media:news = 00a1f0d928e25729b9e86e2d08c127ce [ :summary = \"\" ]',
'meta:seen:meta:source=$sorc -> *',
'meta:seen:meta:source=$sorc :node -> *',
'meta:source=8f1401de15918358d5247e21ca29a814',
'movetag a.b a.m',
'movetag hehe woot',
'ps:person=$pers -> edge:has -> *',
'ps:person=$pers -> edge:has -> geo:place',
'ps:person=$pers -> edge:wentto +:time@=(2014,2017) -> geo:place',
'ps:person=$pers -> edge:wentto -> *',
'ps:person=$pers -> edge:wentto :n2 -> *',
'reindex --form-counts',
'sudo | [ inet:ipv4=1.2.3.4 ]',
'sudo | [ test:cycle0=foo :test:cycle1=bar ]',
'sudo | [ test:guid="*" ]',
'sudo | [ test:str=foo +#lol ]',
'sudo | [ test:str=foo ]',
'sudo | [test:str=123 :tick=2018]',
'sudo | test:int=6 | delnode',
'syn:tag=a.b +#foo',
'syn:tag=aaa.barbarella.ddd',
'syn:tag=baz.faz [ +#foo.bar ]',
'syn:tag=foo.bar -> *',
'syn:tag=foo.bar -> test:str',
'syn:tag=foo.bar -> test:str:tick',
'test:comp +(:hehe<2 and :haha=test)',
'test:comp +(:hehe<2 or #meep.gorp)',
'test:comp +(:hehe<2 or :haha=test)',
'test:comp +:haha*range=(grinch, meanone)',
'test:comp +test:comp*range=((1024, grinch), (4096, zemeanone))',
'test:comp -> * | uniq | count',
'test:comp -> *',
'test:comp -> test:int',
'test:comp:haha~="^lulz"',
'test:comp:haha~="^zerg"',
'test:comp#bar +:hehe=1010 +:haha=test10 +#bar',
'test:guid +test:guid*range=(abcd, dcbe)',
'test:guid | max tick',
'test:guid | min tick',
'test:int +:loc=""',
'test:int +:loc="us.va. syria"',
'test:int +:loc=u',
'test:int +:loc=us',
'test:int +:loc=us.v',
'test:int +:loc=us.va.sydney',
'test:int +:loc^=""',
'test:int +:loc^=23',
'test:int +:loc^=u',
'test:int +:loc^=us',
'test:int +:loc^=us.',
'test:int +:loc^=us.va.',
'test:int +:loc^=us.va.fairfax.reston',
'test:int +test:int<30',
'test:int +test:int<=30',
'test:int <=20',
'test:int | noderefs | +test:comp*range=((1000, grinch), (4000, whoville))',
'test:int:loc=""',
'test:int:loc=u',
'test:int:loc=us',
'test:int:loc^=""',
'test:int:loc^=23',
'test:int:loc^=u',
'test:int:loc^=us',
'test:int:loc^=us.',
'test:int:loc^=us.va.fairfax.reston',
'test:int<30',
'test:int<=30',
'test:int=123 | noderefs -te',
'test:int=123 | noderefs',
'test:int=1234 [test:str=$node.form()] -test:int',
'test:int=1234 [test:str=$node.value()] -test:int',
'test:int=3735928559',
'test:int=8675309',
'test:int>30',
'test:int>=20',
'test:pivcomp -> test:int',
'test:pivcomp | noderefs --join --degrees 2',
'test:pivcomp | noderefs --join -d 3',
'test:pivcomp | noderefs --join',
'test:pivcomp | noderefs -j --degrees 2',
'test:pivcomp | noderefs',
'test:pivcomp:tick=$foo',
'test:pivcomp=$foo',
'test:pivcomp=(foo,bar) +{ :lulz -> test:str +#baz } +test:pivcomp',
'test:pivcomp=(foo,bar) -+> *',
'test:pivcomp=(foo,bar) -+> test:pivtarg',
'test:pivcomp=(foo,bar) -> *',
'test:pivcomp=(foo,bar) -> test:pivtarg',
'test:pivcomp=(foo,bar) -{ :lulz -> test:str +#baz }',
'test:pivcomp=(foo,bar) :lulz -+> test:str',
'test:pivcomp=(foo,bar) :lulz -> test:str',
'test:pivcomp=(foo,bar) :targ -> test:pivtarg',
'test:pivcomp=(hehe,haha) $ticktock=#foo -> test:pivtarg +.seen@=$ticktock',
'test:pivcomp=(hehe,haha)',
'test:pivtarg=hehe [ .seen=2015 ]',
'test:str +#*',
'test:str +#**.bar.baz',
'test:str +#**.baz',
'test:str +#*.bad',
'test:str +#foo.**.baz',
'test:str +#foo.*.baz',
'#foo@=("2013", "2015")',
'test:str +#foo@=(2014, 20141231)',
'test:str +#foo@=(2015, 2018)',
'test:str +#foo@=2016',
'test:str +:bar*range=((test:str, c), (test:str, q))',
'test:str +:tick*range=(19701125, 20151212)',
'test:str +:tick=($test, "+- 2day")',
'test:str +:tick=(2015, "+1 day")',
'test:str +:tick=(20150102, "-3 day")',
'test:str +:tick=(20150201, "+1 day")',
'test:str +:tick=2015',
'test:str +:tick@="-1 day"',
'test:str +:tick@=("now+2days", "-3 day")',
'test:str +:tick@=("now-1day", "?")',
'test:str +:tick@=2015',
'test:str +:tick@=(2015, "+1 day")',
'test:str +:tick@=(20150102+1day, "-4 day")',
'test:str +:tick@=(20150102, "-4 day")',
'test:str +:tick@=(now, "-1 day")',
'test:str +test:str:tick<201808021202',
'test:str +test:str:tick<=201808021202',
'test:str +test:str:tick>201808021202',
'test:str +test:str:tick>=201808021202',
'test:str -#*',
'test:str [+#foo.bar=(2000,2002)]',
'test:str [+#foo.bar=(2000,20020601)]',
'test:str [+#foo.bar]',
'test:str [-#foo]',
'test:str [-:tick]',
'test:str | delnode --force',
'test:str | noderefs -d 3 --unique',
'test:str | noderefs -d 3',
'test:str#foo',
'test:str#foo.bar',
'test:str#foo@=(2012,2022)',
'test:str#foo@=2016',
'test:str',
'test:str:tick<201808021202',
'test:str:tick<=201808021202',
'test:str:tick=(20131231, "+2 days")',
'test:str:tick=2015',
'test:str:tick>201808021202',
'test:str:tick>=201808021202',
'test:str= foo',
'test:str="foo bar" +test:str',
'test:str="foo bar" -test:str:tick',
'test:str="foo bar" [ -:tick ]',
'test:str=$foo',
'test:str=123 [:baz="test:guid:tick=2015"]',
'test:str=123 | noderefs --traverse-edge',
'test:str=123 | noderefs',
'test:str=1234 test:str=duck test:str=knight',
'test:str=a +:tick*range=(20000101, 20101201)',
'test:str=bar -+> test:pivcomp:lulz',
'test:str=bar -> test:pivcomp:lulz',
'test:str=bar <+- *',
'test:str=bar <- *',
'test:str=bar test:pivcomp=(foo,bar) [+#test.bar]',
'test:str=foo +#lol@=2016',
'test:str=foo <+- edge:has',
'test:str=foo <- edge:has',
'test:str=foo | delnode',
'test:str=foobar -+> edge:has',
'test:str=foobar -> edge:has <+- test:str',
'test:str=foobar -> edge:has <- test:str',
'test:str=hello [:tick="2001"]',
'test:str=hello [:tick="2002"]',
'test:str=pennywise | noderefs --join -d 9 --traverse-edge',
'test:str=pennywise | noderefs -d 3 --omit-traversal-tag=omit.nopiv --omit-traversal-tag=test',
'test:str=visi -> #*',
'test:str=visi -> #foo.*',
'test:str=woot $foo=#foo +.seen@=$foo',
'test:str=woot +.seen@=#bar',
'test:str=woot +.seen@=(2012,2015)',
'test:str=woot +.seen@=2012',
'test:str~="zip"',
'''
for $foo in $foos {
($fqdn, $ipv4) = $foo.split("|")
[ inet:dns:a=($fqdn, $ipv4) ]
} ''',
''' /* A comment */ test:int ''',
''' test:int // a comment''',
'''/* multi
line */ test:int ''',
'''
inet:fqdn | graph
--degrees 2
--filter { -#nope }
--pivot { <- meta:seen <- meta:source }
--form-pivot inet:fqdn {<- * | limit 20}
--form-pivot inet:fqdn {-> * | limit 20}
--form-filter inet:fqdn {-inet:fqdn:issuffix=1}
--form-pivot syn:tag {-> *}
--form-pivot * {-> #} ''',
'''
for $foo in $foos {
($fqdn, $ipv4) = $foo.split("|")
[ inet:dns:a=($fqdn, $ipv4) ]
} ''',
'''
for $tag in $node.tags() {
-> test:int [ +#$tag ]
} ''',
'''
for $tag in $node.tags(fo*) {
-> test:int [ -#$tag ]
}
''',
'''
[
inet:email:message="*"
:to=woot@woot.com
:from=visi@vertex.link
:replyto=root@root.com
:subject="hi there"
:date=2015
:body="there are mad sploitz here!"
:bytes="*"
]
{[ inet:email:message:link=($node, https://www.vertex.link) ]}
{[ inet:email:message:attachment=($node, "*") ] -inet:email:message [ :name=sploit.exe ]}
{[ edge:has=($node, ('inet:email:header', ('to', 'Visi Kensho <visi@vertex.link>'))) ]}
''',
'$x = $(1 / 3)',
'$x = $(1 * 3)',
'$x = $(1 * 3 + 2)',
'$x = $(1 -3.2 / -3.2)',
'$x = $(1 + 3 / 2 )',
'$x = $((1 + 3)/ 2)',
'$foo=42 $foo2=43 $x = $($foo * $foo2)',
'$yep=$(42 < 43)',
'$yep=$(42 > 43)',
'$yep=$(42 >= 43)',
'$yep=$(42 + 4 <= 43 * 43)',
'$foo=4.3 $bar=4.2 $baz=$($foo + $bar)',
'inet:ipv4=1 $foo=.created $bar=$($foo +1 )',
"$x=$($lib.time.offset('2 days'))",
'$foo = 1 $bar = 2 inet:ipv4=$($foo + $bar)',
'',
'hehe.haha --size 10 --query "foo_bar.stuff:baz"',
'if $foo {[+#woot]}',
'if $foo {[+#woot]} else {[+#nowoot]}',
'if $foo {[+#woot]} elif $(1-1) {[+#nowoot]}',
'if $foo {[+#woot]} elif $(1-1) {[+#nowoot]} else {[+#nonowoot] }',
'if ($data ~= "hehe") {$lib.print(yes)} else {$lib.print(no)}',
'$foo=$(1 or 0 and 0)',
'$foo=$(not 1 and 1)',
'$foo=$(not 1 > 1)',
'#baz.faz:lol',
'foo:bar#baz.faz:lol',
'#baz.faz:lol=20',
'foo:bar#baz.faz:lol=20',
'+#foo.bar:lol',
'+#foo.bar:lol=20',
'[ -#baz.faz:lol ]',
'[ +#baz.faz:lol=20 ]',
'#tag:somegeoloctypebecauseihatelife*near=($lat, $long)',
'*$foo*near=20',
'[ test:str = $foo.woot.var.$bar.mar.$car ]',
'test:str = $foo.$\'space key\'.subkey',
'''
for $iterkey in $foo.$"bar key".$\'biz key\' {
inet:ipv4=$foo.$"bar key".$\'biz key\'.$iterkey
}
''',
''' [(ou:org=c71cd602f73af5bed208da21012fdf54 :loc=us )]''',
'function x(y, z) { return ($( $x - $y ) ) }',
'function echo(arg) { return ($arg) }',
'function a(arg){}',
'function a (arg) {return ( ) }',
'$name = asdf $foo = $lib.dict() $foo.bar = asdf $foo."bar baz" = asdf $foo.$name = asdf',
'[test:str=a] switch $node.form() { hehe: {[+#baz]} }',
'[test:str=a] switch $woot { hehe: {[+#baz]} }',
'[test:str=c] switch $woot { hehe: {[+#baz]} *: {[+#jaz]} }',
'[test:str=c] switch $woot { hehe: {[+#baz]} "haha hoho": {[+#faz]} "lolz:lulz": {[+#jaz]} }',
'''
/* A
multiline
comment */
[ inet:ipv4=1.2.3.4 ] // this is a comment
// and this too...
switch $foo {
// The bar case...
bar: {
[ +#hehe.haha ]
}
/*
The
baz
case
*/
'baz faz': {}
} ''',
'''
for $foo in $foos {
[ inet:ipv4=1.2.3.4 ]
switch $foo {
bar: { [ +#ohai ] break }
baz: { [ +#visi ] continue }
}
[ inet:ipv4=5.6.7.8 ]
[ +#hehe ]
} ''',
'switch $a { "a": { } }',
'switch $a { "test:str" : { } *: {}}',
'switch $a { "test:this:works:" : { } * : {}}',
'''switch $a { 'single:quotes' : { } "doubele:quotes": {} noquotes: { } * : {}}''',
'switch $category { } switch $type { *: { } }',
]
# Generated with print_parse_list below
_ParseResults = [
'Query: [CmdOper: [Const: metrics.edits.byprop, List: [Const: inet:fqdn:domain, Const: --newv, VarDeref: [VarValue: [Const: lib], Const: null]]]]',
'Query: [CmdOper: [Const: tee, Const: ()]]',
'Query: [LiftPropBy: [Const: inet:fqdn, Const: =, Const: newp.com], CmdOper: [Const: tee, List: [ArgvQuery: [Query: [LiftProp: [Const: inet:fqdn]]]]], CmdOper: [Const: uniq, Const: ()]]',
'Query: [LiftPropBy: [Const: inet:fqdn, Const: =, Const: newp.com], CmdOper: [Const: tee, List: [ArgvQuery: [Query: [LiftProp: [Const: inet:fqdn]]]]], CmdOper: [Const: uniq, Const: ()]]',
'Query: [CmdOper: [Const: hehe.haha, List: [Const: foo]]]',
'Query: [LiftProp: [Const: inet:ipv4], N1WalkNPivo: [], isjoin=False]',
'Query: [LiftProp: [Const: inet:ipv4], N2WalkNPivo: [], isjoin=False]',
'Query: [LiftPropBy: [Const: inet:fqdn, Const: =, Const: woot.com], EditEdgeAdd: [Const: refs, SubQuery: [Query: [LiftProp: [Const: media:news]]]]]',
'Query: [LiftPropBy: [Const: inet:fqdn, Const: =, Const: woot.com], EditEdgeAdd: [Const: refs, SubQuery: [Query: [LiftProp: [Const: media:news]]]]]',
'Query: [SetVarOper: [Const: refs, Const: refs], LiftProp: [Const: media:news], N1Walk: [VarValue: [Const: refs], Const: *], FiltOper: [Const: -, OrCond: [TagCond: [TagMatch: [Const: foo]], TagCond: [TagMatch: [Const: bar]]]]]',
'Query: [SetVarOper: [Const: refs, Const: refs], LiftProp: [Const: media:news], N2Walk: [VarValue: [Const: refs], List: [Const: inet:ipv4, Const: inet:ipv6]], FiltOper: [Const: -, OrCond: [TagCond: [TagMatch: [Const: foo]], TagCond: [TagMatch: [Const: bar]]]]]',
'Query: [LiftProp: [Const: media:news], N1Walk: [Const: refs, Const: *], FiltOper: [Const: -, OrCond: [TagCond: [TagMatch: [Const: foo]], TagCond: [TagMatch: [Const: bar]]]]]',
'Query: [LiftProp: [Const: media:news], N2Walk: [Const: refs, VarValue: [Const: bar]], FiltOper: [Const: -, OrCond: [TagCond: [TagMatch: [Const: foo]], TagCond: [TagMatch: [Const: bar]]]]]',
'Query: [LiftProp: [Const: media:news], EditEdgeDel: [Const: refs, SubQuery: [Query: [LiftPropBy: [Const: inet:fqdn, Const: =, Const: woot.com]]]]]',
'Query: [LiftProp: [Const: media:news], EditEdgeAdd: [Const: refs, SubQuery: [Query: [LiftPropBy: [Const: inet:fqdn, Const: =, Const: woot.com]]]]]',
'Query: [CmdOper: [Const: cron, List: [Const: add, Const: --monthly, Const: -1:12:30, ArgvQuery: [Query: [LiftTag: [TagName: [Const: bar]]]]]]]',
'Query: [SetVarOper: [Const: foo, DollarExpr: [ExprOrNode: [ExprOrNode: [Const: 1, Const: or, Const: 1], Const: or, Const: 0]]]]',
'Query: [SetVarOper: [Const: foo, DollarExpr: [ExprAndNode: [ExprAndNode: [Const: 1, Const: and, Const: 1], Const: and, Const: 0]]]]',
'Query: [SetVarOper: [Const: var, Const: tag1], LiftTag: [TagName: [Const: base, VarValue: [Const: var]]]]',
'Query: [LiftProp: [Const: test:str], SetVarOper: [Const: var, Const: tag1], FiltOper: [Const: +, TagValuCond: [TagMatch: [Const: base, VarValue: [Const: var]], Const: @=, Const: 2014]]]',
'Query: [LiftProp: [Const: test:str], SetVarOper: [Const: var, Const: tag1], PivotToTags: [TagMatch: [Const: base, VarValue: [Const: var]]], isjoin=False]',
'Query: [SetVarOper: [Const: var, Const: hehe], EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditPropSet: [RelProp: [VarValue: [Const: var]], Const: =, Const: heval]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: heval], LiftProp: [Const: test:str], SetVarOper: [Const: var, Const: hehe], FiltOper: [Const: +, HasRelPropCond: [RelProp: [VarValue: [Const: var]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 2019], SetVarOper: [Const: var, Const: tick], EditPropDel: [RelProp: [VarValue: [Const: var]]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foo], SetVarOper: [Const: var, Const: hehe], PropPivot: [RelPropValue: [RelProp: [VarValue: [Const: var]]], AbsProp: test:str], isjoin=False]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foo], SetVarOper: [Const: var, Const: seen], EditPropSet: [UnivProp: [VarValue: [Const: var]], Const: =, Const: 2019]]',
'Query: [LiftProp: [Const: test:str], SetVarOper: [Const: var, Const: seen], FiltOper: [Const: +, HasRelPropCond: [UnivProp: [VarValue: [Const: var]]]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foo], SetVarOper: [Const: var, Const: seen], EditUnivDel: [UnivProp: [VarValue: [Const: var]]], CmdOper: [Const: spin, Const: ()], LiftPropBy: [Const: test:str, Const: =, Const: foo]]',
'Query: [SetVarOper: [Const: var, Const: hehe], EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditPropSet: [RelProp: [VarValue: [Const: hehe]], Const: =, Const: heval]]',
'Query: [LiftTag: [TagName: [Const: tag, VarValue: [Const: bar]]]]',
'Query: [FiltOper: [Const: +, TagCond: [TagMatch: [Const: tag, VarValue: [Const: bar]]]]]',
'Query: [FiltOper: [Const: +, TagCond: [TagMatch: [Const: tag, VarValue: [Const: bar], Const: *]]]]',
'Query: [LiftTag: [TagName: [Const: tag, VarValue: [Const: "escaped \\"string\\""]]]]',
'Query: [FiltOper: [Const: +, TagCond: [TagMatch: [Const: tag, VarValue: [Const: "escaped \\"string\\""], Const: *]]]]',
'Query: [EditTagAdd: [TagName: [Const: tag, VarValue: [Const: "escaped \\"string\\""]]]]',
'Query: [LiftProp: [Const: test:str], SetVarOper: [Const: some\x08var, FuncCall: [VarDeref: [VarValue: [Const: node], Const: repr], CallArgs: [], CallKwargs: []]]]',
'Query: [SetVarOper: [Const: x, Const: 0], WhileLoop: [DollarExpr: [ExprNode: [VarValue: [Const: x], Const: <, Const: 10]], SubQuery: [Query: [SetVarOper: [Const: x, DollarExpr: [ExprNode: [VarValue: [Const: x], Const: +, Const: 1]]], EditNodeAdd: [FormName: [Const: test:int], Const: =, VarValue: [Const: x]]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:int], Const: ?=, Const: 4], EditNodeAdd: [FormName: [Const: test:int], Const: ?=, Const: nonono]]',
'Query: [EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 4], EditTagAdd: [Const: ?, TagName: [Const: hehe.haha]], EditTagAdd: [Const: ?, TagName: [Const: hehe.newp], Const: newp], EditTagAdd: [TagName: [Const: hehe.yes], Const: 2020]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditPropSet: [RelProp: [Const: tick], Const: ?=, Const: 2019]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: a], SwitchCase: [FuncCall: [VarDeref: [VarValue: [Const: node], Const: form], CallArgs: [], CallKwargs: []], CaseEntry: [Const: hehe, SubQuery: [Query: [EditTagAdd: [TagName: [Const: baz]]]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:type10], Const: =, Const: 2], EditPropSet: [RelProp: [Const: strprop], Const: =, Const: 1], CmdOper: [Const: spin, Const: ()], LiftProp: [Const: test:type10], FiltOper: [Const: +, DollarExpr: [RelPropValue: [Const: strprop]]], SetVarOper: [Const: foo, Const: 1], FiltOper: [Const: +, VarValue: [Const: foo]]]',
'Query: [LiftFormTag: [Const: inet:fqdn, TagName: [Const: xxx.xxxxxx.xxxx.xx]], ForLoop: [Const: tag, FuncCall: [VarDeref: [VarValue: [Const: node], Const: tags], CallArgs: [Const: xxx.xxxxxx.*.xx], CallKwargs: []], SubQuery: [Query: [PivotInFrom: [AbsProp: edge:refs], isjoin=False, FiltOper: [Const: +, TagCond: [TagMatch: [Const: xx]]], PivotInFrom: [AbsProp: graph:cluster], isjoin=False, EditTagAdd: [TagName: [Const: foo]], FormPivot: [AbsProp: edge:refs], isjoin=False]]]]',
'Query: [FiltOper: [Const: +, AbsPropCond: [AbsProp: syn:tag, Const: ~=, Const: aka.*.mal.*]]]',
'Query: [FiltOper: [Const: +, OrCond: [OrCond: [AbsPropCond: [AbsProp: syn:tag, Const: ^=, Const: aka], AbsPropCond: [AbsProp: syn:tag, Const: ^=, Const: cno]], AbsPropCond: [AbsProp: syn:tag, Const: ^=, Const: rep]]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 42]]',
'Query: [CmdOper: [Const: help, Const: ()]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: abcd], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 2015], EditTagAdd: [TagName: [Const: cool]]]',
'Query: [SubQuery: [Query: [LiftTag: [TagName: [Const: baz]]]], LiftPropBy: [Const: test:str, Const: =, Const: foo]]',
'Query: [LiftTagTag: [TagName: [Const: baz.faz]]]',
'Query: [LiftTag: [VarValue: [Const: tag]], EditTagDel: [VarValue: [Const: tag]]]',
'Query: [LiftTag: [VarValue: [Const: tag]]]',
'Query: [LiftTag: [TagName: [Const: foo]]]',
'Query: [LiftTag: [TagName: [Const: foo]]]',
'Query: [LiftTag: [TagName: [Const: foo]]]',
'Query: [LiftTag: [TagName: [Const: hehe.haha]]]',
'Query: [VarEvalOper: [VarDeref: [VarValue: [Const: hehe], Const: haha]]]',
'Query: [LiftTag: [TagName: [Const: test.bar]], FiltOper: [Const: +, HasAbsPropCond: [AbsProp: test:pivcomp]], PivotOut: [], isjoin=True]',
'Query: [LiftTag: [TagName: [Const: test.bar]], FiltOper: [Const: +, HasAbsPropCond: [AbsProp: test:pivcomp]], PivotOut: [], isjoin=False]',
'Query: [LiftTag: [TagName: [Const: test.bar]], FiltOper: [Const: +, HasAbsPropCond: [AbsProp: test:str]], PivotIn: [], isjoin=True]',
'Query: [LiftTag: [TagName: [Const: test.bar]], FiltOper: [Const: +, HasAbsPropCond: [AbsProp: test:str]], PivotIn: [], isjoin=False]',
'Query: [LiftProp: [Const: test:migr], PivotInFrom: [AbsProp: edge:refs], isjoin=False]',
'Query: [LiftTag: [TagName: [Const: test.bar]], FiltOper: [Const: -, TagCond: [TagMatch: [Const: test]]], PivotOut: [], isjoin=True]',
'Query: [LiftTag: [TagName: [Const: test.bar]], FiltOper: [Const: -, TagCond: [TagMatch: [Const: test]]], PivotOut: [], isjoin=False]',
'Query: [LiftTag: [TagName: [Const: test.bar]], FiltOper: [Const: -, TagCond: [TagMatch: [Const: test]]], PivotIn: [], isjoin=True]',
'Query: [LiftTag: [TagName: [Const: test.bar]], FiltOper: [Const: -, TagCond: [TagMatch: [Const: test]]], PivotIn: [], isjoin=False]',
'Query: [SetVarOper: [Const: bar, Const: 5.5.5.5], EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, VarValue: [Const: bar]]]',
'Query: [SetVarOper: [Const: blah, FuncCall: [VarDeref: [VarValue: [Const: lib], Const: dict], CallArgs: [], CallKwargs: [CallKwarg: [Const: foo, Const: vertex.link]]]], EditNodeAdd: [FormName: [Const: inet:fqdn], Const: =, VarDeref: [VarValue: [Const: blah], Const: foo]]]',
"Query: [VarListSetOper: [VarList: ['tick', 'tock'], UnivPropValue: [UnivProp: [Const: .seen]]]]",
'Query: [LiftProp: [Const: .created]]',
'Query: [LiftPropBy: [Const: .created, Const: <, Const: 2010]]',
'Query: [LiftPropBy: [Const: .created, Const: >, Const: 2010]]',
'Query: [LiftPropBy: [Const: .created, Const: range=, List: [Const: 2010, Const: ?]]]',
'Query: [LiftPropBy: [Const: .created, Const: range=, List: [Const: 2010, Const: 3001]]]',
'Query: [LiftPropBy: [Const: .created, Const: =, Const: 2001]]',
'Query: [LiftPropBy: [Const: .created, Const: =, Const: {created}]]',
'Query: [LiftProp: [Const: .seen], EditUnivDel: [UnivProp: [Const: .seen]]]',
'Query: [LiftPropBy: [Const: .seen, Const: ~=, Const: ^r]]',
'Query: [EditNodeAdd: [FormName: [Const: graph:node], Const: =, Const: *], EditPropSet: [RelProp: [Const: type], Const: =, Const: m1]]',
'Query: [EditNodeAdd: [FormName: [Const: geo:place], Const: =, Const: *], EditPropSet: [RelProp: [Const: latlong], Const: =, List: [Const: -30.0, Const: 20.22]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:asn], Const: =, Const: 200], EditPropSet: [RelProp: [Const: name], Const: =, Const: visi]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, List: [Const: woot.com, Const: 12.34.56.78]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, FuncCall: [VarDeref: [VarValue: [Const: blob], Const: split], CallArgs: [Const: |], CallKwargs: []]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, List: [Const: vertex.link, Const: 5.5.5.5]], EditTagAdd: [TagName: [Const: nope]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, List: [Const: woot.com, Const: 1.2.3.4]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, List: [Const: woot.com, Const: 1.2.3.4]], EditTagAdd: [TagName: [Const: yepr]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, List: [Const: woot.com, Const: 1.2.3.4]], EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, List: [Const: vertex.link, Const: 1.2.3.4]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, List: [Const: woot.com, Const: 1.2.3.4]], EditPropSet: [UnivProp: [Const: .seen], Const: =, List: [Const: 2015, Const: 2016]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:fqdn], Const: =, Const: hehe.com], EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 127.0.0.1], EditNodeAdd: [FormName: [Const: hash:md5], Const: =, Const: d41d8cd98f00b204e9800998ecf8427e]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:fqdn], Const: =, Const: woot.com]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:fqdn], Const: =, Const: vertex.link], EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 1.2.3.4]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:fqdn], Const: =, Const: woot.com], EditTagAdd: [TagName: [Const: bad], List: [Const: 2015, Const: 2016]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:fqdn], Const: =, Const: woot.com], PivotOut: [], isjoin=False]',
'Query: [EditNodeAdd: [FormName: [Const: inet:fqdn], Const: =, Const: woot.com], EditNodeAdd: [FormName: [Const: inet:fqdn], Const: =, Const: vertex.link], EditNodeAdd: [FormName: [Const: inet:user], Const: =, RelPropValue: [Const: zone]], FiltOper: [Const: +, HasAbsPropCond: [AbsProp: inet:user]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 94.75.194.194], EditPropSet: [RelProp: [Const: loc], Const: =, Const: nl]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, VarValue: [Const: foo]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:int], Const: =, VarDeref: [VarValue: [Const: hehe], Const: haha]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 1.2.3.0/30], EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 5.5.5.5]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 1.2.3.4], EditPropSet: [RelProp: [Const: asn], Const: =, Const: 2]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 1.2.3.4], EditPropSet: [RelProp: [Const: loc], Const: =, Const: us], EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, List: [Const: vertex.link, Const: 1.2.3.4]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 1.2.3.4]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 192.168.1.0/24]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 4.3.2.1], EditPropSet: [RelProp: [Const: loc], Const: =, Const: zz], EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, List: [Const: example.com, Const: 4.3.2.1]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 197.231.221.211], EditPropSet: [RelProp: [Const: asn], Const: =, Const: 37560], EditPropSet: [RelProp: [Const: loc], Const: =, Const: lr.lo.voinjama], EditPropSet: [RelProp: [Const: latlong], Const: =, Const: 8.4219,-9.7478], EditPropSet: [RelProp: [Const: dns:rev], Const: =, Const: exit1.ipredator.se], EditTagAdd: [TagName: [Const: cno.anon.tor.exit], List: [Const: 2017/12/19, Const: 2019/02/15]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:user], Const: =, Const: visi], EditNodeAdd: [FormName: [Const: inet:user], Const: =, Const: whippit]]',
'Query: [EditNodeAdd: [FormName: [Const: test:comp], Const: =, List: [Const: 10, Const: haha]], EditTagAdd: [TagName: [Const: foo.bar]], EditTagDel: [TagName: [Const: foo.bar]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:comp], Const: =, List: [Const: 127, Const: newp]], EditNodeAdd: [FormName: [Const: test:comp], Const: =, List: [Const: 127, Const: 127]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:comp], Const: =, List: [Const: 123, Const: test]], EditNodeAdd: [FormName: [Const: test:comp], Const: =, List: [Const: 123, Const: duck]], EditNodeAdd: [FormName: [Const: test:comp], Const: =, List: [Const: 123, Const: mode]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:guid], Const: =, Const: *], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 2015]]',
'Query: [EditNodeAdd: [FormName: [Const: test:guid], Const: =, Const: *], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 2016]]',
'Query: [EditNodeAdd: [FormName: [Const: test:guid], Const: =, Const: *], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 2017]]',
'Query: [EditNodeAdd: [FormName: [Const: test:pivcomp], Const: =, List: [Const: foo, Const: bar]], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 2018]]',
'Query: [EditNodeAdd: [FormName: [Const: test:pivcomp], Const: =, List: [Const: foo, Const: bar]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:pivcomp], Const: =, List: [Const: hehe, Const: haha]], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 2015], EditTagAdd: [TagName: [Const: foo], List: [Const: 2014, Const: 2016]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:pivcomp], Const: =, List: [Const: xxx, Const: yyy]], EditPropSet: [RelProp: [Const: width], Const: =, Const: 42]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo bar], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 2018]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: bar], EditTagAdd: [TagName: [Const: baz]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditTagAdd: [VarValue: [Const: tag]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foo], FiltOper: [Const: +, TagCond: [VarValue: [Const: tag]]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditTagAdd: [TagName: [Const: bar]], FiltOper: [Const: +, OrCond: [TagCond: [TagMatch: [Const: baz]], NotCond: [HasRelPropCond: [UnivProp: [Const: .seen]]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditTagAdd: [TagName: [Const: bar]], FiltOper: [Const: +, NotCond: [HasRelPropCond: [UnivProp: [Const: .seen]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditTagAdd: [TagName: [Const: bar]], SubQuery: [Query: [EditTagAdd: [TagName: [Const: baz]], FiltOper: [Const: -, TagCond: [TagMatch: [Const: bar]]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: bar], CmdOper: [Const: sleep, List: [Const: 10]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: bar], CmdOper: [Const: spin, Const: ()]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: bar]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: bar], EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 42]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: haha], EditTagAdd: [TagName: [Const: bar], Const: 2015]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: haha], EditTagAdd: [TagName: [Const: foo]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: hehe], EditTagAdd: [TagName: [Const: foo], List: [Const: 2014, Const: 2016]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: hehe]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: oof], EditTagAdd: [TagName: [Const: bar]], SubQuery: [Query: [EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 0xdeadbeef]]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: visi], EditTagAdd: [TagName: [Const: foo.bar]], PivotToTags: [TagMatch: []], isjoin=False, EditTagAdd: [TagName: [Const: baz.faz]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: visi], EditTagAdd: [TagName: [Const: foo.bar]], PivotToTags: [TagMatch: []], isjoin=False]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: visi], EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 20], EditTagAdd: [TagName: [Const: foo.bar]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: woot], EditTagAdd: [TagName: [Const: foo], List: [Const: 2015, Const: 2018]], EditTagAdd: [TagName: [Const: bar]], EditPropSet: [UnivProp: [Const: .seen], Const: =, List: [Const: 2014, Const: 2016]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: woot], EditTagAdd: [TagName: [Const: foo], List: [Const: 2015, Const: 2018]], EditPropSet: [UnivProp: [Const: .seen], Const: =, List: [Const: 2014, Const: 2016]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: woot], EditTagAdd: [TagName: [Const: foo], List: [Const: 2015, Const: 2018]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: woot], EditPropSet: [UnivProp: [Const: .seen], Const: =, List: [Const: 2014, Const: 2015]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: woot], EditPropSet: [UnivProp: [Const: .seen], Const: =, Const: 20]]',
'Query: [EditTagDel: [TagName: [Const: foo]]]',
'Query: [EditNodeAdd: [FormName: [Const: edge:has], Const: =, List: [List: [Const: test:str, Const: foobar], List: [Const: test:str, Const: foo]]]]',
'Query: [EditNodeAdd: [FormName: [Const: edge:refs], Const: =, List: [List: [Const: test:comp, List: [Const: 2048, Const: horton]], List: [Const: test:comp, List: [Const: 4096, Const: whoville]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: edge:refs], Const: =, List: [List: [Const: test:comp, List: [Const: 9001, Const: A mean one]], List: [Const: test:comp, List: [Const: 40000, Const: greeneggs]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: edge:refs], Const: =, List: [List: [Const: test:int, Const: 16], List: [Const: test:comp, List: [Const: 9999, Const: greenham]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: edge:refs], Const: =, List: [List: [Const: test:str, Const: 123], List: [Const: test:int, Const: 123]]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:dns:query], Const: =, List: [Const: tcp://1.2.3.4, Const: , Const: 1]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:dns:query], Const: =, List: [Const: tcp://1.2.3.4, Const: foo*.haha.com, Const: 1]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 1.2.3.1-1.2.3.3]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 1.2.3.4], EditPropSet: [RelProp: [Const: asn], Const: =, Const: 10], EditNodeAdd: [FormName: [Const: meta:seen], Const: =, List: [Const: abcd, List: [Const: inet:asn, Const: 10]]]]',
'Query: [EditNodeAdd: [FormName: [Const: meta:seen], Const: =, List: [Const: abcd, List: [Const: test:str, Const: pennywise]]]]',
'Query: [EditNodeAdd: [FormName: [Const: meta:source], Const: =, Const: abcd], EditTagAdd: [TagName: [Const: omit.nopiv]], EditNodeAdd: [FormName: [Const: meta:seen], Const: =, List: [Const: abcd, List: [Const: test:pivtarg, Const: foo]]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:comp], Const: =, List: [Const: 1234, Const: 5678]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:comp], Const: =, List: [Const: 3, Const: foob]], EditTagAdd: [TagName: [Const: meep.gorp]], EditTagAdd: [TagName: [Const: bleep.zlorp]], EditTagAdd: [TagName: [Const: cond]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:guid], Const: =, Const: *], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 2001]]',
'Query: [EditNodeAdd: [FormName: [Const: test:guid], Const: =, Const: abcd], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 2015]]',
'Query: [EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 1], EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 2], EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 3]]',
'Query: [EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 10], EditPropSet: [RelProp: [Const: loc], Const: =, Const: us.va]]',
'Query: [EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 2], EditPropSet: [RelProp: [Const: loc], Const: =, Const: us.va.sydney]]',
'Query: [EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 20]]',
'Query: [EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 3], EditPropSet: [RelProp: [Const: loc], Const: =, Const: ]]',
'Query: [EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 4], EditPropSet: [RelProp: [Const: loc], Const: =, Const: us.va.fairfax]]',
'Query: [EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 9], EditPropSet: [RelProp: [Const: loc], Const: =, Const: us.ओं]]',
'Query: [EditNodeAdd: [FormName: [Const: test:int], Const: =, Const: 99999]]',
'Query: [EditNodeAdd: [FormName: [Const: test:pivcomp], Const: =, List: [Const: foo, Const: 123]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: beep], EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: boop]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 201808021201]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: hehe], CmdOper: [Const: iden, List: [Const: abcd]], CmdOper: [Const: count, Const: ()]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: hello]]',
'Query: [LiftProp: [Const: edge:refs], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: n1]], Const: range=, List: [List: [Const: test:comp, List: [Const: 1000, Const: green]], List: [Const: test:comp, List: [Const: 3000, Const: ham]]]]]]',
'Query: [LiftProp: [Const: edge:refs]]',
'Query: [LiftProp: [Const: edge:wentto]]',
'Query: [LiftPropBy: [Const: file:bytes:size, Const: =, Const: 4]]',
'Query: [ForLoop: [Const: fqdn, VarValue: [Const: fqdns], SubQuery: [Query: [EditNodeAdd: [FormName: [Const: inet:fqdn], Const: =, VarValue: [Const: fqdn]]]]]]',
"Query: [ForLoop: [VarList: ['fqdn', 'ipv4'], VarValue: [Const: dnsa], SubQuery: [Query: [EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, List: [VarValue: [Const: fqdn], VarValue: [Const: ipv4]]]]]]]",
"Query: [ForLoop: [VarList: ['fqdn', 'ipv4', 'boom'], VarValue: [Const: dnsa], SubQuery: [Query: [EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, List: [VarValue: [Const: fqdn], VarValue: [Const: ipv4]]]]]]]",
'Query: [LiftProp: [Const: geo:place], FiltOper: [Const: +, AbsPropCond: [AbsProp: geo:place:latlong, Const: near=, List: [List: [Const: 34.1, Const: -118.3], Const: 10km]]]]',
'Query: [LiftProp: [Const: geo:place], FiltOper: [Const: -, RelPropCond: [RelPropValue: [RelProp: [Const: latlong]], Const: near=, List: [List: [Const: 34.1, Const: -118.3], Const: 50m]]]]',
'Query: [LiftPropBy: [Const: geo:place:latlong, Const: near=, List: [List: [Const: 34.118560, Const: -118.300370], Const: 2600m]]]',
'Query: [LiftPropBy: [Const: geo:place:latlong, Const: near=, List: [List: [Const: 34.118560, Const: -118.300370], Const: 50m]]]',
'Query: [LiftPropBy: [Const: geo:place:latlong, Const: near=, List: [List: [Const: 0, Const: 0], Const: 50m]]]',
'Query: [LiftPropBy: [Const: geo:place:latlong, Const: near=, List: [List: [Const: 34.1, Const: -118.3], Const: 10km]]]',
'Query: [LiftPropBy: [Const: geo:place, Const: =, VarValue: [Const: place]], PivotInFrom: [AbsProp: edge:has], isjoin=False, PivotIn: [], isjoin=False]',
'Query: [LiftPropBy: [Const: geo:place, Const: =, VarValue: [Const: place]], PivotInFrom: [AbsProp: edge:has], isjoin=False, PivotInFrom: [AbsProp: ps:person], isjoin=False]',
'Query: [LiftPropBy: [Const: geo:place, Const: =, Const: abcd], SetVarOper: [Const: latlong, RelPropValue: [Const: latlong]], SetVarOper: [Const: radius, RelPropValue: [Const: radius]], CmdOper: [Const: spin, Const: ()], LiftPropBy: [Const: tel:mob:telem:latlong, Const: near=, List: [VarValue: [Const: latlong], Const: 3km]]]',
'Query: [LiftPropBy: [Const: graph:cluster, Const: =, Const: abcd], CmdOper: [Const: noderefs, List: [Const: -d, Const: 2, Const: --join]]]',
'Query: [CmdOper: [Const: help, Const: ()]]',
'Query: [CmdOper: [Const: iden, List: [Const: 2cdd997872b10a65407ad5fadfa28e0d]]]',
'Query: [CmdOper: [Const: iden, List: [Const: deadb33f]]]',
'Query: [SetVarOper: [Const: foo, Const: 42], CmdOper: [Const: iden, List: [Const: deadb33f]]]',
'Query: [LiftPropBy: [Const: inet:asn, Const: =, Const: 10], CmdOper: [Const: noderefs, List: [Const: -of, Const: inet:ipv4, Const: --join, Const: -d, Const: 3]]]',
'Query: [LiftProp: [Const: inet:dns:a], FiltOper: [Const: +, SubqCond: [Query: [PropPivot: [RelPropValue: [RelProp: [Const: ipv4]], AbsProp: inet:ipv4], isjoin=False, FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: =, Const: us]]]]]]',
'Query: [LiftProp: [Const: inet:dns:a], FiltOper: [Const: +, SubqCond: [Query: [PropPivot: [RelPropValue: [RelProp: [Const: ipv4]], AbsProp: inet:ipv4], isjoin=False, FiltOper: [Const: -, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: =, Const: us]]]]]]',
'Query: [LiftProp: [Const: inet:dns:a], FiltOper: [Const: -, SubqCond: [Query: [PropPivot: [RelPropValue: [RelProp: [Const: ipv4]], AbsProp: inet:ipv4], isjoin=False, FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: =, Const: us]]]]]]',
'Query: [LiftProp: [Const: inet:dns:a], FiltOper: [Const: -, SubqCond: [Query: [PropPivot: [RelPropValue: [RelProp: [Const: ipv4]], AbsProp: inet:ipv4], isjoin=False, FiltOper: [Const: -, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: =, Const: us]]]]]]',
'Query: [LiftProp: [Const: inet:dns:a], PropPivotOut: [RelProp: [Const: ipv4]], isjoin=False]',
'Query: [LiftPropBy: [Const: inet:dns:a, Const: =, List: [Const: woot.com, Const: 12.34.56.78]], EditPropSet: [UnivProp: [Const: .seen], Const: =, List: [Const: 201708010123, Const: 201708100456]]]',
'Query: [LiftPropBy: [Const: inet:dns:a, Const: =, List: [Const: woot.com, Const: 12.34.56.78]], EditPropSet: [UnivProp: [Const: .seen], Const: =, List: [Const: 201708010123, Const: ?]]]',
'Query: [LiftProp: [Const: inet:dns:a]]',
'Query: [LiftPropBy: [Const: inet:dns:a, Const: =, List: [Const: woot.com, Const: 1.2.3.4]], SetVarOper: [Const: hehe, RelPropValue: [Const: fqdn]], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: fqdn]], Const: =, VarValue: [Const: hehe]]]]',
'Query: [LiftPropBy: [Const: inet:dns:a, Const: =, List: [Const: woot.com, Const: 1.2.3.4]], SetVarOper: [Const: hehe, RelPropValue: [Const: fqdn]], FiltOper: [Const: -, RelPropCond: [RelPropValue: [RelProp: [Const: fqdn]], Const: =, VarValue: [Const: hehe]]]]',
'Query: [LiftPropBy: [Const: inet:dns:a, Const: =, List: [Const: woot.com, Const: 1.2.3.4]], SetVarOper: [Const: hehe, RelPropValue: [Const: fqdn]], LiftPropBy: [Const: inet:fqdn, Const: =, VarValue: [Const: hehe]]]',
'Query: [LiftPropBy: [Const: inet:dns:a, Const: =, List: [Const: woot.com, Const: 1.2.3.4]], SetVarOper: [Const: newp, UnivPropValue: [UnivProp: [Const: .seen]]]]',
'Query: [LiftPropBy: [Const: inet:dns:a, Const: =, List: [Const: woot.com, Const: 1.2.3.4]], SetVarOper: [Const: seen, UnivPropValue: [UnivProp: [Const: .seen]]], PropPivot: [RelPropValue: [RelProp: [Const: fqdn]], AbsProp: inet:fqdn], isjoin=False, EditPropSet: [UnivProp: [Const: .seen], Const: =, VarValue: [Const: seen]]]',
'Query: [LiftPropBy: [Const: inet:dns:a, Const: =, List: [Const: woot.com, Const: 1.2.3.4]], EditPropSet: [UnivProp: [Const: .seen], Const: =, List: [Const: 2015, Const: 2018]]]',
'Query: [LiftPropBy: [Const: inet:dns:query, Const: =, List: [Const: tcp://1.2.3.4, Const: , Const: 1]], PropPivot: [RelPropValue: [RelProp: [Const: name]], AbsProp: inet:fqdn], isjoin=False]',
'Query: [LiftPropBy: [Const: inet:dns:query, Const: =, List: [Const: tcp://1.2.3.4, Const: foo*.haha.com, Const: 1]], PropPivot: [RelPropValue: [RelProp: [Const: name]], AbsProp: inet:fqdn], isjoin=False]',
'Query: [LiftProp: [Const: inet:fqdn], FiltOper: [Const: +, TagCond: [TagMatch: [Const: bad]]], SetVarOper: [Const: fqdnbad, TagValue: [TagName: [Const: bad]]], FormPivot: [AbsProp: inet:dns:a:fqdn], isjoin=False, FiltOper: [Const: +, RelPropCond: [RelPropValue: [UnivProp: [Const: .seen]], Const: @=, VarValue: [Const: fqdnbad]]]]',
'Query: [LiftPropBy: [Const: inet:fqdn, Const: =, Const: woot.com], FormPivot: [AbsProp: inet:dns:a], isjoin=False, FormPivot: [AbsProp: inet:ipv4], isjoin=False]',
'Query: [LiftPropBy: [Const: inet:fqdn, Const: =, Const: woot.com], FormPivot: [AbsProp: inet:dns:a], isjoin=False]',
'Query: [LiftPropBy: [Const: inet:fqdn, Const: =, Const: woot.com], CmdOper: [Const: delnode, Const: ()]]',
'Query: [LiftProp: [Const: inet:fqdn], CmdOper: [Const: graph, List: [Const: --filter, ArgvQuery: [Query: [FiltOper: [Const: -, TagCond: [TagMatch: [Const: nope]]]]]]]]',
'Query: [LiftPropBy: [Const: inet:fqdn, Const: =, Const: woot.com]]',
'Query: [LiftProp: [Const: inet:ipv4], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: asn::name]], Const: =, Const: visi]]]',
'Query: [LiftProp: [Const: inet:ipv4], FiltOper: [Const: +, AbsPropCond: [AbsProp: inet:ipv4, Const: =, Const: 1.2.3.0/30]]]',
'Query: [LiftProp: [Const: inet:ipv4], FiltOper: [Const: +, AbsPropCond: [AbsProp: inet:ipv4, Const: =, Const: 1.2.3.1-1.2.3.3]]]',
'Query: [LiftProp: [Const: inet:ipv4], FiltOper: [Const: +, AbsPropCond: [AbsProp: inet:ipv4, Const: =, Const: 10.2.1.4/32]]]',
'Query: [LiftProp: [Const: inet:ipv4], FormPivot: [AbsProp: test:str], isjoin=False]',
'Query: [LiftProp: [Const: inet:ipv4], CmdOper: [Const: reindex, List: [Const: --subs]]]',
'Query: [LiftPropBy: [Const: inet:ipv4:loc, Const: =, Const: us]]',
'Query: [LiftPropBy: [Const: inet:ipv4:loc, Const: =, Const: zz]]',
'Query: [LiftPropBy: [Const: inet:ipv4, Const: =, Const: 1.2.3.1-1.2.3.3]]',
'Query: [LiftPropBy: [Const: inet:ipv4, Const: =, Const: 192.168.1.0/24]]',
'Query: [LiftPropBy: [Const: inet:ipv4, Const: =, Const: 1.2.3.4], FiltOper: [Const: +, HasRelPropCond: [RelProp: [Const: asn]]]]',
'Query: [LiftPropBy: [Const: inet:ipv4, Const: =, Const: 1.2.3.4], FiltOper: [Const: +, SubqCond: [Query: [FormPivot: [AbsProp: inet:dns:a], isjoin=False], Const: <, Const: 2]]]',
'Query: [LiftPropBy: [Const: inet:ipv4, Const: =, Const: 1.2.3.4], FiltOper: [Const: +, SubqCond: [Query: [FormPivot: [AbsProp: inet:dns:a], isjoin=False], Const: <=, Const: 1]]]',
'Query: [LiftPropBy: [Const: inet:ipv4, Const: =, Const: 1.2.3.4], FiltOper: [Const: +, SubqCond: [Query: [FormPivot: [AbsProp: inet:dns:a], isjoin=False], Const: !=, Const: 2]]]',
'Query: [LiftPropBy: [Const: inet:ipv4, Const: =, Const: 1.2.3.4], CmdOper: [Const: limit, List: [Const: 20]]]',
'Query: [LiftPropBy: [Const: inet:ipv4, Const: =, Const: 12.34.56.78], EditPropSet: [RelProp: [Const: loc], Const: =, Const: us.oh.wilmington]]',
'Query: [LiftPropBy: [Const: inet:ipv4, Const: =, Const: 12.34.56.78], LiftPropBy: [Const: inet:fqdn, Const: =, Const: woot.com], EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 1.2.3.4], EditPropSet: [RelProp: [Const: asn], Const: =, Const: 10101], EditNodeAdd: [FormName: [Const: inet:fqdn], Const: =, Const: woowoo.com], EditTagAdd: [TagName: [Const: my.tag]]]',
'Query: [LiftProp: [Const: inet:user], CmdOper: [Const: limit, List: [Const: --woot]]]',
'Query: [LiftProp: [Const: inet:user], CmdOper: [Const: limit, List: [Const: 1]]]',
'Query: [LiftProp: [Const: inet:user], CmdOper: [Const: limit, List: [Const: 10]], FiltOper: [Const: +, AbsPropCond: [AbsProp: inet:user, Const: =, Const: visi]]]',
'Query: [LiftProp: [Const: inet:user], CmdOper: [Const: limit, List: [Const: 10]], EditTagAdd: [TagName: [Const: foo.bar]]]',
'Query: [LiftPropBy: [Const: media:news, Const: =, Const: 00a1f0d928e25729b9e86e2d08c127ce], EditPropSet: [RelProp: [Const: summary], Const: =, Const: ]]',
'Query: [LiftPropBy: [Const: meta:seen:meta:source, Const: =, VarValue: [Const: sorc]], PivotOut: [], isjoin=False]',
'Query: [LiftPropBy: [Const: meta:seen:meta:source, Const: =, VarValue: [Const: sorc]], PropPivotOut: [RelProp: [Const: node]], isjoin=False]',
'Query: [LiftPropBy: [Const: meta:source, Const: =, Const: 8f1401de15918358d5247e21ca29a814]]',
'Query: [CmdOper: [Const: movetag, List: [Const: a.b, Const: a.m]]]',
'Query: [CmdOper: [Const: movetag, List: [Const: hehe, Const: woot]]]',
'Query: [LiftPropBy: [Const: ps:person, Const: =, VarValue: [Const: pers]], FormPivot: [AbsProp: edge:has], isjoin=False, PivotOut: [], isjoin=False]',
'Query: [LiftPropBy: [Const: ps:person, Const: =, VarValue: [Const: pers]], FormPivot: [AbsProp: edge:has], isjoin=False, FormPivot: [AbsProp: geo:place], isjoin=False]',
'Query: [LiftPropBy: [Const: ps:person, Const: =, VarValue: [Const: pers]], FormPivot: [AbsProp: edge:wentto], isjoin=False, FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: time]], Const: @=, List: [Const: 2014, Const: 2017]]], FormPivot: [AbsProp: geo:place], isjoin=False]',
'Query: [LiftPropBy: [Const: ps:person, Const: =, VarValue: [Const: pers]], FormPivot: [AbsProp: edge:wentto], isjoin=False, PivotOut: [], isjoin=False]',
'Query: [LiftPropBy: [Const: ps:person, Const: =, VarValue: [Const: pers]], FormPivot: [AbsProp: edge:wentto], isjoin=False, PropPivotOut: [RelProp: [Const: n2]], isjoin=False]',
'Query: [CmdOper: [Const: reindex, List: [Const: --form-counts]]]',
'Query: [CmdOper: [Const: sudo, Const: ()], EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 1.2.3.4]]',
'Query: [CmdOper: [Const: sudo, Const: ()], EditNodeAdd: [FormName: [Const: test:cycle0], Const: =, Const: foo], EditPropSet: [RelProp: [Const: test:cycle1], Const: =, Const: bar]]',
'Query: [CmdOper: [Const: sudo, Const: ()], EditNodeAdd: [FormName: [Const: test:guid], Const: =, Const: *]]',
'Query: [CmdOper: [Const: sudo, Const: ()], EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo], EditTagAdd: [TagName: [Const: lol]]]',
'Query: [CmdOper: [Const: sudo, Const: ()], EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: foo]]',
'Query: [CmdOper: [Const: sudo, Const: ()], EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: 123], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 2018]]',
'Query: [CmdOper: [Const: sudo, Const: ()], LiftPropBy: [Const: test:int, Const: =, Const: 6], CmdOper: [Const: delnode, Const: ()]]',
'Query: [LiftPropBy: [Const: syn:tag, Const: =, Const: a.b], FiltOper: [Const: +, TagCond: [TagMatch: [Const: foo]]]]',
'Query: [LiftPropBy: [Const: syn:tag, Const: =, Const: aaa.barbarella.ddd]]',
'Query: [LiftPropBy: [Const: syn:tag, Const: =, Const: baz.faz], EditTagAdd: [TagName: [Const: foo.bar]]]',
'Query: [LiftPropBy: [Const: syn:tag, Const: =, Const: foo.bar], PivotOut: [], isjoin=False]',
'Query: [LiftPropBy: [Const: syn:tag, Const: =, Const: foo.bar], FormPivot: [AbsProp: test:str], isjoin=False]',
'Query: [LiftPropBy: [Const: syn:tag, Const: =, Const: foo.bar], FormPivot: [AbsProp: test:str:tick], isjoin=False]',
'Query: [LiftProp: [Const: test:comp], FiltOper: [Const: +, AndCond: [RelPropCond: [RelPropValue: [RelProp: [Const: hehe]], Const: <, Const: 2], RelPropCond: [RelPropValue: [RelProp: [Const: haha]], Const: =, Const: test]]]]',
'Query: [LiftProp: [Const: test:comp], FiltOper: [Const: +, OrCond: [RelPropCond: [RelPropValue: [RelProp: [Const: hehe]], Const: <, Const: 2], TagCond: [TagMatch: [Const: meep.gorp]]]]]',
'Query: [LiftProp: [Const: test:comp], FiltOper: [Const: +, OrCond: [RelPropCond: [RelPropValue: [RelProp: [Const: hehe]], Const: <, Const: 2], RelPropCond: [RelPropValue: [RelProp: [Const: haha]], Const: =, Const: test]]]]',
'Query: [LiftProp: [Const: test:comp], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: haha]], Const: range=, List: [Const: grinch, Const: meanone]]]]',
'Query: [LiftProp: [Const: test:comp], FiltOper: [Const: +, AbsPropCond: [AbsProp: test:comp, Const: range=, List: [List: [Const: 1024, Const: grinch], List: [Const: 4096, Const: zemeanone]]]]]',
'Query: [LiftProp: [Const: test:comp], PivotOut: [], isjoin=False, CmdOper: [Const: uniq, Const: ()], CmdOper: [Const: count, Const: ()]]',
'Query: [LiftProp: [Const: test:comp], PivotOut: [], isjoin=False]',
'Query: [LiftProp: [Const: test:comp], FormPivot: [AbsProp: test:int], isjoin=False]',
'Query: [LiftPropBy: [Const: test:comp:haha, Const: ~=, Const: ^lulz]]',
'Query: [LiftPropBy: [Const: test:comp:haha, Const: ~=, Const: ^zerg]]',
'Query: [LiftFormTag: [Const: test:comp, TagName: [Const: bar]], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: hehe]], Const: =, Const: 1010]], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: haha]], Const: =, Const: test10]], FiltOper: [Const: +, TagCond: [TagMatch: [Const: bar]]]]',
'Query: [LiftProp: [Const: test:guid], FiltOper: [Const: +, AbsPropCond: [AbsProp: test:guid, Const: range=, List: [Const: abcd, Const: dcbe]]]]',
'Query: [LiftProp: [Const: test:guid], CmdOper: [Const: max, List: [Const: tick]]]',
'Query: [LiftProp: [Const: test:guid], CmdOper: [Const: min, List: [Const: tick]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: =, Const: ]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: =, Const: us.va. syria]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: =, Const: u]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: =, Const: us]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: =, Const: us.v]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: =, Const: us.va.sydney]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: ^=, Const: ]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: ^=, Const: 23]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: ^=, Const: u]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: ^=, Const: us]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: ^=, Const: us.]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: ^=, Const: us.va.]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: loc]], Const: ^=, Const: us.va.fairfax.reston]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, AbsPropCond: [AbsProp: test:int, Const: <, Const: 30]]]',
'Query: [LiftProp: [Const: test:int], FiltOper: [Const: +, AbsPropCond: [AbsProp: test:int, Const: <=, Const: 30]]]',
'Query: [LiftPropBy: [Const: test:int, Const: <=, Const: 20]]',
'Query: [LiftProp: [Const: test:int], CmdOper: [Const: noderefs, Const: ()], FiltOper: [Const: +, AbsPropCond: [AbsProp: test:comp, Const: range=, List: [List: [Const: 1000, Const: grinch], List: [Const: 4000, Const: whoville]]]]]',
'Query: [LiftPropBy: [Const: test:int:loc, Const: =, Const: ]]',
'Query: [LiftPropBy: [Const: test:int:loc, Const: =, Const: u]]',
'Query: [LiftPropBy: [Const: test:int:loc, Const: =, Const: us]]',
'Query: [LiftPropBy: [Const: test:int:loc, Const: ^=, Const: ]]',
'Query: [LiftPropBy: [Const: test:int:loc, Const: ^=, Const: 23]]',
'Query: [LiftPropBy: [Const: test:int:loc, Const: ^=, Const: u]]',
'Query: [LiftPropBy: [Const: test:int:loc, Const: ^=, Const: us]]',
'Query: [LiftPropBy: [Const: test:int:loc, Const: ^=, Const: us.]]',
'Query: [LiftPropBy: [Const: test:int:loc, Const: ^=, Const: us.va.fairfax.reston]]',
'Query: [LiftPropBy: [Const: test:int, Const: <, Const: 30]]',
'Query: [LiftPropBy: [Const: test:int, Const: <=, Const: 30]]',
'Query: [LiftPropBy: [Const: test:int, Const: =, Const: 123], CmdOper: [Const: noderefs, List: [Const: -te]]]',
'Query: [LiftPropBy: [Const: test:int, Const: =, Const: 123], CmdOper: [Const: noderefs, Const: ()]]',
'Query: [LiftPropBy: [Const: test:int, Const: =, Const: 1234], EditNodeAdd: [FormName: [Const: test:str], Const: =, FuncCall: [VarDeref: [VarValue: [Const: node], Const: form], CallArgs: [], CallKwargs: []]], FiltOper: [Const: -, HasAbsPropCond: [AbsProp: test:int]]]',
'Query: [LiftPropBy: [Const: test:int, Const: =, Const: 1234], EditNodeAdd: [FormName: [Const: test:str], Const: =, FuncCall: [VarDeref: [VarValue: [Const: node], Const: value], CallArgs: [], CallKwargs: []]], FiltOper: [Const: -, HasAbsPropCond: [AbsProp: test:int]]]',
'Query: [LiftPropBy: [Const: test:int, Const: =, Const: 3735928559]]',
'Query: [LiftPropBy: [Const: test:int, Const: =, Const: 8675309]]',
'Query: [LiftPropBy: [Const: test:int, Const: >, Const: 30]]',
'Query: [LiftPropBy: [Const: test:int, Const: >=, Const: 20]]',
'Query: [LiftProp: [Const: test:pivcomp], FormPivot: [AbsProp: test:int], isjoin=False]',
'Query: [LiftProp: [Const: test:pivcomp], CmdOper: [Const: noderefs, List: [Const: --join, Const: --degrees, Const: 2]]]',
'Query: [LiftProp: [Const: test:pivcomp], CmdOper: [Const: noderefs, List: [Const: --join, Const: -d, Const: 3]]]',
'Query: [LiftProp: [Const: test:pivcomp], CmdOper: [Const: noderefs, List: [Const: --join]]]',
'Query: [LiftProp: [Const: test:pivcomp], CmdOper: [Const: noderefs, List: [Const: -j, Const: --degrees, Const: 2]]]',
'Query: [LiftProp: [Const: test:pivcomp], CmdOper: [Const: noderefs, Const: ()]]',
'Query: [LiftPropBy: [Const: test:pivcomp:tick, Const: =, VarValue: [Const: foo]]]',
'Query: [LiftPropBy: [Const: test:pivcomp, Const: =, VarValue: [Const: foo]]]',
'Query: [LiftPropBy: [Const: test:pivcomp, Const: =, List: [Const: foo, Const: bar]], FiltOper: [Const: +, SubqCond: [Query: [PropPivot: [RelPropValue: [RelProp: [Const: lulz]], AbsProp: test:str], isjoin=False, FiltOper: [Const: +, TagCond: [TagMatch: [Const: baz]]]]]], FiltOper: [Const: +, HasAbsPropCond: [AbsProp: test:pivcomp]]]',
'Query: [LiftPropBy: [Const: test:pivcomp, Const: =, List: [Const: foo, Const: bar]], PivotOut: [], isjoin=True]',
'Query: [LiftPropBy: [Const: test:pivcomp, Const: =, List: [Const: foo, Const: bar]], FormPivot: [AbsProp: test:pivtarg], isjoin=True]',
'Query: [LiftPropBy: [Const: test:pivcomp, Const: =, List: [Const: foo, Const: bar]], PivotOut: [], isjoin=False]',
'Query: [LiftPropBy: [Const: test:pivcomp, Const: =, List: [Const: foo, Const: bar]], FormPivot: [AbsProp: test:pivtarg], isjoin=False]',
'Query: [LiftPropBy: [Const: test:pivcomp, Const: =, List: [Const: foo, Const: bar]], FiltOper: [Const: -, SubqCond: [Query: [PropPivot: [RelPropValue: [RelProp: [Const: lulz]], AbsProp: test:str], isjoin=False, FiltOper: [Const: +, TagCond: [TagMatch: [Const: baz]]]]]]]',
'Query: [LiftPropBy: [Const: test:pivcomp, Const: =, List: [Const: foo, Const: bar]], PropPivot: [RelPropValue: [RelProp: [Const: lulz]], AbsProp: test:str], isjoin=True]',
'Query: [LiftPropBy: [Const: test:pivcomp, Const: =, List: [Const: foo, Const: bar]], PropPivot: [RelPropValue: [RelProp: [Const: lulz]], AbsProp: test:str], isjoin=False]',
'Query: [LiftPropBy: [Const: test:pivcomp, Const: =, List: [Const: foo, Const: bar]], PropPivot: [RelPropValue: [RelProp: [Const: targ]], AbsProp: test:pivtarg], isjoin=False]',
'Query: [LiftPropBy: [Const: test:pivcomp, Const: =, List: [Const: hehe, Const: haha]], SetVarOper: [Const: ticktock, TagValue: [TagName: [Const: foo]]], FormPivot: [AbsProp: test:pivtarg], isjoin=False, FiltOper: [Const: +, RelPropCond: [RelPropValue: [UnivProp: [Const: .seen]], Const: @=, VarValue: [Const: ticktock]]]]',
'Query: [LiftPropBy: [Const: test:pivcomp, Const: =, List: [Const: hehe, Const: haha]]]',
'Query: [LiftPropBy: [Const: test:pivtarg, Const: =, Const: hehe], EditPropSet: [UnivProp: [Const: .seen], Const: =, Const: 2015]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, TagCond: [TagMatch: [Const: *]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, TagCond: [TagMatch: [Const: **.bar.baz]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, TagCond: [TagMatch: [Const: **.baz]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, TagCond: [TagMatch: [Const: *.bad]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, TagCond: [TagMatch: [Const: foo.**.baz]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, TagCond: [TagMatch: [Const: foo.*.baz]]]]',
'Query: [LiftTag: [TagName: [Const: foo], Const: @=, List: [Const: 2013, Const: 2015]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, TagValuCond: [TagMatch: [Const: foo], Const: @=, List: [Const: 2014, Const: 20141231]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, TagValuCond: [TagMatch: [Const: foo], Const: @=, List: [Const: 2015, Const: 2018]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, TagValuCond: [TagMatch: [Const: foo], Const: @=, Const: 2016]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: bar]], Const: range=, List: [List: [Const: test:str, Const: c], List: [Const: test:str, Const: q]]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: range=, List: [Const: 19701125, Const: 20151212]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: =, List: [VarValue: [Const: test], Const: +- 2day]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: =, List: [Const: 2015, Const: +1 day]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: =, List: [Const: 20150102, Const: -3 day]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: =, List: [Const: 20150201, Const: +1 day]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: =, Const: 2015]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: @=, Const: -1 day]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: @=, List: [Const: now+2days, Const: -3 day]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: @=, List: [Const: now-1day, Const: ?]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: @=, Const: 2015]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: @=, List: [Const: 2015, Const: +1 day]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: @=, List: [Const: 20150102+1day, Const: -4 day]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: @=, List: [Const: 20150102, Const: -4 day]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: @=, List: [Const: now, Const: -1 day]]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, AbsPropCond: [AbsProp: test:str:tick, Const: <, Const: 201808021202]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, AbsPropCond: [AbsProp: test:str:tick, Const: <=, Const: 201808021202]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, AbsPropCond: [AbsProp: test:str:tick, Const: >, Const: 201808021202]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: +, AbsPropCond: [AbsProp: test:str:tick, Const: >=, Const: 201808021202]]]',
'Query: [LiftProp: [Const: test:str], FiltOper: [Const: -, TagCond: [TagMatch: [Const: *]]]]',
'Query: [LiftProp: [Const: test:str], EditTagAdd: [TagName: [Const: foo.bar], List: [Const: 2000, Const: 2002]]]',
'Query: [LiftProp: [Const: test:str], EditTagAdd: [TagName: [Const: foo.bar], List: [Const: 2000, Const: 20020601]]]',
'Query: [LiftProp: [Const: test:str], EditTagAdd: [TagName: [Const: foo.bar]]]',
'Query: [LiftProp: [Const: test:str], EditTagDel: [TagName: [Const: foo]]]',
'Query: [LiftProp: [Const: test:str], EditPropDel: [RelProp: [Const: tick]]]',
'Query: [LiftProp: [Const: test:str], CmdOper: [Const: delnode, List: [Const: --force]]]',
'Query: [LiftProp: [Const: test:str], CmdOper: [Const: noderefs, List: [Const: -d, Const: 3, Const: --unique]]]',
'Query: [LiftProp: [Const: test:str], CmdOper: [Const: noderefs, List: [Const: -d, Const: 3]]]',
'Query: [LiftFormTag: [Const: test:str, TagName: [Const: foo]]]',
'Query: [LiftFormTag: [Const: test:str, TagName: [Const: foo.bar]]]',
'Query: [LiftFormTag: [Const: test:str, TagName: [Const: foo], Const: @=, List: [Const: 2012, Const: 2022]]]',
'Query: [LiftFormTag: [Const: test:str, TagName: [Const: foo], Const: @=, Const: 2016]]',
'Query: [LiftProp: [Const: test:str]]',
'Query: [LiftPropBy: [Const: test:str:tick, Const: <, Const: 201808021202]]',
'Query: [LiftPropBy: [Const: test:str:tick, Const: <=, Const: 201808021202]]',
'Query: [LiftPropBy: [Const: test:str:tick, Const: =, List: [Const: 20131231, Const: +2 days]]]',
'Query: [LiftPropBy: [Const: test:str:tick, Const: =, Const: 2015]]',
'Query: [LiftPropBy: [Const: test:str:tick, Const: >, Const: 201808021202]]',
'Query: [LiftPropBy: [Const: test:str:tick, Const: >=, Const: 201808021202]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foo]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foo bar], FiltOper: [Const: +, HasAbsPropCond: [AbsProp: test:str]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foo bar], FiltOper: [Const: -, HasAbsPropCond: [AbsProp: test:str:tick]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foo bar], EditPropDel: [RelProp: [Const: tick]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, VarValue: [Const: foo]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: 123], EditPropSet: [RelProp: [Const: baz], Const: =, Const: test:guid:tick=2015]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: 123], CmdOper: [Const: noderefs, List: [Const: --traverse-edge]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: 123], CmdOper: [Const: noderefs, Const: ()]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: 1234], LiftPropBy: [Const: test:str, Const: =, Const: duck], LiftPropBy: [Const: test:str, Const: =, Const: knight]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: a], FiltOper: [Const: +, RelPropCond: [RelPropValue: [RelProp: [Const: tick]], Const: range=, List: [Const: 20000101, Const: 20101201]]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: bar], FormPivot: [AbsProp: test:pivcomp:lulz], isjoin=True]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: bar], FormPivot: [AbsProp: test:pivcomp:lulz], isjoin=False]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: bar], PivotIn: [], isjoin=True]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: bar], PivotIn: [], isjoin=False]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: bar], LiftPropBy: [Const: test:pivcomp, Const: =, List: [Const: foo, Const: bar]], EditTagAdd: [TagName: [Const: test.bar]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foo], FiltOper: [Const: +, TagValuCond: [TagMatch: [Const: lol], Const: @=, Const: 2016]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foo], PivotInFrom: [AbsProp: edge:has], isjoin=True]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foo], PivotInFrom: [AbsProp: edge:has], isjoin=False]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foo], CmdOper: [Const: delnode, Const: ()]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foobar], FormPivot: [AbsProp: edge:has], isjoin=True]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foobar], FormPivot: [AbsProp: edge:has], isjoin=False, PivotInFrom: [AbsProp: test:str], isjoin=True]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: foobar], FormPivot: [AbsProp: edge:has], isjoin=False, PivotInFrom: [AbsProp: test:str], isjoin=False]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: hello], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 2001]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: hello], EditPropSet: [RelProp: [Const: tick], Const: =, Const: 2002]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: pennywise], CmdOper: [Const: noderefs, List: [Const: --join, Const: -d, Const: 9, Const: --traverse-edge]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: pennywise], CmdOper: [Const: noderefs, List: [Const: -d, Const: 3, Const: --omit-traversal-tag, Const: omit.nopiv, Const: --omit-traversal-tag, Const: test]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: visi], PivotToTags: [TagMatch: [Const: *]], isjoin=False]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: visi], PivotToTags: [TagMatch: [Const: foo.*]], isjoin=False]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: woot], SetVarOper: [Const: foo, TagValue: [TagName: [Const: foo]]], FiltOper: [Const: +, RelPropCond: [RelPropValue: [UnivProp: [Const: .seen]], Const: @=, VarValue: [Const: foo]]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: woot], FiltOper: [Const: +, RelPropCond: [RelPropValue: [UnivProp: [Const: .seen]], Const: @=, TagValue: [TagName: [Const: bar]]]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: woot], FiltOper: [Const: +, RelPropCond: [RelPropValue: [UnivProp: [Const: .seen]], Const: @=, List: [Const: 2012, Const: 2015]]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, Const: woot], FiltOper: [Const: +, RelPropCond: [RelPropValue: [UnivProp: [Const: .seen]], Const: @=, Const: 2012]]]',
'Query: [LiftPropBy: [Const: test:str, Const: ~=, Const: zip]]',
"Query: [ForLoop: [Const: foo, VarValue: [Const: foos], SubQuery: [Query: [VarListSetOper: [VarList: ['fqdn', 'ipv4'], FuncCall: [VarDeref: [VarValue: [Const: foo], Const: split], CallArgs: [Const: |], CallKwargs: []]], EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, List: [VarValue: [Const: fqdn], VarValue: [Const: ipv4]]]]]]]",
'Query: [LiftProp: [Const: test:int]]',
'Query: [LiftProp: [Const: test:int]]',
'Query: [LiftProp: [Const: test:int]]',
'Query: [LiftProp: [Const: inet:fqdn], CmdOper: [Const: graph, List: [Const: --degrees, Const: 2, Const: --filter, ArgvQuery: [Query: [FiltOper: [Const: -, TagCond: [TagMatch: [Const: nope]]]]], Const: --pivot, ArgvQuery: [Query: [PivotInFrom: [AbsProp: meta:seen], isjoin=False, PivotInFrom: [AbsProp: meta:source], isjoin=False]], Const: --form-pivot, Const: inet:fqdn, ArgvQuery: [Query: [PivotIn: [], isjoin=False, CmdOper: [Const: limit, List: [Const: 20]]]], Const: --form-pivot, Const: inet:fqdn, ArgvQuery: [Query: [PivotOut: [], isjoin=False, CmdOper: [Const: limit, List: [Const: 20]]]], Const: --form-filter, Const: inet:fqdn, ArgvQuery: [Query: [FiltOper: [Const: -, AbsPropCond: [AbsProp: inet:fqdn:issuffix, Const: =, Const: 1]]]], Const: --form-pivot, Const: syn:tag, ArgvQuery: [Query: [PivotOut: [], isjoin=False]], Const: --form-pivot, Const: *, ArgvQuery: [Query: [PivotToTags: [TagMatch: []], isjoin=False]]]]]',
"Query: [ForLoop: [Const: foo, VarValue: [Const: foos], SubQuery: [Query: [VarListSetOper: [VarList: ['fqdn', 'ipv4'], FuncCall: [VarDeref: [VarValue: [Const: foo], Const: split], CallArgs: [Const: |], CallKwargs: []]], EditNodeAdd: [FormName: [Const: inet:dns:a], Const: =, List: [VarValue: [Const: fqdn], VarValue: [Const: ipv4]]]]]]]",
'Query: [ForLoop: [Const: tag, FuncCall: [VarDeref: [VarValue: [Const: node], Const: tags], CallArgs: [], CallKwargs: []], SubQuery: [Query: [FormPivot: [AbsProp: test:int], isjoin=False, EditTagAdd: [VarValue: [Const: tag]]]]]]',
'Query: [ForLoop: [Const: tag, FuncCall: [VarDeref: [VarValue: [Const: node], Const: tags], CallArgs: [Const: fo*], CallKwargs: []], SubQuery: [Query: [FormPivot: [AbsProp: test:int], isjoin=False, EditTagDel: [VarValue: [Const: tag]]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:email:message], Const: =, Const: *], EditPropSet: [RelProp: [Const: to], Const: =, Const: woot@woot.com], EditPropSet: [RelProp: [Const: from], Const: =, Const: visi@vertex.link], EditPropSet: [RelProp: [Const: replyto], Const: =, Const: root@root.com], EditPropSet: [RelProp: [Const: subject], Const: =, Const: hi there], EditPropSet: [RelProp: [Const: date], Const: =, Const: 2015], EditPropSet: [RelProp: [Const: body], Const: =, Const: there are mad sploitz here!], EditPropSet: [RelProp: [Const: bytes], Const: =, Const: *], SubQuery: [Query: [EditNodeAdd: [FormName: [Const: inet:email:message:link], Const: =, List: [VarValue: [Const: node], Const: https://www.vertex.link]]]], SubQuery: [Query: [EditNodeAdd: [FormName: [Const: inet:email:message:attachment], Const: =, List: [VarValue: [Const: node], Const: *]], FiltOper: [Const: -, HasAbsPropCond: [AbsProp: inet:email:message]], EditPropSet: [RelProp: [Const: name], Const: =, Const: sploit.exe]]], SubQuery: [Query: [EditNodeAdd: [FormName: [Const: edge:has], Const: =, List: [VarValue: [Const: node], List: [Const: inet:email:header, List: [Const: to, Const: Visi Kensho <visi@vertex.link>]]]]]]]',
'Query: [SetVarOper: [Const: x, DollarExpr: [ExprNode: [Const: 1, Const: /, Const: 3]]]]',
'Query: [SetVarOper: [Const: x, DollarExpr: [ExprNode: [Const: 1, Const: *, Const: 3]]]]',
'Query: [SetVarOper: [Const: x, DollarExpr: [ExprNode: [ExprNode: [Const: 1, Const: *, Const: 3], Const: +, Const: 2]]]]',
'Query: [SetVarOper: [Const: x, DollarExpr: [ExprNode: [Const: 1, Const: -, ExprNode: [Const: 3.2, Const: /, Const: -3.2]]]]]',
'Query: [SetVarOper: [Const: x, DollarExpr: [ExprNode: [Const: 1, Const: +, ExprNode: [Const: 3, Const: /, Const: 2]]]]]',
'Query: [SetVarOper: [Const: x, DollarExpr: [ExprNode: [ExprNode: [Const: 1, Const: +, Const: 3], Const: /, Const: 2]]]]',
'Query: [SetVarOper: [Const: foo, Const: 42], SetVarOper: [Const: foo2, Const: 43], SetVarOper: [Const: x, DollarExpr: [ExprNode: [VarValue: [Const: foo], Const: *, VarValue: [Const: foo2]]]]]',
'Query: [SetVarOper: [Const: yep, DollarExpr: [ExprNode: [Const: 42, Const: <, Const: 43]]]]',
'Query: [SetVarOper: [Const: yep, DollarExpr: [ExprNode: [Const: 42, Const: >, Const: 43]]]]',
'Query: [SetVarOper: [Const: yep, DollarExpr: [ExprNode: [Const: 42, Const: >=, Const: 43]]]]',
'Query: [SetVarOper: [Const: yep, DollarExpr: [ExprNode: [ExprNode: [Const: 42, Const: +, Const: 4], Const: <=, ExprNode: [Const: 43, Const: *, Const: 43]]]]]',
'Query: [SetVarOper: [Const: foo, Const: 4.3], SetVarOper: [Const: bar, Const: 4.2], SetVarOper: [Const: baz, DollarExpr: [ExprNode: [VarValue: [Const: foo], Const: +, VarValue: [Const: bar]]]]]',
'Query: [LiftPropBy: [Const: inet:ipv4, Const: =, Const: 1], SetVarOper: [Const: foo, UnivPropValue: [UnivProp: [Const: .created]]], SetVarOper: [Const: bar, DollarExpr: [ExprNode: [VarValue: [Const: foo], Const: +, Const: 1]]]]',
'Query: [SetVarOper: [Const: x, DollarExpr: [FuncCall: [VarDeref: [VarDeref: [VarValue: [Const: lib], Const: time], Const: offset], CallArgs: [Const: 2 days], CallKwargs: []]]]]',
'Query: [SetVarOper: [Const: foo, Const: 1], SetVarOper: [Const: bar, Const: 2], LiftPropBy: [Const: inet:ipv4, Const: =, DollarExpr: [ExprNode: [VarValue: [Const: foo], Const: +, VarValue: [Const: bar]]]]]',
'Query: []',
'Query: [CmdOper: [Const: hehe.haha, List: [Const: --size, Const: 10, Const: --query, Const: foo_bar.stuff:baz]]]',
'Query: [IfStmt: [IfClause: [VarValue: [Const: foo], SubQuery: [Query: [EditTagAdd: [TagName: [Const: woot]]]]]]]',
'Query: [IfStmt: [IfClause: [VarValue: [Const: foo], SubQuery: [Query: [EditTagAdd: [TagName: [Const: woot]]]]], SubQuery: [Query: [EditTagAdd: [TagName: [Const: nowoot]]]]]]',
'Query: [IfStmt: [IfClause: [VarValue: [Const: foo], SubQuery: [Query: [EditTagAdd: [TagName: [Const: woot]]]]], IfClause: [DollarExpr: [ExprNode: [Const: 1, Const: -, Const: 1]], SubQuery: [Query: [EditTagAdd: [TagName: [Const: nowoot]]]]]]]',
'Query: [IfStmt: [IfClause: [VarValue: [Const: foo], SubQuery: [Query: [EditTagAdd: [TagName: [Const: woot]]]]], IfClause: [DollarExpr: [ExprNode: [Const: 1, Const: -, Const: 1]], SubQuery: [Query: [EditTagAdd: [TagName: [Const: nowoot]]]]], SubQuery: [Query: [EditTagAdd: [TagName: [Const: nonowoot]]]]]]',
'Query: [IfStmt: [IfClause: [DollarExpr: [ExprNode: [VarValue: [Const: data], Const: ~=, Const: hehe]], SubQuery: [Query: [VarEvalOper: [FuncCall: [VarDeref: [VarValue: [Const: lib], Const: print], CallArgs: [Const: yes], CallKwargs: []]]]]], SubQuery: [Query: [VarEvalOper: [FuncCall: [VarDeref: [VarValue: [Const: lib], Const: print], CallArgs: [Const: no], CallKwargs: []]]]]]]',
'Query: [SetVarOper: [Const: foo, DollarExpr: [ExprOrNode: [Const: 1, Const: or, ExprAndNode: [Const: 0, Const: and, Const: 0]]]]]',
'Query: [SetVarOper: [Const: foo, DollarExpr: [ExprAndNode: [UnaryExprNode: [Const: not, Const: 1], Const: and, Const: 1]]]]',
'Query: [SetVarOper: [Const: foo, DollarExpr: [UnaryExprNode: [Const: not, ExprNode: [Const: 1, Const: >, Const: 1]]]]]',
'Query: [LiftTagProp: [TagProp: [Const: baz.faz, Const: lol]]]',
'Query: [LiftFormTagProp: [FormTagProp: [Const: foo:bar, Const: baz.faz, Const: lol]]]',
'Query: [LiftTagProp: [TagProp: [Const: baz.faz, Const: lol], Const: =, Const: 20]]',
'Query: [LiftFormTagProp: [FormTagProp: [Const: foo:bar, Const: baz.faz, Const: lol], Const: =, Const: 20]]',
'Query: [FiltOper: [Const: +, HasTagPropCond: [TagProp: [Const: foo.bar, Const: lol]]]]',
'Query: [FiltOper: [Const: +, TagPropCond: [TagProp: [Const: foo.bar, Const: lol], Const: =, Const: 20]]]',
'Query: [EditTagPropDel: [TagProp: [Const: baz.faz, Const: lol]]]',
'Query: [EditTagPropSet: [TagProp: [Const: baz.faz, Const: lol], Const: =, Const: 20]]',
'Query: [LiftTagProp: [TagProp: [Const: tag, Const: somegeoloctypebecauseihatelife], Const: near=, List: [VarValue: [Const: lat], VarValue: [Const: long]]]]',
'Query: [LiftPropBy: [VarValue: [Const: foo], Const: near=, Const: 20]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, VarDeref: [VarDeref: [VarDeref: [VarDeref: [VarDeref: [VarValue: [Const: foo], Const: woot], Const: var], VarValue: [Const: bar]], Const: mar], VarValue: [Const: car]]]]',
'Query: [LiftPropBy: [Const: test:str, Const: =, VarDeref: [VarDeref: [VarValue: [Const: foo], VarValue: [Const: space key]], Const: subkey]]]',
'Query: [ForLoop: [Const: iterkey, VarDeref: [VarDeref: [VarValue: [Const: foo], VarValue: [Const: bar key]], VarValue: [Const: biz key]], SubQuery: [Query: [LiftPropBy: [Const: inet:ipv4, Const: =, VarDeref: [VarDeref: [VarDeref: [VarValue: [Const: foo], VarValue: [Const: bar key]], VarValue: [Const: biz key]], VarValue: [Const: iterkey]]]]]]]',
'Query: [EditParens: [EditNodeAdd: [FormName: [Const: ou:org], Const: =, Const: c71cd602f73af5bed208da21012fdf54], EditPropSet: [RelProp: [Const: loc], Const: =, Const: us]]]',
'Query: [Function: [Const: x, FuncArgs: [Const: y, Const: z], Query: [Return: [DollarExpr: [ExprNode: [VarValue: [Const: x], Const: -, VarValue: [Const: y]]]]]]]',
'Query: [Function: [Const: echo, FuncArgs: [Const: arg], Query: [Return: [VarValue: [Const: arg]]]]]',
'Query: [Function: [Const: a, FuncArgs: [Const: arg], Query: []]]',
'Query: [Function: [Const: a, FuncArgs: [Const: arg], Query: [Return: []]]]',
'Query: [SetVarOper: [Const: name, Const: asdf], SetVarOper: [Const: foo, FuncCall: [VarDeref: [VarValue: [Const: lib], Const: dict], CallArgs: [], CallKwargs: []]], SetItemOper: [VarValue: [Const: foo], Const: bar, Const: asdf], SetItemOper: [VarValue: [Const: foo], Const: bar baz, Const: asdf], SetItemOper: [VarValue: [Const: foo], VarValue: [Const: name], Const: asdf]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: a], SwitchCase: [FuncCall: [VarDeref: [VarValue: [Const: node], Const: form], CallArgs: [], CallKwargs: []], CaseEntry: [Const: hehe, SubQuery: [Query: [EditTagAdd: [TagName: [Const: baz]]]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: a], SwitchCase: [VarValue: [Const: woot], CaseEntry: [Const: hehe, SubQuery: [Query: [EditTagAdd: [TagName: [Const: baz]]]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: c], SwitchCase: [VarValue: [Const: woot], CaseEntry: [Const: hehe, SubQuery: [Query: [EditTagAdd: [TagName: [Const: baz]]]]], CaseEntry: [SubQuery: [Query: [EditTagAdd: [TagName: [Const: jaz]]]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: test:str], Const: =, Const: c], SwitchCase: [VarValue: [Const: woot], CaseEntry: [Const: hehe, SubQuery: [Query: [EditTagAdd: [TagName: [Const: baz]]]]], CaseEntry: [Const: haha hoho, SubQuery: [Query: [EditTagAdd: [TagName: [Const: faz]]]]], CaseEntry: [Const: lolz:lulz, SubQuery: [Query: [EditTagAdd: [TagName: [Const: jaz]]]]]]]',
'Query: [EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 1.2.3.4], SwitchCase: [VarValue: [Const: foo], CaseEntry: [Const: bar, SubQuery: [Query: [EditTagAdd: [TagName: [Const: hehe.haha]]]]], CaseEntry: [Const: baz faz, SubQuery: [Query: []]]]]',
'Query: [ForLoop: [Const: foo, VarValue: [Const: foos], SubQuery: [Query: [EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 1.2.3.4], SwitchCase: [VarValue: [Const: foo], CaseEntry: [Const: bar, SubQuery: [Query: [EditTagAdd: [TagName: [Const: ohai]], BreakOper: []]]], CaseEntry: [Const: baz, SubQuery: [Query: [EditTagAdd: [TagName: [Const: visi]], ContinueOper: []]]]], EditNodeAdd: [FormName: [Const: inet:ipv4], Const: =, Const: 5.6.7.8], EditTagAdd: [TagName: [Const: hehe]]]]]]',
'Query: [SwitchCase: [VarValue: [Const: a], CaseEntry: [Const: a, SubQuery: [Query: []]]]]',
'Query: [SwitchCase: [VarValue: [Const: a], CaseEntry: [Const: test:str, SubQuery: [Query: []]], CaseEntry: [SubQuery: [Query: []]]]]',
'Query: [SwitchCase: [VarValue: [Const: a], CaseEntry: [Const: test:this:works:, SubQuery: [Query: []]], CaseEntry: [SubQuery: [Query: []]]]]',
'Query: [SwitchCase: [VarValue: [Const: a], CaseEntry: [Const: single:quotes, SubQuery: [Query: []]], CaseEntry: [Const: doubele:quotes, SubQuery: [Query: []]], CaseEntry: [Const: noquotes, SubQuery: [Query: []]], CaseEntry: [SubQuery: [Query: []]]]]',
'Query: [SwitchCase: [VarValue: [Const: category]], SwitchCase: [VarValue: [Const: type], CaseEntry: [SubQuery: [Query: []]]]]',
]
class GrammarTest(s_t_utils.SynTest):
def test_grammar(self):
'''
Validates that we have no grammar ambiguities
'''
with s_datfile.openDatFile('synapse.lib/storm.lark') as larkf:
grammar = larkf.read().decode()
parser = lark.Lark(grammar, start='query', debug=True, ambiguity='explicit', keep_all_tokens=True,
propagate_positions=True)
for i, query in enumerate(_Queries):
try:
tree = parser.parse(query)
# print(f'#{i}: {query}')
# print(tree, '\n')
self.notin('_ambig', str(tree))
except (lark.ParseError, lark.UnexpectedCharacters):
print(f'Failure on parsing #{i}:\n{{{query}}}')
raise
async def test_parser(self):
self.maxDiff = None
for i, query in enumerate(_Queries):
parser = s_parser.Parser(query)
tree = parser.query()
self.eq(str(tree), _ParseResults[i])
def test_cmdrargs(self):
q = '''add {inet:fqdn | graph 2 --filter { -#nope } } inet:f-M +1 { [ graph:node='*' :type=m1]}'''
correct = (
'add',
'{inet:fqdn | graph 2 --filter { -#nope } }',
'inet:f-M',
'+1',
"{ [ graph:node='*' :type=m1]}"
)
parser = s_parser.Parser(q)
args = parser.cmdrargs()
self.eq(args, correct)
def test_parse_float(self):
self.raises(s_exc.BadSyntax, s_grammar.parse_float, 'visi', 0)
self.eq((4.2, 3), s_grammar.parse_float('4.2', 0))
self.eq((-4.2, 4), s_grammar.parse_float('-4.2', 0))
self.eq((-4.2, 8), s_grammar.parse_float(' -4.2', 0))
self.eq((-4.2, 8), s_grammar.parse_float(' -4.2', 2))
def test_nom(self):
self.eq(('xyz', 10), s_grammar.nom(' xyz ', 0, 'wxyz', trim=True))
def test_parse_cmd_string(self):
self.eq(('newp', 9), s_parser.parse_cmd_string('help newp', 5))
def test_syntax_error(self):
query = 'test:str )'
parser = s_parser.Parser(query)
self.raises(s_exc.BadSyntax, parser.query)
async def test_quotes(self):
# Test vectors
queries = ((r'''[test:str="WORDS \"THINGS STUFF.\""]''', 'WORDS "THINGS STUFF."'),
(r'''[test:str='WORDS "THINGS STUFF."']''', 'WORDS "THINGS STUFF."'),
(r'''[test:str="\""]''', '"'),
(r'''[test:str="hello\\world!"]''', 'hello\\world!'),
(r'''[test:str="hello\\\"world!"]''', 'hello\\"world!'),
# Single quoted string
(r'''[test:str='hello\\\"world!']''', 'hello\\\\\\"world!'),
(r'''[test:str='hello\t"world!']''', 'hello\\t"world!'),
# TAB
(r'''[test:str="hello\tworld!"]''', 'hello\tworld!'),
(r'''[test:str="hello\\tworld!"]''', 'hello\\tworld!'),
(r'''[test:str="hello\\\tworld!"]''', 'hello\\\tworld!'),
# LF / Newline
(r'''[test:str="hello\nworld!"]''', 'hello\nworld!'),
(r'''[test:str="hello\\nworld!"]''', 'hello\\nworld!'),
(r'''[test:str="hello\\\nworld!"]''', 'hello\\\nworld!'),
# CR / Carriage returns
(r'''[test:str="hello\rworld!"]''', 'hello\rworld!'),
(r'''[test:str="hello\\rworld!"]''', 'hello\\rworld!'),
(r'''[test:str="hello\\\rworld!"]''', 'hello\\\rworld!'),
# single quote escape
(r'''[test:str="hello\'world!"]''', '''hello'world!'''),
(r'''[test:str="hello'world!"]''', '''hello'world!'''), # escape isn't technically required
# BEL
(r'''[test:str="hello\aworld!"]''', '''hello\aworld!'''),
# BS
(r'''[test:str="hello\bworld!"]''', '''hello\bworld!'''),
# FF
(r'''[test:str="hello\fworld!"]''', '''hello\fworld!'''),
# VT
(r'''[test:str="hello\vworld!"]''', '''hello\vworld!'''),
# \xhh - hex
(r'''[test:str="hello\xffworld!"]''', '''hello\xffworld!'''),
# \ooo - octal
(r'''[test:str="hello\040world!"]''', '''hello world!'''),
# Items encoded as a python literal object wrapped in quotes
# are not turned into their corresponding item, they are
# treated as strings.
(r'''[test:str="{'key': 'valu'}"]''', '''{'key': 'valu'}'''),
)
async with self.getTestCore() as core:
for (query, valu) in queries:
nodes = await core.nodes(query)
self.len(1, nodes)
self.eq(nodes[0].ndef[1], valu)
def test_isre_funcs(self):
self.true(s_grammar.isCmdName('testcmd'))
self.true(s_grammar.isCmdName('testcmd2'))
self.true(s_grammar.isCmdName('testcmd.yup'))
self.false(s_grammar.isCmdName('2testcmd'))
self.false(s_grammar.isCmdName('testcmd:newp'))
self.false(s_grammar.isCmdName('.hehe'))
self.true(s_grammar.isUnivName('.hehe'))
self.true(s_grammar.isUnivName('.hehe:haha'))
self.true(s_grammar.isUnivName('.hehe.haha'))
self.true(s_grammar.isUnivName('.hehe4'))
self.true(s_grammar.isUnivName('.hehe.4haha'))
self.true(s_grammar.isUnivName('.hehe:4haha'))
self.false(s_grammar.isUnivName('.4hehe'))
self.false(s_grammar.isUnivName('test:str'))
self.false(s_grammar.isUnivName('test:str.hehe'))
self.false(s_grammar.isUnivName('test:str.hehe:haha'))
self.false(s_grammar.isUnivName('test:str.haha.hehe'))
self.true(s_grammar.isUnivName('.foo:x'))
self.true(s_grammar.isUnivName('.x:foo'))
self.true(s_grammar.isUnivName('._haha'))
self.true(s_grammar.isFormName('test:str'))
self.true(s_grammar.isFormName('t2:str'))
self.true(s_grammar.isFormName('test:str:yup'))
self.true(s_grammar.isFormName('test:str123'))
self.false(s_grammar.isFormName('test'))
self.false(s_grammar.isFormName('2t:str'))
self.false(s_grammar.isFormName('.hehe'))
self.false(s_grammar.isFormName('testcmd'))
self.true(s_grammar.isFormName('x:foo'))
self.true(s_grammar.isFormName('foo:x'))
self.true(s_grammar.isPropName('test:str'))
self.true(s_grammar.isPropName('test:str:tick'))
self.true(s_grammar.isPropName('test:str:_tick'))
self.true(s_grammar.isPropName('_test:str:_tick'))
self.true(s_grammar.isPropName('test:str:str123'))
self.true(s_grammar.isPropName('test:str:123str'))
self.true(s_grammar.isPropName('test:str:123:456'))
self.true(s_grammar.isPropName('test:str.hehe'))
self.true(s_grammar.isPropName('test:str.hehe'))
self.true(s_grammar.isPropName('test:str.hehe.haha'))
self.true(s_grammar.isPropName('test:str.hehe:haha'))
self.true(s_grammar.isPropName('test:x'))
self.true(s_grammar.isPropName('x:x'))
self.false(s_grammar.isPropName('test'))
self.false(s_grammar.isPropName('2t:str'))
self.false(s_grammar.isPropName('.hehe'))
self.false(s_grammar.isPropName('testcmd'))
def gen_parse_list():
'''
Prints out the Asts for a list of queries in order to compare ASTs between versions of parsers
'''
retn = []
for i, query in enumerate(_Queries):
parser = s_parser.Parser(query)
tree = parser.query()
retn.append(str(tree))
return retn
def print_parse_list():
for i in gen_parse_list():
print(f' {repr(i)},')
if __name__ == '__main__':
print_parse_list()
| 83.190283 | 1,212 | 0.600088 | 12,511 | 102,740 | 4.918152 | 0.053473 | 0.043572 | 0.061237 | 0.052786 | 0.786514 | 0.712649 | 0.65172 | 0.594952 | 0.543173 | 0.502917 | 0 | 0.034432 | 0.16495 | 102,740 | 1,234 | 1,213 | 83.257699 | 0.682748 | 0.005353 | 0 | 0.02646 | 0 | 0.445255 | 0.864687 | 0.041844 | 0 | 0 | 0.000201 | 0 | 0 | 1 | 0.008212 | false | 0 | 0.006387 | 0 | 0.016423 | 0.005474 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
464290f187979035c17efc9448df151df58fb35e | 21 | py | Python | __init__.py | DrTol/radiator_performance-Python | e9e8244ed1ba23f7fb9b6e87fc4e4b6d6f55cc5b | [
"MIT"
] | 16 | 2019-11-03T13:33:47.000Z | 2022-03-30T04:14:54.000Z | __init__.py | DrTol/pressure_loss_calculator-Python | 44f2c2adce161ef663fd6774d1f1c188b4c7e5c3 | [
"MIT"
] | 1 | 2020-02-26T15:04:09.000Z | 2021-05-16T02:51:37.000Z | __init__.py | dgoo2308/GoalSeek_Python | d9deb0f212f382842fbc7cbe2aa269a1661224e8 | [
"MIT"
] | 17 | 2019-11-03T11:19:06.000Z | 2022-03-11T23:25:09.000Z | # Package maker :)
| 10.5 | 20 | 0.571429 | 2 | 21 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 21 | 1 | 21 | 21 | 0.8 | 0.761905 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
46642680da160d37e662479701ae1e24d0995af8 | 516 | py | Python | run_analyzer.py | qjhqqqqq/biliob-spider | aa47addacaf0e9ea15f004765de2f0c38a6b27ec | [
"MIT"
] | 9 | 2020-06-18T15:36:45.000Z | 2021-12-10T02:27:41.000Z | run_analyzer.py | qjhqqqqq/biliob-spider | aa47addacaf0e9ea15f004765de2f0c38a6b27ec | [
"MIT"
] | null | null | null | run_analyzer.py | qjhqqqqq/biliob-spider | aa47addacaf0e9ea15f004765de2f0c38a6b27ec | [
"MIT"
] | 28 | 2020-09-25T10:36:55.000Z | 2022-01-22T02:33:01.000Z | from biliob_analyzer.author_analyzer import AuthorAnalyzer
from biliob_analyzer.video_analyzer import VideoAnalyzer
import biliob_analyzer.author_rate_caculate
import biliob_analyzer.author_fans_watcher
import biliob_analyzer.author_rank
# import biliob_analyzer.video_rank
from biliob_analyzer.add_keyword import AddKeyword
AddKeyword().add_all_author()
AddKeyword().add_all_video()
author_analyzer = AuthorAnalyzer()
video_analyzer = VideoAnalyzer()
author_analyzer.author_filter()
video_analyzer.video_filter()
| 32.25 | 58 | 0.877907 | 65 | 516 | 6.569231 | 0.276923 | 0.229508 | 0.187354 | 0.18267 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063953 | 516 | 15 | 59 | 34.4 | 0.884058 | 0.063953 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
466946fedc95414521f84771877435678a4a8fc1 | 5,379 | py | Python | project/apps/registration/migrations/0022_auto_20210702_1104.py | dbinetti/barberscore | 13c3d8193834bd2bb79922e28d3f5ab1675bdffd | [
"BSD-2-Clause"
] | 13 | 2017-08-07T15:45:49.000Z | 2019-07-03T13:58:50.000Z | project/apps/registration/migrations/0022_auto_20210702_1104.py | barberscore/barberscore-api | 2aa9f8598c18c28ba1d4a294f76fd055619f803e | [
"BSD-2-Clause"
] | 309 | 2017-07-14T02:34:12.000Z | 2022-01-14T21:37:02.000Z | project/apps/registration/migrations/0022_auto_20210702_1104.py | dbinetti/barberscore-django | 16fbd9945becda0a765bbdf52ad459a63655128f | [
"BSD-2-Clause"
] | 5 | 2017-08-07T14:01:07.000Z | 2019-06-24T19:44:55.000Z | # Generated by Django 2.2.20 on 2021-07-02 18:04
import apps.registration.models
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('bhs', '0008_auto_20210702_1049'),
('registration', '0021_auto_20200310_1234'),
]
operations = [
migrations.RemoveConstraint(
model_name='entry',
name='unique_entry',
),
migrations.RemoveField(
model_name='session',
name='convention_id',
),
migrations.AddField(
model_name='session',
name='convention',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to='bhs.Convention'),
),
migrations.AlterField(
model_name='assignment',
name='district',
field=models.CharField(blank=True, choices=[('BHS', [(200, 'CAR'), (205, 'CSD'), (210, 'DIX'), (215, 'EVG'), (220, 'FWD'), (225, 'ILL'), (230, 'JAD'), (235, 'LOL'), (240, 'MAD'), (345, 'NED'), (350, 'NSC'), (355, 'ONT'), (360, 'PIO'), (365, 'RMD'), (370, 'SLD'), (375, 'SUN'), (380, 'SWD')]), ('Associated', [(410, 'NxtGn'), (420, 'MBHA'), (430, 'HI'), (440, 'SAI')]), ('Affiliated', [(510, 'BABS'), (515, 'BHA'), (520, 'BHNZ'), (525, 'BinG'), (530, 'FABS'), (540, 'HHar'), (550, 'IABS'), (560, 'LABBS'), (565, 'SABS'), (570, 'SNOBS'), (575, 'SPATS')])], max_length=15, null=True),
),
migrations.AlterField(
model_name='contest',
name='district',
field=models.CharField(blank=True, choices=[(110, 'BHS'), (200, 'CAR'), (205, 'CSD'), (210, 'DIX'), (215, 'EVG'), (220, 'FWD'), (225, 'ILL'), (230, 'JAD'), (235, 'LOL'), (240, 'MAD'), (345, 'NED'), (350, 'NSC'), (355, 'ONT'), (360, 'PIO'), (365, 'RMD'), (370, 'SLD'), (375, 'SUN'), (380, 'SWD')], max_length=15, null=True),
),
migrations.AlterField(
model_name='contest',
name='is_novice',
field=models.BooleanField(default=False),
),
migrations.AlterField(
model_name='contest',
name='is_single',
field=models.BooleanField(default=True, help_text='Single-round award'),
),
migrations.AlterField(
model_name='entry',
name='district',
field=models.CharField(blank=True, choices=[('BHS', [(200, 'CAR'), (205, 'CSD'), (210, 'DIX'), (215, 'EVG'), (220, 'FWD'), (225, 'ILL'), (230, 'JAD'), (235, 'LOL'), (240, 'MAD'), (345, 'NED'), (350, 'NSC'), (355, 'ONT'), (360, 'PIO'), (365, 'RMD'), (370, 'SLD'), (375, 'SUN'), (380, 'SWD')]), ('Associated', [(410, 'NxtGn'), (420, 'MBHA'), (430, 'HI'), (440, 'SAI')]), ('Affiliated', [(510, 'BABS'), (515, 'BHA'), (520, 'BHNZ'), (525, 'BinG'), (530, 'FABS'), (540, 'HHar'), (550, 'IABS'), (560, 'LABBS'), (565, 'SABS'), (570, 'SNOBS'), (575, 'SPATS')])], max_length=15, null=True),
),
migrations.AlterField(
model_name='session',
name='district',
field=models.CharField(blank=True, choices=[(110, 'BHS'), (200, 'CAR'), (205, 'CSD'), (210, 'DIX'), (215, 'EVG'), (220, 'FWD'), (225, 'ILL'), (230, 'JAD'), (235, 'LOL'), (240, 'MAD'), (345, 'NED'), (350, 'NSC'), (355, 'ONT'), (360, 'PIO'), (365, 'RMD'), (370, 'SLD'), (375, 'SUN'), (380, 'SWD')], max_length=15, null=True),
),
migrations.AlterField(
model_name='session',
name='num_rounds',
field=models.IntegerField(default=1),
),
migrations.AlterField(
model_name='session',
name='owners',
field=models.ManyToManyField(default=apps.registration.models.Session.get_default_owners, related_name='sessions', to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='session',
name='year',
field=models.IntegerField(blank=True, choices=[(2031, 2031), (2030, 2030), (2029, 2029), (2028, 2028), (2027, 2027), (2026, 2026), (2025, 2025), (2024, 2024), (2023, 2023), (2022, 2022), (2021, 2021), (2020, 2020), (2019, 2019), (2018, 2018), (2017, 2017), (2016, 2016), (2015, 2015), (2014, 2014), (2013, 2013), (2012, 2012), (2011, 2011), (2010, 2010), (2009, 2009), (2008, 2008), (2007, 2007), (2006, 2006), (2005, 2005), (2004, 2004), (2003, 2003), (2002, 2002), (2001, 2001), (2000, 2000), (1999, 1999), (1998, 1998), (1997, 1997), (1996, 1996), (1995, 1995), (1994, 1994), (1993, 1993), (1992, 1992), (1991, 1991), (1990, 1990), (1989, 1989), (1988, 1988), (1987, 1987), (1986, 1986), (1985, 1985), (1984, 1984), (1983, 1983), (1982, 1982), (1981, 1981), (1980, 1980), (1979, 1979), (1978, 1978), (1977, 1977), (1976, 1976), (1975, 1975), (1974, 1974), (1973, 1973), (1972, 1972), (1971, 1971), (1970, 1970), (1969, 1969), (1968, 1968), (1967, 1967), (1966, 1966), (1965, 1965), (1964, 1964), (1963, 1963), (1962, 1962), (1961, 1961), (1960, 1960), (1959, 1959), (1958, 1958), (1957, 1957), (1956, 1956), (1955, 1955), (1954, 1954), (1953, 1953), (1952, 1952), (1951, 1951), (1950, 1950), (1949, 1949), (1948, 1948), (1947, 1947), (1946, 1946), (1945, 1945), (1944, 1944), (1943, 1943), (1942, 1942), (1941, 1941), (1940, 1940), (1939, 1939)], null=True),
),
]
| 70.776316 | 1,373 | 0.541922 | 640 | 5,379 | 4.498438 | 0.376563 | 0.037513 | 0.078152 | 0.090656 | 0.444599 | 0.423758 | 0.395971 | 0.380688 | 0.380688 | 0.380688 | 0 | 0.25967 | 0.211749 | 5,379 | 75 | 1,374 | 71.72 | 0.41934 | 0.008552 | 0 | 0.57971 | 1 | 0 | 0.12493 | 0.008629 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.057971 | 0 | 0.101449 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
46737510b9c57dee4aded2a5669ad3bffdf3be4b | 86 | py | Python | flappybird.py | namitakarnatak/crispy-potato | 67f3ebde3281cbf2073c45c2a2c7bb2d2311f338 | [
"bzip2-1.0.6"
] | null | null | null | flappybird.py | namitakarnatak/crispy-potato | 67f3ebde3281cbf2073c45c2a2c7bb2d2311f338 | [
"bzip2-1.0.6"
] | null | null | null | flappybird.py | namitakarnatak/crispy-potato | 67f3ebde3281cbf2073c45c2a2c7bb2d2311f338 | [
"bzip2-1.0.6"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Wed Mar 3 17:03:53 2021
@author: a
"""
| 12.285714 | 36 | 0.5 | 14 | 86 | 3.071429 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.193548 | 0.27907 | 86 | 6 | 37 | 14.333333 | 0.5 | 0.813953 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.