hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fbc4b0b60ad8ae62a0c57506c9df5540abd34a09 | 174 | py | Python | src/tfpose_ros/__init__.py | Tacha-S/tf-pose-estimation | 56d7c89c5bbbeb01783c583a0ab506f5f32689ea | [
"Apache-2.0"
] | null | null | null | src/tfpose_ros/__init__.py | Tacha-S/tf-pose-estimation | 56d7c89c5bbbeb01783c583a0ab506f5f32689ea | [
"Apache-2.0"
] | null | null | null | src/tfpose_ros/__init__.py | Tacha-S/tf-pose-estimation | 56d7c89c5bbbeb01783c583a0ab506f5f32689ea | [
"Apache-2.0"
] | null | null | null | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
# from tfpose_ros.runner import infer, Estimator, get_estimator
| 29 | 63 | 0.862069 | 23 | 174 | 5.826087 | 0.565217 | 0.223881 | 0.358209 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114943 | 174 | 5 | 64 | 34.8 | 0.87013 | 0.350575 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.333333 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
fbcdb8d129db320a8fd0af54af25401196eea899 | 46,300 | py | Python | proteus/SplitOperator.py | robertsawko/proteus | 6f1e4c2ca1af85a906b35a5162430006f0343861 | [
"NASA-1.3"
] | null | null | null | proteus/SplitOperator.py | robertsawko/proteus | 6f1e4c2ca1af85a906b35a5162430006f0343861 | [
"NASA-1.3"
] | null | null | null | proteus/SplitOperator.py | robertsawko/proteus | 6f1e4c2ca1af85a906b35a5162430006f0343861 | [
"NASA-1.3"
] | null | null | null | """
A class hierarchy for split operator methods.
"""
import Profiling
log = Profiling.logEvent
class System:
def __init__(self):
self.name="Default System"
defaultSystem = System()
class SO_base:
"""
Base class for operating splitting methods for systems.
The base class implements sequential splitting with a fixed time
step based on the list of time intervals.
Here each model take the same fixed time step
"""
def __init__(self,modelList,system=defaultSystem,stepExact=True):
self.system=system
self.dt_system = 1.0
self.t_system_last=0.0
self.t_system=1.0
self.modelList=modelList
self.exitModelStep={}
for model in self.modelList:
self.exitModelStep[model] = False
self.stepExact = stepExact
self.updateAfterModelStep=True
self.stepExactEps = 1.0e-12
self.stepFailures = 0
#self.atol_iso=1.e-8 #tjp added for iterative SO routine tolerance
def converged(self):
#no iteration
if self.its > 0:
self.its=0
return True
else:
return False
def stepExact_system(self,tExact):
self.dt_system = tExact - self.t_system_last
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
model.stepController.t_model = self.t_system
model.stepController.setSubsteps([self.t_system])
def choose_dt_system(self):
#fixed step
self.t_system = self.t_system_last+self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
model.stepController.t_model = self.t_system
def initialize_dt_system(self,t0,tOut):
self.its=0
self.t_system_last = t0
self.dt_system = tOut - self.t_system_last
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
log("Initializing time step on system %s to dt = %12.5e" %
(self.system.name,self.dt_system),level=1)
log("Initializing step sequence for system %s to %s" %
(self.system.name,self.stepSequence),level=1)
def updateTimeHistory(self):
#update step
self.t_system_last = self.t_system
def retryModelStep_solverFailure(self,model):
self.retry = model.stepController.retryStep_solverFailure()
self.maxFailures = model.stepController.maxSolverFailures
return False#kick out to sequence level
def retryModelStep_errorFailure(self,model):
self.retry = model.stepController.retryStep_errorFailure()
self.maxFailures = model.stepController.maxErrorFailures
return False
def ignoreSequenceStepFailure(self,model):
return False#kick out to sequence level
def retrySequence_modelStepFailure(self):
#retry whole sequence with new min time step
self.stepFailures +=1
if (self.stepFailures < self.maxFailures and self.retry):
self.choose_dt_system()
return True
else:
return False
# def retryModelStep_solverFailure(self,model):
# return model.stepController.retryStep_solverFailure()
# def retryModelStep_errorFailure(self,model):
# return model.stepController.retryStep_errorFailure()
# def ignoreSequenceStepFailure(self,model):
# return False#don't try to recover
# def retrySequence_modelStepFailure(self):
# return False#don't try to recover
def modelStepTaken(self,model,t_stepSequence):
log("SO_base modelStepTaken for model= %s t_system_last= %s t_model_last= %s setting to t_stepSequence= %s " % (model.name,
self.t_system_last,
model.stepController.t_model_last,
t_stepSequence),level=3)
self.stepFailures=0
model.calculateAuxiliaryQuantitiesAfterStep()
model.stepController.t_model_last = t_stepSequence
def sequenceStepTaken(self,model):
self.stepFailures=0
def sequenceTaken(self):
#this is called when the sequence of steps is done
self.its += 1
for model in self.modelList:
model.stepController.updateTimeHistory()
model.stepController.choose_dt_model()
# tjp added for split operator class
def SysNorm(self,rSys=0):
""" Compute the maximum discrete residual value from both models"""
pass
def setFromOptions(self,soOptions):
"""
allow classes to set various numerical parameters
"""
pass
Sequential_FixedStep = SO_base
class Sequential_FixedStep_Simple(SO_base):
"""
Base class for operating splitting methods for systems.
The base class implements sequential splitting with a fixed time
step based on the list of time intervals.
Here each model take the same fixed time step
"""
def __init__(self,modelList,system=defaultSystem,stepExact=True):
SO_base.__init__(self,modelList,system,stepExact)
def converged(self):
#no iteration
if self.its > 0:
self.its=0
return True
else:
return False
def stepExact_system(self,tExact):
self.dt_system = tExact - self.t_system_last
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
model.stepController.t_model = self.t_system
model.stepController.setSubsteps([self.t_system])
def choose_dt_system(self):
#fixed step
self.t_system = self.t_system_last+self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
model.stepController.t_model = self.t_system
def initialize_dt_system(self,t0,tOut):
self.its=0
self.t_system_last = t0
self.dt_system = tOut - self.t_system_last
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
log("Initializing time step on system %s to dt = %12.5e" %
(self.system.name,self.dt_system),level=1)
log("Initializing step sequence for system %s to %s" %
(self.system.name,self.stepSequence),level=1)
def updateTimeHistory(self):
#update step
self.t_system_last = self.t_system
def retryModelStep_solverFailure(self,model):
return False#don't try to recover
def retryModelStep_errorFailure(self,model):
return False#don't try to recover
def ignoreSequenceStepFailure(self,model):
return False#don't try to recover
def retrySequence_modelStepFailure(self):
return False#don't try to recover
class Sequential_NonUniformFixedStep(SO_base):
"""
Just step to system output time levels, allowing models to
substep if necessary to system steps
"""
def __init__(self,modelList,system=defaultSystem,stepExact=True):
SO_base.__init__(self,modelList,system,stepExact=True)#force?
for model in self.modelList:
model.stepController.stepExact = True#mwf should be True
def stepExact_system(self,tExact):
old = False
if old:
if (self.dt_system > 0.0 and
self.t_system_last + self.dt_system >= tExact*(1.0-self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
elif (self.dt_system < 0.0 and
self.t_system_last + self.dt_system <= tExact*(1.0 + self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
else:
if (self.dt_system > 0.0):
if(self.t_system_last + self.dt_system >= tExact*(1.0-self.stepExactEps)):
log("===========================================================dt system orig" + str(self.dt_system),level=5)
self.dt_system = tExact - self.t_system_last
log("=========================================================dt system final" + str(self.dt_system),level=5)
elif( tExact - (self.t_system_last + self.dt_system) < self.dt_system/2.0 ): #if next step would be within dt/2 ball go ahead and cut a little bit
log("===========================================================dt system orig" + str(self.dt_system),level=5)
self.dt_system = (tExact - self.t_system_last)/2.0
log("=========================================================dt system final" + str(self.dt_system),level=5)
if (self.dt_system < 0.0):
if(self.t_system_last + self.dt_system <= tExact*(1.0 + self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
elif( tExact - (self.t_system_last + self.dt_system) > self.dt_system/2.0 ): #if next step would be within dt/2 ball go ahead and cut a little bit
self.dt_system = (tExact - self.t_system_last)/2.0
self.dt_system = tExact - self.t_system_last
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
def choose_dt_system(self):
#fixed system step
self.t_system = self.t_system_last+self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.set_dt_allLevels()
def modelStepTaken(self,model,t_stepSequence):
self.stepFailures=0
model.calculateAuxiliaryQuantitiesAfterStep()
#model.stepController.t_model_last = t_stepSequence
model.stepController.updateTimeHistory()
model.stepController.choose_dt_model()
def sequenceTaken(self):
#this is called when the sequence of steps is done
self.its += 1
#for model in self.modelList:
# model.stepController.updateTimeHistory()
# model.stepController.choose_dt_model()
class Sequential_MinModelStep(SO_base):
"""
Look at the minimum model step and make that the system step as
well as all the model steps
Force models to step exactly to each with substepping if necessary,
but this isn't strictly necessary
"""
def __init__(self,modelList,system=defaultSystem,stepExact=True):
SO_base.__init__(self,modelList,system,stepExact)
for model in self.modelList:
model.stepController.stepExact = True
def converged(self):
#no iteration
if self.its > 0:
self.its=0
return True
else:
return False
def stepExact_system(self,tExact):
old = False
if old:
if (self.dt_system > 0.0 and
self.t_system_last + self.dt_system >= tExact*(1.0-self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
elif (self.dt_system < 0.0 and
self.t_system_last + self.dt_system <= tExact*(1.0 + self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
else:
if (self.dt_system > 0.0):
if(self.t_system_last + self.dt_system >= tExact*(1.0-self.stepExactEps)):
log("===========================================================dt system orig" + str(self.dt_system),level=5)
self.dt_system = tExact - self.t_system_last
log("=========================================================dt system final" + str(self.dt_system),level=5)
elif( tExact - (self.t_system_last + self.dt_system) < self.dt_system/2.0 ): #if next step would be within dt/2 ball go ahead and cut a little bit
log("===========================================================dt system orig" + str(self.dt_system),level=5)
self.dt_system = (tExact - self.t_system_last)/2.0
log("=========================================================dt system final" + str(self.dt_system),level=5)
if (self.dt_system < 0.0):
if(self.t_system_last + self.dt_system <= tExact*(1.0 + self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
elif( tExact - (self.t_system_last + self.dt_system) > self.dt_system/2.0 ): #if next step would be within dt/2 ball go ahead and cut a little bit
self.dt_system = (tExact - self.t_system_last)/2.0
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
model.stepController.t_model = self.t_system
#model.stepController.substeps= [self.t_system]
model.stepController.setSubsteps([self.t_system])
def choose_dt_system(self):
self.dt_system = min([model.stepController.dt_model for model in self.modelList])
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,model) for model in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
log("SplitOperator_Min choose_dt_system t_system_last= %s dt_system= %s t_system= %s " % (self.t_system_last,
self.dt_system,
self.t_system),3)
def initialize_dt_system(self,t0,tOut):
self.its=0
self.t_system_last = t0
self.dt_system = min([model.stepController.dt_model for model in self.modelList])
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.t_model = self.t_system
model.stepController.set_dt_allLevels()
model.stepController.initializeTimeHistory()
log("Initializing time step on system %s to dt = %12.5e" %
(self.system.name,
self.dt_system),
level=1)
log("Initializing step sequence for system %s to %s" %
(self.system.name,
self.stepSequence),
level=1)
class Sequential_MinFLCBDFModelStep(SO_base):
"""
Look at the minimum model step and make that the system step as
well as all the model steps
"""
def __init__(self,modelList,system=defaultSystem,stepExact=False):
from StepControl import FLCBDF_controller
SO_base.__init__(self,modelList,system,stepExact)
self.flcbdfList = []
for model in self.modelList:
if isinstance(model.stepController,FLCBDF_controller):
self.flcbdfList.append(model)
self.maxFailures = model.stepController.maxSolverFailures
self.stepFailures=0
self.updateAfterModelStep=False
def converged(self):
#no iteration
if self.its > 0:
self.its=0
return True
else:
return False
def stepExact_system(self,tExact):
#mwf needs to be checked
if (self.dt_system > 0.0 and
self.t_system_last + self.dt_system >= tExact*(1.0-self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
elif (self.dt_system < 0.0 and
self.t_system_last + self.dt_system <= tExact*(1.0 + self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
model.stepController.t_model = self.t_system
#model.stepController.substeps= [self.t_system]
model.stepController.setSubsteps([self.t_system])
def choose_dt_system(self):
self.dt_system = min([model.stepController.dt_model for model in self.flcbdfList])
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,model) for model in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
def initialize_dt_system(self,t0,tOut):
self.its=0
self.t_system_last = t0
self.dt_system = min([model.stepController.dt_model for model in self.modelList])
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.t_model = self.t_system
model.stepController.set_dt_allLevels()
model.stepController.initializeTimeHistory()
log("Initializing time step on system %s to dt = %12.5e" %
(self.system.name,
self.dt_system),
level=1)
log("Initializing step sequence for system %s to %s" %
(self.system.name,
self.stepSequence),
level=1)
def retryModelStep_solverFailure(self,model):
self.retry = model.stepController.retryStep_solverFailure()
return False#kick out to sequence level
def retryModelStep_errorFailure(self,model):
self.retry = model.stepController.retryStep_errorFailure()
return False
def ignoreSequenceStepFailure(self,model):
return False#kick out to sequence level
def retrySequence_modelStepFailure(self):
#retry whole sequence with new min time step
self.stepFailures +=1
if (self.stepFailures < self.maxFailures and self.retry):
self.choose_dt_system()
return True
else:
return False
def modelStepTaken(self,model,t_stepSequence):
self.stepFailures=0
if model.stepController.t_model >= t_stepSequence:
self.exitModelStep[model] = True
model.calculateAuxiliaryQuantitiesAfterStep()
def sequenceStepTaken(self,model):
self.stepFailures=0
self.exitModelStep[model] = False #reset to False
def sequenceTaken(self):
#this is called when the sequence of steps is done
self.its += 1
#do not need to do step exact here, because taken care of by
#systemStepController forcing lock-step?
for model in self.modelList:
model.stepController.updateTimeHistory()
model.stepController.choose_dt_model()
#recompute auxiliary variables here?
class Sequential_MinAdaptiveModelStep(SO_base):
"""
Look at the minimum model step and make that the system step as
well as all the model steps
"""
def __init__(self,modelList,system=defaultSystem,stepExact=False):
from StepControl import FLCBDF_controller
SO_base.__init__(self,modelList,system,stepExact)
self.controllerList = []
for model in self.modelList:
if model.levelModelList[-1].timeIntegration.isAdaptive:
self.controllerList.append(model)
self.maxFailures = model.stepController.maxSolverFailures
#print "controllers",[c.name for c in self.controllerList]
self.stepFailures=0
self.updateAfterModelStep=False
def converged(self):
#no iteration
if self.its > 0:
self.its=0
return True
else:
return False
def stepExact_system(self,tExact):
#cek try to prevent cutting dt by more than a factor of 2
#todo, make standard approach?
old=False
if old:
if (self.dt_system > 0.0 and
self.t_system_last + self.dt_system >= tExact*(1.0-self.stepExactEps)):
log("===========================================================dt system orig" + str(self.dt_system),level=5)
self.dt_system = tExact - self.t_system_last
log("=========================================================dt system final" + str(self.dt_system),level=5)
elif (self.dt_system < 0.0 and
self.t_system_last + self.dt_system <= tExact*(1.0 + self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
else:
if (self.dt_system > 0.0):
if(self.t_system_last + self.dt_system >= tExact*(1.0-self.stepExactEps)):
log("===========================================================dt system orig" + str(self.dt_system),level=5)
self.dt_system = tExact - self.t_system_last
log("=========================================================dt system final" + str(self.dt_system),level=5)
elif( tExact - (self.t_system_last + self.dt_system) < self.dt_system/2.0 ): #if next step would be within dt/2 ball go ahead and cut a little bit
log("===========================================================dt system orig" + str(self.dt_system),level=5)
self.dt_system = (tExact - self.t_system_last)/2.0
log("=========================================================dt system final" + str(self.dt_system),level=5)
if (self.dt_system < 0.0):
if(self.t_system_last + self.dt_system <= tExact*(1.0 + self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
elif( tExact - (self.t_system_last + self.dt_system) > self.dt_system/2.0 ): #if next step would be within dt/2 ball go ahead and cut a little bit
self.dt_system = (tExact - self.t_system_last)/2.0
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
model.stepController.t_model = self.t_system
#model.stepController.substeps= [self.t_system]
model.stepController.setSubsteps([self.t_system])
def choose_dt_system(self):
self.dt_system = min([model.stepController.dt_model for model in self.controllerList])
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,model) for model in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
#mwf this is not done in flcbdf system controller, should it be?
#model.stepController.t_model = self.t_system
def initialize_dt_system(self,t0,tOut):
self.its=0
self.t_system_last = t0
self.dt_system = min([model.stepController.dt_model for model in self.controllerList])
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.t_model = self.t_system
model.stepController.set_dt_allLevels()
model.stepController.initializeTimeHistory()
log("Initializing time step on system %s to dt = %12.5e" %
(self.system.name,
self.dt_system),
level=1)
log("Initializing step sequence for system %s to %s" %
(self.system.name,
self.stepSequence),
level=1)
def retryModelStep_solverFailure(self,model):
self.failureType='solver'
return False#kick out to sequence level
def retryModelStep_errorFailure(self,model):
self.failureType='error'
return False
def ignoreSequenceStepFailure(self,model):
return False#kick out to sequence level
def retrySequence_modelStepFailure(self):
self.retry=True
for model in self.controllerList:
if self.failureType == 'solver':
self.retry = self.retry and model.stepController.retryStep_solverFailure()
else:
self.retry = self.retry and model.stepController.retryStep_errorFailure()
if self.retry:
self.stepFailures +=1
if (self.stepFailures < self.maxFailures and self.retry):
self.choose_dt_system()
return True
else:
return False
def modelStepTaken(self,model,t_stepSequence):
self.stepFailures=0
if model.stepController.t_model >= t_stepSequence:
self.exitModelStep[model] = True
model.calculateAuxiliaryQuantitiesAfterStep()
def sequenceStepTaken(self,model):
self.stepFailures=0
self.exitModelStep[model] = False #reset to False
def sequenceTaken(self):
#this is called when the sequence of steps is done
#do not need to do step exact here, because taken care of by
#systemStepController forcing lock-step?
self.its += 1
for model in self.modelList:
model.stepController.updateTimeHistory()
model.stepController.choose_dt_model()
#recompute auxiliary variables here?
class ISO_fixed_MinAdaptiveModelStep(SO_base):
"""
Look at the minimum model step and make that the system step as
well as all the model steps
"""
def __init__(self,modelList,system=defaultSystem,stepExact=False):
from StepControl import FLCBDF_controller
SO_base.__init__(self,modelList,system,stepExact)
self.controllerList = []
for model in self.modelList:
if model.levelModelList[-1].timeIntegration.isAdaptive:
self.controllerList.append(model)
self.maxFailures = model.stepController.maxSolverFailures
#print "controllers",[c.name for c in self.controllerList]
self.stepFailures=0
self.updateAfterModelStep=False
def converged(self):
#no iteration
if self.its > 2:
self.its=0
return True
else:
return False
def stepExact_system(self,tExact):
#cek try to prevent cutting dt by more than a factor of 2
#todo, make standard approach?
old=False
if old:
if (self.dt_system > 0.0 and
self.t_system_last + self.dt_system >= tExact*(1.0-self.stepExactEps)):
log("===========================================================dt system orig" + str(self.dt_system),level=5)
self.dt_system = tExact - self.t_system_last
log("=========================================================dt system final" + str(self.dt_system),level=5)
elif (self.dt_system < 0.0 and
self.t_system_last + self.dt_system <= tExact*(1.0 + self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
else:
if (self.dt_system > 0.0):
if(self.t_system_last + self.dt_system >= tExact*(1.0-self.stepExactEps)):
log("===========================================================dt system orig" + str(self.dt_system),level=5)
self.dt_system = tExact - self.t_system_last
log("=========================================================dt system final" + str(self.dt_system),level=5)
elif( tExact - (self.t_system_last + self.dt_system) < self.dt_system/2.0 ): #if next step would be within dt/2 ball go ahead and cut a little bit
log("===========================================================dt system orig" + str(self.dt_system),level=5)
self.dt_system = (tExact - self.t_system_last)/2.0
log("=========================================================dt system final" + str(self.dt_system),level=5)
if (self.dt_system < 0.0):
if(self.t_system_last + self.dt_system <= tExact*(1.0 + self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
elif( tExact - (self.t_system_last + self.dt_system) > self.dt_system/2.0 ): #if next step would be within dt/2 ball go ahead and cut a little bit
self.dt_system = (tExact - self.t_system_last)/2.0
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
model.stepController.t_model = self.t_system
#model.stepController.substeps= [self.t_system]
model.stepController.setSubsteps([self.t_system])
def choose_dt_system(self):
self.dt_system = min([model.stepController.dt_model for model in self.controllerList])
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,model) for model in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
#mwf this is not done in flcbdf system controller, should it be?
#model.stepController.t_model = self.t_system
def initialize_dt_system(self,t0,tOut):
self.its=0
self.t_system_last = t0
self.dt_system = min([model.stepController.dt_model for model in self.controllerList])
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.t_model = self.t_system
model.stepController.set_dt_allLevels()
model.stepController.initializeTimeHistory()
log("Initializing time step on system %s to dt = %12.5e" %
(self.system.name,
self.dt_system),
level=1)
log("Initializing step sequence for system %s to %s" %
(self.system.name,
self.stepSequence),
level=1)
def retryModelStep_solverFailure(self,model):
self.failureType='solver'
return False#kick out to sequence level
def retryModelStep_errorFailure(self,model):
self.failureType='error'
return False
def ignoreSequenceStepFailure(self,model):
return False#kick out to sequence level
def retrySequence_modelStepFailure(self):
self.retry=True
for model in self.modelList:
if self.failureType == 'solver':
self.retry = self.retry and model.stepController.retryStep_solverFailure()
else:
self.retry = self.retry and model.stepController.retryStep_errorFailure()
if self.retry:
self.stepFailures +=1
if (self.stepFailures < self.maxFailures and self.retry):
self.choose_dt_system()
return True
else:
return False
def modelStepTaken(self,model,t_stepSequence):
self.stepFailures=0
if model.stepController.t_model >= t_stepSequence:
self.exitModelStep[model] = True
model.calculateAuxiliaryQuantitiesAfterStep()
def sequenceStepTaken(self,model):
self.stepFailures=0
self.exitModelStep[model] = False #reset to False
def sequenceTaken(self):
#this is called when the sequence of steps is done
#do not need to do step exact here, because taken care of by
#systemStepController forcing lock-step?
self.its += 1
if self.its > 2:
for model in self.modelList:
model.stepController.updateTimeHistory()
model.stepController.choose_dt_model()
#recompute auxiliary variables here?
class Sequential_MinAdaptiveModelStep_SS(SO_base):
"""
Look at the minimum model step and make that the system step as
well as all the model steps
"""
def __init__(self,modelList,system=defaultSystem,stepExact=False):
from StepControl import FLCBDF_controller
SO_base.__init__(self,modelList,system,stepExact)
self.controllerList = []
for model in self.modelList:
if model.levelModelList[-1].timeIntegration.isAdaptive:
self.controllerList.append(model)
self.maxFailures = model.stepController.maxSolverFailures
#print "controllers",[c.name for c in self.controllerList]
self.stepFailures=0
self.updateAfterModelStep=False
def converged(self):
#no iteration
if self.its > 0:
self.its=0
return True
else:
return False
def stepExact_system(self,tExact):
#cek try to prevent cutting dt by more than a factor of 2
#todo, make standard approach?
old=False
if old:
if (self.dt_system > 0.0 and
self.t_system_last + self.dt_system >= tExact*(1.0-self.stepExactEps)):
log("===========================================================dt system orig" + str(self.dt_system),level=5)
self.dt_system = tExact - self.t_system_last
log("=========================================================dt system final" + str(self.dt_system),level=5)
elif (self.dt_system < 0.0 and
self.t_system_last + self.dt_system <= tExact*(1.0 + self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
else:
if (self.dt_system > 0.0):
if(self.t_system_last + self.dt_system >= tExact*(1.0-self.stepExactEps)):
log("===========================================================dt system orig" + str(self.dt_system),level=5)
self.dt_system = tExact - self.t_system_last
log("=========================================================dt system final" + str(self.dt_system),level=5)
elif( tExact - (self.t_system_last + self.dt_system) < self.dt_system/2.0 ): #if next step would be within dt/2 ball go ahead and cut a little bit
log("===========================================================dt system orig" + str(self.dt_system),level=5)
self.dt_system = (tExact - self.t_system_last)/2.0
log("=========================================================dt system final" + str(self.dt_system),level=5)
if (self.dt_system < 0.0):
if(self.t_system_last + self.dt_system <= tExact*(1.0 + self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
elif( tExact - (self.t_system_last + self.dt_system) > self.dt_system/2.0 ): #if next step would be within dt/2 ball go ahead and cut a little bit
self.dt_system = (tExact - self.t_system_last)/2.0
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
model.stepController.t_model = self.t_system
#model.stepController.substeps= [self.t_system]
model.stepController.setSubsteps([self.t_system])
def choose_dt_system(self):
self.dt_system = min([model.stepController.dt_model for model in self.controllerList])
self.max_its = max([model.solver.solverList[-1].its for model in self.controllerList])
if self.max_its < 3:
self.dt_system = self.dt_system_last*1.1
self.dt_system_last = self.dt_system
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,model) for model in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
#mwf this is not done in flcbdf system controller, should it be?
#model.stepController.t_model = self.t_system
def initialize_dt_system(self,t0,tOut):
self.its=0
self.t_system_last = t0
self.dt_system = min([model.stepController.dt_model for model in self.controllerList])
self.dt_system_last = self.dt_system
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.t_model = self.t_system
model.stepController.set_dt_allLevels()
model.stepController.initializeTimeHistory()
log("Initializing time step on system %s to dt = %12.5e" %
(self.system.name,
self.dt_system),
level=1)
log("Initializing step sequence for system %s to %s" %
(self.system.name,
self.stepSequence),
level=1)
def retryModelStep_solverFailure(self,model):
self.failureType='solver'
return False#kick out to sequence level
def retryModelStep_errorFailure(self,model):
self.failureType='error'
return False
def ignoreSequenceStepFailure(self,model):
return False#kick out to sequence level
def retrySequence_modelStepFailure(self):
self.retry=True
for model in self.modelList:
if self.failureType == 'solver':
self.retry = self.retry and model.stepController.retryStep_solverFailure()
else:
self.retry = self.retry and model.stepController.retryStep_errorFailure()
if self.retry:
self.stepFailures +=1
if (self.stepFailures < self.maxFailures and self.retry):
self.choose_dt_system()
return True
else:
return False
def modelStepTaken(self,model,t_stepSequence):
self.stepFailures=0
if model.stepController.t_model >= t_stepSequence:
self.exitModelStep[model] = True
model.calculateAuxiliaryQuantitiesAfterStep()
def sequenceStepTaken(self,model):
self.stepFailures=0
self.exitModelStep[model] = False #reset to False
def sequenceTaken(self):
#this is called when the sequence of steps is done
#do not need to do step exact here, because taken care of by
#systemStepController forcing lock-step?
self.its += 1
for model in self.modelList:
model.stepController.updateTimeHistory()
model.stepController.choose_dt_model()
#recompute auxiliary variables here?
class SequentialNotInOrder_MinFLCBDFModelStep(Sequential_MinFLCBDFModelStep):
"""
loop through the models in some specified order
"""
#mwf hack set list for testing
def __init__(self,modelList,modelSequenceList=[1],system=defaultSystem,stepExact=False):
Sequential_MinFLCBDFModelStep.__init__(self,modelList,system=system,stepExact=stepExact)
if modelSequenceList == None:
self.modelSequenceList = [i for i in range(len(modelList))]
else:
self.modelSequenceList = modelSequenceList
def stepExact_system(self,tExact):
#mwf needs to be checked
if (self.dt_system > 0.0 and
self.t_system_last + self.dt_system >= tExact*(1.0-self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
elif (self.dt_system < 0.0 and
self.t_system_last + self.dt_system <= tExact*(1.0 + self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,self.modelList[i]) for i in self.modelSequenceList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
model.stepController.t_model = self.t_system
#model.stepController.substeps= [self.t_system]
model.stepController.setSubsteps([self.t_system])
def choose_dt_system(self):
self.dt_system = min([model.stepController.dt_model for model in self.flcbdfList])
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,self.modelList[i]) for i in self.modelSequenceList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
def initialize_dt_system(self,t0,tOut):
self.its=0
self.t_system_last = t0
self.dt_system = min([model.stepController.dt_model for model in self.modelList])
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,self.modelList[i]) for i in self.modelSequenceList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.t_model = self.t_system
model.stepController.set_dt_allLevels()
model.stepController.initializeTimeHistory()
log("Initializing time step on system %s to dt = %12.5e" %
(self.system.name,
self.dt_system),
level=1)
log("Initializing step sequence for system %s to %s" %
(self.system.name,
self.stepSequence),
level=1)
class SequentialNotInOrder_MinAdaptiveModelStep(Sequential_MinAdaptiveModelStep):
"""
loop through the models in some specified order
"""
#mwf hack set default list for testing
def __init__(self,modelList,modelSequenceList=[1],system=defaultSystem,stepExact=False):
Sequential_MinAdaptiveModelStep.__init__(self,modelList,system=system,stepExact=stepExact)
if modelSequenceList == None:
self.modelSequenceList = [i for i in range(len(modelList))]
else:
self.modelSequenceList = modelSequenceList
def stepExact_system(self,tExact):
#mwf needs to be checked
if (self.dt_system > 0.0 and
self.t_system_last + self.dt_system >= tExact*(1.0-self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
elif (self.dt_system < 0.0 and
self.t_system_last + self.dt_system <= tExact*(1.0 + self.stepExactEps)):
self.dt_system = tExact - self.t_system_last
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,self.modelList[i]) for i in self.modelSequenceList]
#self.stepSequence=[(self.t_system,m) for m in self.modelList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
model.stepController.t_model = self.t_system
#model.stepController.substeps= [self.t_system]
model.stepController.setSubsteps([self.t_system])
def choose_dt_system(self):
self.dt_system = min([model.stepController.dt_model for model in self.controllerList])
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,self.modelList[i]) for i in self.modelSequenceList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.set_dt_allLevels()
#mwf this is not done in flcbdf system controller, should it be?
#model.stepController.t_model = self.t_system
def initialize_dt_system(self,t0,tOut):
self.its=0
self.t_system_last = t0
self.dt_system = min([model.stepController.dt_model for model in self.modelList])
self.t_system = self.t_system_last + self.dt_system
self.stepSequence=[(self.t_system,self.modelList[i]) for i in self.modelSequenceList]
for model in self.modelList:
model.stepController.dt_model = self.dt_system
model.stepController.t_model = self.t_system
model.stepController.set_dt_allLevels()
model.stepController.initializeTimeHistory()
log("Initializing time step on system %s to dt = %12.5e" %
(self.system.name,
self.dt_system),
level=1)
log("Initializing step sequence for system %s to %s" %
(self.system.name,
self.stepSequence),
level=1)
| 49.946063 | 162 | 0.613801 | 5,614 | 46,300 | 4.894193 | 0.044175 | 0.080361 | 0.097831 | 0.065512 | 0.946535 | 0.944097 | 0.93245 | 0.925244 | 0.921095 | 0.915672 | 0 | 0.009487 | 0.269179 | 46,300 | 926 | 163 | 50 | 0.802524 | 0.118791 | 0 | 0.938882 | 0 | 0.0013 | 0.074145 | 0.038568 | 0 | 0 | 0 | 0.00324 | 0 | 1 | 0.119636 | false | 0.002601 | 0.006502 | 0.011704 | 0.196359 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
839eab2cae402e9191c96386f027a441326d94cd | 19 | py | Python | Chapter 01/Chap01_Example1.161.py | Anancha/Programming-Techniques-using-Python | e80c329d2a27383909d358741a5cab03cb22fd8b | [
"MIT"
] | null | null | null | Chapter 01/Chap01_Example1.161.py | Anancha/Programming-Techniques-using-Python | e80c329d2a27383909d358741a5cab03cb22fd8b | [
"MIT"
] | null | null | null | Chapter 01/Chap01_Example1.161.py | Anancha/Programming-Techniques-using-Python | e80c329d2a27383909d358741a5cab03cb22fd8b | [
"MIT"
] | null | null | null | print(int('0b101')) | 19 | 19 | 0.684211 | 3 | 19 | 4.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 0 | 19 | 1 | 19 | 19 | 0.473684 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
83b210f6a7eac669cc6e48292b9dfc0b4c2e04c8 | 172 | py | Python | component/tile/__init__.py | sepal-contrib/se.plan | a534d0f15e371310b6bc483867685f8a21f790c0 | [
"MIT"
] | null | null | null | component/tile/__init__.py | sepal-contrib/se.plan | a534d0f15e371310b6bc483867685f8a21f790c0 | [
"MIT"
] | null | null | null | component/tile/__init__.py | sepal-contrib/se.plan | a534d0f15e371310b6bc483867685f8a21f790c0 | [
"MIT"
] | null | null | null | from .questionnaire_tile import *
from .map_tile import *
from .compute_tile import *
from .dashboard_tile import *
from .custom_aoi_tile import *
from .cost_tile import *
| 24.571429 | 33 | 0.790698 | 25 | 172 | 5.16 | 0.4 | 0.465116 | 0.542636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139535 | 172 | 6 | 34 | 28.666667 | 0.871622 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
83b2318272ccc69c1c611ed4432d31d612476a01 | 1,853 | py | Python | cyclecount/models.py | Jeck-Liu-Create/GreaterWMS | 621e2c91c10724b94b3bff0a828e0000b7c4302e | [
"Apache-2.0"
] | null | null | null | cyclecount/models.py | Jeck-Liu-Create/GreaterWMS | 621e2c91c10724b94b3bff0a828e0000b7c4302e | [
"Apache-2.0"
] | null | null | null | cyclecount/models.py | Jeck-Liu-Create/GreaterWMS | 621e2c91c10724b94b3bff0a828e0000b7c4302e | [
"Apache-2.0"
] | null | null | null | from django.db import models
class QTYRecorder(models.Model):
openid = models.CharField(max_length=255, verbose_name="Openid")
mode_code = models.CharField(max_length=255, verbose_name="Transaction Mode")
bin_name = models.CharField(max_length=255, verbose_name="Bin Name")
goods_code = models.CharField(max_length=255, verbose_name="Goods Code")
goods_qty = models.BigIntegerField(default=0, verbose_name="On Hand Stock")
creater = models.CharField(max_length=255, verbose_name="Who Create")
create_time = models.DateTimeField(auto_now_add=True, verbose_name="Create Time")
update_time = models.DateTimeField(auto_now=True, blank=True, null=True, verbose_name="Update Time")
class CyclecountModeDayModel(models.Model):
openid = models.CharField(max_length=255, verbose_name="Openid")
cyclecount_status = models.IntegerField(default=0, verbose_name="Cycle Count Status")
bin_name = models.CharField(max_length=255, verbose_name="Bin Name")
goods_code = models.CharField(max_length=255, verbose_name="Goods Code")
goods_qty = models.BigIntegerField(default=0, verbose_name="On Hand Stock")
physical_inventory = models.BigIntegerField(default=0, verbose_name="Goods Code")
difference = models.BigIntegerField(default=0, verbose_name="Goods Code")
creater = models.CharField(max_length=255, verbose_name="Who Create")
t_code = models.CharField(max_length=255, verbose_name="Transaction Code")
create_time = models.DateTimeField(auto_now_add=True, verbose_name="Create Time")
update_time = models.DateTimeField(auto_now=True, blank=True, null=True, verbose_name="Update Time")
class Meta:
db_table = 'cyclecountday'
verbose_name = 'data id'
verbose_name_plural = "data id"
ordering = ['openid']
def __str__(self):
return self.pk
| 54.5 | 104 | 0.750135 | 245 | 1,853 | 5.440816 | 0.24898 | 0.173293 | 0.135034 | 0.180045 | 0.796699 | 0.796699 | 0.796699 | 0.796699 | 0.723181 | 0.643661 | 0 | 0.021916 | 0.138154 | 1,853 | 33 | 105 | 56.151515 | 0.812774 | 0 | 0 | 0.482759 | 0 | 0 | 0.130059 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034483 | false | 0 | 0.034483 | 0.034483 | 0.862069 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
f7ef4bd4b7fadf5297047067eee32289a3fb41ba | 134 | py | Python | tests/test_api.py | pupil-labs/python-module-skeleton | 378517b0ce49f0ab71fd5203b350f4bb1e6a06bb | [
"MIT"
] | 2 | 2022-01-17T20:28:45.000Z | 2022-01-25T18:14:47.000Z | tests/test_api.py | pupil-labs/python-module-skeleton | 378517b0ce49f0ab71fd5203b350f4bb1e6a06bb | [
"MIT"
] | null | null | null | tests/test_api.py | pupil-labs/python-module-skeleton | 378517b0ce49f0ab71fd5203b350f4bb1e6a06bb | [
"MIT"
] | null | null | null | import pupil_labs.project_name as this_project
def test_package_metadata() -> None:
assert hasattr(this_project, "__version__")
| 22.333333 | 47 | 0.791045 | 18 | 134 | 5.333333 | 0.833333 | 0.229167 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126866 | 134 | 5 | 48 | 26.8 | 0.820513 | 0 | 0 | 0 | 0 | 0 | 0.08209 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
790ffab04dbe61b315001a0c5f8ad323365ed438 | 33,055 | py | Python | venv1/Lib/site-packages/tensorflow/python/debug/ops/gen_debug_ops.py | Soum-Soum/Tensorflow_Face_Finder | fec6c15d2df7012608511ad87f4b55731bf99478 | [
"Apache-2.0",
"MIT"
] | null | null | null | venv1/Lib/site-packages/tensorflow/python/debug/ops/gen_debug_ops.py | Soum-Soum/Tensorflow_Face_Finder | fec6c15d2df7012608511ad87f4b55731bf99478 | [
"Apache-2.0",
"MIT"
] | 1 | 2021-05-20T00:58:04.000Z | 2021-05-20T00:58:04.000Z | venv1/Lib/site-packages/tensorflow/python/debug/ops/gen_debug_ops.py | Soum-Soum/Tensorflow_Face_Finder | fec6c15d2df7012608511ad87f4b55731bf99478 | [
"Apache-2.0",
"MIT"
] | null | null | null | """Python wrappers around TensorFlow ops.
This file is MACHINE GENERATED! Do not edit.
"""
import collections as _collections
import six as _six
from tensorflow.python import pywrap_tensorflow as _pywrap_tensorflow
from tensorflow.python.eager import context as _context
from tensorflow.python.eager import core as _core
from tensorflow.python.eager import execute as _execute
from tensorflow.python.framework import dtypes as _dtypes
from tensorflow.python.framework import errors as _errors
from tensorflow.python.framework import tensor_shape as _tensor_shape
from tensorflow.core.framework import op_def_pb2 as _op_def_pb2
# Needed to trigger the call to _set_call_cpp_shape_fn.
from tensorflow.python.framework import common_shapes as _common_shapes
from tensorflow.python.framework import op_def_registry as _op_def_registry
from tensorflow.python.framework import ops as _ops
from tensorflow.python.framework import op_def_library as _op_def_library
from tensorflow.python.util.tf_export import tf_export
@tf_export('copy')
def copy(input, tensor_name="", debug_ops_spec=[], name=None):
r"""Copy Op.
Performs CPU-to-CPU or GPU-to-GPU deep-copying of tensor, depending on the
device on which the tensor is allocated.
N.B.: If the all downstream attached debug ops are disabled given the current
gRPC gating status, the output will simply forward the input tensor without
deep-copying. See the documentation of Debug* ops for more details.
Unlike the CopyHost Op, this op does not have HostMemory constraint on its
input or output.
Args:
input: A `Tensor`. Input tensor.
tensor_name: An optional `string`. Defaults to `""`.
The name of the input tensor.
debug_ops_spec: An optional list of `strings`. Defaults to `[]`.
A list of debug op spec (op, url, gated_grpc) for attached debug
ops. Each element of the list has the format
<debug_op>;<grpc_url>;<gated_grpc>, wherein gated_grpc is boolean represented
as 0/1. E.g., "DebugIdentity;grpc://foo:3333;1",
"DebugIdentity;file:///tmp/tfdbg_1;0".
name: A name for the operation (optional).
Returns:
A `Tensor`. Has the same type as `input`.
Output tensor, deep-copied from input.
"""
_ctx = _context.context()
if not _ctx.executing_eagerly():
if tensor_name is None:
tensor_name = ""
tensor_name = _execute.make_str(tensor_name, "tensor_name")
if debug_ops_spec is None:
debug_ops_spec = []
if not isinstance(debug_ops_spec, (list, tuple)):
raise TypeError(
"Expected list for 'debug_ops_spec' argument to "
"'copy' Op, not %r." % debug_ops_spec)
debug_ops_spec = [_execute.make_str(_s, "debug_ops_spec") for _s in debug_ops_spec]
_, _, _op = _op_def_lib._apply_op_helper(
"Copy", input=input, tensor_name=tensor_name,
debug_ops_spec=debug_ops_spec, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "tensor_name",
_op.get_attr("tensor_name"), "debug_ops_spec",
_op.get_attr("debug_ops_spec"))
_execute.record_gradient(
"Copy", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
else:
try:
_result = _pywrap_tensorflow.TFE_Py_FastPathExecute(
_ctx._handle, _ctx.device_name, "Copy", name,
_ctx._post_execution_callbacks, input, "tensor_name", tensor_name,
"debug_ops_spec", debug_ops_spec)
return _result
except _core._FallbackException:
return copy_eager_fallback(
input, tensor_name=tensor_name, debug_ops_spec=debug_ops_spec,
name=name)
except _core._NotOkStatusException as e:
if name is not None:
message = e.message + " name: " + name
else:
message = e.message
_six.raise_from(_core._status_to_exception(e.code, message), None)
def copy_eager_fallback(input, tensor_name="", debug_ops_spec=[], name=None):
r"""This is the slowpath function for Eager mode.
This is for function copy
"""
_ctx = _context.context()
if tensor_name is None:
tensor_name = ""
tensor_name = _execute.make_str(tensor_name, "tensor_name")
if debug_ops_spec is None:
debug_ops_spec = []
if not isinstance(debug_ops_spec, (list, tuple)):
raise TypeError(
"Expected list for 'debug_ops_spec' argument to "
"'copy' Op, not %r." % debug_ops_spec)
debug_ops_spec = [_execute.make_str(_s, "debug_ops_spec") for _s in debug_ops_spec]
_attr_T, (input,) = _execute.args_to_matching_eager([input], _ctx)
_inputs_flat = [input]
_attrs = ("T", _attr_T, "tensor_name", tensor_name, "debug_ops_spec",
debug_ops_spec)
_result = _execute.execute(b"Copy", 1, inputs=_inputs_flat, attrs=_attrs,
ctx=_ctx, name=name)
_execute.record_gradient(
"Copy", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
@tf_export('copy_host')
def copy_host(input, tensor_name="", debug_ops_spec=[], name=None):
r"""Copy Host Op.
Performs CPU-to-CPU deep-copying of tensor.
N.B.: If the all downstream attached debug ops are disabled given the current
gRPC gating status, the output will simply forward the input tensor without
deep-copying. See the documentation of Debug* ops for more details.
Unlike the Copy Op, this op has HostMemory constraint on its input or output.
Args:
input: A `Tensor`. Input tensor.
tensor_name: An optional `string`. Defaults to `""`.
The name of the input tensor.
debug_ops_spec: An optional list of `strings`. Defaults to `[]`.
A list of debug op spec (op, url, gated_grpc) for attached debug
ops. Each element of the list has the format
<debug_op>;<grpc_url>;<gated_grpc>, wherein gated_grpc is boolean represented
as 0/1. E.g., "DebugIdentity;grpc://foo:3333;1",
"DebugIdentity;file:///tmp/tfdbg_1;0".
name: A name for the operation (optional).
Returns:
A `Tensor`. Has the same type as `input`.
Output tensor, deep-copied from input.
"""
_ctx = _context.context()
if not _ctx.executing_eagerly():
if tensor_name is None:
tensor_name = ""
tensor_name = _execute.make_str(tensor_name, "tensor_name")
if debug_ops_spec is None:
debug_ops_spec = []
if not isinstance(debug_ops_spec, (list, tuple)):
raise TypeError(
"Expected list for 'debug_ops_spec' argument to "
"'copy_host' Op, not %r." % debug_ops_spec)
debug_ops_spec = [_execute.make_str(_s, "debug_ops_spec") for _s in debug_ops_spec]
_, _, _op = _op_def_lib._apply_op_helper(
"CopyHost", input=input, tensor_name=tensor_name,
debug_ops_spec=debug_ops_spec, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "tensor_name",
_op.get_attr("tensor_name"), "debug_ops_spec",
_op.get_attr("debug_ops_spec"))
_execute.record_gradient(
"CopyHost", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
else:
try:
_result = _pywrap_tensorflow.TFE_Py_FastPathExecute(
_ctx._handle, _ctx.device_name, "CopyHost", name,
_ctx._post_execution_callbacks, input, "tensor_name", tensor_name,
"debug_ops_spec", debug_ops_spec)
return _result
except _core._FallbackException:
return copy_host_eager_fallback(
input, tensor_name=tensor_name, debug_ops_spec=debug_ops_spec,
name=name)
except _core._NotOkStatusException as e:
if name is not None:
message = e.message + " name: " + name
else:
message = e.message
_six.raise_from(_core._status_to_exception(e.code, message), None)
def copy_host_eager_fallback(input, tensor_name="", debug_ops_spec=[], name=None):
r"""This is the slowpath function for Eager mode.
This is for function copy_host
"""
_ctx = _context.context()
if tensor_name is None:
tensor_name = ""
tensor_name = _execute.make_str(tensor_name, "tensor_name")
if debug_ops_spec is None:
debug_ops_spec = []
if not isinstance(debug_ops_spec, (list, tuple)):
raise TypeError(
"Expected list for 'debug_ops_spec' argument to "
"'copy_host' Op, not %r." % debug_ops_spec)
debug_ops_spec = [_execute.make_str(_s, "debug_ops_spec") for _s in debug_ops_spec]
_attr_T, (input,) = _execute.args_to_matching_eager([input], _ctx)
_inputs_flat = [input]
_attrs = ("T", _attr_T, "tensor_name", tensor_name, "debug_ops_spec",
debug_ops_spec)
_result = _execute.execute(b"CopyHost", 1, inputs=_inputs_flat,
attrs=_attrs, ctx=_ctx, name=name)
_execute.record_gradient(
"CopyHost", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
@tf_export('debug_identity')
def debug_identity(input, device_name="", tensor_name="", debug_urls=[], gated_grpc=False, name=None):
r"""Debug Identity Op.
Provides an identity mapping of the non-Ref type input tensor for debugging.
Args:
input: A `Tensor`. Input tensor, non-Reference type.
device_name: An optional `string`. Defaults to `""`.
tensor_name: An optional `string`. Defaults to `""`.
Name of the input tensor.
debug_urls: An optional list of `strings`. Defaults to `[]`.
List of URLs to debug targets, e.g.,
file:///foo/tfdbg_dump, grpc:://localhost:11011
gated_grpc: An optional `bool`. Defaults to `False`.
Whether this op will be gated. If any of the debug_urls of this
debug node is of the grpc:// scheme, when the value of this attribute is set
to True, the data will not actually be sent via the grpc stream unless this
debug op has been enabled at the debug_url. If all of the debug_urls of this
debug node are of the grpc:// scheme and the debug op is enabled at none of
them, the output will be an empty Tensor.
name: A name for the operation (optional).
Returns:
A `Tensor`. Has the same type as `input`.
Output tensor that equals the input tensor.
"""
_ctx = _context.context()
if not _ctx.executing_eagerly():
if device_name is None:
device_name = ""
device_name = _execute.make_str(device_name, "device_name")
if tensor_name is None:
tensor_name = ""
tensor_name = _execute.make_str(tensor_name, "tensor_name")
if debug_urls is None:
debug_urls = []
if not isinstance(debug_urls, (list, tuple)):
raise TypeError(
"Expected list for 'debug_urls' argument to "
"'debug_identity' Op, not %r." % debug_urls)
debug_urls = [_execute.make_str(_s, "debug_urls") for _s in debug_urls]
if gated_grpc is None:
gated_grpc = False
gated_grpc = _execute.make_bool(gated_grpc, "gated_grpc")
_, _, _op = _op_def_lib._apply_op_helper(
"DebugIdentity", input=input, device_name=device_name,
tensor_name=tensor_name, debug_urls=debug_urls, gated_grpc=gated_grpc,
name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "device_name",
_op.get_attr("device_name"), "tensor_name",
_op.get_attr("tensor_name"), "debug_urls",
_op.get_attr("debug_urls"), "gated_grpc",
_op.get_attr("gated_grpc"))
_execute.record_gradient(
"DebugIdentity", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
else:
try:
_result = _pywrap_tensorflow.TFE_Py_FastPathExecute(
_ctx._handle, _ctx.device_name, "DebugIdentity", name,
_ctx._post_execution_callbacks, input, "device_name", device_name,
"tensor_name", tensor_name, "debug_urls", debug_urls, "gated_grpc",
gated_grpc)
return _result
except _core._FallbackException:
return debug_identity_eager_fallback(
input, device_name=device_name, tensor_name=tensor_name,
debug_urls=debug_urls, gated_grpc=gated_grpc, name=name)
except _core._NotOkStatusException as e:
if name is not None:
message = e.message + " name: " + name
else:
message = e.message
_six.raise_from(_core._status_to_exception(e.code, message), None)
def debug_identity_eager_fallback(input, device_name="", tensor_name="", debug_urls=[], gated_grpc=False, name=None):
r"""This is the slowpath function for Eager mode.
This is for function debug_identity
"""
_ctx = _context.context()
if device_name is None:
device_name = ""
device_name = _execute.make_str(device_name, "device_name")
if tensor_name is None:
tensor_name = ""
tensor_name = _execute.make_str(tensor_name, "tensor_name")
if debug_urls is None:
debug_urls = []
if not isinstance(debug_urls, (list, tuple)):
raise TypeError(
"Expected list for 'debug_urls' argument to "
"'debug_identity' Op, not %r." % debug_urls)
debug_urls = [_execute.make_str(_s, "debug_urls") for _s in debug_urls]
if gated_grpc is None:
gated_grpc = False
gated_grpc = _execute.make_bool(gated_grpc, "gated_grpc")
_attr_T, (input,) = _execute.args_to_matching_eager([input], _ctx)
_inputs_flat = [input]
_attrs = ("T", _attr_T, "device_name", device_name, "tensor_name",
tensor_name, "debug_urls", debug_urls, "gated_grpc", gated_grpc)
_result = _execute.execute(b"DebugIdentity", 1, inputs=_inputs_flat,
attrs=_attrs, ctx=_ctx, name=name)
_execute.record_gradient(
"DebugIdentity", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
@tf_export('debug_nan_count')
def debug_nan_count(input, device_name="", tensor_name="", debug_urls=[], gated_grpc=False, name=None):
r"""Debug NaN Value Counter Op
Counts number of NaNs in the input tensor, for debugging.
Args:
input: A `Tensor`. Input tensor, non-Reference type.
device_name: An optional `string`. Defaults to `""`.
tensor_name: An optional `string`. Defaults to `""`.
Name of the input tensor.
debug_urls: An optional list of `strings`. Defaults to `[]`.
List of URLs to debug targets, e.g.,
file:///foo/tfdbg_dump, grpc:://localhost:11011.
gated_grpc: An optional `bool`. Defaults to `False`.
Whether this op will be gated. If any of the debug_urls of this
debug node is of the grpc:// scheme, when the value of this attribute is set
to True, the data will not actually be sent via the grpc stream unless this
debug op has been enabled at the debug_url. If all of the debug_urls of this
debug node are of the grpc:// scheme and the debug op is enabled at none of
them, the output will be an empty Tensor.
name: A name for the operation (optional).
Returns:
A `Tensor` of type `int64`.
An integer output tensor that is the number of NaNs in the input.
"""
_ctx = _context.context()
if not _ctx.executing_eagerly():
if device_name is None:
device_name = ""
device_name = _execute.make_str(device_name, "device_name")
if tensor_name is None:
tensor_name = ""
tensor_name = _execute.make_str(tensor_name, "tensor_name")
if debug_urls is None:
debug_urls = []
if not isinstance(debug_urls, (list, tuple)):
raise TypeError(
"Expected list for 'debug_urls' argument to "
"'debug_nan_count' Op, not %r." % debug_urls)
debug_urls = [_execute.make_str(_s, "debug_urls") for _s in debug_urls]
if gated_grpc is None:
gated_grpc = False
gated_grpc = _execute.make_bool(gated_grpc, "gated_grpc")
_, _, _op = _op_def_lib._apply_op_helper(
"DebugNanCount", input=input, device_name=device_name,
tensor_name=tensor_name, debug_urls=debug_urls, gated_grpc=gated_grpc,
name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "device_name",
_op.get_attr("device_name"), "tensor_name",
_op.get_attr("tensor_name"), "debug_urls",
_op.get_attr("debug_urls"), "gated_grpc",
_op.get_attr("gated_grpc"))
_execute.record_gradient(
"DebugNanCount", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
else:
try:
_result = _pywrap_tensorflow.TFE_Py_FastPathExecute(
_ctx._handle, _ctx.device_name, "DebugNanCount", name,
_ctx._post_execution_callbacks, input, "device_name", device_name,
"tensor_name", tensor_name, "debug_urls", debug_urls, "gated_grpc",
gated_grpc)
return _result
except _core._FallbackException:
return debug_nan_count_eager_fallback(
input, device_name=device_name, tensor_name=tensor_name,
debug_urls=debug_urls, gated_grpc=gated_grpc, name=name)
except _core._NotOkStatusException as e:
if name is not None:
message = e.message + " name: " + name
else:
message = e.message
_six.raise_from(_core._status_to_exception(e.code, message), None)
def debug_nan_count_eager_fallback(input, device_name="", tensor_name="", debug_urls=[], gated_grpc=False, name=None):
r"""This is the slowpath function for Eager mode.
This is for function debug_nan_count
"""
_ctx = _context.context()
if device_name is None:
device_name = ""
device_name = _execute.make_str(device_name, "device_name")
if tensor_name is None:
tensor_name = ""
tensor_name = _execute.make_str(tensor_name, "tensor_name")
if debug_urls is None:
debug_urls = []
if not isinstance(debug_urls, (list, tuple)):
raise TypeError(
"Expected list for 'debug_urls' argument to "
"'debug_nan_count' Op, not %r." % debug_urls)
debug_urls = [_execute.make_str(_s, "debug_urls") for _s in debug_urls]
if gated_grpc is None:
gated_grpc = False
gated_grpc = _execute.make_bool(gated_grpc, "gated_grpc")
_attr_T, (input,) = _execute.args_to_matching_eager([input], _ctx)
_inputs_flat = [input]
_attrs = ("T", _attr_T, "device_name", device_name, "tensor_name",
tensor_name, "debug_urls", debug_urls, "gated_grpc", gated_grpc)
_result = _execute.execute(b"DebugNanCount", 1, inputs=_inputs_flat,
attrs=_attrs, ctx=_ctx, name=name)
_execute.record_gradient(
"DebugNanCount", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
@tf_export('debug_numeric_summary')
def debug_numeric_summary(input, device_name="", tensor_name="", debug_urls=[], lower_bound=float('-inf'), upper_bound=float('inf'), mute_if_healthy=False, gated_grpc=False, name=None):
r"""Debug Numeric Summary Op.
Provide a basic summary of numeric value types, range and distribution.
Args:
input: A `Tensor`. Input tensor, non-Reference type, float or double.
device_name: An optional `string`. Defaults to `""`.
tensor_name: An optional `string`. Defaults to `""`.
Name of the input tensor.
debug_urls: An optional list of `strings`. Defaults to `[]`.
List of URLs to debug targets, e.g.,
file:///foo/tfdbg_dump, grpc:://localhost:11011
lower_bound: An optional `float`. Defaults to `float('-inf')`.
(float) The lower bound <= which values will be included in the
generalized -inf count. Default: -inf.
upper_bound: An optional `float`. Defaults to `float('inf')`.
(float) The upper bound >= which values will be included in the
generalized +inf count. Default: +inf.
mute_if_healthy: An optional `bool`. Defaults to `False`.
(bool) Do not send data to the debug URLs unless at least one
of elements [2], [3] and [7] (i.e., the nan count and the generalized -inf and
inf counts) is non-zero.
gated_grpc: An optional `bool`. Defaults to `False`.
Whether this op will be gated. If any of the debug_urls of this
debug node is of the grpc:// scheme, when the value of this attribute is set
to True, the data will not actually be sent via the grpc stream unless this
debug op has been enabled at the debug_url. If all of the debug_urls of this
debug node are of the grpc:// scheme and the debug op is enabled at none of
them, the output will be an empty Tensor.
name: A name for the operation (optional).
Returns:
A `Tensor` of type `float64`.
A double tensor of shape [14 + nDimensions], where nDimensions is the
the number of dimensions of the tensor's shape. The elements of output are:
[0]: is initialized (1.0) or not (0.0).
[1]: total number of elements
[2]: NaN element count
[3]: generalized -inf count: elements <= lower_bound. lower_bound is -inf by
default.
[4]: negative element count (excluding -inf), if lower_bound is the default
-inf. Otherwise, this is the count of elements > lower_bound and < 0.
[5]: zero element count
[6]: positive element count (excluding +inf), if upper_bound is the default
-inf. Otherwise, this is the count of elements < upper_bound and > 0.
[7]: generalized +inf count, elements >= upper_bound. upper_bound is +inf by
default.
Output elements [1:8] are all zero, if the tensor is uninitialized.
[8]: minimum of all non-inf and non-NaN elements.
If uninitialized or no such element exists: +inf.
[9]: maximum of all non-inf and non-NaN elements.
If uninitialized or no such element exists: -inf.
[10]: mean of all non-inf and non-NaN elements.
If uninitialized or no such element exists: NaN.
[11]: variance of all non-inf and non-NaN elements.
If uninitialized or no such element exists: NaN.
[12]: Data type of the tensor encoded as an enum integer. See the DataType
proto for more details.
[13]: Number of dimensions of the tensor (ndims).
[14+]: Sizes of the dimensions.
"""
_ctx = _context.context()
if not _ctx.executing_eagerly():
if device_name is None:
device_name = ""
device_name = _execute.make_str(device_name, "device_name")
if tensor_name is None:
tensor_name = ""
tensor_name = _execute.make_str(tensor_name, "tensor_name")
if debug_urls is None:
debug_urls = []
if not isinstance(debug_urls, (list, tuple)):
raise TypeError(
"Expected list for 'debug_urls' argument to "
"'debug_numeric_summary' Op, not %r." % debug_urls)
debug_urls = [_execute.make_str(_s, "debug_urls") for _s in debug_urls]
if lower_bound is None:
lower_bound = float('-inf')
lower_bound = _execute.make_float(lower_bound, "lower_bound")
if upper_bound is None:
upper_bound = float('inf')
upper_bound = _execute.make_float(upper_bound, "upper_bound")
if mute_if_healthy is None:
mute_if_healthy = False
mute_if_healthy = _execute.make_bool(mute_if_healthy, "mute_if_healthy")
if gated_grpc is None:
gated_grpc = False
gated_grpc = _execute.make_bool(gated_grpc, "gated_grpc")
_, _, _op = _op_def_lib._apply_op_helper(
"DebugNumericSummary", input=input, device_name=device_name,
tensor_name=tensor_name, debug_urls=debug_urls,
lower_bound=lower_bound, upper_bound=upper_bound,
mute_if_healthy=mute_if_healthy, gated_grpc=gated_grpc, name=name)
_result = _op.outputs[:]
_inputs_flat = _op.inputs
_attrs = ("T", _op.get_attr("T"), "device_name",
_op.get_attr("device_name"), "tensor_name",
_op.get_attr("tensor_name"), "debug_urls",
_op.get_attr("debug_urls"), "lower_bound",
_op.get_attr("lower_bound"), "upper_bound",
_op.get_attr("upper_bound"), "mute_if_healthy",
_op.get_attr("mute_if_healthy"), "gated_grpc",
_op.get_attr("gated_grpc"))
_execute.record_gradient(
"DebugNumericSummary", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
else:
try:
_result = _pywrap_tensorflow.TFE_Py_FastPathExecute(
_ctx._handle, _ctx.device_name, "DebugNumericSummary", name,
_ctx._post_execution_callbacks, input, "device_name", device_name,
"tensor_name", tensor_name, "debug_urls", debug_urls, "lower_bound",
lower_bound, "upper_bound", upper_bound, "mute_if_healthy",
mute_if_healthy, "gated_grpc", gated_grpc)
return _result
except _core._FallbackException:
return debug_numeric_summary_eager_fallback(
input, device_name=device_name, tensor_name=tensor_name,
debug_urls=debug_urls, lower_bound=lower_bound,
upper_bound=upper_bound, mute_if_healthy=mute_if_healthy,
gated_grpc=gated_grpc, name=name)
except _core._NotOkStatusException as e:
if name is not None:
message = e.message + " name: " + name
else:
message = e.message
_six.raise_from(_core._status_to_exception(e.code, message), None)
def debug_numeric_summary_eager_fallback(input, device_name="", tensor_name="", debug_urls=[], lower_bound=float('-inf'), upper_bound=float('inf'), mute_if_healthy=False, gated_grpc=False, name=None):
r"""This is the slowpath function for Eager mode.
This is for function debug_numeric_summary
"""
_ctx = _context.context()
if device_name is None:
device_name = ""
device_name = _execute.make_str(device_name, "device_name")
if tensor_name is None:
tensor_name = ""
tensor_name = _execute.make_str(tensor_name, "tensor_name")
if debug_urls is None:
debug_urls = []
if not isinstance(debug_urls, (list, tuple)):
raise TypeError(
"Expected list for 'debug_urls' argument to "
"'debug_numeric_summary' Op, not %r." % debug_urls)
debug_urls = [_execute.make_str(_s, "debug_urls") for _s in debug_urls]
if lower_bound is None:
lower_bound = float('-inf')
lower_bound = _execute.make_float(lower_bound, "lower_bound")
if upper_bound is None:
upper_bound = float('inf')
upper_bound = _execute.make_float(upper_bound, "upper_bound")
if mute_if_healthy is None:
mute_if_healthy = False
mute_if_healthy = _execute.make_bool(mute_if_healthy, "mute_if_healthy")
if gated_grpc is None:
gated_grpc = False
gated_grpc = _execute.make_bool(gated_grpc, "gated_grpc")
_attr_T, (input,) = _execute.args_to_matching_eager([input], _ctx)
_inputs_flat = [input]
_attrs = ("T", _attr_T, "device_name", device_name, "tensor_name",
tensor_name, "debug_urls", debug_urls, "lower_bound", lower_bound,
"upper_bound", upper_bound, "mute_if_healthy", mute_if_healthy,
"gated_grpc", gated_grpc)
_result = _execute.execute(b"DebugNumericSummary", 1, inputs=_inputs_flat,
attrs=_attrs, ctx=_ctx, name=name)
_execute.record_gradient(
"DebugNumericSummary", _inputs_flat, _attrs, _result, name)
_result, = _result
return _result
def _InitOpDefLibrary(op_list_proto_bytes):
op_list = _op_def_pb2.OpList()
op_list.ParseFromString(op_list_proto_bytes)
_op_def_registry.register_op_list(op_list)
op_def_lib = _op_def_library.OpDefLibrary()
op_def_lib.add_op_list(op_list)
return op_def_lib
# op {
# name: "Copy"
# input_arg {
# name: "input"
# type_attr: "T"
# }
# output_arg {
# name: "output"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# }
# attr {
# name: "tensor_name"
# type: "string"
# default_value {
# s: ""
# }
# }
# attr {
# name: "debug_ops_spec"
# type: "list(string)"
# default_value {
# list {
# }
# }
# }
# allows_uninitialized_input: true
# }
# op {
# name: "CopyHost"
# input_arg {
# name: "input"
# type_attr: "T"
# }
# output_arg {
# name: "output"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# }
# attr {
# name: "tensor_name"
# type: "string"
# default_value {
# s: ""
# }
# }
# attr {
# name: "debug_ops_spec"
# type: "list(string)"
# default_value {
# list {
# }
# }
# }
# allows_uninitialized_input: true
# }
# op {
# name: "DebugIdentity"
# input_arg {
# name: "input"
# type_attr: "T"
# }
# output_arg {
# name: "output"
# type_attr: "T"
# }
# attr {
# name: "T"
# type: "type"
# }
# attr {
# name: "device_name"
# type: "string"
# default_value {
# s: ""
# }
# }
# attr {
# name: "tensor_name"
# type: "string"
# default_value {
# s: ""
# }
# }
# attr {
# name: "debug_urls"
# type: "list(string)"
# default_value {
# list {
# }
# }
# }
# attr {
# name: "gated_grpc"
# type: "bool"
# default_value {
# b: false
# }
# }
# allows_uninitialized_input: true
# }
# op {
# name: "DebugNanCount"
# input_arg {
# name: "input"
# type_attr: "T"
# }
# output_arg {
# name: "output"
# type: DT_INT64
# }
# attr {
# name: "T"
# type: "type"
# }
# attr {
# name: "device_name"
# type: "string"
# default_value {
# s: ""
# }
# }
# attr {
# name: "tensor_name"
# type: "string"
# default_value {
# s: ""
# }
# }
# attr {
# name: "debug_urls"
# type: "list(string)"
# default_value {
# list {
# }
# }
# }
# attr {
# name: "gated_grpc"
# type: "bool"
# default_value {
# b: false
# }
# }
# allows_uninitialized_input: true
# }
# op {
# name: "DebugNumericSummary"
# input_arg {
# name: "input"
# type_attr: "T"
# }
# output_arg {
# name: "output"
# type: DT_DOUBLE
# }
# attr {
# name: "T"
# type: "type"
# }
# attr {
# name: "device_name"
# type: "string"
# default_value {
# s: ""
# }
# }
# attr {
# name: "tensor_name"
# type: "string"
# default_value {
# s: ""
# }
# }
# attr {
# name: "debug_urls"
# type: "list(string)"
# default_value {
# list {
# }
# }
# }
# attr {
# name: "lower_bound"
# type: "float"
# default_value {
# f: -inf
# }
# }
# attr {
# name: "upper_bound"
# type: "float"
# default_value {
# f: inf
# }
# }
# attr {
# name: "mute_if_healthy"
# type: "bool"
# default_value {
# b: false
# }
# }
# attr {
# name: "gated_grpc"
# type: "bool"
# default_value {
# b: false
# }
# }
# allows_uninitialized_input: true
# }
_op_def_lib = _InitOpDefLibrary(b"\nl\n\004Copy\022\n\n\005input\"\001T\032\013\n\006output\"\001T\"\t\n\001T\022\004type\"\031\n\013tensor_name\022\006string\032\002\022\000\"\"\n\016debug_ops_spec\022\014list(string)\032\002\n\000\230\001\001\np\n\010CopyHost\022\n\n\005input\"\001T\032\013\n\006output\"\001T\"\t\n\001T\022\004type\"\031\n\013tensor_name\022\006string\032\002\022\000\"\"\n\016debug_ops_spec\022\014list(string)\032\002\n\000\230\001\001\n\244\001\n\rDebugIdentity\022\n\n\005input\"\001T\032\013\n\006output\"\001T\"\t\n\001T\022\004type\"\031\n\013device_name\022\006string\032\002\022\000\"\031\n\013tensor_name\022\006string\032\002\022\000\"\036\n\ndebug_urls\022\014list(string)\032\002\n\000\"\026\n\ngated_grpc\022\004bool\032\002(\000\230\001\001\n\243\001\n\rDebugNanCount\022\n\n\005input\"\001T\032\n\n\006output\030\t\"\t\n\001T\022\004type\"\031\n\013device_name\022\006string\032\002\022\000\"\031\n\013tensor_name\022\006string\032\002\022\000\"\036\n\ndebug_urls\022\014list(string)\032\002\n\000\"\026\n\ngated_grpc\022\004bool\032\002(\000\230\001\001\n\200\002\n\023DebugNumericSummary\022\n\n\005input\"\001T\032\n\n\006output\030\002\"\t\n\001T\022\004type\"\031\n\013device_name\022\006string\032\002\022\000\"\031\n\013tensor_name\022\006string\032\002\022\000\"\036\n\ndebug_urls\022\014list(string)\032\002\n\000\"\033\n\013lower_bound\022\005float\032\005%\000\000\200\377\"\033\n\013upper_bound\022\005float\032\005%\000\000\200\177\"\033\n\017mute_if_healthy\022\004bool\032\002(\000\"\026\n\ngated_grpc\022\004bool\032\002(\000\230\001\001")
| 38.706089 | 1,593 | 0.651218 | 4,508 | 33,055 | 4.472937 | 0.074534 | 0.061 | 0.045824 | 0.039675 | 0.885687 | 0.861734 | 0.854444 | 0.849038 | 0.84274 | 0.836193 | 0 | 0.028376 | 0.236636 | 33,055 | 853 | 1,594 | 38.751465 | 0.770737 | 0.330328 | 0 | 0.803695 | 1 | 0.016166 | 0.158147 | 0.043777 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025404 | false | 0 | 0.034642 | 0 | 0.108545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
f75e724a0c22b739656f3ae2548a8fb43d261090 | 24,789 | py | Python | ove/algorithm/basicvsr/fix_model.py | iBobbyTS/OpenVideoEnhance | 64d12d7a4c344798e5d60eafd20f8f554f852e84 | [
"MIT"
] | 3 | 2020-12-20T14:15:19.000Z | 2021-03-23T12:23:38.000Z | ove/algorithm/basicvsr/fix_model.py | iBobbyTS/OpenVideoEnhance | 64d12d7a4c344798e5d60eafd20f8f554f852e84 | [
"MIT"
] | 1 | 2022-01-17T06:39:20.000Z | 2022-01-18T08:12:38.000Z | ove/algorithm/basicvsr/fix_model.py | iBobbyTS/OpenVideoEnhance | 64d12d7a4c344798e5d60eafd20f8f554f852e84 | [
"MIT"
] | 1 | 2021-03-03T22:53:05.000Z | 2021-03-03T22:53:05.000Z | import torch
ori = torch.load('/Users/ibobby/Dataset/model_weights/BasicVSR/v-bi.pth')
ori = ori['state_dict']
new = {}
for k in ori.keys():
new[k.replace('generator.', '')] = ori[k]
# Rename
m = list()
u = list()
# PixelShufflePack
m += ["upsample1.main.0.weight", "upsample1.main.0.bias", "upsample2.main.0.weight", "upsample2.main.0.bias"]
u += ["upsample1.upsample_conv.weight", "upsample1.upsample_conv.bias", "upsample2.upsample_conv.weight", "upsample2.upsample_conv.bias"]
# ResidualBlock
m += ["backward_resblocks.main.2.conv.0.weight", "backward_resblocks.main.2.conv.0.bias", "backward_resblocks.main.2.conv.2.weight", "backward_resblocks.main.2.conv.2.bias", "backward_resblocks.main.3.conv.0.weight", "backward_resblocks.main.3.conv.0.bias", "backward_resblocks.main.3.conv.2.weight", "backward_resblocks.main.3.conv.2.bias", "backward_resblocks.main.4.conv.0.weight", "backward_resblocks.main.4.conv.0.bias", "backward_resblocks.main.4.conv.2.weight", "backward_resblocks.main.4.conv.2.bias", "backward_resblocks.main.5.conv.0.weight", "backward_resblocks.main.5.conv.0.bias", "backward_resblocks.main.5.conv.2.weight", "backward_resblocks.main.5.conv.2.bias", "backward_resblocks.main.6.conv.0.weight", "backward_resblocks.main.6.conv.0.bias", "backward_resblocks.main.6.conv.2.weight", "backward_resblocks.main.6.conv.2.bias", "backward_resblocks.main.7.conv.0.weight", "backward_resblocks.main.7.conv.0.bias", "backward_resblocks.main.7.conv.2.weight", "backward_resblocks.main.7.conv.2.bias", "backward_resblocks.main.8.conv.0.weight", "backward_resblocks.main.8.conv.0.bias", "backward_resblocks.main.8.conv.2.weight", "backward_resblocks.main.8.conv.2.bias", "backward_resblocks.main.9.conv.0.weight", "backward_resblocks.main.9.conv.0.bias", "backward_resblocks.main.9.conv.2.weight", "backward_resblocks.main.9.conv.2.bias", "backward_resblocks.main.10.conv.0.weight", "backward_resblocks.main.10.conv.0.bias", "backward_resblocks.main.10.conv.2.weight", "backward_resblocks.main.10.conv.2.bias", "backward_resblocks.main.11.conv.0.weight", "backward_resblocks.main.11.conv.0.bias", "backward_resblocks.main.11.conv.2.weight", "backward_resblocks.main.11.conv.2.bias", "backward_resblocks.main.12.conv.0.weight", "backward_resblocks.main.12.conv.0.bias", "backward_resblocks.main.12.conv.2.weight", "backward_resblocks.main.12.conv.2.bias", "backward_resblocks.main.13.conv.0.weight", "backward_resblocks.main.13.conv.0.bias", "backward_resblocks.main.13.conv.2.weight", "backward_resblocks.main.13.conv.2.bias", "backward_resblocks.main.14.conv.0.weight", "backward_resblocks.main.14.conv.0.bias", "backward_resblocks.main.14.conv.2.weight", "backward_resblocks.main.14.conv.2.bias", "backward_resblocks.main.15.conv.0.weight", "backward_resblocks.main.15.conv.0.bias", "backward_resblocks.main.15.conv.2.weight", "backward_resblocks.main.15.conv.2.bias", "backward_resblocks.main.16.conv.0.weight", "backward_resblocks.main.16.conv.0.bias", "backward_resblocks.main.16.conv.2.weight", "backward_resblocks.main.16.conv.2.bias", "backward_resblocks.main.17.conv.0.weight", "backward_resblocks.main.17.conv.0.bias", "backward_resblocks.main.17.conv.2.weight", "backward_resblocks.main.17.conv.2.bias", "backward_resblocks.main.18.conv.0.weight", "backward_resblocks.main.18.conv.0.bias", "backward_resblocks.main.18.conv.2.weight", "backward_resblocks.main.18.conv.2.bias", "backward_resblocks.main.19.conv.0.weight", "backward_resblocks.main.19.conv.0.bias", "backward_resblocks.main.19.conv.2.weight", "backward_resblocks.main.19.conv.2.bias", "backward_resblocks.main.20.conv.0.weight", "backward_resblocks.main.20.conv.0.bias", "backward_resblocks.main.20.conv.2.weight", "backward_resblocks.main.20.conv.2.bias", "backward_resblocks.main.21.conv.0.weight", "backward_resblocks.main.21.conv.0.bias", "backward_resblocks.main.21.conv.2.weight", "backward_resblocks.main.21.conv.2.bias", "backward_resblocks.main.22.conv.0.weight", "backward_resblocks.main.22.conv.0.bias", "backward_resblocks.main.22.conv.2.weight", "backward_resblocks.main.22.conv.2.bias", "backward_resblocks.main.23.conv.0.weight", "backward_resblocks.main.23.conv.0.bias", "backward_resblocks.main.23.conv.2.weight", "backward_resblocks.main.23.conv.2.bias", "backward_resblocks.main.24.conv.0.weight", "backward_resblocks.main.24.conv.0.bias", "backward_resblocks.main.24.conv.2.weight", "backward_resblocks.main.24.conv.2.bias", "backward_resblocks.main.25.conv.0.weight", "backward_resblocks.main.25.conv.0.bias", "backward_resblocks.main.25.conv.2.weight", "backward_resblocks.main.25.conv.2.bias", "backward_resblocks.main.26.conv.0.weight", "backward_resblocks.main.26.conv.0.bias", "backward_resblocks.main.26.conv.2.weight", "backward_resblocks.main.26.conv.2.bias", "backward_resblocks.main.27.conv.0.weight", "backward_resblocks.main.27.conv.0.bias", "backward_resblocks.main.27.conv.2.weight", "backward_resblocks.main.27.conv.2.bias", "backward_resblocks.main.28.conv.0.weight", "backward_resblocks.main.28.conv.0.bias", "backward_resblocks.main.28.conv.2.weight", "backward_resblocks.main.28.conv.2.bias", "backward_resblocks.main.29.conv.0.weight", "backward_resblocks.main.29.conv.0.bias", "backward_resblocks.main.29.conv.2.weight", "backward_resblocks.main.29.conv.2.bias", "backward_resblocks.main.30.conv.0.weight", "backward_resblocks.main.30.conv.0.bias", "backward_resblocks.main.30.conv.2.weight", "backward_resblocks.main.30.conv.2.bias", "backward_resblocks.main.31.conv.0.weight", "backward_resblocks.main.31.conv.0.bias", "backward_resblocks.main.31.conv.2.weight", "backward_resblocks.main.31.conv.2.bias", "forward_resblocks.main.2.conv.0.weight", "forward_resblocks.main.2.conv.0.bias", "forward_resblocks.main.2.conv.2.weight", "forward_resblocks.main.2.conv.2.bias", "forward_resblocks.main.3.conv.0.weight", "forward_resblocks.main.3.conv.0.bias", "forward_resblocks.main.3.conv.2.weight", "forward_resblocks.main.3.conv.2.bias", "forward_resblocks.main.4.conv.0.weight", "forward_resblocks.main.4.conv.0.bias", "forward_resblocks.main.4.conv.2.weight", "forward_resblocks.main.4.conv.2.bias", "forward_resblocks.main.5.conv.0.weight", "forward_resblocks.main.5.conv.0.bias", "forward_resblocks.main.5.conv.2.weight", "forward_resblocks.main.5.conv.2.bias", "forward_resblocks.main.6.conv.0.weight", "forward_resblocks.main.6.conv.0.bias", "forward_resblocks.main.6.conv.2.weight", "forward_resblocks.main.6.conv.2.bias", "forward_resblocks.main.7.conv.0.weight", "forward_resblocks.main.7.conv.0.bias", "forward_resblocks.main.7.conv.2.weight", "forward_resblocks.main.7.conv.2.bias", "forward_resblocks.main.8.conv.0.weight", "forward_resblocks.main.8.conv.0.bias", "forward_resblocks.main.8.conv.2.weight", "forward_resblocks.main.8.conv.2.bias", "forward_resblocks.main.9.conv.0.weight", "forward_resblocks.main.9.conv.0.bias", "forward_resblocks.main.9.conv.2.weight", "forward_resblocks.main.9.conv.2.bias", "forward_resblocks.main.10.conv.0.weight", "forward_resblocks.main.10.conv.0.bias", "forward_resblocks.main.10.conv.2.weight", "forward_resblocks.main.10.conv.2.bias", "forward_resblocks.main.11.conv.0.weight", "forward_resblocks.main.11.conv.0.bias", "forward_resblocks.main.11.conv.2.weight", "forward_resblocks.main.11.conv.2.bias", "forward_resblocks.main.12.conv.0.weight", "forward_resblocks.main.12.conv.0.bias", "forward_resblocks.main.12.conv.2.weight", "forward_resblocks.main.12.conv.2.bias", "forward_resblocks.main.13.conv.0.weight", "forward_resblocks.main.13.conv.0.bias", "forward_resblocks.main.13.conv.2.weight", "forward_resblocks.main.13.conv.2.bias", "forward_resblocks.main.14.conv.0.weight", "forward_resblocks.main.14.conv.0.bias", "forward_resblocks.main.14.conv.2.weight", "forward_resblocks.main.14.conv.2.bias", "forward_resblocks.main.15.conv.0.weight", "forward_resblocks.main.15.conv.0.bias", "forward_resblocks.main.15.conv.2.weight", "forward_resblocks.main.15.conv.2.bias", "forward_resblocks.main.16.conv.0.weight", "forward_resblocks.main.16.conv.0.bias", "forward_resblocks.main.16.conv.2.weight", "forward_resblocks.main.16.conv.2.bias", "forward_resblocks.main.17.conv.0.weight", "forward_resblocks.main.17.conv.0.bias", "forward_resblocks.main.17.conv.2.weight", "forward_resblocks.main.17.conv.2.bias", "forward_resblocks.main.18.conv.0.weight", "forward_resblocks.main.18.conv.0.bias", "forward_resblocks.main.18.conv.2.weight", "forward_resblocks.main.18.conv.2.bias", "forward_resblocks.main.19.conv.0.weight", "forward_resblocks.main.19.conv.0.bias", "forward_resblocks.main.19.conv.2.weight", "forward_resblocks.main.19.conv.2.bias", "forward_resblocks.main.20.conv.0.weight", "forward_resblocks.main.20.conv.0.bias", "forward_resblocks.main.20.conv.2.weight", "forward_resblocks.main.20.conv.2.bias", "forward_resblocks.main.21.conv.0.weight", "forward_resblocks.main.21.conv.0.bias", "forward_resblocks.main.21.conv.2.weight", "forward_resblocks.main.21.conv.2.bias", "forward_resblocks.main.22.conv.0.weight", "forward_resblocks.main.22.conv.0.bias", "forward_resblocks.main.22.conv.2.weight", "forward_resblocks.main.22.conv.2.bias", "forward_resblocks.main.23.conv.0.weight", "forward_resblocks.main.23.conv.0.bias", "forward_resblocks.main.23.conv.2.weight", "forward_resblocks.main.23.conv.2.bias", "forward_resblocks.main.24.conv.0.weight", "forward_resblocks.main.24.conv.0.bias", "forward_resblocks.main.24.conv.2.weight", "forward_resblocks.main.24.conv.2.bias", "forward_resblocks.main.25.conv.0.weight", "forward_resblocks.main.25.conv.0.bias", "forward_resblocks.main.25.conv.2.weight", "forward_resblocks.main.25.conv.2.bias", "forward_resblocks.main.26.conv.0.weight", "forward_resblocks.main.26.conv.0.bias", "forward_resblocks.main.26.conv.2.weight", "forward_resblocks.main.26.conv.2.bias", "forward_resblocks.main.27.conv.0.weight", "forward_resblocks.main.27.conv.0.bias", "forward_resblocks.main.27.conv.2.weight", "forward_resblocks.main.27.conv.2.bias", "forward_resblocks.main.28.conv.0.weight", "forward_resblocks.main.28.conv.0.bias", "forward_resblocks.main.28.conv.2.weight", "forward_resblocks.main.28.conv.2.bias", "forward_resblocks.main.29.conv.0.weight", "forward_resblocks.main.29.conv.0.bias", "forward_resblocks.main.29.conv.2.weight", "forward_resblocks.main.29.conv.2.bias", "forward_resblocks.main.30.conv.0.weight", "forward_resblocks.main.30.conv.0.bias", "forward_resblocks.main.30.conv.2.weight", "forward_resblocks.main.30.conv.2.bias", "forward_resblocks.main.31.conv.0.weight", "forward_resblocks.main.31.conv.0.bias", "forward_resblocks.main.31.conv.2.weight", "forward_resblocks.main.31.conv.2.bias"]
u += ["backward_resblocks.main.2.0.conv1.weight", "backward_resblocks.main.2.0.conv1.bias", "backward_resblocks.main.2.0.conv2.weight", "backward_resblocks.main.2.0.conv2.bias", "backward_resblocks.main.2.1.conv1.weight", "backward_resblocks.main.2.1.conv1.bias", "backward_resblocks.main.2.1.conv2.weight", "backward_resblocks.main.2.1.conv2.bias", "backward_resblocks.main.2.2.conv1.weight", "backward_resblocks.main.2.2.conv1.bias", "backward_resblocks.main.2.2.conv2.weight", "backward_resblocks.main.2.2.conv2.bias", "backward_resblocks.main.2.3.conv1.weight", "backward_resblocks.main.2.3.conv1.bias", "backward_resblocks.main.2.3.conv2.weight", "backward_resblocks.main.2.3.conv2.bias", "backward_resblocks.main.2.4.conv1.weight", "backward_resblocks.main.2.4.conv1.bias", "backward_resblocks.main.2.4.conv2.weight", "backward_resblocks.main.2.4.conv2.bias", "backward_resblocks.main.2.5.conv1.weight", "backward_resblocks.main.2.5.conv1.bias", "backward_resblocks.main.2.5.conv2.weight", "backward_resblocks.main.2.5.conv2.bias", "backward_resblocks.main.2.6.conv1.weight", "backward_resblocks.main.2.6.conv1.bias", "backward_resblocks.main.2.6.conv2.weight", "backward_resblocks.main.2.6.conv2.bias", "backward_resblocks.main.2.7.conv1.weight", "backward_resblocks.main.2.7.conv1.bias", "backward_resblocks.main.2.7.conv2.weight", "backward_resblocks.main.2.7.conv2.bias", "backward_resblocks.main.2.8.conv1.weight", "backward_resblocks.main.2.8.conv1.bias", "backward_resblocks.main.2.8.conv2.weight", "backward_resblocks.main.2.8.conv2.bias", "backward_resblocks.main.2.9.conv1.weight", "backward_resblocks.main.2.9.conv1.bias", "backward_resblocks.main.2.9.conv2.weight", "backward_resblocks.main.2.9.conv2.bias", "backward_resblocks.main.2.10.conv1.weight", "backward_resblocks.main.2.10.conv1.bias", "backward_resblocks.main.2.10.conv2.weight", "backward_resblocks.main.2.10.conv2.bias", "backward_resblocks.main.2.11.conv1.weight", "backward_resblocks.main.2.11.conv1.bias", "backward_resblocks.main.2.11.conv2.weight", "backward_resblocks.main.2.11.conv2.bias", "backward_resblocks.main.2.12.conv1.weight", "backward_resblocks.main.2.12.conv1.bias", "backward_resblocks.main.2.12.conv2.weight", "backward_resblocks.main.2.12.conv2.bias", "backward_resblocks.main.2.13.conv1.weight", "backward_resblocks.main.2.13.conv1.bias", "backward_resblocks.main.2.13.conv2.weight", "backward_resblocks.main.2.13.conv2.bias", "backward_resblocks.main.2.14.conv1.weight", "backward_resblocks.main.2.14.conv1.bias", "backward_resblocks.main.2.14.conv2.weight", "backward_resblocks.main.2.14.conv2.bias", "backward_resblocks.main.2.15.conv1.weight", "backward_resblocks.main.2.15.conv1.bias", "backward_resblocks.main.2.15.conv2.weight", "backward_resblocks.main.2.15.conv2.bias", "backward_resblocks.main.2.16.conv1.weight", "backward_resblocks.main.2.16.conv1.bias", "backward_resblocks.main.2.16.conv2.weight", "backward_resblocks.main.2.16.conv2.bias", "backward_resblocks.main.2.17.conv1.weight", "backward_resblocks.main.2.17.conv1.bias", "backward_resblocks.main.2.17.conv2.weight", "backward_resblocks.main.2.17.conv2.bias", "backward_resblocks.main.2.18.conv1.weight", "backward_resblocks.main.2.18.conv1.bias", "backward_resblocks.main.2.18.conv2.weight", "backward_resblocks.main.2.18.conv2.bias", "backward_resblocks.main.2.19.conv1.weight", "backward_resblocks.main.2.19.conv1.bias", "backward_resblocks.main.2.19.conv2.weight", "backward_resblocks.main.2.19.conv2.bias", "backward_resblocks.main.2.20.conv1.weight", "backward_resblocks.main.2.20.conv1.bias", "backward_resblocks.main.2.20.conv2.weight", "backward_resblocks.main.2.20.conv2.bias", "backward_resblocks.main.2.21.conv1.weight", "backward_resblocks.main.2.21.conv1.bias", "backward_resblocks.main.2.21.conv2.weight", "backward_resblocks.main.2.21.conv2.bias", "backward_resblocks.main.2.22.conv1.weight", "backward_resblocks.main.2.22.conv1.bias", "backward_resblocks.main.2.22.conv2.weight", "backward_resblocks.main.2.22.conv2.bias", "backward_resblocks.main.2.23.conv1.weight", "backward_resblocks.main.2.23.conv1.bias", "backward_resblocks.main.2.23.conv2.weight", "backward_resblocks.main.2.23.conv2.bias", "backward_resblocks.main.2.24.conv1.weight", "backward_resblocks.main.2.24.conv1.bias", "backward_resblocks.main.2.24.conv2.weight", "backward_resblocks.main.2.24.conv2.bias", "backward_resblocks.main.2.25.conv1.weight", "backward_resblocks.main.2.25.conv1.bias", "backward_resblocks.main.2.25.conv2.weight", "backward_resblocks.main.2.25.conv2.bias", "backward_resblocks.main.2.26.conv1.weight", "backward_resblocks.main.2.26.conv1.bias", "backward_resblocks.main.2.26.conv2.weight", "backward_resblocks.main.2.26.conv2.bias", "backward_resblocks.main.2.27.conv1.weight", "backward_resblocks.main.2.27.conv1.bias", "backward_resblocks.main.2.27.conv2.weight", "backward_resblocks.main.2.27.conv2.bias", "backward_resblocks.main.2.28.conv1.weight", "backward_resblocks.main.2.28.conv1.bias", "backward_resblocks.main.2.28.conv2.weight", "backward_resblocks.main.2.28.conv2.bias", "backward_resblocks.main.2.29.conv1.weight", "backward_resblocks.main.2.29.conv1.bias", "backward_resblocks.main.2.29.conv2.weight", "backward_resblocks.main.2.29.conv2.bias", "forward_resblocks.main.2.0.conv1.weight", "forward_resblocks.main.2.0.conv1.bias", "forward_resblocks.main.2.0.conv2.weight", "forward_resblocks.main.2.0.conv2.bias", "forward_resblocks.main.2.1.conv1.weight", "forward_resblocks.main.2.1.conv1.bias", "forward_resblocks.main.2.1.conv2.weight", "forward_resblocks.main.2.1.conv2.bias", "forward_resblocks.main.2.2.conv1.weight", "forward_resblocks.main.2.2.conv1.bias", "forward_resblocks.main.2.2.conv2.weight", "forward_resblocks.main.2.2.conv2.bias", "forward_resblocks.main.2.3.conv1.weight", "forward_resblocks.main.2.3.conv1.bias", "forward_resblocks.main.2.3.conv2.weight", "forward_resblocks.main.2.3.conv2.bias", "forward_resblocks.main.2.4.conv1.weight", "forward_resblocks.main.2.4.conv1.bias", "forward_resblocks.main.2.4.conv2.weight", "forward_resblocks.main.2.4.conv2.bias", "forward_resblocks.main.2.5.conv1.weight", "forward_resblocks.main.2.5.conv1.bias", "forward_resblocks.main.2.5.conv2.weight", "forward_resblocks.main.2.5.conv2.bias", "forward_resblocks.main.2.6.conv1.weight", "forward_resblocks.main.2.6.conv1.bias", "forward_resblocks.main.2.6.conv2.weight", "forward_resblocks.main.2.6.conv2.bias", "forward_resblocks.main.2.7.conv1.weight", "forward_resblocks.main.2.7.conv1.bias", "forward_resblocks.main.2.7.conv2.weight", "forward_resblocks.main.2.7.conv2.bias", "forward_resblocks.main.2.8.conv1.weight", "forward_resblocks.main.2.8.conv1.bias", "forward_resblocks.main.2.8.conv2.weight", "forward_resblocks.main.2.8.conv2.bias", "forward_resblocks.main.2.9.conv1.weight", "forward_resblocks.main.2.9.conv1.bias", "forward_resblocks.main.2.9.conv2.weight", "forward_resblocks.main.2.9.conv2.bias", "forward_resblocks.main.2.10.conv1.weight", "forward_resblocks.main.2.10.conv1.bias", "forward_resblocks.main.2.10.conv2.weight", "forward_resblocks.main.2.10.conv2.bias", "forward_resblocks.main.2.11.conv1.weight", "forward_resblocks.main.2.11.conv1.bias", "forward_resblocks.main.2.11.conv2.weight", "forward_resblocks.main.2.11.conv2.bias", "forward_resblocks.main.2.12.conv1.weight", "forward_resblocks.main.2.12.conv1.bias", "forward_resblocks.main.2.12.conv2.weight", "forward_resblocks.main.2.12.conv2.bias", "forward_resblocks.main.2.13.conv1.weight", "forward_resblocks.main.2.13.conv1.bias", "forward_resblocks.main.2.13.conv2.weight", "forward_resblocks.main.2.13.conv2.bias", "forward_resblocks.main.2.14.conv1.weight", "forward_resblocks.main.2.14.conv1.bias", "forward_resblocks.main.2.14.conv2.weight", "forward_resblocks.main.2.14.conv2.bias", "forward_resblocks.main.2.15.conv1.weight", "forward_resblocks.main.2.15.conv1.bias", "forward_resblocks.main.2.15.conv2.weight", "forward_resblocks.main.2.15.conv2.bias", "forward_resblocks.main.2.16.conv1.weight", "forward_resblocks.main.2.16.conv1.bias", "forward_resblocks.main.2.16.conv2.weight", "forward_resblocks.main.2.16.conv2.bias", "forward_resblocks.main.2.17.conv1.weight", "forward_resblocks.main.2.17.conv1.bias", "forward_resblocks.main.2.17.conv2.weight", "forward_resblocks.main.2.17.conv2.bias", "forward_resblocks.main.2.18.conv1.weight", "forward_resblocks.main.2.18.conv1.bias", "forward_resblocks.main.2.18.conv2.weight", "forward_resblocks.main.2.18.conv2.bias", "forward_resblocks.main.2.19.conv1.weight", "forward_resblocks.main.2.19.conv1.bias", "forward_resblocks.main.2.19.conv2.weight", "forward_resblocks.main.2.19.conv2.bias", "forward_resblocks.main.2.20.conv1.weight", "forward_resblocks.main.2.20.conv1.bias", "forward_resblocks.main.2.20.conv2.weight", "forward_resblocks.main.2.20.conv2.bias", "forward_resblocks.main.2.21.conv1.weight", "forward_resblocks.main.2.21.conv1.bias", "forward_resblocks.main.2.21.conv2.weight", "forward_resblocks.main.2.21.conv2.bias", "forward_resblocks.main.2.22.conv1.weight", "forward_resblocks.main.2.22.conv1.bias", "forward_resblocks.main.2.22.conv2.weight", "forward_resblocks.main.2.22.conv2.bias", "forward_resblocks.main.2.23.conv1.weight", "forward_resblocks.main.2.23.conv1.bias", "forward_resblocks.main.2.23.conv2.weight", "forward_resblocks.main.2.23.conv2.bias", "forward_resblocks.main.2.24.conv1.weight", "forward_resblocks.main.2.24.conv1.bias", "forward_resblocks.main.2.24.conv2.weight", "forward_resblocks.main.2.24.conv2.bias", "forward_resblocks.main.2.25.conv1.weight", "forward_resblocks.main.2.25.conv1.bias", "forward_resblocks.main.2.25.conv2.weight", "forward_resblocks.main.2.25.conv2.bias", "forward_resblocks.main.2.26.conv1.weight", "forward_resblocks.main.2.26.conv1.bias", "forward_resblocks.main.2.26.conv2.weight", "forward_resblocks.main.2.26.conv2.bias", "forward_resblocks.main.2.27.conv1.weight", "forward_resblocks.main.2.27.conv1.bias", "forward_resblocks.main.2.27.conv2.weight", "forward_resblocks.main.2.27.conv2.bias", "forward_resblocks.main.2.28.conv1.weight", "forward_resblocks.main.2.28.conv1.bias", "forward_resblocks.main.2.28.conv2.weight", "forward_resblocks.main.2.28.conv2.bias", "forward_resblocks.main.2.29.conv1.weight", "forward_resblocks.main.2.29.conv1.bias", "forward_resblocks.main.2.29.conv2.weight", "forward_resblocks.main.2.29.conv2.bias"]
# BasicVSRNet
m += ["upsample.0.weight", "upsample.0.bias", "upsample.2.main.0.weight", "upsample.2.main.0.bias", "upsample.4.main.0.weight", "upsample.4.main.0.bias", "upsample.6.weight", "upsample.6.bias", "upsample.8.weight", "upsample.8.bias"]
u += ["fusion.weight", "fusion.bias", "upsample1.main.0.weight", "upsample1.main.0.bias", "upsample2.main.0.weight", "upsample2.main.0.bias", "conv_hr.weight", "conv_hr.bias", "conv_last.weight", "conv_last.bias"]
for mk, uk in zip(m, u):
new[mk] = new.pop(uk)
# Remove
u = list()
# SPyNet
u += ["spynet.mean", "spynet.std", "spynet.basic_module.0.basic_module.0.conv.weight", "spynet.basic_module.0.basic_module.0.conv.bias", "spynet.basic_module.0.basic_module.1.conv.weight", "spynet.basic_module.0.basic_module.1.conv.bias", "spynet.basic_module.0.basic_module.2.conv.weight", "spynet.basic_module.0.basic_module.2.conv.bias", "spynet.basic_module.0.basic_module.3.conv.weight", "spynet.basic_module.0.basic_module.3.conv.bias", "spynet.basic_module.0.basic_module.4.conv.weight", "spynet.basic_module.0.basic_module.4.conv.bias", "spynet.basic_module.1.basic_module.0.conv.weight", "spynet.basic_module.1.basic_module.0.conv.bias", "spynet.basic_module.1.basic_module.1.conv.weight", "spynet.basic_module.1.basic_module.1.conv.bias", "spynet.basic_module.1.basic_module.2.conv.weight", "spynet.basic_module.1.basic_module.2.conv.bias", "spynet.basic_module.1.basic_module.3.conv.weight", "spynet.basic_module.1.basic_module.3.conv.bias", "spynet.basic_module.1.basic_module.4.conv.weight", "spynet.basic_module.1.basic_module.4.conv.bias", "spynet.basic_module.2.basic_module.0.conv.weight", "spynet.basic_module.2.basic_module.0.conv.bias", "spynet.basic_module.2.basic_module.1.conv.weight", "spynet.basic_module.2.basic_module.1.conv.bias", "spynet.basic_module.2.basic_module.2.conv.weight", "spynet.basic_module.2.basic_module.2.conv.bias", "spynet.basic_module.2.basic_module.3.conv.weight", "spynet.basic_module.2.basic_module.3.conv.bias", "spynet.basic_module.2.basic_module.4.conv.weight", "spynet.basic_module.2.basic_module.4.conv.bias", "spynet.basic_module.3.basic_module.0.conv.weight", "spynet.basic_module.3.basic_module.0.conv.bias", "spynet.basic_module.3.basic_module.1.conv.weight", "spynet.basic_module.3.basic_module.1.conv.bias", "spynet.basic_module.3.basic_module.2.conv.weight", "spynet.basic_module.3.basic_module.2.conv.bias", "spynet.basic_module.3.basic_module.3.conv.weight", "spynet.basic_module.3.basic_module.3.conv.bias", "spynet.basic_module.3.basic_module.4.conv.weight", "spynet.basic_module.3.basic_module.4.conv.bias", "spynet.basic_module.4.basic_module.0.conv.weight", "spynet.basic_module.4.basic_module.0.conv.bias", "spynet.basic_module.4.basic_module.1.conv.weight", "spynet.basic_module.4.basic_module.1.conv.bias", "spynet.basic_module.4.basic_module.2.conv.weight", "spynet.basic_module.4.basic_module.2.conv.bias", "spynet.basic_module.4.basic_module.3.conv.weight", "spynet.basic_module.4.basic_module.3.conv.bias", "spynet.basic_module.4.basic_module.4.conv.weight", "spynet.basic_module.4.basic_module.4.conv.bias", "spynet.basic_module.5.basic_module.0.conv.weight", "spynet.basic_module.5.basic_module.0.conv.bias", "spynet.basic_module.5.basic_module.1.conv.weight", "spynet.basic_module.5.basic_module.1.conv.bias", "spynet.basic_module.5.basic_module.2.conv.weight", "spynet.basic_module.5.basic_module.2.conv.bias", "spynet.basic_module.5.basic_module.3.conv.weight", "spynet.basic_module.5.basic_module.3.conv.bias", "spynet.basic_module.5.basic_module.4.conv.weight", "spynet.basic_module.5.basic_module.4.conv.bias"]
for u_ in u:
new.pop(u_)
# Save
# torch.save(new, '/Users/ibobby/Dataset/model_weights/BasicVSR/v-bi-fixed.pth')
print('model_fixed')
| 708.257143 | 10,365 | 0.785752 | 4,091 | 24,789 | 4.611098 | 0.021755 | 0.330789 | 0.184054 | 0.144614 | 0.968511 | 0.968299 | 0.136238 | 0.136238 | 0.008058 | 0.008058 | 0 | 0.070043 | 0.026665 | 24,789 | 34 | 10,366 | 729.088235 | 0.711787 | 0.00593 | 0 | 0.095238 | 0 | 0 | 0.895587 | 0.886331 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.047619 | 0 | 0.047619 | 0.047619 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
f78eb8a9d00b2be117de02366ff8ed83e18b0ee7 | 162 | py | Python | redshells/data/__init__.py | mski-iksm/redshells | 1e956fed9b000ea3f6ba1c96e25d5dd953025155 | [
"MIT"
] | 42 | 2019-01-02T01:31:39.000Z | 2022-01-29T08:56:12.000Z | redshells/data/__init__.py | mski-iksm/redshells | 1e956fed9b000ea3f6ba1c96e25d5dd953025155 | [
"MIT"
] | 29 | 2019-03-28T02:33:01.000Z | 2021-09-27T00:45:25.000Z | redshells/data/__init__.py | mski-iksm/redshells | 1e956fed9b000ea3f6ba1c96e25d5dd953025155 | [
"MIT"
] | 17 | 2019-02-21T03:08:20.000Z | 2022-02-17T23:27:48.000Z | import redshells.data.data_frame_utils
from redshells.data.load_data_of_task import LoadDataOfTask
from redshells.data.load_existing_file import LoadExistingFile
| 40.5 | 62 | 0.901235 | 23 | 162 | 6.043478 | 0.565217 | 0.280576 | 0.244604 | 0.302158 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061728 | 162 | 3 | 63 | 54 | 0.914474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
e3c17d2bbcfebfe8ce6199e78bdc517c714ce457 | 125 | py | Python | tensorflow_helpers/tensorflow_helpers/augmentation/__init__.py | PatrickKalkman/python-package-creation | 8c208fd9ed92e06b76611801422731ef04087335 | [
"MIT"
] | 1 | 2021-03-17T11:54:31.000Z | 2021-03-17T11:54:31.000Z | tensorflow_helpers/tensorflow_helpers/augmentation/__init__.py | PatrickKalkman/python-package-creation | 8c208fd9ed92e06b76611801422731ef04087335 | [
"MIT"
] | null | null | null | tensorflow_helpers/tensorflow_helpers/augmentation/__init__.py | PatrickKalkman/python-package-creation | 8c208fd9ed92e06b76611801422731ef04087335 | [
"MIT"
] | 2 | 2021-05-19T07:34:19.000Z | 2021-09-15T19:06:40.000Z | from .cutmix_imagedatagenerator import CutMixImageDataGenerator
from .mixup_imagedatagenerator import MixupImageDataGenerator | 62.5 | 63 | 0.928 | 10 | 125 | 11.4 | 0.7 | 0.421053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.056 | 125 | 2 | 64 | 62.5 | 0.966102 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e3cb863dbb8d9462bacd0eeb4268c7926c2c012d | 10,790 | py | Python | angr/procedures/definitions/win32_netsh.py | r4b3rt/angr | c133cfd4f83ffea2a1d9e064241e9459eaabc55f | [
"BSD-2-Clause"
] | null | null | null | angr/procedures/definitions/win32_netsh.py | r4b3rt/angr | c133cfd4f83ffea2a1d9e064241e9459eaabc55f | [
"BSD-2-Clause"
] | null | null | null | angr/procedures/definitions/win32_netsh.py | r4b3rt/angr | c133cfd4f83ffea2a1d9e064241e9459eaabc55f | [
"BSD-2-Clause"
] | null | null | null | # pylint:disable=line-too-long
import logging
from ...sim_type import SimTypeFunction, SimTypeShort, SimTypeInt, SimTypeLong, SimTypeLongLong, SimTypeDouble, SimTypeFloat, SimTypePointer, SimTypeChar, SimStruct, SimTypeFixedSizeArray, SimTypeBottom, SimUnion, SimTypeBool
from ...calling_conventions import SimCCStdcall, SimCCMicrosoftAMD64
from .. import SIM_PROCEDURES as P
from . import SimLibrary
_l = logging.getLogger(name=__name__)
lib = SimLibrary()
lib.set_default_cc('X86', SimCCStdcall)
lib.set_default_cc('AMD64', SimCCMicrosoftAMD64)
lib.set_library_names("netsh.dll")
prototypes = \
{
#
'MatchEnumTag': SimTypeFunction([SimTypePointer(SimTypeInt(signed=True, label="Int"), label="IntPtr", offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimStruct({"pwszToken": SimTypePointer(SimTypeChar(label="Char"), offset=0), "dwValue": SimTypeInt(signed=False, label="UInt32")}, name="TOKEN_VALUE", pack=False, align=None), offset=0), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hModule", "pwcArg", "dwNumArg", "pEnumTable", "pdwValue"]),
#
'MatchToken': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=True, label="Int32"), arg_names=["pwszUserToken", "pwszCmdToken"]),
#
'PreprocessCommand': SimTypeFunction([SimTypePointer(SimTypeInt(signed=True, label="Int"), label="IntPtr", offset=0), SimTypePointer(SimTypePointer(SimTypeChar(label="Char"), offset=0), label="LPArray", offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimStruct({"pwszTag": SimTypePointer(SimTypeChar(label="Char"), offset=0), "dwRequired": SimTypeInt(signed=False, label="UInt32"), "bPresent": SimTypeInt(signed=True, label="Int32")}, name="TAG_TYPE", pack=False, align=None), label="LPArray", offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeInt(signed=False, label="UInt32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["hModule", "ppwcArguments", "dwCurrentIndex", "dwArgCount", "pttTags", "dwTagCount", "dwMinArgs", "dwMaxArgs", "pdwTagType"]),
#
'PrintError': SimTypeFunction([SimTypePointer(SimTypeInt(signed=True, label="Int"), label="IntPtr", offset=0), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hModule", "dwErrId"]),
#
'PrintMessageFromModule': SimTypeFunction([SimTypePointer(SimTypeInt(signed=True, label="Int"), label="IntPtr", offset=0), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["hModule", "dwMsgId"]),
#
'PrintMessage': SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pwszFormat"]),
#
'RegisterContext': SimTypeFunction([SimTypePointer(SimStruct({"Anonymous": SimUnion({"Anonymous": SimStruct({"dwVersion": SimTypeInt(signed=False, label="UInt32"), "dwReserved": SimTypeInt(signed=False, label="UInt32")}, name="_Anonymous_e__Struct", pack=False, align=None), "_ullAlign": SimTypeLongLong(signed=False, label="UInt64")}, name="<anon>", label="None"), "pwszContext": SimTypePointer(SimTypeChar(label="Char"), offset=0), "guidHelper": SimTypeBottom(label="Guid"), "dwFlags": SimTypeInt(signed=False, label="UInt32"), "ulPriority": SimTypeInt(signed=False, label="UInt32"), "ulNumTopCmds": SimTypeInt(signed=False, label="UInt32"), "pTopCmds": SimTypePointer(SimStruct({"pwszCmdToken": SimTypePointer(SimTypeChar(label="Char"), offset=0), "pfnCmdHandler": SimTypePointer(SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypePointer(SimTypeChar(label="Char"), offset=0), label="LPArray", offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeBottom(label="Void"), offset=0), SimTypePointer(SimTypeInt(signed=True, label="Int32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pwszMachine", "ppwcArguments", "dwCurrentIndex", "dwArgCount", "dwFlags", "pvData", "pbDone"]), offset=0), "dwShortCmdHelpToken": SimTypeInt(signed=False, label="UInt32"), "dwCmdHlpToken": SimTypeInt(signed=False, label="UInt32"), "dwFlags": SimTypeInt(signed=False, label="UInt32"), "pOsVersionCheck": SimTypePointer(SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=True, label="Int32"), arg_names=["CIMOSType", "CIMOSProductSuite", "CIMOSVersion", "CIMOSBuildNumber", "CIMServicePackMajorVersion", "CIMServicePackMinorVersion", "uiReserved", "dwReserved"]), offset=0)}, name="CMD_ENTRY", pack=False, align=None), offset=0), "ulNumGroups": SimTypeInt(signed=False, label="UInt32"), "pCmdGroups": SimTypePointer(SimStruct({"pwszCmdGroupToken": SimTypePointer(SimTypeChar(label="Char"), offset=0), "dwShortCmdHelpToken": SimTypeInt(signed=False, label="UInt32"), "ulCmdGroupSize": SimTypeInt(signed=False, label="UInt32"), "dwFlags": SimTypeInt(signed=False, label="UInt32"), "pCmdGroup": SimTypePointer(SimStruct({"pwszCmdToken": SimTypePointer(SimTypeChar(label="Char"), offset=0), "pfnCmdHandler": SimTypePointer(SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypePointer(SimTypeChar(label="Char"), offset=0), label="LPArray", offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeBottom(label="Void"), offset=0), SimTypePointer(SimTypeInt(signed=True, label="Int32"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pwszMachine", "ppwcArguments", "dwCurrentIndex", "dwArgCount", "dwFlags", "pvData", "pbDone"]), offset=0), "dwShortCmdHelpToken": SimTypeInt(signed=False, label="UInt32"), "dwCmdHlpToken": SimTypeInt(signed=False, label="UInt32"), "dwFlags": SimTypeInt(signed=False, label="UInt32"), "pOsVersionCheck": SimTypePointer(SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=True, label="Int32"), arg_names=["CIMOSType", "CIMOSProductSuite", "CIMOSVersion", "CIMOSBuildNumber", "CIMServicePackMajorVersion", "CIMServicePackMinorVersion", "uiReserved", "dwReserved"]), offset=0)}, name="CMD_ENTRY", pack=False, align=None), offset=0), "pOsVersionCheck": SimTypePointer(SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=True, label="Int32"), arg_names=["CIMOSType", "CIMOSProductSuite", "CIMOSVersion", "CIMOSBuildNumber", "CIMServicePackMajorVersion", "CIMServicePackMinorVersion", "uiReserved", "dwReserved"]), offset=0)}, name="CMD_GROUP_ENTRY", pack=False, align=None), offset=0), "pfnCommitFn": SimTypePointer(SimTypeFunction([SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["dwAction"]), offset=0), "pfnDumpFn": SimTypePointer(SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypePointer(SimTypeChar(label="Char"), offset=0), label="LPArray", offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeBottom(label="Void"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pwszRouter", "ppwcArguments", "dwArgCount", "pvData"]), offset=0), "pfnConnectFn": SimTypePointer(SimTypeFunction([SimTypePointer(SimTypeChar(label="Char"), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pwszMachine"]), offset=0), "pReserved": SimTypePointer(SimTypeBottom(label="Void"), offset=0), "pfnOsVersionCheck": SimTypePointer(SimTypeFunction([SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32"), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypePointer(SimTypeChar(label="Char"), offset=0), SimTypeInt(signed=False, label="UInt32"), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=True, label="Int32"), arg_names=["CIMOSType", "CIMOSProductSuite", "CIMOSVersion", "CIMOSBuildNumber", "CIMServicePackMajorVersion", "CIMServicePackMinorVersion", "uiReserved", "dwReserved"]), offset=0)}, name="NS_CONTEXT_ATTRIBUTES", pack=False, align=None), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pChildContext"]),
#
'RegisterHelper': SimTypeFunction([SimTypePointer(SimTypeBottom(label="Guid"), offset=0), SimTypePointer(SimStruct({"Anonymous": SimUnion({"Anonymous": SimStruct({"dwVersion": SimTypeInt(signed=False, label="UInt32"), "dwReserved": SimTypeInt(signed=False, label="UInt32")}, name="_Anonymous_e__Struct", pack=False, align=None), "_ullAlign": SimTypeLongLong(signed=False, label="UInt64")}, name="<anon>", label="None"), "guidHelper": SimTypeBottom(label="Guid"), "pfnStart": SimTypePointer(SimTypeFunction([SimTypePointer(SimTypeBottom(label="Guid"), offset=0), SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["pguidParent", "dwVersion"]), offset=0), "pfnStop": SimTypePointer(SimTypeFunction([SimTypeInt(signed=False, label="UInt32")], SimTypeInt(signed=False, label="UInt32"), arg_names=["dwReserved"]), offset=0)}, name="NS_HELPER_ATTRIBUTES", pack=False, align=None), offset=0)], SimTypeInt(signed=False, label="UInt32"), arg_names=["pguidParentContext", "pfnRegisterSubContext"]),
}
lib.set_prototypes(prototypes)
| 283.947368 | 6,544 | 0.752085 | 1,119 | 10,790 | 7.203753 | 0.134942 | 0.160774 | 0.140925 | 0.222553 | 0.836993 | 0.814291 | 0.777075 | 0.754869 | 0.74234 | 0.737378 | 0 | 0.023383 | 0.064597 | 10,790 | 37 | 6,545 | 291.621622 | 0.77529 | 0.002595 | 0 | 0 | 0 | 0 | 0.231757 | 0.025316 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.227273 | 0 | 0.227273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
5425d740b6c74a4072084e2c16d64b1b85c78e7f | 99 | py | Python | Sample.py | SachinSuryawanshi0909/New-Live-Project | 5839e86cf32d57572827976682977b7f5d992c1b | [
"MIT"
] | null | null | null | Sample.py | SachinSuryawanshi0909/New-Live-Project | 5839e86cf32d57572827976682977b7f5d992c1b | [
"MIT"
] | null | null | null | Sample.py | SachinSuryawanshi0909/New-Live-Project | 5839e86cf32d57572827976682977b7f5d992c1b | [
"MIT"
] | null | null | null | def sum(a,b):
return a+b
def substract(a,b):
return a-b
def Mul(a,b):
return a*b
| 11 | 19 | 0.545455 | 21 | 99 | 2.571429 | 0.333333 | 0.222222 | 0.444444 | 0.5 | 0.666667 | 0.481481 | 0 | 0 | 0 | 0 | 0 | 0 | 0.30303 | 99 | 8 | 20 | 12.375 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 7 |
581d377c20b17858512885c9cf720b08fa3e7de9 | 75 | py | Python | EasyNN/batch/__init__.py | danielwilczak101/EasyNN | 89319e974c324dda228c6ecff7c39d723eda3ca2 | [
"MIT"
] | 5 | 2021-01-28T21:19:02.000Z | 2022-02-03T05:47:47.000Z | EasyNN/batch/__init__.py | danielwilczak101/EasyNN | 89319e974c324dda228c6ecff7c39d723eda3ca2 | [
"MIT"
] | 1 | 2021-02-04T20:57:45.000Z | 2021-03-03T14:49:44.000Z | EasyNN/batch/__init__.py | danielwilczak101/EasyNN | 89319e974c324dda228c6ecff7c39d723eda3ca2 | [
"MIT"
] | 2 | 2021-02-12T04:27:40.000Z | 2021-12-19T20:11:20.000Z | from EasyNN.batch.abc import Batch
from EasyNN.batch.mini import MiniBatch
| 25 | 39 | 0.84 | 12 | 75 | 5.25 | 0.583333 | 0.31746 | 0.47619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106667 | 75 | 2 | 40 | 37.5 | 0.940299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
582603bde2c4968b38375e7fec852e97a58409de | 57 | py | Python | dopedefects/tests/test_dopedefects.py | dopedefects/dopedefects | 0c410efdf0da01d8673e4414ddc446faabe9efd2 | [
"MIT"
] | null | null | null | dopedefects/tests/test_dopedefects.py | dopedefects/dopedefects | 0c410efdf0da01d8673e4414ddc446faabe9efd2 | [
"MIT"
] | 5 | 2020-01-28T22:50:39.000Z | 2022-02-10T00:13:58.000Z | dopedefects/tests/test_dopedefects.py | dopedefects/dopedefects | 0c410efdf0da01d8673e4414ddc446faabe9efd2 | [
"MIT"
] | null | null | null | import dopedefects as dd
def test_empty():
return
| 8.142857 | 24 | 0.701754 | 8 | 57 | 4.875 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.245614 | 57 | 6 | 25 | 9.5 | 0.906977 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
582a0ad2e647672952c8466d6b601820a10fc4ed | 73 | py | Python | tsplot/__init__.py | brett-hosking/tsplot | 94464edaeecaf8879acb8fe52dce5a03e9ad9b03 | [
"MIT"
] | null | null | null | tsplot/__init__.py | brett-hosking/tsplot | 94464edaeecaf8879acb8fe52dce5a03e9ad9b03 | [
"MIT"
] | null | null | null | tsplot/__init__.py | brett-hosking/tsplot | 94464edaeecaf8879acb8fe52dce5a03e9ad9b03 | [
"MIT"
] | null | null | null | from tsplot import time
from tsplot import plot
# from tsplot import data | 24.333333 | 25 | 0.821918 | 12 | 73 | 5 | 0.5 | 0.5 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164384 | 73 | 3 | 25 | 24.333333 | 0.983607 | 0.315068 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
583ab8ca001f0dc945babe052fe8d688b9992ba9 | 2,728 | py | Python | tests/examples/minlplib/ex14_1_2.py | ouyang-w-19/decogo | 52546480e49776251d4d27856e18a46f40c824a1 | [
"MIT"
] | 2 | 2021-07-03T13:19:10.000Z | 2022-02-06T10:48:13.000Z | tests/examples/minlplib/ex14_1_2.py | ouyang-w-19/decogo | 52546480e49776251d4d27856e18a46f40c824a1 | [
"MIT"
] | 1 | 2021-07-04T14:52:14.000Z | 2021-07-15T10:17:11.000Z | tests/examples/minlplib/ex14_1_2.py | ouyang-w-19/decogo | 52546480e49776251d4d27856e18a46f40c824a1 | [
"MIT"
] | null | null | null | # NLP written by GAMS Convert at 04/21/18 13:51:43
#
# Equation counts
# Total E G L N X C B
# 10 2 0 8 0 0 0 0
#
# Variable counts
# x b i s1s s2s sc si
# Total cont binary integer sos1 sos2 scont sint
# 7 7 0 0 0 0 0 0
# FX 0 0 0 0 0 0 0 0
#
# Nonzero counts
# Total const NL DLL
# 43 17 26 0
#
# Reformulation has removed 1 variable and 1 equation
from pyomo.environ import *
model = m = ConcreteModel()
m.x1 = Var(within=Reals,bounds=(0.0001,100),initialize=0.0001)
m.x2 = Var(within=Reals,bounds=(0.0001,100),initialize=0.0001)
m.x3 = Var(within=Reals,bounds=(0.0001,100),initialize=0.0001)
m.x4 = Var(within=Reals,bounds=(0.0001,100),initialize=0.0001)
m.x5 = Var(within=Reals,bounds=(0.0001,100),initialize=0.0001)
m.x6 = Var(within=Reals,bounds=(None,None),initialize=0)
m.obj = Objective(expr= m.x6, sense=minimize)
m.c2 = Constraint(expr=m.x1*m.x2 + m.x1 - 3*m.x5 == 0)
m.c3 = Constraint(expr=2.8845e-6*m.x2**2 + 4.4975e-7*m.x2 + 2*m.x1*m.x2 + m.x1 + 0.000545176668613029*m.x2*m.x3 +
3.40735417883143e-5*m.x2*m.x4 + m.x3**2*m.x2 - 10*m.x5 - m.x6 <= 0)
m.c4 = Constraint(expr=(-2.8845e-6*m.x2**2) - 4.4975e-7*m.x2 - 2*m.x1*m.x2 - m.x1 - 0.000545176668613029*m.x2*m.x3 -
3.40735417883143e-5*m.x2*m.x4 - m.x3**2*m.x2 + 10*m.x5 - m.x6 <= 0)
m.c5 = Constraint(expr=0.386*m.x3**2 + 0.000410621754172864*m.x3 + 0.000545176668613029*m.x2*m.x3 + 2*m.x3**2*m.x2
- 8*m.x5 - m.x6 <= 0)
m.c6 = Constraint(expr=(-0.386*m.x3**2) - 0.000410621754172864*m.x3 - 0.000545176668613029*m.x2*m.x3 - 2*m.x3**2*m.x2
+ 8*m.x5 - m.x6 <= 0)
m.c7 = Constraint(expr=2*m.x4**2 + 3.40735417883143e-5*m.x2*m.x4 - 40*m.x5 - m.x6 <= 0)
m.c8 = Constraint(expr=(-2*m.x4**2) - 3.40735417883143e-5*m.x2*m.x4 + 40*m.x5 - m.x6 <= 0)
m.c9 = Constraint(expr=9.615e-7*m.x2**2 + 4.4975e-7*m.x2 + 0.193*m.x3**2 + 0.000410621754172864*m.x3 + m.x4**2 + m.x1*
m.x2 + m.x1 + 0.000545176668613029*m.x2*m.x3 + 3.40735417883143e-5*m.x2*m.x4 + m.x3**2*m.x2
- m.x6 <= 1)
m.c10 = Constraint(expr=(-9.615e-7*m.x2**2) - 4.4975e-7*m.x2 - 0.193*m.x3**2 - 0.000410621754172864*m.x3 - m.x4**2 -
m.x1*m.x2 - m.x1 - 0.000545176668613029*m.x2*m.x3 - 3.40735417883143e-5*m.x2*m.x4 - m.x3**2*m.x2
- m.x6 <= -1)
| 46.237288 | 120 | 0.51283 | 485 | 2,728 | 2.884536 | 0.210309 | 0.06862 | 0.054325 | 0.025733 | 0.722659 | 0.722659 | 0.711937 | 0.706219 | 0.706219 | 0.706219 | 0 | 0.326041 | 0.31305 | 2,728 | 58 | 121 | 47.034483 | 0.420491 | 0.248534 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.038462 | 0 | 0.038462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
54356e7c2223d8df1024c67c6f22cf95d2c11cd4 | 6,747 | py | Python | test_runtime/test_nn_runtime/test_activations_runtime.py | 792370706/Ivy | d5d3d92443dfb590335554dd4ce834ea675189fe | [
"Apache-2.0"
] | null | null | null | test_runtime/test_nn_runtime/test_activations_runtime.py | 792370706/Ivy | d5d3d92443dfb590335554dd4ce834ea675189fe | [
"Apache-2.0"
] | null | null | null | test_runtime/test_nn_runtime/test_activations_runtime.py | 792370706/Ivy | d5d3d92443dfb590335554dd4ce834ea675189fe | [
"Apache-2.0"
] | null | null | null | """
Collection of runtime tests for templated activation functions
"""
DIM = int(1e4)
# global
import os
import random
# local
import ivy.core.general as ivy_gen
import ivy.neural_net_functional.activations as ivy_act
this_file_dir = os.path.dirname(os.path.realpath(__file__))
import with_time_logs.ivy.neural_net.activations as ivy_act_w_time
from ivy import torch as _ivy_torch
from ivy import tensorflow as _ivy_tf
from ivy import mxnet as _ivy_mxnet
from ivy import jax as _ivy_jnp
from ivy import numpy as _ivy_np
from with_time_logs.ivy import torch as _ivy_torch_w_time
from with_time_logs.ivy import tensorflow as _ivy_tf_w_time
from with_time_logs.ivy import mxnet as _ivy_mxnet_w_time
from with_time_logs.ivy import jax as _ivy_jnp_w_time
from with_time_logs.ivy import numpy as _ivy_np_w_time
LIB_DICT = {_ivy_torch: _ivy_torch_w_time,
_ivy_tf: _ivy_tf_w_time,
_ivy_mxnet: _ivy_mxnet_w_time,
_ivy_jnp: _ivy_jnp_w_time,
_ivy_np: _ivy_np_w_time}
# local
import ivy_tests.helpers as helpers
from test_runtime.utils import append_to_file, log_time, write_times, TIMES_DICT
def test_relu():
fname = os.path.join(this_file_dir, 'runtime_analysis/{}/activations/relu.txt'.format(DIM))
if os.path.exists(fname):
os.remove(fname)
for lib, call in [(l, c) for l, c in helpers.calls if c not in [helpers.tf_graph_call, helpers.mx_graph_call]]:
time_lib = LIB_DICT[lib]
append_to_file(fname, '{}'.format(lib))
x0 = ivy_gen.tensor([random.uniform(0, 1) for _ in range(DIM)], f=lib)
ivy_act.relu(x0, f=lib)
ivy_act_w_time.relu(x0, f=time_lib)
TIMES_DICT.clear()
for _ in range(100):
log_time(fname, 'tb0')
ivy_act_w_time.relu(x0, f=time_lib)
log_time(fname, 'tb4', time_at_start=True)
log_time(fname, 'tt0')
ivy_act.relu(x0, f=lib)
log_time(fname, 'tt1', time_at_start=True)
write_times()
append_to_file(fname, 'end of analysis')
def test_leaky_relu():
fname = os.path.join(this_file_dir, 'runtime_analysis/{}/activations/leaky_relu.txt'.format(DIM))
if os.path.exists(fname):
os.remove(fname)
for lib, call in [(l, c) for l, c in helpers.calls if c not in [helpers.tf_graph_call, helpers.mx_graph_call]]:
time_lib = LIB_DICT[lib]
append_to_file(fname, '{}'.format(lib))
x0 = ivy_gen.tensor([random.uniform(0, 1) for _ in range(DIM)], f=lib)
ivy_act.leaky_relu(x0, f=lib)
ivy_act_w_time.leaky_relu(x0, f=time_lib)
TIMES_DICT.clear()
for _ in range(100):
log_time(fname, 'tb0')
ivy_act_w_time.leaky_relu(x0, f=time_lib)
log_time(fname, 'tb4', time_at_start=True)
log_time(fname, 'tt0')
ivy_act.leaky_relu(x0, f=lib)
log_time(fname, 'tt1', time_at_start=True)
write_times()
append_to_file(fname, 'end of analysis')
def test_tanh():
fname = os.path.join(this_file_dir, 'runtime_analysis/{}/activations/tanh.txt'.format(DIM))
if os.path.exists(fname):
os.remove(fname)
for lib, call in [(l, c) for l, c in helpers.calls if c not in [helpers.tf_graph_call, helpers.mx_graph_call]]:
time_lib = LIB_DICT[lib]
append_to_file(fname, '{}'.format(lib))
x0 = ivy_gen.tensor([random.uniform(0, 1) for _ in range(DIM)], f=lib)
ivy_act.tanh(x0, f=lib)
ivy_act_w_time.tanh(x0, f=time_lib)
TIMES_DICT.clear()
for _ in range(100):
log_time(fname, 'tb0')
ivy_act_w_time.tanh(x0, f=time_lib)
log_time(fname, 'tb4', time_at_start=True)
log_time(fname, 'tt0')
ivy_act.tanh(x0, f=lib)
log_time(fname, 'tt1', time_at_start=True)
write_times()
append_to_file(fname, 'end of analysis')
def test_sigmoid():
fname = os.path.join(this_file_dir, 'runtime_analysis/{}/activations/sigmoid.txt'.format(DIM))
if os.path.exists(fname):
os.remove(fname)
for lib, call in [(l, c) for l, c in helpers.calls if c not in [helpers.tf_graph_call, helpers.mx_graph_call]]:
time_lib = LIB_DICT[lib]
append_to_file(fname, '{}'.format(lib))
x0 = ivy_gen.tensor([random.uniform(0, 1) for _ in range(DIM)], f=lib)
ivy_act.sigmoid(x0, f=lib)
ivy_act_w_time.sigmoid(x0, f=time_lib)
TIMES_DICT.clear()
for _ in range(100):
log_time(fname, 'tb0')
ivy_act_w_time.sigmoid(x0, f=time_lib)
log_time(fname, 'tb4', time_at_start=True)
log_time(fname, 'tt0')
ivy_act.sigmoid(x0, f=lib)
log_time(fname, 'tt1', time_at_start=True)
write_times()
append_to_file(fname, 'end of analysis')
def test_softmax():
fname = os.path.join(this_file_dir, 'runtime_analysis/{}/activations/softmax.txt'.format(DIM))
if os.path.exists(fname):
os.remove(fname)
for lib, call in [(l, c) for l, c in helpers.calls if c not in [helpers.tf_graph_call, helpers.mx_graph_call]]:
time_lib = LIB_DICT[lib]
append_to_file(fname, '{}'.format(lib))
x0 = ivy_gen.tensor([random.uniform(0, 1) for _ in range(DIM)], f=lib)
ivy_act.softmax(x0, f=lib)
ivy_act_w_time.softmax(x0, f=time_lib)
TIMES_DICT.clear()
for _ in range(100):
log_time(fname, 'tb0')
ivy_act_w_time.softmax(x0, f=time_lib)
log_time(fname, 'tb4', time_at_start=True)
log_time(fname, 'tt0')
ivy_act.softmax(x0, f=lib)
log_time(fname, 'tt1', time_at_start=True)
write_times()
append_to_file(fname, 'end of analysis')
def test_softplus():
fname = os.path.join(this_file_dir, 'runtime_analysis/{}/activations/softplus.txt'.format(DIM))
if os.path.exists(fname):
os.remove(fname)
for lib, call in [(l, c) for l, c in helpers.calls if c not in [helpers.tf_graph_call, helpers.mx_graph_call]]:
time_lib = LIB_DICT[lib]
append_to_file(fname, '{}'.format(lib))
x0 = ivy_gen.tensor([random.uniform(0, 1) for _ in range(DIM)], f=lib)
ivy_act.softplus(x0, f=lib)
ivy_act_w_time.softplus(x0, f=time_lib)
TIMES_DICT.clear()
for _ in range(100):
log_time(fname, 'tb0')
ivy_act_w_time.softplus(x0, f=time_lib)
log_time(fname, 'tb4', time_at_start=True)
log_time(fname, 'tt0')
ivy_act.softplus(x0, f=lib)
log_time(fname, 'tt1', time_at_start=True)
write_times()
append_to_file(fname, 'end of analysis')
| 29.207792 | 115 | 0.640877 | 1,074 | 6,747 | 3.726257 | 0.088454 | 0.038981 | 0.071964 | 0.035732 | 0.861319 | 0.861319 | 0.795852 | 0.787356 | 0.752374 | 0.725637 | 0 | 0.016764 | 0.239662 | 6,747 | 230 | 116 | 29.334783 | 0.763353 | 0.012154 | 0 | 0.75 | 0 | 0 | 0.064613 | 0.038467 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0.118056 | 0 | 0.159722 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5480f394516294385ffd8ff34f4b1d132662d8c8 | 202 | py | Python | src/_dependencies/objects/data.py | dry-python/dependencies | 1a8bba41ab42d0b5249b36471f5300d9faba81e7 | [
"BSD-2-Clause"
] | 175 | 2018-07-21T13:04:44.000Z | 2020-05-27T15:31:06.000Z | src/_dependencies/objects/data.py | proofit404/dependencies | 204e0cfadca801d64857f24aa4c74e7939ed9af0 | [
"BSD-2-Clause"
] | 325 | 2016-05-16T11:16:11.000Z | 2022-03-04T00:45:57.000Z | src/_dependencies/objects/data.py | dry-python/dependencies | 1a8bba41ab42d0b5249b36471f5300d9faba81e7 | [
"BSD-2-Clause"
] | 18 | 2018-06-17T09:33:16.000Z | 2020-05-20T18:12:30.000Z | from _dependencies.spec import _Spec
def _is_data(name, dependency):
return True
def _build_data_spec(name, dependency):
return _Spec(lambda: dependency, {}, set(), set(), lambda: "Scalar")
| 20.2 | 72 | 0.717822 | 26 | 202 | 5.269231 | 0.576923 | 0.20438 | 0.291971 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.158416 | 202 | 9 | 73 | 22.444444 | 0.805882 | 0 | 0 | 0 | 0 | 0 | 0.029703 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
548147c15b573cd0cd1cebe932f9d491afaee4ca | 179 | py | Python | tests/parser/range.8.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/range.8.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/range.8.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | input = """
f(a).
%g(2..4).
%h(1..3).
f(b).
intersect(X) :- g(X), h(X).
"""
output = """
f(a).
%g(2..4).
%h(1..3).
f(b).
intersect(X) :- g(X), h(X).
"""
| 7.782609 | 28 | 0.329609 | 34 | 179 | 1.735294 | 0.382353 | 0.067797 | 0.101695 | 0.135593 | 0.813559 | 0.813559 | 0.813559 | 0.813559 | 0.813559 | 0.813559 | 0 | 0.062016 | 0.27933 | 179 | 22 | 29 | 8.136364 | 0.395349 | 0 | 0 | 0.857143 | 0 | 0 | 0.807453 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
548236c3871ac05575059e735f9b178579be23d9 | 47,092 | py | Python | test/utils/summarize_rollup_transform_test.py | umich-brcf-bioinf/Jacquard | ff79fcb2ea67ff3c5e222ef1ed24083deb7c3f7c | [
"Apache-2.0"
] | 7 | 2018-10-04T10:21:23.000Z | 2020-11-10T07:18:19.000Z | test/utils/summarize_rollup_transform_test.py | umich-brcf-bioinf/Jacquard | ff79fcb2ea67ff3c5e222ef1ed24083deb7c3f7c | [
"Apache-2.0"
] | 5 | 2019-07-22T06:35:14.000Z | 2020-11-03T17:05:33.000Z | test/utils/summarize_rollup_transform_test.py | umich-brcf-bioinf/Jacquard | ff79fcb2ea67ff3c5e222ef1ed24083deb7c3f7c | [
"Apache-2.0"
] | 3 | 2017-03-12T22:37:18.000Z | 2020-10-19T13:40:38.000Z | #pylint: disable=too-few-public-methods, invalid-name, line-too-long
#pylint: disable=too-many-instance-attributes, too-many-public-methods
from __future__ import print_function, absolute_import, division
import re
from jacquard.utils.utils import JQException
import jacquard.utils.utils as utils
import jacquard.variant_caller_transforms.common_tags as common_tags
import jacquard.variant_caller_transforms.mutect as mutect
import jacquard.utils.summarize_rollup_transform as summarize_caller
import jacquard.variant_caller_transforms.varscan as varscan
from jacquard.utils.vcf import VcfRecord
import test.utils.test_case as test_case
class CallersReportedListTagTestCase(test_case.JacquardBaseTestCase):
def test_metaheader(self):
split_metaheader = summarize_caller._CallersReportedListTag().metaheader.split("\n")
self.assertEquals('##FORMAT=<ID={}CALLERS_REPORTED_LIST,Number=.,Type=String,Description="Comma-separated list variant callers which listed this variant in the Jacquard tagged VCF">'.format(summarize_caller.JQ_SUMMARY_TAG),
split_metaheader[0])
def test_add_tag_values(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}|X:1:1|Y:1:1\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_REPORTED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_REPORTED))
processedVcfRecord = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._CallersReportedListTag()
tag.add_tag_values(processedVcfRecord)
expected = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}:{}CALLERS_REPORTED_LIST|X:1:1:MT,VS|Y:1:1:MT,VS\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_REPORTED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_REPORTED, summarize_caller.JQ_SUMMARY_TAG))
self.assertEquals(expected, processedVcfRecord.text())
def test_add_tag_values_nullValues(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}|X:1:.|Y:1:.\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_REPORTED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_REPORTED))
processedVcfRecord = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._CallersReportedListTag()
tag.add_tag_values(processedVcfRecord)
expected = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}:{}CALLERS_REPORTED_LIST|X:1:.:MT|Y:1:.:MT\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_REPORTED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_REPORTED, summarize_caller.JQ_SUMMARY_TAG))
self.assertEquals(expected, processedVcfRecord.text())
class CallersReportedTagTestCase(test_case.JacquardBaseTestCase):
def test_metaheader(self):
split_metaheader = summarize_caller._CallersReportedTag().metaheader.split("\n")
self.assertEquals('##FORMAT=<ID={}{},Number=1,Type=Integer,Description="Count of variant callers which listed this variant in the Jacquard tagged VCF">'.format(summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_REPORTED),
split_metaheader[0])
def test_add_tag_values(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}|X:1:1|Y:1:1\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_REPORTED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_REPORTED))
processedVcfRecord = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._CallersReportedTag()
tag.add_tag_values(processedVcfRecord)
expected = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}:{}{}|X:1:1:2|Y:1:1:2\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_REPORTED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_REPORTED, summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_REPORTED))
self.assertEquals(expected, processedVcfRecord.text())
def test_add_tag_values_nullValues(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}|X:.:.|Y:.:.\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_REPORTED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_REPORTED))
processedVcfRecord = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._CallersReportedTag()
tag.add_tag_values(processedVcfRecord)
expected = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}:{}{}|X:.:.:0|Y:.:.:0\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_REPORTED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_REPORTED, summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_REPORTED))
self.assertEquals(expected, processedVcfRecord.text())
class CallersPassedListTagTestCase(test_case.JacquardBaseTestCase):
def test_metaheader(self):
split_metaheader = summarize_caller._CallersPassedListTag().metaheader.split("\n")
self.assertEquals('##FORMAT=<ID={}CALLERS_PASSED_LIST,Number=.,Type=String,Description="Comma-separated list of variant caller short-names where FILTER = PASS for this variant in the Jacquard tagged VCF">'.format(summarize_caller.JQ_SUMMARY_TAG),
split_metaheader[0])
def test_add_tag_values(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}|X:1:1|Y:1:0\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_PASSED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_PASSED))
processedVcfRecord = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._CallersPassedListTag()
tag.add_tag_values(processedVcfRecord)
expected = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}:{}CALLERS_PASSED_LIST|X:1:1:MT,VS|Y:1:0:MT\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_PASSED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_PASSED, summarize_caller.JQ_SUMMARY_TAG))
self.assertEquals(expected, processedVcfRecord.text())
def test_add_tag_values_NoCallersPassed(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}|X:0:0|Y:0:0\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_PASSED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_PASSED))
processedVcfRecord = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._CallersPassedListTag()
tag.add_tag_values(processedVcfRecord)
expected = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}:{}CALLERS_PASSED_LIST|X:0:0:.|Y:0:0:.\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_PASSED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_PASSED, summarize_caller.JQ_SUMMARY_TAG))
self.assertEquals(expected, processedVcfRecord.text())
def test_add_tag_values_nullValues(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}|X:1:.|Y:1:.\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_PASSED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_PASSED))
processedVcfRecord = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._CallersPassedListTag()
tag.add_tag_values(processedVcfRecord)
expected = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}:{}CALLERS_PASSED_LIST|X:1:.:MT|Y:1:.:MT\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_PASSED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_PASSED, summarize_caller.JQ_SUMMARY_TAG))
self.assertEquals(expected, processedVcfRecord.text())
class CallersPassedTagTestCase(test_case.JacquardBaseTestCase):
def test_metaheader(self):
split_metaheader = summarize_caller._CallersPassedTag().metaheader.split("\n")
self.assertEquals('##FORMAT=<ID={}{},Number=1,Type=Integer,Description="Count of variant callers where FILTER = PASS for this variant in the Jacquard tagged VCF">'.format(summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_PASSED),
split_metaheader[0])
def test_add_tag_values(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}|X:1:1|Y:1:0\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_PASSED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_PASSED))
processedVcfRecord = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._CallersPassedTag()
tag.add_tag_values(processedVcfRecord)
expected = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}:{}{}|X:1:1:2|Y:1:0:1\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_PASSED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_PASSED, summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_PASSED))
self.assertEquals(expected, processedVcfRecord.text())
def test_add_tag_values_nullValues(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}|X:.:.|Y:.:.\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_PASSED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_PASSED))
processedVcfRecord = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._CallersPassedTag()
tag.add_tag_values(processedVcfRecord)
expected = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}:{}{}:{}{}|X:.:.:0|Y:.:.:0\n".format(mutect.JQ_MUTECT_TAG, common_tags.CALLER_PASSED, varscan.JQ_VARSCAN_TAG, common_tags.CALLER_PASSED, summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_PASSED))
self.assertEquals(expected, processedVcfRecord.text())
class SamplesReportedTestCase(test_case.JacquardBaseTestCase):
def test_metaheader(self):
split_metaheader = summarize_caller._SamplesReported().metaheader.split("\n")
self.assertEquals('##INFO=<ID={}{},Number=1,Type=Integer,Description="Count of samples where this variant appeared in any of the Jacquard tagged VCFs (regardless of quality/filtering)">'.format(summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_SAMPLES_REPORTED),
split_metaheader[0])
def test_add_tag_values(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}|X:2|Y:1\n".format(summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_REPORTED))
processedVcfRecord = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._SamplesReported()
tag.add_tag_values(processedVcfRecord)
expected = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO;{}{}=2|JQ_DP:{}{}|X:2|Y:1\n".format(summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_SAMPLES_REPORTED, summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_REPORTED))
self.assertEquals(expected, processedVcfRecord.text())
def test_add_tag_values_nullValues(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}|X:.|Y:.\n".format(summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_REPORTED))
processedVcfRecord = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._SamplesReported()
tag.add_tag_values(processedVcfRecord)
expected = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO;{}{}=0|JQ_DP:{}{}|X:.|Y:.\n".format(summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_SAMPLES_REPORTED, summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_REPORTED))
self.assertEquals(expected, processedVcfRecord.text())
class SamplesPassedTestCase(test_case.JacquardBaseTestCase):
def test_metaheader(self):
split_metaheader = summarize_caller._SamplesPassed().metaheader.split("\n")
self.assertEquals('##INFO=<ID={}{},Number=1,Type=Integer,Description="Count of samples where a variant caller passed the filter in any of the Jacquard tagged VCFs">'.format(summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_SAMPLES_PASSED),
split_metaheader[0])
def test_add_tag_values_onePassed(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}|X:2|Y:0\n".format(summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_PASSED))
processedVcfRecord = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._SamplesPassed()
tag.add_tag_values(processedVcfRecord)
expected = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO;{}{}=1|JQ_DP:{}{}|X:2|Y:0\n".format(summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_SAMPLES_PASSED, summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_PASSED))
self.assertEquals(expected, processedVcfRecord.text())
def test_add_tag_values_nonePassed(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}|X:0|Y:0\n".format(summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_PASSED))
vcf_record = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._SamplesPassed()
tag.add_tag_values(vcf_record)
info_tag = summarize_caller.JQ_SUMMARY_TAG + summarize_caller.JQ_SAMPLES_PASSED
self.assertIn(info_tag, vcf_record.info_dict)
self.assertEquals("0", vcf_record.info_dict[info_tag])
def test_add_tag_values_nullValues(self):
line = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO|JQ_DP:{}{}|X:.|Y:.\n".format(summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_PASSED))
processedVcfRecord = VcfRecord.parse_record(line, ["SA", "SB"])
tag = summarize_caller._SamplesPassed()
tag.add_tag_values(processedVcfRecord)
expected = self.entab("CHROM|POS|ID|REF|ALT|QUAL|FILTER|INFO;{}{}=0|JQ_DP:{}{}|X:.|Y:.\n".format(summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_SAMPLES_PASSED, summarize_caller.JQ_SUMMARY_TAG, summarize_caller.JQ_PASSED))
self.assertEquals(expected, processedVcfRecord.text())
class HCGenotypeTagTestCase(test_case.JacquardBaseTestCase):
_TAG_ID = "{}HC_GT".format(summarize_caller.JQ_SUMMARY_TAG)
def test_metaheader(self):
self.assertEquals('##FORMAT=<ID={}HC_GT,Number=1,Type=String,Description="High confidence consensus genotype (inferred from JQ_*_GT and JQ_*_CALLER_PASSED). Majority rules; ties go to the least unusual variant (0/1>0/2>1/1). Variants which failed their filter are ignored. Phasing is removed.">'.format(summarize_caller.JQ_SUMMARY_TAG),
summarize_caller._HCGenotypeTag().metaheader)
def test_add_tag_values(self):
tag = summarize_caller._HCGenotypeTag()
sample_tag_values = {"SA": {"JQ_foo_GT": "0/1", "JQ_bar_GT": "0/1", "JQ_baz_GT":"."},
"SB": {"JQ_foo_GT": "0/0", "JQ_bar_GT": "0/0", "JQ_baz_GT":"0/1"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("0/1", record.sample_tag_values["SA"][HCGenotypeTagTestCase._TAG_ID])
self.assertEquals("0/0", record.sample_tag_values["SB"][HCGenotypeTagTestCase._TAG_ID])
def test_add_tag_values_allNulls(self):
tag = summarize_caller._HCGenotypeTag()
sample_tag_values = {"SA": {"JQ_foo_GT": ".", "JQ_bar_GT": ".", "JQ_baz_GT":"."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][HCGenotypeTagTestCase._TAG_ID])
def test_add_tag_values_onlyCountGTTags(self):
tag = summarize_caller._HCGenotypeTag()
sample_tag_values = {"SA": {"JQ_foo_XX": "0/1", "JQ_bar_GT": ".", "JQ_baz_GT":"0/0"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("0/0", record.sample_tag_values["SA"][HCGenotypeTagTestCase._TAG_ID])
def test_add_tag_values_missingGTTags(self):
tag = summarize_caller._HCGenotypeTag()
sample_tag_values = {"SA": {"JQ_foo_XX": "0/1"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][HCGenotypeTagTestCase._TAG_ID])
def test_prioritize_genotype(self):
tag = summarize_caller._HCGenotypeTag()
self.assertEquals("0/1", tag._prioritize_genotype(["0/1"]))
self.assertEquals("0/1", tag._prioritize_genotype(["0/0", "0/1"]))
self.assertEquals("0/0", tag._prioritize_genotype(["0/0", "0/0", "1/1"]))
self.assertEquals("1/1", tag._prioritize_genotype(["1/1", "1/1", "0/1"]))
self.assertEquals("0/1", tag._prioritize_genotype(["1/1", "0/1", "0/1"]))
self.assertEquals("0/1", tag._prioritize_genotype(["0/1", "0/2", "1/1", "1/2", "2/2"]))
def test_prioritize_genotype_dephases(self):
tag = summarize_caller._HCGenotypeTag()
self.assertEquals("0/1", tag._prioritize_genotype(["0|1"]))
self.assertEquals("0/1", tag._prioritize_genotype(["1|0"]))
self.assertEquals("1/3", tag._prioritize_genotype(["3|1"]))
self.assertEquals("0/0", tag._prioritize_genotype(["0|0", "0/0"]))
self.assertEquals("0/1", tag._prioritize_genotype(["0|0", "0|1", "1|0"]))
self.assertEquals("1/1", tag._prioritize_genotype(["0|0", "1|1"]))
self.assertEquals("1/1", tag._prioritize_genotype(["1|1", "1|1", "0|1"]))
self.assertEquals("0/1", tag._prioritize_genotype(["1|1", "0|1", "1|0"]))
self.assertEquals("0/1", tag._prioritize_genotype(["0|1", "0|2"]))
class AlleleFreqRangeTagTestCase(test_case.JacquardBaseTestCase):
_TAG_ID = "{}AF_RANGE".format(summarize_caller.JQ_SUMMARY_TAG)
def test_metaheader(self):
split_meta_header = summarize_caller._AlleleFreqRangeTag().metaheader.split("\n")
self.assertEqual('##FORMAT=<ID={0}AF_RANGE,Number=1,Type=Float,Description="Max(allele frequency) - min (allele frequency) across recognized callers.">'.format(summarize_caller.JQ_SUMMARY_TAG),
split_meta_header[0])
def test_add_tag_values(self):
tag = summarize_caller._AlleleFreqRangeTag()
sample_tag_values = {"SA": {"JQ_foo_AF": "0", "JQ_bar_AF": "0.1", "JQ_baz_AF":"0.2"},
"SB": {"JQ_foo_AF": "0.2", "JQ_bar_AF": "0.3", "JQ_baz_AF":"0.4"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("0.2", record.sample_tag_values["SA"][AlleleFreqRangeTagTestCase._TAG_ID])
self.assertEquals("0.2", record.sample_tag_values["SB"][AlleleFreqRangeTagTestCase._TAG_ID])
def test_add_tag_values_nullsDoNotCount(self):
tag = summarize_caller._AlleleFreqRangeTag()
sample_tag_values = {"SA": {"JQ_foo_AF": ".", "JQ_bar_AF": "0.1", "JQ_baz_AF":"0.2"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("0.1", record.sample_tag_values["SA"][AlleleFreqRangeTagTestCase._TAG_ID])
def test_add_tag_values_oneValueReturnsNull(self):
tag = summarize_caller._AlleleFreqRangeTag()
sample_tag_values = {"SA": {"JQ_foo_AF": "."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][AlleleFreqRangeTagTestCase._TAG_ID])
def test_add_tag_values_OnlyCountAFTags(self):
tag = summarize_caller._AlleleFreqRangeTag()
sample_tag_values = {"SA": {"JQ_foo_XX": "0", "JQ_bar_AF": "0.1", "JQ_baz_AF":"0.2"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("0.1", record.sample_tag_values["SA"][AlleleFreqRangeTagTestCase._TAG_ID])
def test_add_tag_values_allNulls(self):
tag = summarize_caller._AlleleFreqRangeTag()
sample_tag_values = {"SA": {"JQ_foo_AF": ".", "JQ_bar_AF": ".", "JQ_baz_AF":"."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][AlleleFreqRangeTagTestCase._TAG_ID])
def test_add_tag_values_missingTag(self):
tag = summarize_caller._AlleleFreqRangeTag()
sample_tag_values = {"SA": {"JQ_foo_XX": "."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][AlleleFreqRangeTagTestCase._TAG_ID])
def test_add_tag_values_multAlts(self):
tag = summarize_caller._AlleleFreqRangeTag()
sample_tag_values = {"SA": {"JQ_foo_AF": "0,0.1", "JQ_bar_AF": "0.1,0.2"},
"SB": {"JQ_foo_AF": "0.2", "JQ_bar_AF": "0.3"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("0.1,0.1", record.sample_tag_values["SA"][AlleleFreqRangeTagTestCase._TAG_ID])
def test_add_tag_values_rounds(self):
tag = summarize_caller._AlleleFreqRangeTag()
sample_tag_values = {"SA": {"JQ_foo_AF": "0", "JQ_bar_AF": "0.666666"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("0.6667", record.sample_tag_values["SA"][AlleleFreqRangeTagTestCase._TAG_ID])
def test_add_tag_values_inconsistentMultAlt(self):
tag = summarize_caller._AlleleFreqRangeTag()
sample_tag_values = {"SA": {"JQ_foo_AF": "0,0.1", "JQ_bar_AF": "0.2"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
self.assertRaisesRegexp(JQException,
r"Error summarizing values \[.*\] at record \[.*\]",
tag.add_tag_values,
record)
class AlleleFreqAverageTagTestCase(test_case.JacquardBaseTestCase):
_TAG_ID = "{}AF_AVERAGE".format(summarize_caller.JQ_SUMMARY_TAG)
def test_metaheader(self):
split_meta_header = summarize_caller._AlleleFreqAverageTag().metaheader.split("\n")
self.assertEqual('##FORMAT=<ID={0}AF_AVERAGE,Number=1,Type=Float,Description="Average allele frequency across recognized variant callers that reported frequency for this sample-locus [average(JQ_*_AF)].">'.format(summarize_caller.JQ_SUMMARY_TAG),
split_meta_header[0])
def test_add_tag_values(self):
tag = summarize_caller._AlleleFreqAverageTag()
sample_tag_values = {"SA": {"JQ_foo_AF": "0", "JQ_bar_AF": "0.1", "JQ_baz_AF":"0.2"},
"SB": {"JQ_foo_AF": "0.2", "JQ_bar_AF": "0.3", "JQ_baz_AF":"0.4"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("0.1", record.sample_tag_values["SA"][AlleleFreqAverageTagTestCase._TAG_ID])
self.assertEquals("0.3", record.sample_tag_values["SB"][AlleleFreqAverageTagTestCase._TAG_ID])
def test_add_tag_values_nullsDoNotCount(self):
tag = summarize_caller._AlleleFreqAverageTag()
sample_tag_values = {"SA": {"JQ_foo_AF": ".", "JQ_bar_AF": ".", "JQ_baz_AF":"0.2"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("0.2", record.sample_tag_values["SA"][AlleleFreqAverageTagTestCase._TAG_ID])
def test_add_tag_values_OnlyCountAFTags(self):
tag = summarize_caller._AlleleFreqAverageTag()
sample_tag_values = {"SA": {"JQ_foo_XX": "0", "JQ_bar_XX": "0.1", "JQ_baz_AF":"0.2"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("0.2", record.sample_tag_values["SA"][AlleleFreqAverageTagTestCase._TAG_ID])
def test_add_tag_values_allNulls(self):
tag = summarize_caller._AlleleFreqAverageTag()
sample_tag_values = {"SA": {"JQ_foo_AF": ".", "JQ_bar_AF": ".", "JQ_baz_AF":"."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][AlleleFreqAverageTagTestCase._TAG_ID])
def test_add_tag_values_missingTag(self):
tag = summarize_caller._AlleleFreqAverageTag()
sample_tag_values = {"SA": {"JQ_foo_XX": "."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][AlleleFreqAverageTagTestCase._TAG_ID])
def test_add_tag_values_multAlts(self):
tag = summarize_caller._AlleleFreqAverageTag()
sample_tag_values = {"SA": {"JQ_foo_AF": "0,0.1", "JQ_bar_AF": "0.1,0.2"},
"SB": {"JQ_foo_AF": "0.2", "JQ_bar_AF": "0.3"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("0.05,0.15", record.sample_tag_values["SA"][AlleleFreqAverageTagTestCase._TAG_ID])
def test_add_tag_values_rounds(self):
tag = summarize_caller._AlleleFreqAverageTag()
sample_tag_values = {"SA": {"JQ_foo_AF": "0", "JQ_bar_AF": "0.666666"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("0.3333", record.sample_tag_values["SA"][AlleleFreqAverageTagTestCase._TAG_ID])
def test_add_tag_values_inconsistentMultAlt(self):
tag = summarize_caller._AlleleFreqAverageTag()
sample_tag_values = {"SA": {"JQ_foo_AF": "0,0.1", "JQ_bar_AF": "0.2"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
self.assertRaisesRegexp(JQException,
r"Error summarizing values \[.*\] at record \[.*\]",
tag.add_tag_values,
record)
class DepthRangeTagTestCase(test_case.JacquardBaseTestCase):
_TAG_ID = "{}DP_RANGE".format(summarize_caller.JQ_SUMMARY_TAG)
def test_metaheader(self):
split_meta_header = summarize_caller._DepthRangeTag().metaheader.split("\n")
self.assertEquals('##FORMAT=<ID={0}DP_RANGE,Number=1,Type=Float,Description="Max(depth) - min (depth) across recognized callers.">'.format(summarize_caller.JQ_SUMMARY_TAG),
split_meta_header[0])
def test_add_tag_values(self):
tag = summarize_caller._DepthRangeTag()
sample_tag_values = {"SA": {"JQ_foo_DP": "0", "JQ_bar_DP": "1", "JQ_baz_DP":"2"},
"SB": {"JQ_foo_DP": "2", "JQ_bar_DP": "3", "JQ_baz_DP":"4"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("2", record.sample_tag_values["SA"][DepthRangeTagTestCase._TAG_ID])
self.assertEquals("2", record.sample_tag_values["SB"][DepthRangeTagTestCase._TAG_ID])
def test_add_tag_values_oneValueReturnsNull(self):
tag = summarize_caller._DepthRangeTag()
sample_tag_values = {"SA": {"JQ_foo_DP": "."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][DepthRangeTagTestCase._TAG_ID])
def test_add_tag_values_nullsDoNotCount(self):
tag = summarize_caller._DepthRangeTag()
sample_tag_values = {"SA": {"JQ_foo_DP": ".", "JQ_bar_DP": "1", "JQ_baz_DP":"2"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("1", record.sample_tag_values["SA"][DepthRangeTagTestCase._TAG_ID])
def test_add_tag_values_OnlyCountDPTags(self):
tag = summarize_caller._DepthRangeTag()
sample_tag_values = {"SA": {"JQ_foo_XX": "0", "JQ_bar_DP": "1", "JQ_baz_DP":"2"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("1", record.sample_tag_values["SA"][DepthRangeTagTestCase._TAG_ID])
def test_add_tag_values_allNulls(self):
tag = summarize_caller._DepthRangeTag()
sample_tag_values = {"SA": {"JQ_foo_DP": ".", "JQ_bar_DP": ".", "JQ_baz_DP":"."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][DepthRangeTagTestCase._TAG_ID])
def test_add_tag_values_missingTag(self):
tag = summarize_caller._DepthRangeTag()
sample_tag_values = {"SA": {"JQ_foo_XX": "."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][DepthRangeTagTestCase._TAG_ID])
def test_add_tag_values_multAlts(self):
tag = summarize_caller._DepthRangeTag()
sample_tag_values = {"SA": {"JQ_foo_DP": "0,1", "JQ_bar_DP": "2,3"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("2,2", record.sample_tag_values["SA"][DepthRangeTagTestCase._TAG_ID])
def test_add_tag_values_rounds(self):
tag = summarize_caller._DepthRangeTag()
sample_tag_values = {"SA": {"JQ_foo_DP": "0", "JQ_bar_DP": "42.666666"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("42.6667", record.sample_tag_values["SA"][DepthRangeTagTestCase._TAG_ID])
def test_add_tag_values_inconsistentMultAlt(self):
tag = summarize_caller._DepthRangeTag()
sample_tag_values = {"SA": {"JQ_foo_DP": "0,0.1", "JQ_bar_DP": "0.2"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
self.assertRaisesRegexp(JQException,
r"Error summarizing values \[.*\] at record \[.*\]",
tag.add_tag_values,
record)
class DepthAverageTagTestCase(test_case.JacquardBaseTestCase):
_TAG_ID = "{}DP_AVERAGE".format(summarize_caller.JQ_SUMMARY_TAG)
def test_metaheader(self):
split_meta_header = summarize_caller._DepthAverageTag().metaheader.split("\n")
self.assertEquals('##FORMAT=<ID={}DP_AVERAGE,Number=1,Type=Float,Description="Average depth across recognized variant callers that reported depth for this sample-locus; rounded to integer [round(average(JQ_*_DP))].">'.format(summarize_caller.JQ_SUMMARY_TAG),
split_meta_header[0])
def test_add_tag_values(self):
tag = summarize_caller._DepthAverageTag()
sample_tag_values = {"SA": {"JQ_foo_DP": "0", "JQ_bar_DP": "1", "JQ_baz_DP":"2"},
"SB": {"JQ_foo_DP": "2", "JQ_bar_DP": "3", "JQ_baz_DP":"4"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("1", record.sample_tag_values["SA"][DepthAverageTagTestCase._TAG_ID])
self.assertEquals("3", record.sample_tag_values["SB"][DepthAverageTagTestCase._TAG_ID])
def test_add_tag_values_oneValueReturnsNull(self):
tag = summarize_caller._DepthAverageTag()
sample_tag_values = {"SA": {"JQ_foo_DP": "."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][DepthAverageTagTestCase._TAG_ID])
def test_add_tag_values_nullsDoNotCount(self):
tag = summarize_caller._DepthAverageTag()
sample_tag_values = {"SA": {"JQ_foo_DP": ".", "JQ_bar_DP": "1", "JQ_baz_DP":"2"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("1.5", record.sample_tag_values["SA"][DepthAverageTagTestCase._TAG_ID])
def test_add_tag_values_OnlyCountDPTags(self):
tag = summarize_caller._DepthAverageTag()
sample_tag_values = {"SA": {"JQ_foo_XX": "0", "JQ_bar_DP": "1", "JQ_baz_DP":"2"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("1.5", record.sample_tag_values["SA"][DepthAverageTagTestCase._TAG_ID])
def test_add_tag_values_allNulls(self):
tag = summarize_caller._DepthAverageTag()
sample_tag_values = {"SA": {"JQ_foo_DP": ".", "JQ_bar_DP": ".", "JQ_baz_DP":"."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][DepthAverageTagTestCase._TAG_ID])
def test_add_tag_values_missingTag(self):
tag = summarize_caller._DepthAverageTag()
sample_tag_values = {"SA": {"JQ_foo_XX": "."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][DepthAverageTagTestCase._TAG_ID])
def test_add_tag_values_multAlts(self):
tag = summarize_caller._DepthAverageTag()
sample_tag_values = {"SA": {"JQ_foo_DP": "0,1", "JQ_bar_DP": "2,3"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("1,2", record.sample_tag_values["SA"][DepthAverageTagTestCase._TAG_ID])
def test_add_tag_values_rounds(self):
tag = summarize_caller._DepthAverageTag()
sample_tag_values = {"SA": {"JQ_foo_DP": "0", "JQ_bar_DP": "3"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("1.5", record.sample_tag_values["SA"][DepthAverageTagTestCase._TAG_ID])
def test_add_tag_values_inconsistentMultAlt(self):
tag = summarize_caller._DepthAverageTag()
sample_tag_values = {"SA": {"JQ_foo_DP": "0,0.1", "JQ_bar_DP": "0.2"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
self.assertRaisesRegexp(JQException,
r"Error summarizing values \[.*\] at record \[.*\]",
tag.add_tag_values,
record)
class SomaticTagTestCase(test_case.JacquardBaseTestCase):
_TAG_ID = "{}SOM_COUNT".format(summarize_caller.JQ_SUMMARY_TAG)
def test_metaheader(self):
split_meta_header = summarize_caller._SomaticTag().metaheader.split("\n")
self.assertEqual('##FORMAT=<ID={0}SOM_COUNT,Number=1,Type=Integer,' \
'Description="Count of recognized variant callers ' \
'that reported confident somatic call for this '\
'sample-locus.">'\
.format(summarize_caller.JQ_SUMMARY_TAG), split_meta_header[0])
def test_add_tag_values(self):
tag = summarize_caller._SomaticTag()
sample_tag_values = {"SA": {"JQ_foo_HC_SOM": "1", "JQ_bar_HC_SOM": "1"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("2", record.sample_tag_values["SA"][SomaticTagTestCase._TAG_ID])
def test_add_tag_values_allZero(self):
tag = summarize_caller._SomaticTag()
sample_tag_values = {"SA": {"JQ_foo_HC_SOM": "0", "JQ_bar_HC_SOM": "0"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("0", record.sample_tag_values["SA"][SomaticTagTestCase._TAG_ID])
def test_add_tag_values_nullsDoNotCount(self):
tag = summarize_caller._SomaticTag()
sample_tag_values = {"SA": {"JQ_foo_HC_SOM": "1", "JQ_bar_HC_SOM": "."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("1", record.sample_tag_values["SA"][SomaticTagTestCase._TAG_ID])
def test_add_tag_values_OnlyCountHCSOMTags(self):
tag = summarize_caller._SomaticTag()
sample_tag_values = {"SA": {"JQ_foo": "1", "JQ_foo_HC_SOM": "1", "JQ_bar_HC_SOM": "1"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("2", record.sample_tag_values["SA"][SomaticTagTestCase._TAG_ID])
def test_add_tag_values_allNulls(self):
tag = summarize_caller._SomaticTag()
sample_tag_values = {"SA": {"JQ_foo_HC_SOM": ".", "JQ_bar_HC_SOM": "."}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][SomaticTagTestCase._TAG_ID])
def test_add_tag_values_missingTag(self):
tag = summarize_caller._SomaticTag()
sample_tag_values = {"SA": {"JQ_foo": "1",}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals(".", record.sample_tag_values["SA"][SomaticTagTestCase._TAG_ID])
def test_add_tag_values_multAlts(self):
tag = summarize_caller._SomaticTag()
sample_tag_values = {"SA": {"JQ_foo_HC_SOM": "0,1", "JQ_bar_HC_SOM": "1,1"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
tag.add_tag_values(record)
self.assertEquals("1,2", record.sample_tag_values["SA"][SomaticTagTestCase._TAG_ID])
def test_add_tag_values_inconsistentMultAlt(self):
tag = summarize_caller._SomaticTag()
sample_tag_values = {"SA": {"JQ_foo_HC_SOM": "0,0.1", "JQ_bar_HC_SOM": "0.2"}}
record = VcfRecord("CHROM", "POS", "REF", "ALT",
sample_tag_values=sample_tag_values)
self.assertRaisesRegexp(JQException,
r"Error summarizing values \[.*\] at record \[.*\]",
tag.add_tag_values,
record)
class SummarizeCallerTestCase(test_case.JacquardBaseTestCase):
@staticmethod
def _sum_to_string(collection_of_numbers):
return str(sum(collection_of_numbers))
def test_aggregate_numeric_values_ints(self):
input_values = ["0", "1", "2", "4"]
actual_value = summarize_caller._aggregate_numeric_values(input_values,
self._sum_to_string)
self.assertEquals("7", actual_value)
def test_aggregate_numeric_values_floats(self):
input_values = ["0.0", "0.1", "0.2", "0.4"]
actual_value = summarize_caller._aggregate_numeric_values(input_values,
self._sum_to_string)
self.assertEquals("0.7", actual_value)
def test_aggregate_numeric_values_listsOfInts(self):
input_values = ["0,1", "2,4"]
actual_value = summarize_caller._aggregate_numeric_values(input_values,
self._sum_to_string)
self.assertEquals("2,5", actual_value)
def test_aggregate_numeric_values_listsOfFloats(self):
input_values = ["0.0,0.1", "0.2,0.4"]
actual_value = summarize_caller._aggregate_numeric_values(input_values,
self._sum_to_string)
self.assertEquals("0.2,0.5", actual_value)
def test_aggregate_numeric_values_listsOf3(self):
input_values = ["0,1,3", "4,5,6"]
actual_value = summarize_caller._aggregate_numeric_values(input_values,
self._sum_to_string)
self.assertEquals("4,6,9", actual_value)
def test_get_non_null_values(self):
sample_tag_values = {"SA": {"JQ_A_AF":"0",
"JQ_B_AF":"1",
"JQ_C_AF":"2"}}
record = VcfRecord("chr1", "42", "A", "C", sample_tag_values=sample_tag_values)
actual_values = summarize_caller._get_non_null_values(record,
"SA",
common_tags.ALLELE_FREQ_TAG)
self.assertEquals(["0", "1", "2"], sorted(actual_values))
def test_get_non_null_values_hasNulls(self):
sample_tag_values = {"SA": {"JQ_A_AF":".",
"JQ_B_AF":".",
"JQ_C_AF":"2"}}
record = VcfRecord("chr1", "42", "A", "C", sample_tag_values=sample_tag_values)
actual_values = summarize_caller._get_non_null_values(record,
"SA",
common_tags.ALLELE_FREQ_TAG)
self.assertEquals(["2"], actual_values)
def test_get_non_null_values_invalidSample(self):
sample_tag_values = {"SA": {"JQ_A_AF":".",
"JQ_B_AF":".",
"JQ_C_AF":"2"}}
record = VcfRecord("chr1", "42", "A", "C", sample_tag_values=sample_tag_values)
self.assertRaisesRegexp(utils.JQException,
r"Sample \[FOO\] was not recognized",
summarize_caller._get_non_null_values,
record,
"FOO",
re.compile("^JQ_.*_AF$"))
def test_get_non_null_values_missingTag(self):
sample_tag_values = {"SA": {"JQ_A_AF":".",
"JQ_B_AF":".",
"JQ_C_AF":"2"}}
record = VcfRecord("chr1", "42", "A", "C", sample_tag_values=sample_tag_values)
actual_values = summarize_caller._get_non_null_values(record,
"SA",
common_tags.DEPTH_TAG)
self.assertEquals([], actual_values)
def test_summary_tag_prefix(self):
self.assertEquals("JQ_SUMMARY_", summarize_caller.JQ_SUMMARY_TAG)
def test_add_tags(self):
sample_tag_values = {"SA": {"JQ_foo_AF":"0", "JQ_VS_CALLER_REPORTED":"1", 'JQ_MT_CALLER_REPORTED':"1", 'JQ_VS_CALLER_PASSED': "0", 'JQ_MT_CALLER_PASSED': "0"},
"SB": {"JQ_foo_AF":"0.2", "JQ_VS_CALLER_REPORTED":"1", 'JQ_MT_CALLER_REPORTED':"1", 'JQ_VS_CALLER_PASSED': "0", 'JQ_MT_CALLER_PASSED': "0"}}
record = VcfRecord("chr1", "42", "A", "C", sample_tag_values=sample_tag_values)
actual_record = summarize_caller.SummarizeCaller().add_tags(record)
# self.assertEquals("2", actual_record.info_dict["JQ_SUMMARY_SAMPLES_REPORTED_COUNT"])
# self.assertEquals("0", actual_record.info_dict["JQ_SUMMARY_SAMPLES_PASSED_COUNT"])
self.assertEquals("2", actual_record.sample_tag_values["SA"]["JQ_SUMMARY_CALLERS_REPORTED_COUNT"])
self.assertEquals("0", actual_record.sample_tag_values["SA"]["JQ_SUMMARY_CALLERS_PASSED_COUNT"])
self.assertEquals("0", actual_record.sample_tag_values["SA"]["JQ_SUMMARY_AF_AVERAGE"])
self.assertEquals(".", actual_record.sample_tag_values["SA"]["JQ_SUMMARY_AF_RANGE"])
self.assertEquals(".", actual_record.sample_tag_values["SA"]["JQ_SUMMARY_DP_AVERAGE"])
self.assertEquals(".", actual_record.sample_tag_values["SA"]["JQ_SUMMARY_DP_RANGE"])
self.assertEquals(".", actual_record.sample_tag_values["SA"]["JQ_SUMMARY_SOM_COUNT"])
self.assertEquals("2", actual_record.sample_tag_values["SB"]["JQ_SUMMARY_CALLERS_REPORTED_COUNT"])
self.assertEquals("0", actual_record.sample_tag_values["SB"]["JQ_SUMMARY_CALLERS_PASSED_COUNT"])
self.assertEquals("0.2", actual_record.sample_tag_values["SB"]["JQ_SUMMARY_AF_AVERAGE"])
self.assertEquals(".", actual_record.sample_tag_values["SB"]["JQ_SUMMARY_AF_RANGE"])
self.assertEquals(".", actual_record.sample_tag_values["SB"]["JQ_SUMMARY_DP_AVERAGE"])
self.assertEquals(".", actual_record.sample_tag_values["SB"]["JQ_SUMMARY_DP_RANGE"])
self.assertEquals(".", actual_record.sample_tag_values["SB"]["JQ_SUMMARY_SOM_COUNT"])
def test_get_new_metaheaders(self):
expected = ('##FORMAT=<ID={}{},'
'Number=1,'
'Type=Integer,'
'Description="Count of variant callers which listed this variant in the Jacquard tagged VCF">')\
.format(summarize_caller.JQ_SUMMARY_TAG,
summarize_caller.JQ_REPORTED)
actual = summarize_caller.SummarizeCaller().get_metaheaders()
split_actual = actual[0].split("\n")
first_meta_header = split_actual[0]
self.assertEqual(expected, first_meta_header)
self.assertEqual(12, len(actual))
self.assertEqual(1, len(split_actual))
| 58.718204 | 344 | 0.653296 | 5,842 | 47,092 | 4.904998 | 0.044163 | 0.106474 | 0.113593 | 0.05992 | 0.912302 | 0.900715 | 0.886861 | 0.866446 | 0.83413 | 0.821986 | 0 | 0.014338 | 0.204684 | 47,092 | 801 | 345 | 58.791511 | 0.750754 | 0.006795 | 0 | 0.665123 | 0 | 0.058642 | 0.167764 | 0.065472 | 0 | 0 | 0 | 0 | 0.186728 | 1 | 0.135802 | false | 0.063272 | 0.015432 | 0.001543 | 0.182099 | 0.001543 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
5489dc4857b65a65c560d98cf7feb9d8c75018f3 | 439 | py | Python | send_Node.py | AntonAlbertovich/Eusocial-Cluster-Utility | fef4f583b6151bb40e54d6825d65d668581c2121 | [
"MIT"
] | 2 | 2019-03-22T15:08:31.000Z | 2019-03-23T20:10:40.000Z | send_Node.py | AntonAlbertovich/Eusocial-Cluster-Utility | fef4f583b6151bb40e54d6825d65d668581c2121 | [
"MIT"
] | 1 | 2019-03-23T20:08:12.000Z | 2019-03-23T20:08:12.000Z | send_Node.py | AntonAlbertovich/Eusocial-Cluster-Utility | fef4f583b6151bb40e54d6825d65d668581c2121 | [
"MIT"
] | 1 | 2019-03-23T19:56:07.000Z | 2019-03-23T19:56:07.000Z | import os
import socket
# Custom file distribution protocal for send_Node_2
dir_path = os.path.dirname(os.path.realpath(__file__))
os.chdir(dir_path)
os.system("sshpass -p 'root' scp -r /home/user/Eusocial-Cluster/Machine02 user@192.168.1.00:/home/user/Eusocial-Cluster")
os.system("sshpass -p 'root' scp -r /home/user/Eusocial-Cluster/Machine02 user@192.168.1.01:/home/user/Eusocial-Cluster")
print('192.168.1.02')
print('Machine02')
| 43.9 | 122 | 0.76082 | 74 | 439 | 4.405405 | 0.459459 | 0.09816 | 0.196319 | 0.282209 | 0.411043 | 0.411043 | 0.411043 | 0.411043 | 0.411043 | 0.411043 | 0 | 0.084158 | 0.079727 | 439 | 9 | 123 | 48.777778 | 0.722772 | 0.111617 | 0 | 0 | 0 | 0.25 | 0.615979 | 0.42268 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.25 | 0.25 | 0 | 0.25 | 0.25 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
b70398658ed333ad30518f7f8a4e3232171ab013 | 649 | py | Python | module_2/lab2_1_1_20.py | dzooli/pcep_prepare | ddf34991a2d6ef2cfe3bda706ec333e9caa2aea5 | [
"MIT"
] | null | null | null | module_2/lab2_1_1_20.py | dzooli/pcep_prepare | ddf34991a2d6ef2cfe3bda706ec333e9caa2aea5 | [
"MIT"
] | null | null | null | module_2/lab2_1_1_20.py | dzooli/pcep_prepare | ddf34991a2d6ef2cfe3bda706ec333e9caa2aea5 | [
"MIT"
] | null | null | null | print()
print(" * "*2)
print(" * * "*2)
print(" * * "*2)
print(" * * "*2)
print("*** ***"*2)
print(" * * "*2)
print(" * * "*2)
print(" ***** "*2)
print()
print(" "*4, "*"," "*4, sep='', end='|\n')
print(" "*3, "* *"," "*3, sep='', end='|\n')
print(" "*2,"* *"," "*2, sep='', end='|\n')
print(" ", "* *"," ", sep='', end='|\n')
print("*** ***", sep='', end='|\n')
print(" "*3,"*"," "*1,"*"," "*3, sep='', end='|\n')
print(" "*3,"*"," "*1,"*"," "*3, sep='', end='|\n')
print(" "*3,"*"," "*1,"*"," "*3, sep='', end='|\n')
print(" "*3,"*"*3," "*3, sep='', end='|\n')
print("Hello\nWorld")
print('Hello\\nWorld')
| 27.041667 | 51 | 0.33282 | 78 | 649 | 2.769231 | 0.128205 | 0.25 | 0.291667 | 0.5 | 0.810185 | 0.810185 | 0.625 | 0.625 | 0.513889 | 0.513889 | 0 | 0.049713 | 0.194145 | 649 | 23 | 52 | 28.217391 | 0.363289 | 0 | 0 | 0.47619 | 0 | 0 | 0.269646 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
3fcd7c868eb1a66eecacdc7db1edfed5b2ce722b | 163 | py | Python | calculator.py | shyed2001/Python_Programming | 93ef958e3d8aa77f9191b550972235ce4fe4a6cb | [
"bzip2-1.0.6"
] | 2 | 2019-05-01T04:32:14.000Z | 2019-05-04T11:28:18.000Z | calculator.py | shyed2001/python-learning-basics | 93ef958e3d8aa77f9191b550972235ce4fe4a6cb | [
"bzip2-1.0.6"
] | null | null | null | calculator.py | shyed2001/python-learning-basics | 93ef958e3d8aa77f9191b550972235ce4fe4a6cb | [
"bzip2-1.0.6"
] | null | null | null | def add(n1, n2):
return n1 + n2
def subtract(n1, n2):
return n1 - n2
def divide(n1, n2):
return n1 / n2
def multiply(n1, n2):
return n1 * n2
| 10.866667 | 21 | 0.570552 | 28 | 163 | 3.321429 | 0.285714 | 0.344086 | 0.430108 | 0.516129 | 0.698925 | 0.548387 | 0 | 0 | 0 | 0 | 0 | 0.141593 | 0.306748 | 163 | 14 | 22 | 11.642857 | 0.681416 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
3fd2c6cc46e50fd0233b36d4186b7d2711baf507 | 218 | py | Python | Markov_Chain/src/utils.py | AlexCubo/Spiced_Projects | 9fb9a744f397a9e5142646cf3619c1c969ee428d | [
"MIT"
] | null | null | null | Markov_Chain/src/utils.py | AlexCubo/Spiced_Projects | 9fb9a744f397a9e5142646cf3619c1c969ee428d | [
"MIT"
] | null | null | null | Markov_Chain/src/utils.py | AlexCubo/Spiced_Projects | 9fb9a744f397a9e5142646cf3619c1c969ee428d | [
"MIT"
] | null | null | null | from pathlib import Path
def get_project_root():
''' Returns the absolute path of the project including the project folder.
'''
return Path(__file__).parent.parent
# TODO: create initial data/ structure
| 21.8 | 78 | 0.729358 | 29 | 218 | 5.275862 | 0.758621 | 0.130719 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192661 | 218 | 9 | 79 | 24.222222 | 0.869318 | 0.518349 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3ff2c0596cabf84082f8112891e4a7c04a48ebcb | 916 | py | Python | WatchDogs_Visualisation/oldApps/tweet-map/venv2/lib/python3.7/site-packages/dash/dependencies.py | tnreddy09/WatchDogs_StockMarketAnalysis | 0c72430da633785fcb14e40d8b007c86081d515d | [
"Apache-2.0"
] | 4 | 2020-02-05T11:26:47.000Z | 2021-05-26T07:48:46.000Z | WatchDogs_Visualisation/oldApps/tweet-map/venv2/lib/python3.7/site-packages/dash/dependencies.py | prashanth-thipparthi/WatchDogs_StockMarketAnalysis | 0c72430da633785fcb14e40d8b007c86081d515d | [
"Apache-2.0"
] | null | null | null | WatchDogs_Visualisation/oldApps/tweet-map/venv2/lib/python3.7/site-packages/dash/dependencies.py | prashanth-thipparthi/WatchDogs_StockMarketAnalysis | 0c72430da633785fcb14e40d8b007c86081d515d | [
"Apache-2.0"
] | null | null | null | # pylint: disable=old-style-class, too-few-public-methods
class Output:
def __init__(self, component_id, component_property):
self.component_id = component_id
self.component_property = component_property
# pylint: disable=old-style-class, too-few-public-methods
class Input:
def __init__(self, component_id, component_property):
self.component_id = component_id
self.component_property = component_property
# pylint: disable=old-style-class, too-few-public-methods
class State:
def __init__(self, component_id, component_property):
self.component_id = component_id
self.component_property = component_property
# pylint: disable=old-style-class, too-few-public-methods
class Event:
def __init__(self, component_id, component_event):
self.component_id = component_id
self.component_event = component_event
| 33.925926 | 58 | 0.724891 | 112 | 916 | 5.571429 | 0.169643 | 0.25 | 0.192308 | 0.307692 | 0.927885 | 0.927885 | 0.878205 | 0.815705 | 0.815705 | 0.815705 | 0 | 0 | 0.19214 | 916 | 26 | 59 | 35.230769 | 0.843243 | 0.24345 | 0 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
b751f8e1219eb915a9e07b9c07c1f8700b5c900b | 2,857 | py | Python | Ago-Dic-2021/cedillo-aparicio-cristina/Practica-4/composite_test.py | AnhellO/DAS_Sistemas | 07b4eca78357d02d225d570033d05748d91383e3 | [
"MIT"
] | 41 | 2017-09-26T09:36:32.000Z | 2022-03-19T18:05:25.000Z | Ago-Dic-2021/cedillo-aparicio-cristina/Practica-4/composite_test.py | AnhellO/DAS_Sistemas | 07b4eca78357d02d225d570033d05748d91383e3 | [
"MIT"
] | 67 | 2017-09-11T05:06:12.000Z | 2022-02-14T04:44:04.000Z | Ago-Dic-2021/cedillo-aparicio-cristina/Practica-4/composite_test.py | AnhellO/DAS_Sistemas | 07b4eca78357d02d225d570033d05748d91383e3 | [
"MIT"
] | 210 | 2017-09-01T00:10:08.000Z | 2022-03-19T18:05:12.000Z | from composite import *
def test(capsys):
gerenteGeneral = CompositeElement('Gerente General')
gerente1 = CompositeElement('Gerente 1')
gerenteGeneral.add(gerente1)
gerenteGeneral.details()
out, _ = capsys.readouterr()
assert "'Gerente General' es superior de 'Gerente 1'" in out
def test2(capsys):
gerenteGeneral = CompositeElement('Gerente General')
programador1 = LeafElement('Programador 1')
gerenteGeneral.add(programador1)
gerenteGeneral.details()
out, _ = capsys.readouterr()
assert "'Gerente General' es supervisor de 'Programador 1'" in out
def test3(capsys):
gerenteGeneral = CompositeElement('Gerente General')
gerente1 = CompositeElement("Gerente 1")
programador1 = LeafElement('Programador 1-1')
gerenteGeneral.add(gerente1)
gerente1.add(programador1)
gerenteGeneral.details()
out, _ = capsys.readouterr()
assert "'Gerente General' es superior de 'Gerente 1'\n'Gerente 1' es supervisor de 'Programador 1-1'" in out
def test4(capsys):
gerenteGeneral = CompositeElement("Gerente General")
gerente1 = CompositeElement("Gerente 1")
gerente2 = CompositeElement("Gerente 2")
programador11 = LeafElement("Programador 1-1")
programador12 = LeafElement("Programador 1-2")
programador21 = LeafElement("Programador 2-1")
programador22 = LeafElement("Programador 2-2")
gerenteGeneral.add(gerente1)
gerenteGeneral.add(gerente2)
gerente1.add(programador11)
gerente1.add(programador12)
gerente2.add(programador21)
gerente2.add(programador22)
gerenteGeneral.details()
out, _ = capsys.readouterr()
assert "'Gerente General' es superior de 'Gerente 1'\n'Gerente 1' es supervisor de 'Programador 1-1'\n'Gerente 1' es supervisor de 'Programador 1-2'\n'Gerente General' es superior de 'Gerente 2'\n'Gerente 2' es supervisor de 'Programador 2-1'\n'Gerente 2' es supervisor de 'Programador 2-2'" in out
def test5(capsys):
gerenteGeneral = CompositeElement("Gerente General")
gerente1 = CompositeElement("Gerente 1")
gerente2 = CompositeElement("Gerente 2")
programador11 = LeafElement("Programador 1-1")
programador12 = LeafElement("Programador 1-2")
programador21 = LeafElement("Programador 2-1")
programador22 = LeafElement("Programador 2-2")
gerenteGeneral.add(gerente1)
gerenteGeneral.add(gerente2)
gerente1.add(programador11)
gerente1.add(programador12)
gerente2.add(programador21)
gerente2.add(programador22)
gerente2.remove(programador21)
gerenteGeneral.details()
out, _ = capsys.readouterr()
assert "'Gerente General' es superior de 'Gerente 1'\n'Gerente 1' es supervisor de 'Programador 1-1'\n'Gerente 1' es supervisor de 'Programador 1-2'\n'Gerente General' es superior de 'Gerente 2'\n'Gerente 2' es supervisor de 'Programador 2-2'" in out | 44.640625 | 302 | 0.728736 | 325 | 2,857 | 6.390769 | 0.113846 | 0.050072 | 0.060664 | 0.108329 | 0.882523 | 0.846895 | 0.846895 | 0.846895 | 0.846895 | 0.767935 | 0 | 0.050188 | 0.163108 | 2,857 | 64 | 303 | 44.640625 | 0.818486 | 0 | 0 | 0.711864 | 0 | 0.050847 | 0.342547 | 0 | 0 | 0 | 0 | 0 | 0.084746 | 1 | 0.084746 | false | 0 | 0.016949 | 0 | 0.101695 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b7745ea8f8bc74f9525623f45209ed41fbff50f6 | 39 | py | Python | twitterspider/TwitterSpider/untils/__init__.py | AaronZhang123356/Social-Network-Analysis | 0d105c709b61532200f512dbcdf2211909ab326d | [
"MIT"
] | null | null | null | twitterspider/TwitterSpider/untils/__init__.py | AaronZhang123356/Social-Network-Analysis | 0d105c709b61532200f512dbcdf2211909ab326d | [
"MIT"
] | null | null | null | twitterspider/TwitterSpider/untils/__init__.py | AaronZhang123356/Social-Network-Analysis | 0d105c709b61532200f512dbcdf2211909ab326d | [
"MIT"
] | null | null | null | #encoding:utf-8
#time:2020/11/17 20:23 | 19.5 | 22 | 0.717949 | 9 | 39 | 3.111111 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.361111 | 0.076923 | 39 | 2 | 22 | 19.5 | 0.416667 | 0.897436 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b7826aa0f72e9869e5136a62844ccd1a61b37edc | 7,912 | py | Python | otp/nametag/NametagConstants.py | ksmit799/POTCO-PS | 520d38935ae8df4b452c733a82c94dddac01e275 | [
"Apache-2.0"
] | 8 | 2017-01-24T04:33:29.000Z | 2020-11-01T08:36:24.000Z | otp/nametag/NametagConstants.py | ksmit799/Pirates-Online-Remake | 520d38935ae8df4b452c733a82c94dddac01e275 | [
"Apache-2.0"
] | 1 | 2017-03-02T18:05:17.000Z | 2017-03-14T06:47:10.000Z | otp/nametag/NametagConstants.py | ksmit799/Pirates-Online-Remake | 520d38935ae8df4b452c733a82c94dddac01e275 | [
"Apache-2.0"
] | 11 | 2017-03-02T18:46:07.000Z | 2020-11-01T08:36:26.000Z | CFNoQuitButton=256
CFPageButton=16
CFQuicktalker=4
CFQuitButton=32
CFReversed=64
CFSndOpenchat=128
CFSpeech=1
CFThought=2
CFTimeout=8
CCNormal = 0
CCNoChat = 1
CCNonPlayer = 2
CCSuit = 3
CCToonBuilding = 4
CCSuitBuilding = 5
CCHouseBuilding = 6
CCSpeedChat = 7
CCFreeChat = 8
NAMETAG_COLORS = {
CCNormal: (
# Normal FG BG
((0.3, 0.3, 0.7, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Click FG BG
((0.3, 0.3, 0.7, 1.0), (0.2, 0.2, 0.2, 0.6), # Name
(1.0, 0.5, 0.5, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Hover FG BG
((0.5, 0.5, 1.0, 1.0), (1.0, 1.0, 1.0, 1.0), # Name
(0.0, 0.6, 0.6, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Disable FG BG
((0.3, 0.3, 0.7, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
),
CCNoChat: (
# Normal FG BG
((0.8, 0.4, 0.0, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Click FG BG
((1.0, 0.5, 0.5, 1.0), (0.2, 0.2, 0.2, 0.6), # Name
(1.0, 0.5, 0.5, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Hover FG BG
((1.0, 0.5, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0), # Name
(0.0, 0.6, 0.6, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Disable FG BG
((0.8, 0.4, 0.0, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
),
CCNonPlayer: (
# Normal FG BG
((0.8, 0.4, 0.0, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Click FG BG
((0.8, 0.4, 0.0, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Hover FG BG
((0.8, 0.4, 0.0, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Disable FG BG
((0.8, 0.4, 0.0, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
),
CCSuit: (
# Normal FG BG
((0.2, 0.2, 0.2, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Click FG BG
((0.2, 0.2, 0.2, 1.0), (0.2, 0.2, 0.2, 0.6), # Name
(1.0, 0.5, 0.5, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Hover FG BG
((0.4, 0.4, 0.4, 1.0), (1.0, 1.0, 1.0, 0.7), # Name
(0.0, 0.6, 0.6, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Disable FG BG
((0.2, 0.2, 0.2, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
),
CCSuitBuilding: (
# Normal FG BG
((0.2, 0.2, 0.2, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Click FG BG
((0.2, 0.2, 0.2, 1.0), (0.2, 0.2, 0.2, 0.6), # Name
(1.0, 0.5, 0.5, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Hover FG BG
((0.4, 0.4, 0.4, 1.0), (1.0, 1.0, 1.0, 0.7), # Name
(0.0, 0.6, 0.6, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Disable FG BG
((0.2, 0.2, 0.5, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
),
CCToonBuilding: (
# Normal FG BG
((0.2, 0.6, 0.9, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Click FG BG
((0.2, 0.6, 0.9, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Hover FG BG
((0.2, 0.6, 0.9, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Disable FG BG
((0.2, 0.6, 0.9, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
),
CCHouseBuilding: (
# Normal FG BG
((0.2, 0.6, 0.9, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Click FG BG
((0.2, 0.2, 0.5, 1.0), (0.2, 0.2, 0.2, 0.6), # Name
(1.0, 0.5, 0.5, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Hover FG BG
((0.5, 0.5, 1.0, 1.0), (1.0, 1.0, 1.0, 1.0), # Name
(0.0, 0.6, 0.6, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Disable FG BG
((0.0, 0.6, 0.2, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
),
CCSpeedChat: (
# Normal FG BG
((0.0, 0.6, 0.2, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Click FG BG
((0.0, 0.5, 0.0, 1.0), (0.5, 0.5, 0.5, 0.6), # Name
(1.0, 0.5, 0.5, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Hover FG BG
((0.0, 0.7, 0.2, 1.0), (1.0, 1.0, 1.0, 0.7), # Name
(0.0, 0.6, 0.6, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Disable FG BG
((0.0, 0.6, 0.2, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
),
CCFreeChat: (
# Normal FG BG
((0.3, 0.3, 0.7, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Click FG BG
((0.2, 0.2, 0.5, 1.0), (0.2, 0.2, 0.2, 0.6), # Name
(1.0, 0.5, 0.5, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Hover FG BG
((0.5, 0.5, 1.0, 1.0), (1.0, 1.0, 1.0, 1.0), # Name
(0.0, 0.6, 0.6, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
# Disable FG BG
((0.3, 0.3, 0.7, 1.0), (0.8, 0.8, 0.8, 0.5), # Name
(0.0, 0.0, 0.0, 1.0), (1.0, 1.0, 1.0, 1.0)), # Chat
),
}
WTNormal = 0
WTQuickTalker = 1
WTSystem = 2
WTBattleSOS = 3
WTEmote = 4
WTToontownBoardingGroup = 5
WHISPER_COLORS = {
WTNormal: (
# Normal FG BG
((0.0, 0.0, 0.0, 1.0), (0.2, 0.6, 0.8, 0.6)),
# Click FG BG
((1.0, 0.5, 0.5, 1.0), (1.0, 1.0, 1.0, 1.0)),
# Hover FG BG
((0.0, 0.0, 0.0, 1.0), (0.2, 0.7, 0.9, 0.6)),
# Disable FG BG
((0.0, 0.0, 0.0, 1.0), (0.2, 0.7, 0.8, 0.6)),
),
WTQuickTalker: (
# Normal FG BG
((0.0, 0.0, 0.0, 1.0), (0.2, 0.6, 0.8, 0.6)),
# Click FG BG
((1.0, 0.5, 0.5, 1.0), (1.0, 1.0, 1.0, 1.0)),
# Hover FG BG
((0.0, 0.0, 0.0, 1.0), (0.2, 0.7, 0.9, 0.6)),
# Disable FG BG
((0.0, 0.0, 0.0, 1.0), (0.2, 0.7, 0.8, 0.6)),
),
WTSystem: (
# Normal FG BG
((0.0, 0.0, 0.0, 1.0), (0.8, 0.3, 0.6, 0.6)),
# Click FG BG
((1.0, 0.5, 0.5, 1.0), (1.0, 1.0, 1.0, 1.0)),
# Hover FG BG
((0.0, 0.0, 0.0, 1.0), (0.8, 0.4, 1.0, 1.0)),
# Disable FG BG
((0.0, 0.0, 0.0, 1.0), (0.8, 0.3, 0.6, 0.6)),
),
# TODO: WTBattleSOS
# TODO: WTEmote
# TODO: WTToontownBoardingGroup
}
| 41.208333 | 60 | 0.329247 | 1,630 | 7,912 | 1.596933 | 0.031288 | 0.216673 | 0.258164 | 0.284287 | 0.805609 | 0.804841 | 0.79831 | 0.794084 | 0.794084 | 0.794084 | 0 | 0.299891 | 0.420501 | 7,912 | 191 | 61 | 41.424084 | 0.26783 | 0.253413 | 0 | 0.654412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005236 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
b7ffeebc231e773ae782de0052001576437606e4 | 16,590 | py | Python | src/genie/libs/parser/junos/tests/ShowSystemQueuesNoForwarding/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 204 | 2018-06-27T00:55:27.000Z | 2022-03-06T21:12:18.000Z | src/genie/libs/parser/junos/tests/ShowSystemQueuesNoForwarding/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 468 | 2018-06-19T00:33:18.000Z | 2022-03-31T23:23:35.000Z | src/genie/libs/parser/junos/tests/ShowSystemQueuesNoForwarding/cli/equal/golden_output_1_expected.py | balmasea/genieparser | d1e71a96dfb081e0a8591707b9d4872decd5d9d3 | [
"Apache-2.0"
] | 309 | 2019-01-16T20:21:07.000Z | 2022-03-30T12:56:41.000Z | expected_output = {
"queues-statistics": {
"interface-queues-statistics": {
"interface-queue": [
{
"max-octets-allowed": "12500",
"max-packets-allowed": "41",
"name": "lsi",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "dsc",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "lo0",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500",
"max-packets-allowed": "41",
"name": "gre",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500",
"max-packets-allowed": "41",
"name": "ipip",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "tap",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500",
"max-packets-allowed": "41",
"name": "pime",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500",
"max-packets-allowed": "41",
"name": "pimd",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500000",
"max-packets-allowed": "41666",
"name": "fxp0",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500000",
"max-packets-allowed": "41666",
"name": "em1",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500",
"max-packets-allowed": "41",
"name": "mtun",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "demux0",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500000",
"max-packets-allowed": "41666",
"name": "cbp0",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500000",
"max-packets-allowed": "41666",
"name": "pip0",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "125000",
"max-packets-allowed": "416",
"name": "pp0",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500000",
"max-packets-allowed": "41666",
"name": "irb",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500000",
"max-packets-allowed": "41666",
"name": "vtep",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500000",
"max-packets-allowed": "41666",
"name": "esi",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500000",
"max-packets-allowed": "41666",
"name": "rbeb",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "fti0",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "fti1",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "fti2",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "fti3",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "fti4",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "fti5",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "fti6",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "fti7",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "12500000",
"max-packets-allowed": "41666",
"name": "jsrv",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "lc-0/0/0",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "pfh-0/0/0",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "pfe-0/0/0",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "1250000",
"max-packets-allowed": "4166",
"name": "ge-0/0/0",
"number-of-queue-drops": "3",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "1250000",
"max-packets-allowed": "4166",
"name": "ge-0/0/1",
"number-of-queue-drops": "3",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "1250000",
"max-packets-allowed": "4166",
"name": "ge-0/0/2",
"number-of-queue-drops": "132",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "1250000",
"max-packets-allowed": "4166",
"name": "ge-0/0/3",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "1250000",
"max-packets-allowed": "4166",
"name": "ge-0/0/4",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "1250000",
"max-packets-allowed": "4166",
"name": "ge-0/0/5",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "1250000",
"max-packets-allowed": "4166",
"name": "ge-0/0/6",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "1250000",
"max-packets-allowed": "4166",
"name": "ge-0/0/7",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "1250000",
"max-packets-allowed": "4166",
"name": "ge-0/0/8",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "1250000",
"max-packets-allowed": "4166",
"name": "ge-0/0/9",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
]
},
"protocol-queues-statistics": {
"protocol-queue": [
{
"max-octets-allowed": "1000000",
"max-packets-allowed": "1000",
"name": "splfwdq",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "1000000",
"max-packets-allowed": "1000",
"name": "splnetq",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "1000000",
"max-packets-allowed": "1000",
"name": "optionq",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "50000",
"max-packets-allowed": "50",
"name": "icmpq",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "frlmiq",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "25000",
"max-packets-allowed": "1000",
"name": "spppintrq",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "atmctlpktq",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "0",
"max-packets-allowed": "0",
"name": "atmoamq",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "1250000",
"max-packets-allowed": "4166",
"name": "tnpintrq",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "200000",
"max-packets-allowed": "200",
"name": "tagintrq",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
{
"max-octets-allowed": "200000",
"max-packets-allowed": "200",
"name": "tagfragq",
"number-of-queue-drops": "0",
"octets-in-queue": "0",
"packets-in-queue": "0",
},
]
},
}
}
| 38.671329 | 53 | 0.311935 | 1,249 | 16,590 | 4.142514 | 0.068054 | 0.140704 | 0.160804 | 0.180905 | 0.92385 | 0.919018 | 0.919018 | 0.918825 | 0.918825 | 0.901044 | 0 | 0.076933 | 0.526763 | 16,590 | 428 | 54 | 38.761682 | 0.582091 | 0 | 0 | 0.593458 | 0 | 0 | 0.348282 | 0.069017 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
4d04a5af94ccf4e6b2737ade012022b9266e3a9a | 57 | py | Python | tle/util/db/__init__.py | Bigo-s-cult/TLE | 5da565470b236e440f63b86c1cdfe9c78398c0c5 | [
"MIT"
] | 367 | 2019-03-30T21:42:31.000Z | 2022-03-17T11:53:20.000Z | tle/util/db/__init__.py | Bigo-s-cult/TLE | 5da565470b236e440f63b86c1cdfe9c78398c0c5 | [
"MIT"
] | 320 | 2019-04-01T23:15:33.000Z | 2022-03-27T23:07:01.000Z | tle/util/db/__init__.py | Bigo-s-cult/TLE | 5da565470b236e440f63b86c1cdfe9c78398c0c5 | [
"MIT"
] | 220 | 2019-04-02T18:01:03.000Z | 2022-03-19T00:01:46.000Z | from .cache_db_conn import *
from .user_db_conn import *
| 19 | 28 | 0.789474 | 10 | 57 | 4.1 | 0.6 | 0.292683 | 0.585366 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140351 | 57 | 2 | 29 | 28.5 | 0.836735 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4d15d9e8308d10ae33d67fe767ea9b19e537c20d | 44 | py | Python | sources/ir_datasets_extensions/__init__.py | giguru/datasets_for_haystack | f8ca210f5bc4ca6d50b321c7e4e1fd29a1dec85e | [
"MIT"
] | null | null | null | sources/ir_datasets_extensions/__init__.py | giguru/datasets_for_haystack | f8ca210f5bc4ca6d50b321c7e4e1fd29a1dec85e | [
"MIT"
] | null | null | null | sources/ir_datasets_extensions/__init__.py | giguru/datasets_for_haystack | f8ca210f5bc4ca6d50b321c7e4e1fd29a1dec85e | [
"MIT"
] | null | null | null | from . import canard
from . import orconvqa
| 14.666667 | 22 | 0.772727 | 6 | 44 | 5.666667 | 0.666667 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 44 | 2 | 23 | 22 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4d3d4479431bb1833f498c3390fc1677aa0c3d02 | 145 | py | Python | ukb/models/frame/__init__.py | wi905252/ukb-cardiac-mri | 3177dde898a65b1d7f385b78e4f134de3852bea5 | [
"Apache-2.0"
] | 19 | 2018-05-30T22:13:17.000Z | 2022-01-18T14:04:40.000Z | ukb/models/frame/__init__.py | wi905252/ukb-cardiac-mri | 3177dde898a65b1d7f385b78e4f134de3852bea5 | [
"Apache-2.0"
] | 1 | 2019-08-07T07:29:07.000Z | 2019-08-07T08:54:10.000Z | ukb/models/frame/__init__.py | wi905252/ukb-cardiac-mri | 3177dde898a65b1d7f385b78e4f134de3852bea5 | [
"Apache-2.0"
] | 8 | 2019-07-03T23:19:43.000Z | 2021-11-15T17:09:24.000Z | from .fnn import *
from .vgg import *
from .lenet import *
from .densenet import *
from .densenet_pretrained import *
from .densenet_av import *
| 20.714286 | 34 | 0.751724 | 20 | 145 | 5.35 | 0.4 | 0.46729 | 0.504673 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.165517 | 145 | 6 | 35 | 24.166667 | 0.884298 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4dbd273d8da363f891586293f5ecb2c6bab8b7c5 | 23,892 | py | Python | tests/module/checks/test_python.py | chutz/pkgcheck | 714c016381b53246b449dd9c3116811e50db744f | [
"BSD-3-Clause"
] | null | null | null | tests/module/checks/test_python.py | chutz/pkgcheck | 714c016381b53246b449dd9c3116811e50db744f | [
"BSD-3-Clause"
] | null | null | null | tests/module/checks/test_python.py | chutz/pkgcheck | 714c016381b53246b449dd9c3116811e50db744f | [
"BSD-3-Clause"
] | null | null | null | from pkgcheck.checks import python
from .. import misc
class TestPythonCheck(misc.ReportTestCase):
check = python.PythonCheck(None)
check_kls = python.PythonCheck
def mk_pkg(self, cpv="app-foo/bar-1", **kwargs):
kwargs.setdefault('EAPI', '7')
return misc.FakePkg(cpv, data=kwargs)
def test_multiple_eclasses(self):
r = self.assertReport(
self.check,
self.mk_pkg(_eclasses_=['python-any-r1', 'python-single-r1'],
DEPEND='dev-lang/python'))
assert isinstance(r, python.PythonEclassError)
def test_missing_eclass_depend(self):
self.assertNoReport(
self.check,
self.mk_pkg(_eclasses_=['python-any-r1'], DEPEND='dev-lang/python'))
self.assertNoReport(self.check, self.mk_pkg(DEPEND='dev-foo/frobnicate'))
r = self.assertReport(self.check, self.mk_pkg(DEPEND='dev-lang/python'))
assert isinstance(r, python.MissingPythonEclass)
assert 'missing python-any-r1 eclass usage for DEPEND="dev-lang/python"' == str(r)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(DEPEND='dev-lang/python:2.7')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(DEPEND='dev-lang/python:*')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(DEPEND='=dev-lang/python-2*')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(DEPEND='|| ( dev-lang/python:2.7 dev-lang/python:3.6 )')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(DEPEND='dev-python/pypy')),
python.MissingPythonEclass)
def test_missing_eclass_bdepend(self):
self.assertNoReport(
self.check,
self.mk_pkg(_eclasses_=['python-any-r1'], BDEPEND='dev-lang/python'))
self.assertNoReport(self.check, self.mk_pkg(BDEPEND='dev-foo/frobnicate'))
assert isinstance(
self.assertReport(self.check, self.mk_pkg(BDEPEND='dev-lang/python')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(BDEPEND='dev-lang/python:2.7')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(BDEPEND='dev-lang/python:*')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(BDEPEND='=dev-lang/python-2*')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(BDEPEND='|| ( dev-lang/python:2.7 dev-lang/python:3.6 )')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(BDEPEND='dev-python/pypy')),
python.MissingPythonEclass)
def test_missing_eclass_rdepend(self):
self.assertNoReport(
self.check,
self.mk_pkg(_eclasses_=['python-r1'], RDEPEND='dev-lang/python:2.7'))
self.assertNoReport(
self.check,
self.mk_pkg(_eclasses_=['python-single-r1'], RDEPEND='dev-lang/python:2.7'))
self.assertNoReport(self.check, self.mk_pkg(RDEPEND='dev-foo/frobnicate'))
r = self.assertReport(self.check, self.mk_pkg(RDEPEND='dev-lang/python'))
assert isinstance(r, python.MissingPythonEclass)
assert 'missing python-r1 or python-single-r1 eclass' in str(r)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(RDEPEND='dev-lang/python:2.7')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(RDEPEND='dev-lang/python:=')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(RDEPEND='=dev-lang/python-2*')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(RDEPEND='|| ( dev-lang/python:2.7 dev-lang/python:3.6 )')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(RDEPEND='dev-python/pypy')),
python.MissingPythonEclass)
# special exception: virtual/pypy
self.assertNoReport(self.check, self.mk_pkg(cpv='virtual/pypy-4.1',
RDEPEND='|| ( dev-python/pypy:0/41 dev-python/pypy-bin:0/41 )'))
self.assertNoReport(self.check, self.mk_pkg(cpv='virtual/pypy3-4.1',
RDEPEND='|| ( dev-python/pypy3:0/41 dev-python/pypy3-bin:0/41 )'))
def test_missing_eclass_pdepend(self):
self.assertNoReport(
self.check,
self.mk_pkg(_eclasses_=['python-r1'], PDEPEND='dev-lang/python:2.7'))
self.assertNoReport(
self.check,
self.mk_pkg(_eclasses_=['python-single-r1'], PDEPEND='dev-lang/python:2.7'))
self.assertNoReport(self.check, self.mk_pkg(PDEPEND='dev-foo/frobnicate'))
assert isinstance(
self.assertReport(self.check, self.mk_pkg(PDEPEND='dev-lang/python')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(PDEPEND='dev-lang/python:2.7')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(PDEPEND='dev-lang/python:=')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(PDEPEND='=dev-lang/python-2*')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(PDEPEND='|| ( dev-lang/python:2.7 dev-lang/python:3.6 )')),
python.MissingPythonEclass)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(PDEPEND='dev-python/pypy')),
python.MissingPythonEclass)
def test_valid_packages(self):
self.assertNoReport(
self.check,
self.mk_pkg(
_eclasses_=['python-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6',
RDEPEND='python_targets_python2_7? ( '
' dev-lang/python:2.7 ) '
'python_targets_python3_6? ( '
' dev-lang/python:3.6 )',
REQUIRED_USE='|| ( python_targets_python2_7 '
' python_targets_python3_6 )'))
# python-single-r1 with one implementation does not use PST
self.assertNoReport(
self.check,
self.mk_pkg(_eclasses_=['python-single-r1'],
IUSE='python_targets_python2_7',
RDEPEND='python_targets_python2_7? ( '
' dev-lang/python:2.7 )',
REQUIRED_USE='python_targets_python2_7'))
self.assertNoReport(
self.check,
self.mk_pkg(
_eclasses_=['python-single-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6 '
'python_single_target_python2_7 '
'python_single_target_python3_6',
RDEPEND='python_single_target_python2_7? ( '
' dev-lang/python:2.7 ) '
'python_single_target_python3_6? ( '
' dev-lang/python:3.6 )',
REQUIRED_USE='^^ ( python_single_target_python2_7 '
' python_single_target_python3_6 ) '
'python_single_target_python2_7? ( '
' python_targets_python2_7 ) '
'python_single_target_python3_6? ( '
' python_targets_python3_6 )'))
self.assertNoReport(
self.check,
self.mk_pkg(_eclasses_=['python-any-r1'],
DEPEND='|| ( '
' dev-lang/python:2.7 '
' dev-lang/python:3.6 )'))
self.assertNoReport(
self.check,
self.mk_pkg(_eclasses_=['python-any-r1'], DEPEND='dev-lang/python:2.7'))
self.assertNoReport(
self.check,
self.mk_pkg(_eclasses_=['python-any-r1'],
BDEPEND='|| ( '
' dev-lang/python:2.7 '
' dev-lang/python:3.6 )'))
def test_single_use_mismatch(self):
r = self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-single-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6 '
'python_single_target_python2_7',
RDEPEND='python_single_target_python2_7? ( '
' dev-lang/python:2.7 )',
REQUIRED_USE='python_single_target_python2_7 '
'python_single_target_python2_7? ( '
' python_targets_python2_7 )'))
assert isinstance(r, python.PythonSingleUseMismatch)
assert 'mismatched flags in IUSE: PYTHON_TARGETS=( python2_7 python3_6 )' in str(r)
r = self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-single-r1'],
IUSE='python_targets_python2_7 '
'python_single_target_python2_7 '
'python_single_target_python3_6',
RDEPEND='python_single_target_python2_7? ( '
' dev-lang/python:2.7 ) '
'python_single_target_python3_6? ( '
' dev-lang/python:3.6 )',
REQUIRED_USE='^^ ( python_single_target_python2_7 '
' python_single_target_python3_6 )'))
assert isinstance(r, python.PythonSingleUseMismatch)
assert 'mismatched flags in IUSE: PYTHON_TARGETS=( python2_7 )' in str(r)
def test_missing_required_use(self):
r = self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6',
RDEPEND='python_targets_python2_7? ( '
' dev-lang/python:2.7 ) '
'python_targets_python3_6? ( '
' dev-lang/python:3.6 )'))
assert isinstance(r, python.PythonMissingRequiredUse)
assert 'missing REQUIRED_USE="${PYTHON_REQUIRED_USE}"' == str(r)
# incomplete REQUIRED_USE (e.g. use of python_gen_useflags)
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6',
RDEPEND='python_targets_python2_7? ( '
' dev-lang/python:2.7 ) '
'python_targets_python3_6? ( '
' dev-lang/python:3.6 )',
REQUIRED_USE='|| ( python_targets_python2_7 )')),
python.PythonMissingRequiredUse)
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6 '
'python_targets_python3_7',
RDEPEND='python_targets_python2_7? ( '
' dev-lang/python:2.7 ) '
'python_targets_python3_6? ( '
' dev-lang/python:3.6 ) '
'python_targets_python3_7? ( '
' dev-lang/python:3.7 )',
REQUIRED_USE='|| ( python_targets_python3_6 '
' python_targets_python3_7 )')),
python.PythonMissingRequiredUse)
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-single-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6 '
'python_single_target_python2_7 '
'python_single_target_python3_6',
RDEPEND='python_single_target_python2_7? ( '
' dev-lang/python:2.7 ) '
'python_single_target_python3_6? ( '
' dev-lang/python:3.6 )')),
python.PythonMissingRequiredUse)
# incomplete REQUIRED_USE
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-single-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6 '
'python_single_target_python2_7 '
'python_single_target_python3_6',
RDEPEND='python_single_target_python2_7? ( '
' dev-lang/python:2.7 ) '
'python_single_target_python3_6? ( '
' dev-lang/python:3.6 )',
REQUIRED_USE='^^ ( python_single_target_python2_7 )')),
python.PythonMissingRequiredUse)
# || instead of ^^ in python-single-r1
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-single-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6 '
'python_single_target_python2_7 '
'python_single_target_python3_6',
RDEPEND='python_single_target_python2_7? ( '
' dev-lang/python:2.7 ) '
'python_single_target_python3_6? ( '
' dev-lang/python:3.6 )',
REQUIRED_USE='|| ( python_targets_python2_7 '
' python_targets_python3_6 )')),
python.PythonMissingRequiredUse)
def test_missing_deps(self):
r = self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6',
REQUIRED_USE='|| ( python_targets_python2_7 '
' python_targets_python3_6 )'))
assert isinstance(r, python.PythonMissingDeps)
assert 'missing RDEPEND="${PYTHON_DEPS}"' == str(r)
# incomplete deps
r = self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6',
RDEPEND='python_targets_python2_7? ( '
' dev-lang/python:2.7 )',
REQUIRED_USE='|| ( python_targets_python2_7 '
' python_targets_python3_6 )'))
assert isinstance(r, python.PythonMissingDeps)
assert 'missing RDEPEND="${PYTHON_DEPS}"' == str(r)
# check that irrelevant dep with same USE conditional does not wrongly
# satisfy the check
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6',
RDEPEND='python_targets_python2_7? ( '
' dev-foo/bar ) '
'python_targets_python3_6? ( '
' dev-lang/python:3.6 )',
REQUIRED_USE='|| ( python_targets_python2_7 '
' python_targets_python3_6 )')),
python.PythonMissingDeps)
# DEPEND only, RDEPEND missing
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6',
DEPEND='python_targets_python2_7? ( '
' dev-lang/python:2.7 ) '
'python_targets_python3_6? ( '
' dev-lang/python:3.6 )',
REQUIRED_USE='|| ( python_targets_python2_7 '
' python_targets_python3_6 )')),
python.PythonMissingDeps)
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-single-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6 '
'python_single_target_python2_7 '
'python_single_target_python3_6',
REQUIRED_USE='^^ ( python_single_target_python2_7 '
' python_single_target_python3_6 ) '
'python_single_target_python2_7? ( '
' python_targets_python2_7 ) '
'python_single_target_python3_6? ( '
' python_targets_python3_6 )')),
python.PythonMissingDeps)
# incomplete deps
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-single-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6 '
'python_single_target_python2_7 '
'python_single_target_python3_6',
RDEPEND='python_single_target_python2_7? ( '
' dev-lang/python:2.7 )',
REQUIRED_USE='^^ ( python_single_target_python2_7 '
' python_single_target_python3_6 ) '
'python_single_target_python2_7? ( '
' python_targets_python2_7 ) '
'python_single_target_python3_6? ( '
' python_targets_python3_6 )')),
python.PythonMissingDeps)
# check that irrelevant dep with same USE conditional does not wrongly
# satisfy the check
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-single-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6 '
'python_single_target_python2_7 '
'python_single_target_python3_6',
RDEPEND='python_single_target_python2_7? ( '
' dev-foo/bar ) '
'python_single_target_python3_6? ( '
' dev-lang/python:3.6 )',
REQUIRED_USE='^^ ( python_single_target_python2_7 '
' python_single_target_python3_6 ) '
'python_single_target_python2_7? ( '
' python_targets_python2_7 ) '
'python_single_target_python3_6? ( '
' python_targets_python3_6 )')),
python.PythonMissingDeps)
# DEPEND only, RDEPEND missing
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-single-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6 '
'python_single_target_python2_7 '
'python_single_target_python3_6',
DEPEND='python_single_target_python2_7? ( '
' dev-lang/python:2.7 ) '
'python_single_target_python3_6? ( '
' dev-lang/python:3.6 )',
REQUIRED_USE='^^ ( python_single_target_python2_7 '
' python_single_target_python3_6 ) '
'python_single_target_python2_7? ( '
' python_targets_python2_7 ) '
'python_single_target_python3_6? ( '
' python_targets_python3_6 )')),
python.PythonMissingDeps)
# check that the check isn't wrongly satisfied by PYTHON_TARGETS
# in python-single-r1 (PYTHON_SINGLE_TARGET expected)
assert isinstance(
self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-single-r1'],
IUSE='python_targets_python2_7 '
'python_targets_python3_6 '
'python_single_target_python2_7 '
'python_single_target_python3_6',
RDEPEND='python_targets_python2_7? ( '
' dev-lang/python:2.7 ) '
'python_targets_python3_6? ( '
' dev-lang/python:3.6 )',
REQUIRED_USE='^^ ( python_single_target_python2_7 '
' python_single_target_python3_6 ) '
'python_single_target_python2_7? ( '
' python_targets_python2_7 ) '
'python_single_target_python3_6? ( '
' python_targets_python3_6 )')),
python.PythonMissingDeps)
assert isinstance(
self.assertReport(self.check, self.mk_pkg(_eclasses_=['python-any-r1'])),
python.PythonMissingDeps)
def test_runtime_dep_in_any_r1(self):
r = self.assertReport(
self.check,
self.mk_pkg(
_eclasses_=['python-any-r1'],
DEPEND='|| ( '
' dev-lang/python:2.7 '
' dev-lang/python:3.6 )',
RDEPEND='|| ( '
' dev-lang/python:2.7 '
' dev-lang/python:3.6 )'))
assert isinstance(r, python.PythonRuntimeDepInAnyR1)
assert 'inherits python-any-r1 with RDEPEND="dev-lang/python:2.7"' in str(r)
# shouldn't trigger for blockers
self.assertNoReport(
self.check,
self.mk_pkg(
_eclasses_=['python-any-r1'],
DEPEND='dev-lang/python:2.7',
RDEPEND='!dev-python/pypy3-bin:0'))
| 46.034682 | 91 | 0.514733 | 2,235 | 23,892 | 5.182998 | 0.052349 | 0.098757 | 0.083046 | 0.081578 | 0.905991 | 0.893042 | 0.890107 | 0.884151 | 0.874741 | 0.850829 | 0 | 0.03378 | 0.387912 | 23,892 | 518 | 92 | 46.123552 | 0.758342 | 0.025866 | 0 | 0.80303 | 0 | 0.012987 | 0.310745 | 0.184762 | 0 | 0 | 0 | 0 | 0.248918 | 1 | 0.02381 | false | 0 | 0.004329 | 0 | 0.036797 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4df4bc07ffb2e17109c5846c875b3cc0e1112fbb | 25,985 | py | Python | lib/rucio/tests/test_throttler.py | elichad/rucio | 0705e4124e18c853c4661f85e00238cb0c6bd750 | [
"Apache-2.0"
] | null | null | null | lib/rucio/tests/test_throttler.py | elichad/rucio | 0705e4124e18c853c4661f85e00238cb0c6bd750 | [
"Apache-2.0"
] | null | null | null | lib/rucio/tests/test_throttler.py | elichad/rucio | 0705e4124e18c853c4661f85e00238cb0c6bd750 | [
"Apache-2.0"
] | null | null | null | # Copyright 2018 CERN for the benefit of the ATLAS collaboration.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Authors:
# - Hannes Hansen <hannes.jakob.hansen@cern.ch>, 2019
#
# PY3K COMPATIBLE
from datetime import datetime
from nose.tools import assert_equal
from rucio.common.types import InternalAccount, InternalScope
from rucio.common.utils import generate_uuid
from rucio.core.config import set, remove_option
from rucio.core.did import attach_dids, add_did
from rucio.core.replica import add_replica
from rucio.core.request import queue_requests, get_request_by_did
from rucio.core.rse import get_rse_id, set_rse_transfer_limits
from rucio.daemons.conveyor import throttler
from rucio.db.sqla import session, models, constants
class TestThrottlerGroupedFIFO(object):
@classmethod
def setUpClass(cls):
cls.db_session = session.get_session()
cls.dialect = cls.db_session.bind.dialect.name
cls.dest_rse = 'MOCK'
cls.source_rse = 'MOCK4'
cls.dest_rse_id = get_rse_id(cls.dest_rse)
cls.source_rse_id = get_rse_id(cls.source_rse)
cls.scope = InternalScope('mock')
cls.account = InternalAccount('root')
cls.user_activity = 'User Subscription'
cls.all_activities = 'all_activities'
set('throttler_release_strategy', 'dest_%s' % cls.dest_rse_id, 'grouped_fifo')
def setUp(self):
self.db_session.query(models.Request).delete()
remove_option('throttler', '%s,%s' % (self.user_activity, self.dest_rse), session=self.db_session)
remove_option('throttler', '%s,%s' % (self.all_activities, self.dest_rse), session=self.db_session)
self.db_session.commit()
@classmethod
def tearDownClass(cls):
cls.db_session.commit()
cls.db_session.close()
def test_throttler_grouped_fifo_all(self):
""" THROTTLER (CLIENTS): throttler release all waiting requests (grouped fifo). """
if self.dialect == 'mysql':
return True
# no threshold -> release all waiting requests
name1 = generate_uuid()
name2 = generate_uuid()
name3 = generate_uuid()
name4 = generate_uuid()
dataset_1_name = generate_uuid()
add_did(self.scope, dataset_1_name, constants.DIDType.DATASET, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name1, 1, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name2, 1, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name3, 1, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name4, 1, self.account, session=self.db_session)
attach_dids(self.scope, dataset_1_name, [{'name': name1, 'scope': self.scope}], self.account, session=self.db_session)
attach_dids(self.scope, dataset_1_name, [{'name': name2, 'scope': self.scope}], self.account, session=self.db_session)
requests = [{
'source_rse_id': self.source_rse_id,
'dest_rse_id': self.dest_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name1,
'bytes': 1,
'scope': self.scope,
'retry_count': 1,
'rule_id': generate_uuid(),
'requested_at': datetime.now().replace(year=2000),
'attributes': {
'activity': self.user_activity,
'bytes': 1,
'md5': '',
'adler32': ''
}
}, {
'source_rse_id': self.source_rse_id,
'dest_rse_id': self.dest_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name2,
'bytes': 2,
'requested_at': datetime.now().replace(year=2020),
'rule_id': generate_uuid(),
'scope': self.scope,
'retry_count': 1,
'attributes': {
'activity': self.user_activity,
'bytes': 2,
'md5': '',
'adler32': ''
}
}, {
'source_rse_id': self.source_rse_id,
'dest_rse_id': self.dest_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name3,
'bytes': 3,
'requested_at': datetime.now().replace(year=2021), # requested after the request below but small enough for max_volume check
'rule_id': generate_uuid(),
'scope': self.scope,
'retry_count': 1,
'attributes': {
'activity': self.user_activity,
'bytes': 3,
'md5': '',
'adler32': ''
}
}, {
'source_rse_id': self.source_rse_id,
'dest_rse_id': self.dest_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name4,
'bytes': 3000,
'requested_at': datetime.now().replace(year=2020),
'rule_id': generate_uuid(),
'scope': self.scope,
'retry_count': 1,
'attributes': {
'activity': self.user_activity,
'bytes': 3000,
'md5': '',
'adler32': ''
}
}]
queue_requests(requests, session=self.db_session)
self.db_session.commit()
throttler.run(once=True, sleep_time=1)
request_1 = get_request_by_did(self.scope, name1, self.dest_rse_id)
assert_equal(request_1['state'], constants.RequestState.QUEUED)
request_2 = get_request_by_did(self.scope, name2, self.dest_rse_id)
assert_equal(request_2['state'], constants.RequestState.QUEUED)
request_3 = get_request_by_did(self.scope, name3, self.dest_rse_id)
assert_equal(request_3['state'], constants.RequestState.QUEUED)
request_4 = get_request_by_did(self.scope, name4, self.dest_rse_id)
assert_equal(request_4['state'], constants.RequestState.QUEUED)
def test_throttler_grouped_fifo_nothing(self):
""" THROTTLER (CLIENTS): throttler release nothing (grouped fifo). """
if self.dialect == 'mysql':
return True
# four waiting requests and one active requests but threshold is 1
# more than 80% of the transfer limit are already used -> release nothing
set('throttler', '%s,%s' % (self.all_activities, self.dest_rse), 1, session=self.db_session)
request = models.Request(dest_rse_id=self.dest_rse_id, bytes=2, activity=self.user_activity, state=constants.RequestState.SUBMITTED)
request.save(session=self.db_session)
name1 = generate_uuid()
name2 = generate_uuid()
name3 = generate_uuid()
name4 = generate_uuid()
dataset_1_name = generate_uuid()
add_did(self.scope, dataset_1_name, constants.DIDType.DATASET, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name1, 1, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name2, 1, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name3, 1, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name4, 1, self.account, session=self.db_session)
attach_dids(self.scope, dataset_1_name, [{'name': name1, 'scope': self.scope}], self.account, session=self.db_session)
attach_dids(self.scope, dataset_1_name, [{'name': name2, 'scope': self.scope}], self.account, session=self.db_session)
requests = [{
'source_rse_id': self.source_rse_id,
'dest_rse_id': self.dest_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name1,
'bytes': 1,
'scope': self.scope,
'retry_count': 1,
'rule_id': generate_uuid(),
'requested_at': datetime.now().replace(year=2000),
'attributes': {
'activity': self.user_activity,
'bytes': 1,
'md5': '',
'adler32': ''
}
}, {
'source_rse_id': self.source_rse_id,
'dest_rse_id': self.dest_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name2,
'bytes': 2,
'requested_at': datetime.now().replace(year=2020),
'rule_id': generate_uuid(),
'scope': self.scope,
'retry_count': 1,
'attributes': {
'activity': self.user_activity,
'bytes': 2,
'md5': '',
'adler32': ''
}
}, {
'source_rse_id': self.source_rse_id,
'dest_rse_id': self.dest_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name3,
'bytes': 3,
'requested_at': datetime.now().replace(year=2021), # requested after the request below but small enough for max_volume check
'rule_id': generate_uuid(),
'scope': self.scope,
'retry_count': 1,
'attributes': {
'activity': self.user_activity,
'bytes': 3,
'md5': '',
'adler32': ''
}
}, {
'source_rse_id': self.source_rse_id,
'dest_rse_id': self.dest_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name4,
'bytes': 3000,
'requested_at': datetime.now().replace(year=2020),
'rule_id': generate_uuid(),
'scope': self.scope,
'retry_count': 1,
'attributes': {
'activity': self.user_activity,
'bytes': 3000,
'md5': '',
'adler32': ''
}
}]
queue_requests(requests, session=self.db_session)
self.db_session.commit()
throttler.run(once=True, sleep_time=1)
request_1 = get_request_by_did(self.scope, name1, self.dest_rse_id)
assert_equal(request_1['state'], constants.RequestState.WAITING)
request_2 = get_request_by_did(self.scope, name2, self.dest_rse_id)
assert_equal(request_2['state'], constants.RequestState.WAITING)
request_3 = get_request_by_did(self.scope, name3, self.dest_rse_id)
assert_equal(request_3['state'], constants.RequestState.WAITING)
request_4 = get_request_by_did(self.scope, name4, self.dest_rse_id)
assert_equal(request_4['state'], constants.RequestState.WAITING)
def test_throttler_grouped_fifo_subset(self):
""" THROTTLER (CLIENTS): throttler release subset of waiting requests (grouped fifo). """
if self.dialect == 'mysql':
return True
set_rse_transfer_limits(self.dest_rse_id, self.all_activities, volume=10, max_transfers=1, session=self.db_session)
set('throttler', '%s,%s' % (self.all_activities, self.dest_rse), 1, session=self.db_session) # threshold used by throttler
name1 = generate_uuid()
name2 = generate_uuid()
name3 = generate_uuid()
name4 = generate_uuid()
dataset_1_name = generate_uuid()
add_did(self.scope, dataset_1_name, constants.DIDType.DATASET, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name1, 1, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name2, 1, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name3, 1, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name4, 1, self.account, session=self.db_session)
attach_dids(self.scope, dataset_1_name, [{'name': name1, 'scope': self.scope}], self.account, session=self.db_session)
attach_dids(self.scope, dataset_1_name, [{'name': name2, 'scope': self.scope}], self.account, session=self.db_session)
requests = [{
'source_rse_id': self.source_rse_id,
'dest_rse_id': self.dest_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name1,
'bytes': 1,
'scope': self.scope,
'retry_count': 1,
'rule_id': generate_uuid(),
'requested_at': datetime.now().replace(year=2000),
'attributes': {
'activity': self.user_activity,
'bytes': 1,
'md5': '',
'adler32': ''
}
}, {
'source_rse_id': self.source_rse_id,
'dest_rse_id': self.dest_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name2,
'bytes': 2,
'requested_at': datetime.now().replace(year=2020),
'rule_id': generate_uuid(),
'scope': self.scope,
'retry_count': 1,
'attributes': {
'activity': self.user_activity,
'bytes': 2,
'md5': '',
'adler32': ''
}
}, {
'source_rse_id': self.source_rse_id,
'dest_rse_id': self.dest_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name3,
'bytes': 3,
'requested_at': datetime.now().replace(year=2021), # requested after the request below but small enough for max_volume check
'rule_id': generate_uuid(),
'scope': self.scope,
'retry_count': 1,
'attributes': {
'activity': self.user_activity,
'bytes': 3,
'md5': '',
'adler32': ''
}
}, {
'source_rse_id': self.source_rse_id,
'dest_rse_id': self.dest_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name4,
'bytes': 3000,
'requested_at': datetime.now().replace(year=2020),
'rule_id': generate_uuid(),
'scope': self.scope,
'retry_count': 1,
'attributes': {
'activity': self.user_activity,
'bytes': 3000,
'md5': '',
'adler32': ''
}
}]
queue_requests(requests, session=self.db_session)
self.db_session.commit()
throttler.run(once=True, sleep_time=1)
# released because it got requested first
request_1 = get_request_by_did(self.scope, name1, self.dest_rse_id)
assert_equal(request_1['state'], constants.RequestState.QUEUED)
# released because the DID is attached to the same dataset
request_2 = get_request_by_did(self.scope, name2, self.dest_rse_id)
assert_equal(request_2['state'], constants.RequestState.QUEUED)
# released because of available volume
request_3 = get_request_by_did(self.scope, name3, self.dest_rse_id)
assert_equal(request_3['state'], constants.RequestState.QUEUED)
# still waiting because there is no free volume
request_4 = get_request_by_did(self.scope, name4, self.dest_rse_id)
assert_equal(request_4['state'], constants.RequestState.WAITING)
class TestThrottlerFIFO(object):
@classmethod
def setUpClass(cls):
cls.db_session = session.get_session()
cls.dest_rse = 'MOCK'
cls.source_rse = 'MOCK4'
cls.dest_rse_id = get_rse_id(cls.dest_rse)
cls.source_rse_id = get_rse_id(cls.source_rse)
cls.scope = InternalScope('mock')
cls.account = InternalAccount('root')
cls.user_activity = 'User Subscription'
cls.all_activities = 'all_activities'
set_rse_transfer_limits(cls.dest_rse_id, cls.user_activity, max_transfers=1, session=cls.db_session)
set('throttler_release_strategy', 'dest_%s' % cls.dest_rse_id, 'fifo', session=cls.db_session)
def setUp(self):
self.db_session.query(models.Request).delete()
remove_option('throttler', '%s,%s' % (self.user_activity, self.dest_rse), session=self.db_session)
remove_option('throttler', '%s,%s' % (self.all_activities, self.dest_rse), session=self.db_session)
self.db_session.commit()
@classmethod
def tearDownClass(cls):
cls.db_session.commit()
cls.db_session.close()
def test_throttler_fifo_release_all(self):
""" THROTTLER (CLIENTS): throttler release all waiting requests (fifo). """
# no threshold -> release all waiting requests
name1 = generate_uuid()
name2 = generate_uuid()
add_replica(self.source_rse_id, self.scope, name1, 1, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name2, 1, self.account, session=self.db_session)
requests = [{
'dest_rse_id': self.dest_rse_id,
'source_rse_id': self.source_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name1,
'account': self.account,
'scope': self.scope,
'rule_id': generate_uuid(),
'retry_count': 1,
'requested_at': datetime.now().replace(year=2018),
'attributes': {
'activity': self.user_activity,
'bytes': 1,
'md5': '',
'adler32': ''
}
}, {
'dest_rse_id': self.dest_rse_id,
'source_rse_id': self.source_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'requested_at': datetime.now().replace(year=2020),
'name': name2,
'account': self.account,
'scope': self.scope,
'rule_id': generate_uuid(),
'retry_count': 1,
'attributes': {
'activity': self.user_activity,
'bytes': 1,
'md5': '',
'adler32': ''
}
}]
queue_requests(requests, session=self.db_session)
self.db_session.commit()
throttler.run(once=True, sleep_time=1)
request = get_request_by_did(self.scope, name1, self.dest_rse_id)
assert_equal(request['state'], constants.RequestState.QUEUED)
request2 = get_request_by_did(self.scope, name2, self.dest_rse_id)
assert_equal(request2['state'], constants.RequestState.QUEUED)
# active transfers + waiting requests are less than the threshold -> release all waiting requests
self.db_session.query(models.Request).delete()
self.db_session.commit()
name1 = generate_uuid()
add_replica(self.source_rse_id, self.scope, name1, 1, self.account, session=self.db_session)
set('throttler', '%s,%s' % (self.user_activity, self.dest_rse), 3, session=self.db_session)
request = models.Request(dest_rse_id=self.dest_rse_id, activity=self.user_activity, state=constants.RequestState.SUBMITTED)
request.save(session=self.db_session)
requests = [{
'dest_rse_id': self.dest_rse_id,
'source_rse_id': self.source_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name1,
'account': self.account,
'scope': self.scope,
'rule_id': generate_uuid(),
'retry_count': 1,
'requested_at': datetime.now().replace(year=2018),
'attributes': {
'activity': self.user_activity,
'bytes': 1,
'md5': '',
'adler32': ''
}
}]
queue_requests(requests, session=self.db_session)
self.db_session.commit()
throttler.run(once=True, sleep_time=1)
request = get_request_by_did(self.scope, name1, self.dest_rse_id)
assert_equal(request['state'], constants.RequestState.QUEUED)
def test_throttler_fifo_release_nothing(self):
""" THROTTLER (CLIENTS): throttler release nothing (fifo). """
# two waiting requests and one active requests but threshold is 1
# more than 80% of the transfer limit are already used -> release nothing
set('throttler', '%s,%s' % (self.user_activity, self.dest_rse), 1, session=self.db_session)
request = models.Request(dest_rse_id=self.dest_rse_id, bytes=2, activity=self.user_activity, state=constants.RequestState.SUBMITTED)
request.save(session=self.db_session)
name1 = generate_uuid()
name2 = generate_uuid()
add_replica(self.source_rse_id, self.scope, name1, 1, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name2, 1, self.account, session=self.db_session)
requests = [{
'dest_rse_id': self.dest_rse_id,
'source_rse_id': self.source_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name1,
'account': self.account,
'scope': self.scope,
'rule_id': generate_uuid(),
'retry_count': 1,
'requested_at': datetime.now().replace(year=2018),
'attributes': {
'activity': self.user_activity,
'bytes': 1,
'md5': '',
'adler32': ''
}
}, {
'dest_rse_id': self.dest_rse_id,
'source_rse_id': self.source_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'requested_at': datetime.now().replace(year=2020),
'name': name2,
'account': self.account,
'scope': self.scope,
'rule_id': generate_uuid(),
'retry_count': 1,
'attributes': {
'activity': self.user_activity,
'bytes': 1,
'md5': '',
'adler32': ''
}
}]
queue_requests(requests, session=self.db_session)
self.db_session.commit()
throttler.run(once=True, sleep_time=1)
request = get_request_by_did(self.scope, name1, self.dest_rse_id)
assert_equal(request['state'], constants.RequestState.WAITING)
request2 = get_request_by_did(self.scope, name2, self.dest_rse_id)
assert_equal(request2['state'], constants.RequestState.WAITING)
def test_throttler_fifo_release_subset(self):
""" THROTTLER (CLIENTS): throttler release subset of waiting requests (fifo). """
# two waiting requests and no active requests but threshold is 1 -> release only 1 request
set('throttler', '%s,%s' % (self.user_activity, self.dest_rse), 1, session=self.db_session)
name1 = generate_uuid()
name2 = generate_uuid()
add_replica(self.source_rse_id, self.scope, name1, 1, self.account, session=self.db_session)
add_replica(self.source_rse_id, self.scope, name2, 1, self.account, session=self.db_session)
requests = [{
'dest_rse_id': self.dest_rse_id,
'source_rse_id': self.source_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'name': name1,
'account': self.account,
'scope': self.scope,
'rule_id': generate_uuid(),
'retry_count': 1,
'requested_at': datetime.now().replace(year=2018),
'attributes': {
'activity': self.user_activity,
'bytes': 1,
'md5': '',
'adler32': ''
}
}, {
'dest_rse_id': self.dest_rse_id,
'source_rse_id': self.source_rse_id,
'request_type': constants.RequestType.TRANSFER,
'request_id': generate_uuid(),
'requested_at': datetime.now().replace(year=2020),
'name': name2,
'account': self.account,
'scope': self.scope,
'rule_id': generate_uuid(),
'retry_count': 1,
'attributes': {
'activity': self.user_activity,
'bytes': 1,
'md5': '',
'adler32': ''
}
}]
queue_requests(requests, session=self.db_session)
self.db_session.commit()
throttler.run(once=True, sleep_time=1)
request = get_request_by_did(self.scope, name1, self.dest_rse_id)
assert_equal(request['state'], constants.RequestState.QUEUED)
request2 = get_request_by_did(self.scope, name2, self.dest_rse_id)
assert_equal(request2['state'], constants.RequestState.WAITING)
| 44.647766 | 140 | 0.594112 | 2,985 | 25,985 | 4.912563 | 0.077052 | 0.045349 | 0.042349 | 0.077741 | 0.897845 | 0.887343 | 0.88332 | 0.869203 | 0.866544 | 0.858429 | 0 | 0.020699 | 0.284202 | 25,985 | 581 | 141 | 44.724613 | 0.767688 | 0.0792 | 0 | 0.902299 | 0 | 0 | 0.118162 | 0.00218 | 0 | 0 | 0 | 0 | 0.038314 | 1 | 0.022989 | false | 0 | 0.021073 | 0 | 0.05364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
127879b428699a779249919c6cc2d3c5c31cf1a2 | 89 | py | Python | tests/test_greeting.py | tothattilamate/pytest_noahgift | 9ce506495c15a7030575626b03841dfd97e03dbc | [
"MIT"
] | null | null | null | tests/test_greeting.py | tothattilamate/pytest_noahgift | 9ce506495c15a7030575626b03841dfd97e03dbc | [
"MIT"
] | null | null | null | tests/test_greeting.py | tothattilamate/pytest_noahgift | 9ce506495c15a7030575626b03841dfd97e03dbc | [
"MIT"
] | null | null | null | from greeting import my_name
def test_my_name():
"My name is: bob" == my_name('bob') | 22.25 | 39 | 0.685393 | 16 | 89 | 3.5625 | 0.5625 | 0.421053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179775 | 89 | 4 | 39 | 22.25 | 0.780822 | 0 | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
12de7ef5ee35c2da2676bf75751dc9ad9db4d063 | 4,781 | py | Python | explainer/ebp/ebp.py | jhammelman/visual-attribution | 0405b1b148f426c72f36f6887c7864f289e6c0e2 | [
"BSD-2-Clause"
] | 146 | 2018-05-23T19:57:31.000Z | 2022-02-20T15:20:32.000Z | explainer/ebp/ebp.py | jhammelman/visual-attribution | 0405b1b148f426c72f36f6887c7864f289e6c0e2 | [
"BSD-2-Clause"
] | 7 | 2020-03-31T03:45:29.000Z | 2022-03-11T23:43:54.000Z | explainer/ebp/ebp.py | jhammelman/visual-attribution | 0405b1b148f426c72f36f6887c7864f289e6c0e2 | [
"BSD-2-Clause"
] | 31 | 2018-09-05T22:09:56.000Z | 2021-12-02T15:47:04.000Z | import types
import torch
from explainer.ebp.functions import EBConv2d, EBLinear, EBAvgPool2d
def get_layer(model, key_list):
a = model
for key in key_list:
a = a._modules[key]
return a
class ExcitationBackpropExplainer(object):
def __init__(self, model, output_layer_keys=None):
self.output_layer = get_layer(model, output_layer_keys)
self.model = model
self._override_backward()
self._register_hooks()
def _override_backward(self):
def new_linear(self, x):
return EBLinear.apply(x, self.weight, self.bias)
def new_conv2d(self, x):
return EBConv2d.apply(x, self.weight, self.bias, self.stride,
self.padding, self.dilation, self.groups)
def new_avgpool2d(self, x):
return EBAvgPool2d.apply(x, self.kernel_size, self.stride,
self.padding, self.ceil_mode, self.count_include_pad)
def replace(m):
name = m.__class__.__name__
if name == 'Linear':
m.forward = types.MethodType(new_linear, m)
elif name == 'Conv2d':
m.forward = types.MethodType(new_conv2d, m)
elif name == 'AvgPool2d':
m.forward = types.MethodType(new_avgpool2d, m)
self.model.apply(replace)
def _register_hooks(self):
self.intermediate_vars = []
def forward_hook(m, i, o):
self.intermediate_vars.append(o)
self.output_layer.register_forward_hook(forward_hook)
def explain(self, inp, ind=None):
self.intermediate_vars = []
output = self.model(inp)
output_var = self.intermediate_vars[0]
if ind is None:
ind = output.data.max(1)[1]
grad_out = output.data.clone()
grad_out.fill_(0.0)
grad_out.scatter_(1, ind.unsqueeze(0).t(), 1.0)
attmap_var = torch.autograd.grad(output, output_var, grad_out, retain_graph=True)
attmap = attmap_var[0].data.clone()
attmap = torch.clamp(attmap.sum(1).unsqueeze(1), min=0.0)
return attmap
class ContrastiveExcitationBackpropExplainer(object):
def __init__(self, model, intermediate_layer_keys=None, output_layer_keys=None, final_linear_keys=None):
self.intermediate_layer = get_layer(model, intermediate_layer_keys)
self.output_layer = get_layer(model, output_layer_keys)
self.final_linear = get_layer(model, final_linear_keys)
self.model = model
self._override_backward()
self._register_hooks()
def _override_backward(self):
def new_linear(self, x):
return EBLinear.apply(x, self.weight, self.bias)
def new_conv2d(self, x):
return EBConv2d.apply(x, self.weight, self.bias, self.stride,
self.padding, self.dilation, self.groups)
def new_avgpool2d(self, x):
return EBAvgPool2d.apply(x, self.kernel_size, self.stride,
self.padding, self.ceil_mode, self.count_include_pad)
def replace(m):
name = m.__class__.__name__
if name == 'Linear':
m.forward = types.MethodType(new_linear, m)
elif name == 'Conv2d':
m.forward = types.MethodType(new_conv2d, m)
elif name == 'AvgPool2d':
m.forward = types.MethodType(new_avgpool2d, m)
self.model.apply(replace)
def _register_hooks(self):
self.intermediate_vars = []
def forward_hook(m, i, o):
self.intermediate_vars.append(o)
self.intermediate_layer.register_forward_hook(forward_hook)
self.output_layer.register_forward_hook(forward_hook)
def explain(self, inp, ind=None):
self.intermediate_vars = []
output = self.model(inp)
output_var, intermediate_var = self.intermediate_vars
if ind is None:
ind = output.data.max(1)[1]
grad_out = output.data.clone()
grad_out.fill_(0.0)
grad_out.scatter_(1, ind.unsqueeze(0).t(), 1.0)
self.final_linear.weight.data *= -1.0
neg_map_var = torch.autograd.grad(output, intermediate_var, grad_out, retain_graph=True)
neg_map = neg_map_var[0].data.clone()
self.final_linear.weight.data *= -1.0
pos_map_var = torch.autograd.grad(output, intermediate_var, grad_out, retain_graph=True)
pos_map = pos_map_var[0].data.clone()
diff = pos_map - neg_map
attmap_var = torch.autograd.grad(intermediate_var, output_var, diff, retain_graph=True)
attmap = attmap_var[0].data.clone()
attmap = torch.clamp(attmap.sum(1).unsqueeze(1), min=0.0)
return attmap
| 37.062016 | 108 | 0.622882 | 605 | 4,781 | 4.669421 | 0.157025 | 0.056637 | 0.056637 | 0.04885 | 0.811681 | 0.764248 | 0.748319 | 0.729204 | 0.729204 | 0.729204 | 0 | 0.014659 | 0.272328 | 4,781 | 128 | 109 | 37.351563 | 0.797356 | 0 | 0 | 0.764706 | 0 | 0 | 0.008785 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.186275 | false | 0 | 0.029412 | 0.058824 | 0.323529 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
12f0badc1d2d37c517b908663a6ab514ab98b240 | 200 | py | Python | ooquery/__init__.py | gisce/ooquery | f2995dc3b6152006a74d8d231c6e4a74b1751f7b | [
"MIT"
] | 5 | 2015-10-07T10:05:47.000Z | 2019-02-26T13:55:01.000Z | ooquery/__init__.py | gisce/ooquery | f2995dc3b6152006a74d8d231c6e4a74b1751f7b | [
"MIT"
] | 32 | 2016-09-06T10:38:59.000Z | 2021-07-05T12:17:51.000Z | ooquery/__init__.py | gisce/ooquery | f2995dc3b6152006a74d8d231c6e4a74b1751f7b | [
"MIT"
] | null | null | null | # coding=utf-8
from __future__ import absolute_import
from ooquery.expression import Expression
from ooquery.parser import Parser
from ooquery.operators import *
from ooquery.ooquery import OOQuery
| 22.222222 | 41 | 0.84 | 27 | 200 | 6.037037 | 0.407407 | 0.269939 | 0.208589 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005682 | 0.12 | 200 | 8 | 42 | 25 | 0.920455 | 0.06 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
425b3219eb7d27bc182f82c56e8c34f7e6f4a633 | 2,287 | py | Python | migrations/versions/2021_12_26_22_50_add_new_kind_field_to_events_tables.py | horia141/jupiter | 2c721d1d44e1cd2607ad9936e54a20ea254741dc | [
"MIT"
] | 15 | 2019-05-05T14:34:58.000Z | 2022-02-25T09:57:28.000Z | migrations/versions/2021_12_26_22_50_add_new_kind_field_to_events_tables.py | horia141/jupiter | 2c721d1d44e1cd2607ad9936e54a20ea254741dc | [
"MIT"
] | 3 | 2020-02-22T16:09:39.000Z | 2021-12-18T21:33:06.000Z | migrations/versions/2021_12_26_22_50_add_new_kind_field_to_events_tables.py | horia141/jupiter | 2c721d1d44e1cd2607ad9936e54a20ea254741dc | [
"MIT"
] | null | null | null | """Add new kind field to events tables and backfill the value
Revision ID: 5fe8147f542c
Revises: 670ea976fe9f
Create Date: 2021-12-26 22:50:11.688817
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '5fe8147f542c'
down_revision = '670ea976fe9f'
branch_labels = None
depends_on = None
def upgrade() -> None:
op.execute("""
alter table person_event
add column kind VARCHAR(16);""")
op.execute("""
update person_event
set kind = 'create'
where name = 'Created' and kind is Null;""")
op.execute("""
update person_event
set kind = 'archived'
where name = 'Archived' and kind is Null;""")
op.execute("""
update person_event
set kind = 'update'
where kind is Null;""")
op.execute("""
alter table prm_database_event
add column kind VARCHAR(16);""")
op.execute("""
update prm_database_event
set kind = 'create'
where name = 'Created' and kind is Null;""")
op.execute("""
update prm_database_event
set kind = 'archived'
where name = 'Archived' and kind is Null;""")
op.execute("""
update prm_database_event
set kind = 'update'
where kind is Null;""")
op.execute("""
alter table metric_entry_event
add column kind VARCHAR(16);""")
op.execute("""
update metric_entry_event
set kind = 'create'
where name = 'Created' and kind is Null;""")
op.execute("""
update metric_entry_event
set kind = 'archived'
where name = 'Archived' and kind is Null;""")
op.execute("""
update metric_entry_event
set kind = 'update'
where kind is Null;""")
op.execute("""
alter table metric_event
add column kind VARCHAR(16);""")
op.execute("""
update metric_event
set kind = 'create'
where name = 'Created' and kind is Null;""")
op.execute("""
update metric_event
set kind = 'archived'
where name = 'Archived' and kind is Null;""")
op.execute("""
update metric_event
set kind = 'update'
where kind is Null;""")
def downgrade() -> None:
pass
| 27.22619 | 61 | 0.587669 | 276 | 2,287 | 4.771739 | 0.235507 | 0.109339 | 0.136674 | 0.100228 | 0.743356 | 0.743356 | 0.743356 | 0.7388 | 0.710706 | 0.646925 | 0 | 0.036092 | 0.297333 | 2,287 | 83 | 62 | 27.554217 | 0.783447 | 0.081767 | 0 | 0.811594 | 0 | 0 | 0.736138 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028986 | false | 0.014493 | 0.028986 | 0 | 0.057971 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c42ea2d1e542ec9b6d366931db7f1b97594201a3 | 54,720 | py | Python | sdk/python/pulumi_yandex/mdb_kafka_cluster.py | pulumi/pulumi-yandex | 559a0c82fd2b834bb5f1dc3abbf0dab689b13a3e | [
"ECL-2.0",
"Apache-2.0"
] | 9 | 2021-04-20T15:39:41.000Z | 2022-02-20T09:14:39.000Z | sdk/python/pulumi_yandex/mdb_kafka_cluster.py | pulumi/pulumi-yandex | 559a0c82fd2b834bb5f1dc3abbf0dab689b13a3e | [
"ECL-2.0",
"Apache-2.0"
] | 56 | 2021-04-20T11:31:03.000Z | 2022-03-31T15:53:06.000Z | sdk/python/pulumi_yandex/mdb_kafka_cluster.py | pulumi/pulumi-yandex | 559a0c82fd2b834bb5f1dc3abbf0dab689b13a3e | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from . import _utilities
from . import outputs
from ._inputs import *
__all__ = ['MdbKafkaClusterArgs', 'MdbKafkaCluster']
@pulumi.input_type
class MdbKafkaClusterArgs:
def __init__(__self__, *,
config: pulumi.Input['MdbKafkaClusterConfigArgs'],
network_id: pulumi.Input[str],
deletion_protection: Optional[pulumi.Input[bool]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
host_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
security_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
subnet_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
topics: Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterTopicArgs']]]] = None,
users: Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterUserArgs']]]] = None):
"""
The set of arguments for constructing a MdbKafkaCluster resource.
:param pulumi.Input['MdbKafkaClusterConfigArgs'] config: Configuration of the Kafka cluster. The structure is documented below.
:param pulumi.Input[str] network_id: ID of the network, to which the Kafka cluster belongs.
:param pulumi.Input[bool] deletion_protection: Inhibits deletion of the cluster. Can be either `true` or `false`.
:param pulumi.Input[str] description: Description of the Kafka cluster.
:param pulumi.Input[str] environment: Deployment environment of the Kafka cluster. Can be either `PRESTABLE` or `PRODUCTION`.
The default is `PRODUCTION`.
:param pulumi.Input[str] folder_id: The ID of the folder that the resource belongs to. If it is not provided, the default provider folder is used.
:param pulumi.Input[Sequence[pulumi.Input[str]]] host_group_ids: A list of IDs of the host groups to place VMs of the cluster on.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: A set of key/value label pairs to assign to the Kafka cluster.
:param pulumi.Input[str] name: The name of the topic.
:param pulumi.Input[Sequence[pulumi.Input[str]]] security_group_ids: Security group ids, to which the Kafka cluster belongs.
:param pulumi.Input[Sequence[pulumi.Input[str]]] subnet_ids: IDs of the subnets, to which the Kafka cluster belongs.
:param pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterTopicArgs']]] topics: To manage topics, please switch to using a separate resource type `MdbKafkaTopic`.
:param pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterUserArgs']]] users: A user of the Kafka cluster. The structure is documented below.
"""
pulumi.set(__self__, "config", config)
pulumi.set(__self__, "network_id", network_id)
if deletion_protection is not None:
pulumi.set(__self__, "deletion_protection", deletion_protection)
if description is not None:
pulumi.set(__self__, "description", description)
if environment is not None:
pulumi.set(__self__, "environment", environment)
if folder_id is not None:
pulumi.set(__self__, "folder_id", folder_id)
if host_group_ids is not None:
pulumi.set(__self__, "host_group_ids", host_group_ids)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if name is not None:
pulumi.set(__self__, "name", name)
if security_group_ids is not None:
pulumi.set(__self__, "security_group_ids", security_group_ids)
if subnet_ids is not None:
pulumi.set(__self__, "subnet_ids", subnet_ids)
if topics is not None:
warnings.warn("""to manage topics, please switch to using a separate resource type yandex_mdb_kafka_topic""", DeprecationWarning)
pulumi.log.warn("""topics is deprecated: to manage topics, please switch to using a separate resource type yandex_mdb_kafka_topic""")
if topics is not None:
pulumi.set(__self__, "topics", topics)
if users is not None:
pulumi.set(__self__, "users", users)
@property
@pulumi.getter
def config(self) -> pulumi.Input['MdbKafkaClusterConfigArgs']:
"""
Configuration of the Kafka cluster. The structure is documented below.
"""
return pulumi.get(self, "config")
@config.setter
def config(self, value: pulumi.Input['MdbKafkaClusterConfigArgs']):
pulumi.set(self, "config", value)
@property
@pulumi.getter(name="networkId")
def network_id(self) -> pulumi.Input[str]:
"""
ID of the network, to which the Kafka cluster belongs.
"""
return pulumi.get(self, "network_id")
@network_id.setter
def network_id(self, value: pulumi.Input[str]):
pulumi.set(self, "network_id", value)
@property
@pulumi.getter(name="deletionProtection")
def deletion_protection(self) -> Optional[pulumi.Input[bool]]:
"""
Inhibits deletion of the cluster. Can be either `true` or `false`.
"""
return pulumi.get(self, "deletion_protection")
@deletion_protection.setter
def deletion_protection(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "deletion_protection", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Description of the Kafka cluster.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def environment(self) -> Optional[pulumi.Input[str]]:
"""
Deployment environment of the Kafka cluster. Can be either `PRESTABLE` or `PRODUCTION`.
The default is `PRODUCTION`.
"""
return pulumi.get(self, "environment")
@environment.setter
def environment(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "environment", value)
@property
@pulumi.getter(name="folderId")
def folder_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the folder that the resource belongs to. If it is not provided, the default provider folder is used.
"""
return pulumi.get(self, "folder_id")
@folder_id.setter
def folder_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "folder_id", value)
@property
@pulumi.getter(name="hostGroupIds")
def host_group_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A list of IDs of the host groups to place VMs of the cluster on.
"""
return pulumi.get(self, "host_group_ids")
@host_group_ids.setter
def host_group_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "host_group_ids", value)
@property
@pulumi.getter
def labels(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A set of key/value label pairs to assign to the Kafka cluster.
"""
return pulumi.get(self, "labels")
@labels.setter
def labels(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "labels", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the topic.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="securityGroupIds")
def security_group_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Security group ids, to which the Kafka cluster belongs.
"""
return pulumi.get(self, "security_group_ids")
@security_group_ids.setter
def security_group_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "security_group_ids", value)
@property
@pulumi.getter(name="subnetIds")
def subnet_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
IDs of the subnets, to which the Kafka cluster belongs.
"""
return pulumi.get(self, "subnet_ids")
@subnet_ids.setter
def subnet_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "subnet_ids", value)
@property
@pulumi.getter
def topics(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterTopicArgs']]]]:
"""
To manage topics, please switch to using a separate resource type `MdbKafkaTopic`.
"""
return pulumi.get(self, "topics")
@topics.setter
def topics(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterTopicArgs']]]]):
pulumi.set(self, "topics", value)
@property
@pulumi.getter
def users(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterUserArgs']]]]:
"""
A user of the Kafka cluster. The structure is documented below.
"""
return pulumi.get(self, "users")
@users.setter
def users(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterUserArgs']]]]):
pulumi.set(self, "users", value)
@pulumi.input_type
class _MdbKafkaClusterState:
def __init__(__self__, *,
config: Optional[pulumi.Input['MdbKafkaClusterConfigArgs']] = None,
created_at: Optional[pulumi.Input[str]] = None,
deletion_protection: Optional[pulumi.Input[bool]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
health: Optional[pulumi.Input[str]] = None,
host_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
hosts: Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterHostArgs']]]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
network_id: Optional[pulumi.Input[str]] = None,
security_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
status: Optional[pulumi.Input[str]] = None,
subnet_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
topics: Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterTopicArgs']]]] = None,
users: Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterUserArgs']]]] = None):
"""
Input properties used for looking up and filtering MdbKafkaCluster resources.
:param pulumi.Input['MdbKafkaClusterConfigArgs'] config: Configuration of the Kafka cluster. The structure is documented below.
:param pulumi.Input[str] created_at: Timestamp of cluster creation.
:param pulumi.Input[bool] deletion_protection: Inhibits deletion of the cluster. Can be either `true` or `false`.
:param pulumi.Input[str] description: Description of the Kafka cluster.
:param pulumi.Input[str] environment: Deployment environment of the Kafka cluster. Can be either `PRESTABLE` or `PRODUCTION`.
The default is `PRODUCTION`.
:param pulumi.Input[str] folder_id: The ID of the folder that the resource belongs to. If it is not provided, the default provider folder is used.
:param pulumi.Input[str] health: Health of the host.
:param pulumi.Input[Sequence[pulumi.Input[str]]] host_group_ids: A list of IDs of the host groups to place VMs of the cluster on.
:param pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterHostArgs']]] hosts: A host of the Kafka cluster. The structure is documented below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: A set of key/value label pairs to assign to the Kafka cluster.
:param pulumi.Input[str] name: The name of the topic.
:param pulumi.Input[str] network_id: ID of the network, to which the Kafka cluster belongs.
:param pulumi.Input[Sequence[pulumi.Input[str]]] security_group_ids: Security group ids, to which the Kafka cluster belongs.
:param pulumi.Input[str] status: Status of the cluster. Can be either `CREATING`, `STARTING`, `RUNNING`, `UPDATING`, `STOPPING`, `STOPPED`, `ERROR` or `STATUS_UNKNOWN`.
For more information see `status` field of JSON representation in [the official documentation](https://cloud.yandex.com/docs/managed-kafka/api-ref/Cluster/).
:param pulumi.Input[Sequence[pulumi.Input[str]]] subnet_ids: IDs of the subnets, to which the Kafka cluster belongs.
:param pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterTopicArgs']]] topics: To manage topics, please switch to using a separate resource type `MdbKafkaTopic`.
:param pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterUserArgs']]] users: A user of the Kafka cluster. The structure is documented below.
"""
if config is not None:
pulumi.set(__self__, "config", config)
if created_at is not None:
pulumi.set(__self__, "created_at", created_at)
if deletion_protection is not None:
pulumi.set(__self__, "deletion_protection", deletion_protection)
if description is not None:
pulumi.set(__self__, "description", description)
if environment is not None:
pulumi.set(__self__, "environment", environment)
if folder_id is not None:
pulumi.set(__self__, "folder_id", folder_id)
if health is not None:
pulumi.set(__self__, "health", health)
if host_group_ids is not None:
pulumi.set(__self__, "host_group_ids", host_group_ids)
if hosts is not None:
pulumi.set(__self__, "hosts", hosts)
if labels is not None:
pulumi.set(__self__, "labels", labels)
if name is not None:
pulumi.set(__self__, "name", name)
if network_id is not None:
pulumi.set(__self__, "network_id", network_id)
if security_group_ids is not None:
pulumi.set(__self__, "security_group_ids", security_group_ids)
if status is not None:
pulumi.set(__self__, "status", status)
if subnet_ids is not None:
pulumi.set(__self__, "subnet_ids", subnet_ids)
if topics is not None:
warnings.warn("""to manage topics, please switch to using a separate resource type yandex_mdb_kafka_topic""", DeprecationWarning)
pulumi.log.warn("""topics is deprecated: to manage topics, please switch to using a separate resource type yandex_mdb_kafka_topic""")
if topics is not None:
pulumi.set(__self__, "topics", topics)
if users is not None:
pulumi.set(__self__, "users", users)
@property
@pulumi.getter
def config(self) -> Optional[pulumi.Input['MdbKafkaClusterConfigArgs']]:
"""
Configuration of the Kafka cluster. The structure is documented below.
"""
return pulumi.get(self, "config")
@config.setter
def config(self, value: Optional[pulumi.Input['MdbKafkaClusterConfigArgs']]):
pulumi.set(self, "config", value)
@property
@pulumi.getter(name="createdAt")
def created_at(self) -> Optional[pulumi.Input[str]]:
"""
Timestamp of cluster creation.
"""
return pulumi.get(self, "created_at")
@created_at.setter
def created_at(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "created_at", value)
@property
@pulumi.getter(name="deletionProtection")
def deletion_protection(self) -> Optional[pulumi.Input[bool]]:
"""
Inhibits deletion of the cluster. Can be either `true` or `false`.
"""
return pulumi.get(self, "deletion_protection")
@deletion_protection.setter
def deletion_protection(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "deletion_protection", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
Description of the Kafka cluster.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter
def environment(self) -> Optional[pulumi.Input[str]]:
"""
Deployment environment of the Kafka cluster. Can be either `PRESTABLE` or `PRODUCTION`.
The default is `PRODUCTION`.
"""
return pulumi.get(self, "environment")
@environment.setter
def environment(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "environment", value)
@property
@pulumi.getter(name="folderId")
def folder_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the folder that the resource belongs to. If it is not provided, the default provider folder is used.
"""
return pulumi.get(self, "folder_id")
@folder_id.setter
def folder_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "folder_id", value)
@property
@pulumi.getter
def health(self) -> Optional[pulumi.Input[str]]:
"""
Health of the host.
"""
return pulumi.get(self, "health")
@health.setter
def health(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "health", value)
@property
@pulumi.getter(name="hostGroupIds")
def host_group_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
A list of IDs of the host groups to place VMs of the cluster on.
"""
return pulumi.get(self, "host_group_ids")
@host_group_ids.setter
def host_group_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "host_group_ids", value)
@property
@pulumi.getter
def hosts(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterHostArgs']]]]:
"""
A host of the Kafka cluster. The structure is documented below.
"""
return pulumi.get(self, "hosts")
@hosts.setter
def hosts(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterHostArgs']]]]):
pulumi.set(self, "hosts", value)
@property
@pulumi.getter
def labels(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
"""
A set of key/value label pairs to assign to the Kafka cluster.
"""
return pulumi.get(self, "labels")
@labels.setter
def labels(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "labels", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the topic.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="networkId")
def network_id(self) -> Optional[pulumi.Input[str]]:
"""
ID of the network, to which the Kafka cluster belongs.
"""
return pulumi.get(self, "network_id")
@network_id.setter
def network_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "network_id", value)
@property
@pulumi.getter(name="securityGroupIds")
def security_group_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
Security group ids, to which the Kafka cluster belongs.
"""
return pulumi.get(self, "security_group_ids")
@security_group_ids.setter
def security_group_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "security_group_ids", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[str]]:
"""
Status of the cluster. Can be either `CREATING`, `STARTING`, `RUNNING`, `UPDATING`, `STOPPING`, `STOPPED`, `ERROR` or `STATUS_UNKNOWN`.
For more information see `status` field of JSON representation in [the official documentation](https://cloud.yandex.com/docs/managed-kafka/api-ref/Cluster/).
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "status", value)
@property
@pulumi.getter(name="subnetIds")
def subnet_ids(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
IDs of the subnets, to which the Kafka cluster belongs.
"""
return pulumi.get(self, "subnet_ids")
@subnet_ids.setter
def subnet_ids(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "subnet_ids", value)
@property
@pulumi.getter
def topics(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterTopicArgs']]]]:
"""
To manage topics, please switch to using a separate resource type `MdbKafkaTopic`.
"""
return pulumi.get(self, "topics")
@topics.setter
def topics(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterTopicArgs']]]]):
pulumi.set(self, "topics", value)
@property
@pulumi.getter
def users(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterUserArgs']]]]:
"""
A user of the Kafka cluster. The structure is documented below.
"""
return pulumi.get(self, "users")
@users.setter
def users(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['MdbKafkaClusterUserArgs']]]]):
pulumi.set(self, "users", value)
class MdbKafkaCluster(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
config: Optional[pulumi.Input[pulumi.InputType['MdbKafkaClusterConfigArgs']]] = None,
deletion_protection: Optional[pulumi.Input[bool]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
host_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
network_id: Optional[pulumi.Input[str]] = None,
security_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
subnet_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
topics: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbKafkaClusterTopicArgs']]]]] = None,
users: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbKafkaClusterUserArgs']]]]] = None,
__props__=None):
"""
Manages a Kafka cluster within the Yandex.Cloud. For more information, see
[the official documentation](https://cloud.yandex.com/docs/managed-kafka/concepts).
## Example Usage
Example of creating a Single Node Kafka.
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.5.0.0/24"],
zone="ru-central1-a")
foo_mdb_kafka_cluster = yandex.MdbKafkaCluster("fooMdbKafkaCluster",
config=yandex.MdbKafkaClusterConfigArgs(
assign_public_ip=False,
brokers_count=1,
kafka=yandex.MdbKafkaClusterConfigKafkaArgs(
kafka_config=yandex.MdbKafkaClusterConfigKafkaKafkaConfigArgs(
compression_type="COMPRESSION_TYPE_ZSTD",
default_replication_factor="1",
log_flush_interval_messages="1024",
log_flush_interval_ms="1000",
log_flush_scheduler_interval_ms="1000",
log_preallocate=True,
log_retention_bytes="1073741824",
log_retention_hours="168",
log_retention_minutes="10080",
log_retention_ms="86400000",
log_segment_bytes="134217728",
num_partitions="10",
),
resources=yandex.MdbKafkaClusterConfigKafkaResourcesArgs(
disk_size=32,
disk_type_id="network-ssd",
resource_preset_id="s2.micro",
),
),
schema_registry=False,
unmanaged_topics=False,
version="2.6",
zones=["ru-central1-a"],
),
environment="PRESTABLE",
network_id=foo_vpc_network.id,
subnet_ids=[foo_vpc_subnet.id],
users=[
yandex.MdbKafkaClusterUserArgs(
name="producer-application",
password="password",
permissions=[yandex.MdbKafkaClusterUserPermissionArgs(
role="ACCESS_ROLE_PRODUCER",
topic_name="input",
)],
),
yandex.MdbKafkaClusterUserArgs(
name="worker",
password="password",
permissions=[
yandex.MdbKafkaClusterUserPermissionArgs(
role="ACCESS_ROLE_CONSUMER",
topic_name="input",
),
yandex.MdbKafkaClusterUserPermissionArgs(
role="ACCESS_ROLE_PRODUCER",
topic_name="output",
),
],
),
])
```
Example of creating a HA Kafka Cluster with two brokers per AZ (6 brokers + 3 zk)
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.1.0.0/24"],
zone="ru-central1-a")
bar = yandex.VpcSubnet("bar",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.2.0.0/24"],
zone="ru-central1-b")
baz = yandex.VpcSubnet("baz",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.3.0.0/24"],
zone="ru-central1-c")
foo_mdb_kafka_cluster = yandex.MdbKafkaCluster("fooMdbKafkaCluster",
config=yandex.MdbKafkaClusterConfigArgs(
assign_public_ip=True,
brokers_count=2,
kafka=yandex.MdbKafkaClusterConfigKafkaArgs(
kafka_config=yandex.MdbKafkaClusterConfigKafkaKafkaConfigArgs(
compression_type="COMPRESSION_TYPE_ZSTD",
default_replication_factor="6",
log_flush_interval_messages="1024",
log_flush_interval_ms="1000",
log_flush_scheduler_interval_ms="1000",
log_preallocate=True,
log_retention_bytes="1073741824",
log_retention_hours="168",
log_retention_minutes="10080",
log_retention_ms="86400000",
log_segment_bytes="134217728",
num_partitions="10",
),
resources=yandex.MdbKafkaClusterConfigKafkaResourcesArgs(
disk_size=128,
disk_type_id="network-ssd",
resource_preset_id="s2.medium",
),
),
schema_registry=False,
unmanaged_topics=False,
version="2.6",
zones=[
"ru-central1-a",
"ru-central1-b",
"ru-central1-c",
],
zookeeper=yandex.MdbKafkaClusterConfigZookeeperArgs(
resources=yandex.MdbKafkaClusterConfigZookeeperResourcesArgs(
disk_size=20,
disk_type_id="network-ssd",
resource_preset_id="s2.micro",
),
),
),
environment="PRESTABLE",
network_id=foo_vpc_network.id,
subnet_ids=[
foo_vpc_subnet.id,
bar.id,
baz.id,
],
users=[
yandex.MdbKafkaClusterUserArgs(
name="producer-application",
password="password",
permissions=[yandex.MdbKafkaClusterUserPermissionArgs(
role="ACCESS_ROLE_PRODUCER",
topic_name="input",
)],
),
yandex.MdbKafkaClusterUserArgs(
name="worker",
password="password",
permissions=[
yandex.MdbKafkaClusterUserPermissionArgs(
role="ACCESS_ROLE_CONSUMER",
topic_name="input",
),
yandex.MdbKafkaClusterUserPermissionArgs(
role="ACCESS_ROLE_PRODUCER",
topic_name="output",
),
],
),
])
```
## Import
A cluster can be imported using the `id` of the resource, e.g.
```sh
$ pulumi import yandex:index/mdbKafkaCluster:MdbKafkaCluster foo cluster_id
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['MdbKafkaClusterConfigArgs']] config: Configuration of the Kafka cluster. The structure is documented below.
:param pulumi.Input[bool] deletion_protection: Inhibits deletion of the cluster. Can be either `true` or `false`.
:param pulumi.Input[str] description: Description of the Kafka cluster.
:param pulumi.Input[str] environment: Deployment environment of the Kafka cluster. Can be either `PRESTABLE` or `PRODUCTION`.
The default is `PRODUCTION`.
:param pulumi.Input[str] folder_id: The ID of the folder that the resource belongs to. If it is not provided, the default provider folder is used.
:param pulumi.Input[Sequence[pulumi.Input[str]]] host_group_ids: A list of IDs of the host groups to place VMs of the cluster on.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: A set of key/value label pairs to assign to the Kafka cluster.
:param pulumi.Input[str] name: The name of the topic.
:param pulumi.Input[str] network_id: ID of the network, to which the Kafka cluster belongs.
:param pulumi.Input[Sequence[pulumi.Input[str]]] security_group_ids: Security group ids, to which the Kafka cluster belongs.
:param pulumi.Input[Sequence[pulumi.Input[str]]] subnet_ids: IDs of the subnets, to which the Kafka cluster belongs.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbKafkaClusterTopicArgs']]]] topics: To manage topics, please switch to using a separate resource type `MdbKafkaTopic`.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbKafkaClusterUserArgs']]]] users: A user of the Kafka cluster. The structure is documented below.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: MdbKafkaClusterArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages a Kafka cluster within the Yandex.Cloud. For more information, see
[the official documentation](https://cloud.yandex.com/docs/managed-kafka/concepts).
## Example Usage
Example of creating a Single Node Kafka.
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.5.0.0/24"],
zone="ru-central1-a")
foo_mdb_kafka_cluster = yandex.MdbKafkaCluster("fooMdbKafkaCluster",
config=yandex.MdbKafkaClusterConfigArgs(
assign_public_ip=False,
brokers_count=1,
kafka=yandex.MdbKafkaClusterConfigKafkaArgs(
kafka_config=yandex.MdbKafkaClusterConfigKafkaKafkaConfigArgs(
compression_type="COMPRESSION_TYPE_ZSTD",
default_replication_factor="1",
log_flush_interval_messages="1024",
log_flush_interval_ms="1000",
log_flush_scheduler_interval_ms="1000",
log_preallocate=True,
log_retention_bytes="1073741824",
log_retention_hours="168",
log_retention_minutes="10080",
log_retention_ms="86400000",
log_segment_bytes="134217728",
num_partitions="10",
),
resources=yandex.MdbKafkaClusterConfigKafkaResourcesArgs(
disk_size=32,
disk_type_id="network-ssd",
resource_preset_id="s2.micro",
),
),
schema_registry=False,
unmanaged_topics=False,
version="2.6",
zones=["ru-central1-a"],
),
environment="PRESTABLE",
network_id=foo_vpc_network.id,
subnet_ids=[foo_vpc_subnet.id],
users=[
yandex.MdbKafkaClusterUserArgs(
name="producer-application",
password="password",
permissions=[yandex.MdbKafkaClusterUserPermissionArgs(
role="ACCESS_ROLE_PRODUCER",
topic_name="input",
)],
),
yandex.MdbKafkaClusterUserArgs(
name="worker",
password="password",
permissions=[
yandex.MdbKafkaClusterUserPermissionArgs(
role="ACCESS_ROLE_CONSUMER",
topic_name="input",
),
yandex.MdbKafkaClusterUserPermissionArgs(
role="ACCESS_ROLE_PRODUCER",
topic_name="output",
),
],
),
])
```
Example of creating a HA Kafka Cluster with two brokers per AZ (6 brokers + 3 zk)
```python
import pulumi
import pulumi_yandex as yandex
foo_vpc_network = yandex.VpcNetwork("fooVpcNetwork")
foo_vpc_subnet = yandex.VpcSubnet("fooVpcSubnet",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.1.0.0/24"],
zone="ru-central1-a")
bar = yandex.VpcSubnet("bar",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.2.0.0/24"],
zone="ru-central1-b")
baz = yandex.VpcSubnet("baz",
network_id=foo_vpc_network.id,
v4_cidr_blocks=["10.3.0.0/24"],
zone="ru-central1-c")
foo_mdb_kafka_cluster = yandex.MdbKafkaCluster("fooMdbKafkaCluster",
config=yandex.MdbKafkaClusterConfigArgs(
assign_public_ip=True,
brokers_count=2,
kafka=yandex.MdbKafkaClusterConfigKafkaArgs(
kafka_config=yandex.MdbKafkaClusterConfigKafkaKafkaConfigArgs(
compression_type="COMPRESSION_TYPE_ZSTD",
default_replication_factor="6",
log_flush_interval_messages="1024",
log_flush_interval_ms="1000",
log_flush_scheduler_interval_ms="1000",
log_preallocate=True,
log_retention_bytes="1073741824",
log_retention_hours="168",
log_retention_minutes="10080",
log_retention_ms="86400000",
log_segment_bytes="134217728",
num_partitions="10",
),
resources=yandex.MdbKafkaClusterConfigKafkaResourcesArgs(
disk_size=128,
disk_type_id="network-ssd",
resource_preset_id="s2.medium",
),
),
schema_registry=False,
unmanaged_topics=False,
version="2.6",
zones=[
"ru-central1-a",
"ru-central1-b",
"ru-central1-c",
],
zookeeper=yandex.MdbKafkaClusterConfigZookeeperArgs(
resources=yandex.MdbKafkaClusterConfigZookeeperResourcesArgs(
disk_size=20,
disk_type_id="network-ssd",
resource_preset_id="s2.micro",
),
),
),
environment="PRESTABLE",
network_id=foo_vpc_network.id,
subnet_ids=[
foo_vpc_subnet.id,
bar.id,
baz.id,
],
users=[
yandex.MdbKafkaClusterUserArgs(
name="producer-application",
password="password",
permissions=[yandex.MdbKafkaClusterUserPermissionArgs(
role="ACCESS_ROLE_PRODUCER",
topic_name="input",
)],
),
yandex.MdbKafkaClusterUserArgs(
name="worker",
password="password",
permissions=[
yandex.MdbKafkaClusterUserPermissionArgs(
role="ACCESS_ROLE_CONSUMER",
topic_name="input",
),
yandex.MdbKafkaClusterUserPermissionArgs(
role="ACCESS_ROLE_PRODUCER",
topic_name="output",
),
],
),
])
```
## Import
A cluster can be imported using the `id` of the resource, e.g.
```sh
$ pulumi import yandex:index/mdbKafkaCluster:MdbKafkaCluster foo cluster_id
```
:param str resource_name: The name of the resource.
:param MdbKafkaClusterArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(MdbKafkaClusterArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
config: Optional[pulumi.Input[pulumi.InputType['MdbKafkaClusterConfigArgs']]] = None,
deletion_protection: Optional[pulumi.Input[bool]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
host_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
network_id: Optional[pulumi.Input[str]] = None,
security_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
subnet_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
topics: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbKafkaClusterTopicArgs']]]]] = None,
users: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbKafkaClusterUserArgs']]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = MdbKafkaClusterArgs.__new__(MdbKafkaClusterArgs)
if config is None and not opts.urn:
raise TypeError("Missing required property 'config'")
__props__.__dict__["config"] = config
__props__.__dict__["deletion_protection"] = deletion_protection
__props__.__dict__["description"] = description
__props__.__dict__["environment"] = environment
__props__.__dict__["folder_id"] = folder_id
__props__.__dict__["host_group_ids"] = host_group_ids
__props__.__dict__["labels"] = labels
__props__.__dict__["name"] = name
if network_id is None and not opts.urn:
raise TypeError("Missing required property 'network_id'")
__props__.__dict__["network_id"] = network_id
__props__.__dict__["security_group_ids"] = security_group_ids
__props__.__dict__["subnet_ids"] = subnet_ids
if topics is not None and not opts.urn:
warnings.warn("""to manage topics, please switch to using a separate resource type yandex_mdb_kafka_topic""", DeprecationWarning)
pulumi.log.warn("""topics is deprecated: to manage topics, please switch to using a separate resource type yandex_mdb_kafka_topic""")
__props__.__dict__["topics"] = topics
__props__.__dict__["users"] = users
__props__.__dict__["created_at"] = None
__props__.__dict__["health"] = None
__props__.__dict__["hosts"] = None
__props__.__dict__["status"] = None
super(MdbKafkaCluster, __self__).__init__(
'yandex:index/mdbKafkaCluster:MdbKafkaCluster',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
config: Optional[pulumi.Input[pulumi.InputType['MdbKafkaClusterConfigArgs']]] = None,
created_at: Optional[pulumi.Input[str]] = None,
deletion_protection: Optional[pulumi.Input[bool]] = None,
description: Optional[pulumi.Input[str]] = None,
environment: Optional[pulumi.Input[str]] = None,
folder_id: Optional[pulumi.Input[str]] = None,
health: Optional[pulumi.Input[str]] = None,
host_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
hosts: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbKafkaClusterHostArgs']]]]] = None,
labels: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
name: Optional[pulumi.Input[str]] = None,
network_id: Optional[pulumi.Input[str]] = None,
security_group_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
status: Optional[pulumi.Input[str]] = None,
subnet_ids: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
topics: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbKafkaClusterTopicArgs']]]]] = None,
users: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbKafkaClusterUserArgs']]]]] = None) -> 'MdbKafkaCluster':
"""
Get an existing MdbKafkaCluster resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['MdbKafkaClusterConfigArgs']] config: Configuration of the Kafka cluster. The structure is documented below.
:param pulumi.Input[str] created_at: Timestamp of cluster creation.
:param pulumi.Input[bool] deletion_protection: Inhibits deletion of the cluster. Can be either `true` or `false`.
:param pulumi.Input[str] description: Description of the Kafka cluster.
:param pulumi.Input[str] environment: Deployment environment of the Kafka cluster. Can be either `PRESTABLE` or `PRODUCTION`.
The default is `PRODUCTION`.
:param pulumi.Input[str] folder_id: The ID of the folder that the resource belongs to. If it is not provided, the default provider folder is used.
:param pulumi.Input[str] health: Health of the host.
:param pulumi.Input[Sequence[pulumi.Input[str]]] host_group_ids: A list of IDs of the host groups to place VMs of the cluster on.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbKafkaClusterHostArgs']]]] hosts: A host of the Kafka cluster. The structure is documented below.
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] labels: A set of key/value label pairs to assign to the Kafka cluster.
:param pulumi.Input[str] name: The name of the topic.
:param pulumi.Input[str] network_id: ID of the network, to which the Kafka cluster belongs.
:param pulumi.Input[Sequence[pulumi.Input[str]]] security_group_ids: Security group ids, to which the Kafka cluster belongs.
:param pulumi.Input[str] status: Status of the cluster. Can be either `CREATING`, `STARTING`, `RUNNING`, `UPDATING`, `STOPPING`, `STOPPED`, `ERROR` or `STATUS_UNKNOWN`.
For more information see `status` field of JSON representation in [the official documentation](https://cloud.yandex.com/docs/managed-kafka/api-ref/Cluster/).
:param pulumi.Input[Sequence[pulumi.Input[str]]] subnet_ids: IDs of the subnets, to which the Kafka cluster belongs.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbKafkaClusterTopicArgs']]]] topics: To manage topics, please switch to using a separate resource type `MdbKafkaTopic`.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['MdbKafkaClusterUserArgs']]]] users: A user of the Kafka cluster. The structure is documented below.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _MdbKafkaClusterState.__new__(_MdbKafkaClusterState)
__props__.__dict__["config"] = config
__props__.__dict__["created_at"] = created_at
__props__.__dict__["deletion_protection"] = deletion_protection
__props__.__dict__["description"] = description
__props__.__dict__["environment"] = environment
__props__.__dict__["folder_id"] = folder_id
__props__.__dict__["health"] = health
__props__.__dict__["host_group_ids"] = host_group_ids
__props__.__dict__["hosts"] = hosts
__props__.__dict__["labels"] = labels
__props__.__dict__["name"] = name
__props__.__dict__["network_id"] = network_id
__props__.__dict__["security_group_ids"] = security_group_ids
__props__.__dict__["status"] = status
__props__.__dict__["subnet_ids"] = subnet_ids
__props__.__dict__["topics"] = topics
__props__.__dict__["users"] = users
return MdbKafkaCluster(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def config(self) -> pulumi.Output['outputs.MdbKafkaClusterConfig']:
"""
Configuration of the Kafka cluster. The structure is documented below.
"""
return pulumi.get(self, "config")
@property
@pulumi.getter(name="createdAt")
def created_at(self) -> pulumi.Output[str]:
"""
Timestamp of cluster creation.
"""
return pulumi.get(self, "created_at")
@property
@pulumi.getter(name="deletionProtection")
def deletion_protection(self) -> pulumi.Output[bool]:
"""
Inhibits deletion of the cluster. Can be either `true` or `false`.
"""
return pulumi.get(self, "deletion_protection")
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[str]]:
"""
Description of the Kafka cluster.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter
def environment(self) -> pulumi.Output[Optional[str]]:
"""
Deployment environment of the Kafka cluster. Can be either `PRESTABLE` or `PRODUCTION`.
The default is `PRODUCTION`.
"""
return pulumi.get(self, "environment")
@property
@pulumi.getter(name="folderId")
def folder_id(self) -> pulumi.Output[str]:
"""
The ID of the folder that the resource belongs to. If it is not provided, the default provider folder is used.
"""
return pulumi.get(self, "folder_id")
@property
@pulumi.getter
def health(self) -> pulumi.Output[str]:
"""
Health of the host.
"""
return pulumi.get(self, "health")
@property
@pulumi.getter(name="hostGroupIds")
def host_group_ids(self) -> pulumi.Output[Sequence[str]]:
"""
A list of IDs of the host groups to place VMs of the cluster on.
"""
return pulumi.get(self, "host_group_ids")
@property
@pulumi.getter
def hosts(self) -> pulumi.Output[Sequence['outputs.MdbKafkaClusterHost']]:
"""
A host of the Kafka cluster. The structure is documented below.
"""
return pulumi.get(self, "hosts")
@property
@pulumi.getter
def labels(self) -> pulumi.Output[Mapping[str, str]]:
"""
A set of key/value label pairs to assign to the Kafka cluster.
"""
return pulumi.get(self, "labels")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
The name of the topic.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="networkId")
def network_id(self) -> pulumi.Output[str]:
"""
ID of the network, to which the Kafka cluster belongs.
"""
return pulumi.get(self, "network_id")
@property
@pulumi.getter(name="securityGroupIds")
def security_group_ids(self) -> pulumi.Output[Sequence[str]]:
"""
Security group ids, to which the Kafka cluster belongs.
"""
return pulumi.get(self, "security_group_ids")
@property
@pulumi.getter
def status(self) -> pulumi.Output[str]:
"""
Status of the cluster. Can be either `CREATING`, `STARTING`, `RUNNING`, `UPDATING`, `STOPPING`, `STOPPED`, `ERROR` or `STATUS_UNKNOWN`.
For more information see `status` field of JSON representation in [the official documentation](https://cloud.yandex.com/docs/managed-kafka/api-ref/Cluster/).
"""
return pulumi.get(self, "status")
@property
@pulumi.getter(name="subnetIds")
def subnet_ids(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
IDs of the subnets, to which the Kafka cluster belongs.
"""
return pulumi.get(self, "subnet_ids")
@property
@pulumi.getter
def topics(self) -> pulumi.Output[Optional[Sequence['outputs.MdbKafkaClusterTopic']]]:
"""
To manage topics, please switch to using a separate resource type `MdbKafkaTopic`.
"""
return pulumi.get(self, "topics")
@property
@pulumi.getter
def users(self) -> pulumi.Output[Optional[Sequence['outputs.MdbKafkaClusterUser']]]:
"""
A user of the Kafka cluster. The structure is documented below.
"""
return pulumi.get(self, "users")
| 45.752508 | 188 | 0.60826 | 5,838 | 54,720 | 5.509078 | 0.056869 | 0.096107 | 0.059636 | 0.055189 | 0.927616 | 0.916392 | 0.89727 | 0.889186 | 0.878708 | 0.866146 | 0 | 0.008125 | 0.286988 | 54,720 | 1,195 | 189 | 45.790795 | 0.816204 | 0.434722 | 0 | 0.784466 | 1 | 0 | 0.125586 | 0.038802 | 0 | 0 | 0 | 0 | 0 | 1 | 0.163107 | false | 0.001942 | 0.013592 | 0 | 0.275728 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c43c9fcc0b793c0b8674d45f27f24fc577ca4d47 | 198 | py | Python | venvctrl/__init__.py | flowersjg/venvctrl | 36d4e0e4d5ebced6385a6ade1198f4769ff2df41 | [
"MIT"
] | 4 | 2015-04-20T15:13:34.000Z | 2020-11-24T04:57:47.000Z | venvctrl/__init__.py | flowersjg/venvctrl | 36d4e0e4d5ebced6385a6ade1198f4769ff2df41 | [
"MIT"
] | 9 | 2017-10-25T01:56:12.000Z | 2021-12-30T03:49:54.000Z | venvctrl/__init__.py | flowersjg/venvctrl | 36d4e0e4d5ebced6385a6ade1198f4769ff2df41 | [
"MIT"
] | 6 | 2015-12-14T20:09:39.000Z | 2019-04-30T22:20:48.000Z | """Python API for managing virtualenv."""
from __future__ import division
from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals
| 28.285714 | 42 | 0.823232 | 24 | 198 | 6 | 0.583333 | 0.277778 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141414 | 198 | 6 | 43 | 33 | 0.847059 | 0.176768 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0.25 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c45a2d5390a76a06b453b456794c7adcbced320e | 196 | py | Python | yowsup/layers/protocol_groups/protocolentities/iq_groups_participants_add.py | rihbyne/yowsup | 06581618fdba54aa51041624e19d5dce64b054ef | [
"MIT"
] | 1 | 2019-12-16T11:00:06.000Z | 2019-12-16T11:00:06.000Z | yowsup/layers/protocol_groups/protocolentities/iq_groups_participants_add.py | pasinit/yowsup | 894007650bf3d75ef7af4a0e57e84dc7cccc4dfe | [
"MIT"
] | null | null | null | yowsup/layers/protocol_groups/protocolentities/iq_groups_participants_add.py | pasinit/yowsup | 894007650bf3d75ef7af4a0e57e84dc7cccc4dfe | [
"MIT"
] | 3 | 2017-08-18T21:24:46.000Z | 2018-09-07T21:07:39.000Z | '''
<iq type="set" id="{{id}}" xmlns="w:g", to="{{group_jid}}">
<add>
<participant jid="{{jid}}""></participant>
<participant jid="{{jid}}""></participant>
</add>
</iq>
''' | 24.5 | 59 | 0.484694 | 22 | 196 | 4.272727 | 0.545455 | 0.297872 | 0.361702 | 0.595745 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188776 | 196 | 8 | 60 | 24.5 | 0.591195 | 0.959184 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
671f359e4bd4da584ab6b7bf9fe140cab7bcb9b3 | 28,110 | py | Python | tests/test_broadcast.py | jmeyers314/DP_SNe | d5a91e96425fe99c8f4450a9f11256f3abbd43eb | [
"BSD-2-Clause"
] | 20 | 2017-11-20T09:44:14.000Z | 2022-02-11T17:38:24.000Z | tests/test_broadcast.py | jmeyers314/DP_SNe | d5a91e96425fe99c8f4450a9f11256f3abbd43eb | [
"BSD-2-Clause"
] | 1 | 2016-07-08T01:12:51.000Z | 2016-07-08T01:12:51.000Z | tests/test_broadcast.py | jmeyers314/DP_SNe | d5a91e96425fe99c8f4450a9f11256f3abbd43eb | [
"BSD-2-Clause"
] | 9 | 2017-06-30T20:46:57.000Z | 2021-08-17T06:47:58.000Z | import numpy as np
import dpmm
from test_utils import timer
@timer
def test_GaussianMeanKnownVariance():
"""Test broadcasting rules for GaussianMeanKnownVariance prior."""
# Test sample() method:
prior = dpmm.GaussianMeanKnownVariance(1.0, 1.0, 1.0)
arr = prior.sample()
assert isinstance(arr, float)
arr = prior.sample(size=1)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
arr = prior.sample(size=(1,))
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
arr = prior.sample(size=10)
assert isinstance(arr, np.ndarray)
assert arr.shape == (10,)
assert arr.dtype == float
arr = prior.sample(size=(10, 20))
assert isinstance(arr, np.ndarray)
assert arr.shape == (10, 20)
assert arr.dtype == float
# Test like1() method:
prior = dpmm.GaussianMeanKnownVariance(1.0, 1.0, 1.0)
x = 1.0
mu = 1.0
arr = prior.like1(x, mu)
assert isinstance(arr, float)
x = np.array([1.0])
arr = prior.like1(x, mu)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
assert arr[0] == prior.like1(x[0], mu)
x = np.array([1.0, 2.0])
arr = prior.like1(x, mu)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior.like1(x[i], mu)
x = np.array([[1.0, 2.0], [3.0, 4.0]])
arr = prior.like1(x, mu)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 2)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior.like1(x[i, j], mu)
x = np.array([1.0, 2.0])
mu = np.array([2.0, 3.0])
arr = prior.like1(x, mu)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior.like1(x[i], mu[i])
x = np.array([1.0, 2.0])
mu = np.array([1.0, 2.0, 3.0])
arr = prior.like1(x[:, np.newaxis], mu)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 3)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior.like1(x[i], mu[j])
arr = prior.like1(x, mu[:, np.newaxis])
assert isinstance(arr, np.ndarray)
assert arr.shape == (3, 2)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior.like1(x[j], mu[i])
x = np.array([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0]])
mu = np.array([10.0, 11.0, 12.0, 13.0])
arr = prior.like1(x[:, :, np.newaxis], mu)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 3, 4)
assert arr.dtype == float
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x[i, j], mu[k])
arr = prior.like1(x, mu[:, np.newaxis, np.newaxis])
assert isinstance(arr, np.ndarray)
assert arr.shape == (4, 2, 3)
assert arr.dtype == float
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x[j, k], mu[i])
# Test __call__() method:
prior = dpmm.GaussianMeanKnownVariance(1.0, 1.0, 1.0)
mu = 1.0
arr = prior(mu)
assert isinstance(arr, float)
mu = np.array([1.0])
arr = prior(mu)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
assert arr[0] == prior(mu[0])
mu = np.array([1.0, 2.0])
arr = prior(mu)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior(mu[i])
mu = np.array([[1.0, 2.0], [3.0, 4.0]])
arr = prior(mu)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 2)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior(mu[i, j])
# Should _post_params method do any broadcasting?
# Test pred method():
prior = dpmm.InvGamma(1.0, 1.0, 0.0)
x = 1.0
arr = prior.pred(x)
assert isinstance(arr, float)
x = np.array([1.0])
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
assert arr[0] == prior.pred(x[0])
x = np.array([1.0, 2.0])
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior.pred(x[i])
x = np.arange(6.0).reshape(3, 2)+1
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (3, 2)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior.pred(x[i, j])
@timer
def test_InvGamma():
"""Test broadcasting rules for InvGamma prior."""
# Test sample() method:
prior = dpmm.InvGamma(1.0, 1.0, 0.0)
arr = prior.sample()
assert isinstance(arr, float)
arr = prior.sample(size=1)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
arr = prior.sample(size=(1,))
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
arr = prior.sample(size=10)
assert isinstance(arr, np.ndarray)
assert arr.shape == (10,)
assert arr.dtype == float
arr = prior.sample(size=(10, 20))
assert isinstance(arr, np.ndarray)
assert arr.shape == (10, 20)
assert arr.dtype == float
# Test like1() method:
prior = dpmm.InvGamma(1.0, 1.0, 0.0)
x = 1.0
var = 1.0
arr = prior.like1(x, var)
assert isinstance(arr, float)
x = np.array([1.0])
arr = prior.like1(x, var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
assert arr[0] == prior.like1(x[0], var)
x = np.array([1.0, 2.0])
arr = prior.like1(x, var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior.like1(x[i], var)
x = np.array([[1.0, 2.0], [3.0, 4.0]])
arr = prior.like1(x, var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 2)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior.like1(x[i, j], var)
x = np.array([1.0, 2.0])
var = np.array([2.0, 3.0])
arr = prior.like1(x, var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior.like1(x[i], var[i])
x = np.array([1.0, 2.0])
var = np.array([1.0, 2.0, 3.0])
arr = prior.like1(x[:, np.newaxis], var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 3)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior.like1(x[i], var[j])
arr = prior.like1(x, var[:, np.newaxis])
assert isinstance(arr, np.ndarray)
assert arr.shape == (3, 2)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior.like1(x[j], var[i])
x = np.array([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0]])
var = np.array([10.0, 11.0, 12.0, 13.0])
arr = prior.like1(x[:, :, np.newaxis], var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 3, 4)
assert arr.dtype == float
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x[i, j], var[k])
arr = prior.like1(x, var[:, np.newaxis, np.newaxis])
assert isinstance(arr, np.ndarray)
assert arr.shape == (4, 2, 3)
assert arr.dtype == float
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x[j, k], var[i])
# Test __call__() method:
prior = dpmm.InvGamma(1.0, 1.0, 0.0)
var = 1.0
arr = prior(var)
assert isinstance(arr, float)
var = np.array([1.0])
arr = prior(var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
assert arr[0] == prior(var[0])
var = np.array([1.0, 2.0])
arr = prior(var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior(var[i])
var = np.array([[1.0, 2.0], [3.0, 4.0]])
arr = prior(var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 2)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior(var[i, j])
# Should _post_params method do any broadcasting?
# Test pred method():
prior = dpmm.InvGamma(1.0, 1.0, 0.0)
x = 1.0
arr = prior.pred(x)
assert isinstance(arr, float)
x = np.array([1.0])
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
assert arr[0] == prior.pred(x[0])
x = np.array([1.0, 2.0])
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior.pred(x[i])
x = np.arange(6.0).reshape(3, 2)+1
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (3, 2)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior.pred(x[i, j])
@timer
def test_InvGamma2D():
"""Test broadcasting rules for InvGamma2D prior."""
# Test sample() method:
prior = dpmm.InvGamma2D(1.0, 1.0, np.array([0.0, 0.0]))
arr = prior.sample()
assert isinstance(arr, float)
arr = prior.sample(size=1)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
arr = prior.sample(size=(1,))
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
arr = prior.sample(size=10)
assert isinstance(arr, np.ndarray)
assert arr.shape == (10,)
assert arr.dtype == float
arr = prior.sample(size=(10, 20))
assert isinstance(arr, np.ndarray)
assert arr.shape == (10, 20)
assert arr.dtype == float
# Test like1() method:
prior = dpmm.InvGamma2D(1.0, 1.0, np.array([0.0, 0.0]))
x = np.array([1.0, 2.0]) # Data is 2D, so trailing axis should always be len 2.
var = 1.0
arr = prior.like1(x, var)
assert isinstance(arr, float)
x = np.array([1.0]) # If trailing axis is not 2, then should get an AssertionError
var = 1.0
np.testing.assert_raises(AssertionError, prior.like1, x, var)
x = np.array([[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]])
var = 1.0
arr = prior.like1(x, var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (3,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior.like1(x[i], var)
x = np.array([[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]])
var = np.array([2.0, 3.0])
arr = prior.like1(x, var[:, np.newaxis])
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 3)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior.like1(x[j], var[i])
x = np.arange(24, dtype=float).reshape(3, 4, 2)
var = np.array([2.0, 3.0])
arr = prior.like1(x[:,:,np.newaxis,:], var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (3, 4, 2)
assert arr.dtype == float
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x[i, j], var[k])
x = np.arange(24, dtype=float).reshape(3, 4, 2)
var = np.arange(12, dtype=float).reshape(3, 4) + 1 # add 1 so we don't divide by zero
arr = prior.like1(x, var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (3, 4)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior.like1(x[i, j], var[i, j])
# Test __call__() method:
prior = dpmm.InvGamma2D(1.0, 1.0, np.array([0.0, 0.0]))
var = 1.0
arr = prior(var)
assert isinstance(arr, float)
var = np.array([1.0])
arr = prior(var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
assert arr[0] == prior(var[0])
var = np.array([1.0, 2.0])
arr = prior(var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior(var[i])
var = np.array([[1.0, 2.0], [3.0, 4.0]])
arr = prior(var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 2)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior(var[i, j])
# Should _post_params method do any broadcasting?
# Test pred method():
prior = dpmm.InvGamma2D(1.0, 1.0, np.array([0.0, 0.0]))
x = 1.0
np.testing.assert_raises(AssertionError, prior.pred, x)
x = np.array([1.0, 2.0])
arr = prior.pred(x)
assert isinstance(arr, float)
x = np.arange(24, dtype=float).reshape(3, 4, 2)
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (3, 4)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior.pred(x[i, j])
@timer
def test_NormInvChi2():
"""Test broadcasting rules for NormInvChi2 prior."""
# Test sample() method:
prior = dpmm.NormInvChi2(1.0, 1.0, 1.0, 1.0)
arr = prior.sample()
assert isinstance(arr, np.void)
assert arr.dtype == prior.model_dtype
arr = prior.sample(size=1)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == prior.model_dtype
arr = prior.sample(size=(1,))
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == prior.model_dtype
arr = prior.sample(size=10)
assert isinstance(arr, np.ndarray)
assert arr.shape == (10,)
assert arr.dtype == prior.model_dtype
arr = prior.sample(size=(10, 20))
assert isinstance(arr, np.ndarray)
assert arr.shape == (10, 20)
assert arr.dtype == prior.model_dtype
# Test like1() method:
prior = dpmm.NormInvChi2(1.0, 1.0, 1.0, 1.0)
x = 1.0
mu = 1.0
var = 1.0
arr = prior.like1(x, mu, var)
assert isinstance(arr, float)
x = np.array([1.0])
mu = 1.0
var = 1.0
arr = prior.like1(x, mu, var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
assert arr[0] == prior.like1(x[0], mu, var)
x = np.array([1.0, 2.0, 3.0, 4.0])
mu = np.array([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0]])
var = np.array([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0]])
arr = prior.like1(x[:, np.newaxis, np.newaxis], mu, var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (4, 2, 3)
assert arr.dtype == float
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x[i], mu[j, k], var[j, k])
theta = np.zeros((2, 3), dtype=prior.model_dtype)
theta['mu'] = mu
theta['var'] = var
arr = prior.like1(x[:, np.newaxis, np.newaxis], theta)
assert isinstance(arr, np.ndarray)
assert arr.shape == (4, 2, 3)
assert arr.dtype == float
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x[i], theta[j, k])
arr = prior.like1(x, mu[:, :, np.newaxis], var[:, :, np.newaxis])
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 3, 4)
assert arr.dtype == float
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x[k], mu[i, j], var[i, j])
arr = prior.like1(x, theta[:, :, np.newaxis])
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 3, 4)
assert arr.dtype == float
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x[k], theta[i, j])
# Test __call__() method:
prior = dpmm.NormInvChi2(1.0, 1.0, 1.0, 1.0)
mu = 1.0
var = 1.0
arr = prior(mu, var)
assert isinstance(arr, float)
mu = np.array([1.0])
arr = prior(mu, var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
assert arr[0] == prior(mu[0], var)
mu = np.array([1.0, 2.0])
var = np.array([10.0, 11.0, 12.0])
arr = prior(mu[:, np.newaxis], var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 3)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior(mu[i], var[j])
theta = np.zeros((2, 3), dtype=prior.model_dtype)
theta['mu'] = mu[:, np.newaxis]
theta['var'] = var
arr = prior(theta)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 3)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior(theta[i][j])
# Should _post_params method do any broadcasting?
# Test pred method():
prior = dpmm.NormInvChi2(1.0, 1.0, 1.0, 1.0)
x = 1.0
arr = prior.pred(x)
assert isinstance(arr, float)
x = np.array([1.0])
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
assert arr[0] == prior.pred(x[0])
x = np.array([1.0, 2.0])
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior.pred(x[i])
x = np.array([[1.0, 2.0], [3.0, 4.0]])
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 2)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior.pred(x[i, j])
@timer
def test_NormInvGamma():
"""Test broadcasting rules for NormInvGamma prior."""
# Test sample() method:
prior = dpmm.NormInvGamma(1.0, 1.0, 1.0, 1.0)
arr = prior.sample()
assert isinstance(arr, np.void)
assert arr.dtype == prior.model_dtype
arr = prior.sample(size=1)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == prior.model_dtype
arr = prior.sample(size=(1,))
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == prior.model_dtype
arr = prior.sample(size=10)
assert isinstance(arr, np.ndarray)
assert arr.shape == (10,)
assert arr.dtype == prior.model_dtype
arr = prior.sample(size=(10, 20))
assert isinstance(arr, np.ndarray)
assert arr.shape == (10, 20)
assert arr.dtype == prior.model_dtype
# Test like1() method:
prior = dpmm.NormInvGamma(1.0, 1.0, 1.0, 1.0)
x = 1.0
mu = 1.0
var = 1.0
arr = prior.like1(x, mu, var)
assert isinstance(arr, float)
x = np.array([1.0])
mu = 1.0
var = 1.0
arr = prior.like1(x, mu, var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
assert arr[0] == prior.like1(x[0], mu, var)
x = np.array([1.0, 2.0, 3.0, 4.0])
mu = np.array([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0]])
var = np.array([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0]])
arr = prior.like1(x[:, np.newaxis, np.newaxis], mu, var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (4, 2, 3)
assert arr.dtype == float
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x[i], mu[j, k], var[j, k])
theta = np.zeros((2, 3), dtype=prior.model_dtype)
theta['mu'] = mu
theta['var'] = var
arr = prior.like1(x[:, np.newaxis, np.newaxis], theta)
assert isinstance(arr, np.ndarray)
assert arr.shape == (4, 2, 3)
assert arr.dtype == float
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x[i], theta[j, k])
arr = prior.like1(x, mu[:, :, np.newaxis], var[:, :, np.newaxis])
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 3, 4)
assert arr.dtype == float
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x[k], mu[i, j], var[i, j])
arr = prior.like1(x, theta[:, :, np.newaxis])
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 3, 4)
assert arr.dtype == float
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x[k], theta[i, j])
# Test __call__() method:
prior = dpmm.NormInvGamma(1.0, 1.0, 1.0, 1.0)
mu = 1.0
var = 1.0
arr = prior(mu, var)
assert isinstance(arr, float)
mu = np.array([1.0])
arr = prior(mu, var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
assert arr[0] == prior(mu[0], var)
mu = np.array([1.0, 2.0])
var = np.array([10.0, 11.0, 12.0])
arr = prior(mu[:, np.newaxis], var)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 3)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior(mu[i], var[j])
theta = np.zeros((2, 3), dtype=prior.model_dtype)
theta['mu'] = mu[:, np.newaxis]
theta['var'] = var
arr = prior(theta)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 3)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior(theta[i, j])
# Should _post_params method do any broadcasting?
# Test pred method():
prior = dpmm.NormInvGamma(1.0, 1.0, 1.0, 1.0)
x = 1.0
arr = prior.pred(x)
assert isinstance(arr, float)
x = np.array([1.0])
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == float
assert arr[0] == prior.pred(x[0])
x = np.array([1.0, 2.0])
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior.pred(x[i])
x = np.array([[1.0, 2.0], [3.0, 4.0]])
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 2)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
assert r == prior.pred(x[i, j])
@timer
def test_NormInvWish():
"""Test broadcasting rules for NormInvWish prior."""
# Test sample() method:
mu_0 = np.arange(3.0)
kappa_0 = 3.0
Lam_0 = np.eye(3) + 0.01*np.arange(9).reshape(3,3)
Lam_0 += Lam_0.T # To make symmetric
nu_0 = 3
prior = dpmm.NormInvWish(mu_0, kappa_0, Lam_0, nu_0)
arr = prior.sample()
assert isinstance(arr, np.void)
assert arr.dtype == prior.model_dtype
arr = prior.sample(size=1)
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == prior.model_dtype
arr = prior.sample(size=(1,))
assert isinstance(arr, np.ndarray)
assert arr.shape == (1,)
assert arr.dtype == prior.model_dtype
arr = prior.sample(size=10)
assert isinstance(arr, np.ndarray)
assert arr.shape == (10,)
assert arr.dtype == prior.model_dtype
arr = prior.sample(size=(10, 20))
assert isinstance(arr, np.ndarray)
assert arr.shape == (10, 20)
assert arr.dtype == prior.model_dtype
# Test like1() method:
prior = dpmm.NormInvWish(mu_0, kappa_0, Lam_0, nu_0)
x = np.arange(3.0)
mu = np.arange(3.0)+1.0
Sig = np.eye(3) + 0.03*np.arange(9).reshape(3, 3)
Sig += Sig.T
arr = prior.like1(x, mu, Sig)
assert isinstance(arr, float)
# If trailing axis of x is not dim 3 (for these prior parameters), should get and AssertionError
xbad = np.arange(2.0)
np.testing.assert_raises(AssertionError, prior.like1, xbad, mu, Sig)
# And similar checks for mu and Sig
mubad = np.arange(4.0)
np.testing.assert_raises(AssertionError, prior.like1, x, mubad, Sig)
Sigbad = np.eye(2)
np.testing.assert_raises(AssertionError, prior.like1, x, mu, Sigbad)
# Try some non-trival broadcasts
mu = np.arange(6.0).reshape(2, 3)
arr = prior.like1(x, mu, Sig)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
for i, r in np.ndenumerate(arr):
assert r == prior.like1(x, mu[i], Sig)
theta = np.zeros((2,), dtype=prior.model_dtype)
theta['mu'] = mu
theta['Sig'] = Sig
arr = prior.like1(x, theta)
for i, r in np.ndenumerate(arr):
assert r == prior.like1(x, theta[i])
mu = np.empty((3, 4, 3), dtype=float)
Sig = np.empty((3, 4, 3, 3), dtype=float)
for i in range(3):
for j in range(4):
mu[i, j] = np.arange(3.0)
Sig[i, j] = np.eye(3)+0.1*i+0.2*j
arr = prior.like1(x, mu, Sig)
for (i, j), r in np.ndenumerate(arr):
assert r == prior.like1(x, mu[i, j], Sig[i, j])
theta = np.empty((3, 4), dtype=prior.model_dtype)
theta['mu'] = mu
theta['Sig'] = Sig
arr = prior.like1(x, theta)
for (i, j), r in np.ndenumerate(arr):
assert r == prior.like1(x, theta[i, j])
mu = np.arange(6.0).reshape(2, 3)
arr = prior.like1(x, mu[:, np.newaxis, np.newaxis, :], Sig)
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x, mu[i], Sig[j, k])
theta = np.empty((2, 3, 4), dtype=prior.model_dtype)
theta['mu'] = (np.arange(6.0).reshape(2, 3))[:, np.newaxis, np.newaxis, :]
theta['Sig'] = Sig
arr = prior.like1(x, theta)
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior.like1(x, theta[i, j, k])
# Test __call__() method:
prior = dpmm.NormInvWish(mu_0, kappa_0, Lam_0, nu_0)
mu = np.arange(3.0)
Sig = np.eye(3)
arr = prior(mu, Sig)
assert isinstance(arr, float)
theta = np.zeros(1, dtype=prior.model_dtype)
theta['mu'] = mu
theta['Sig'] = Sig
arr = prior(theta[0])
assert isinstance(arr, float)
assert arr == prior(mu, Sig)
mu = np.arange(6.0).reshape(2, 3)
arr = prior(mu, Sig)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior(mu[i], Sig)
theta = np.zeros(2, dtype=prior.model_dtype)
theta['mu'] = mu
theta['Sig'] = Sig
arr = prior(theta)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior(theta[i])
mu = np.empty((3, 4, 3), dtype=float)
Sig = np.empty((3, 4, 3, 3), dtype=float)
for i in range(3):
for j in range(4):
mu[i, j] = np.arange(3.0)
Sig[i, j] = np.eye(3)+0.1*i+0.2*j
arr = prior(mu, Sig)
for (i, j), r in np.ndenumerate(arr):
assert r == prior(mu[i, j], Sig[i, j])
theta = np.zeros((3, 4), dtype=prior.model_dtype)
theta['mu'] = mu
theta['Sig'] = Sig
arr = prior(theta)
for (i, j), r in np.ndenumerate(arr):
assert r == prior(theta[i, j])
mu = np.arange(6.0).reshape(2, 3)
arr = prior(mu[:, np.newaxis, np.newaxis, :], Sig)
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior(mu[i], Sig[j, k])
theta = np.zeros((2, 3, 4), dtype=prior.model_dtype)
theta['mu'] = mu[:, np.newaxis, np.newaxis, :]
theta['Sig'] = Sig
arr = prior(theta)
for (i, j, k), r in np.ndenumerate(arr):
assert r == prior(theta[i, j, k])
# Should _post_params method do any broadcasting?
# Test pred method():
prior = dpmm.NormInvWish(mu_0, kappa_0, Lam_0, nu_0)
x = np.arange(3.0)+1
arr = prior.pred(x)
assert isinstance(arr, float)
x = np.arange(6.0).reshape(2, 3)
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2,)
assert arr.dtype == float
for i, r in np.ndenumerate(arr):
assert r == prior.pred(x[i])
x = np.arange(24.0).reshape(2, 4, 3)
arr = prior.pred(x)
assert isinstance(arr, np.ndarray)
assert arr.shape == (2, 4)
assert arr.dtype == float
for (i, j), r in np.ndenumerate(arr):
np.testing.assert_almost_equal(r, prior.pred(x[i, j]))
if __name__ == '__main__':
test_GaussianMeanKnownVariance()
test_InvGamma()
test_InvGamma2D()
test_NormInvChi2()
test_NormInvGamma()
test_NormInvWish()
| 30.160944 | 100 | 0.580149 | 4,581 | 28,110 | 3.535473 | 0.027068 | 0.105582 | 0.13139 | 0.116695 | 0.938133 | 0.935107 | 0.92387 | 0.919795 | 0.904297 | 0.886824 | 0 | 0.049408 | 0.24614 | 28,110 | 931 | 101 | 30.19334 | 0.714879 | 0.0508 | 0 | 0.868176 | 0 | 0 | 0.002367 | 0 | 0 | 0 | 0 | 0 | 0.48735 | 1 | 0.007989 | false | 0 | 0.003995 | 0 | 0.011984 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
671f6093f14be824f110bc4c02b48edf31e161f3 | 349 | py | Python | Chapter04/impexp.py | PacktPublishing/Modernizing-Oracle-Tuxedo-Applications-with-Python | 32bf39da8c0b8351123701c307447aa18142cc1e | [
"MIT"
] | 2 | 2021-06-26T17:43:06.000Z | 2022-02-21T10:12:42.000Z | Chapter04/impexp.py | PacktPublishing/Modernizing-Oracle-Tuxedo-Applications-with-Python | 32bf39da8c0b8351123701c307447aa18142cc1e | [
"MIT"
] | null | null | null | Chapter04/impexp.py | PacktPublishing/Modernizing-Oracle-Tuxedo-Applications-with-Python | 32bf39da8c0b8351123701c307447aa18142cc1e | [
"MIT"
] | null | null | null | import tuxedo as t
print(t.tpexport("Hello, world!"))
print(t.tpexport({"TA_CLASS": "Hello, world!"}))
print(t.tpexport({"TA_CLASS": "Hello, world!"}, t.TPEX_STRING))
print(t.tpimport(t.tpexport({"TA_CLASS": "Hello, world!"})))
print(
t.tpimport(
t.tpexport({"TA_CLASS": "Hello, world!"}, t.TPEX_STRING),
t.TPEX_STRING,
)
)
| 26.846154 | 65 | 0.633238 | 50 | 349 | 4.28 | 0.26 | 0.140187 | 0.205607 | 0.299065 | 0.813084 | 0.813084 | 0.813084 | 0.813084 | 0.813084 | 0 | 0 | 0 | 0.143266 | 349 | 12 | 66 | 29.083333 | 0.715719 | 0 | 0 | 0 | 0 | 0 | 0.277937 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.272727 | 0 | 0.272727 | 0.454545 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 9 |
67a64694cd089a42f1008bf2598dbc084e8089fa | 34,234 | py | Python | tests/v2_validation/cattlevalidationtest/core/test_webhook_scale_service.py | bmdepesa/validation-tests | 23e7ab95ce76744483a0657f790b42a88a93436d | [
"Apache-2.0"
] | null | null | null | tests/v2_validation/cattlevalidationtest/core/test_webhook_scale_service.py | bmdepesa/validation-tests | 23e7ab95ce76744483a0657f790b42a88a93436d | [
"Apache-2.0"
] | null | null | null | tests/v2_validation/cattlevalidationtest/core/test_webhook_scale_service.py | bmdepesa/validation-tests | 23e7ab95ce76744483a0657f790b42a88a93436d | [
"Apache-2.0"
] | null | null | null | from common_fixtures import * # NOQA
def test_webhook_scaleup(client):
# This method tests the service scale up using webhook token
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "servicescaleuptest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 2,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 200
assert resp.url is not None
json_resp = json.loads(resp.content)
webhook_url = json_resp["url"]
webhook_id = json_resp["id"]
print "Webhook is " + repr(webhook_url)
print "Id is " + repr(webhook_id)
# Execute Webhook and verify that the scale is incremented by
# the amount specified
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 200
service = client.wait_success(service)
assert service.scale == 3
# Delete the Webhook
delete_webhook_verify(env.accountId, webhook_id)
delete_all(client, [env])
def test_webhook_scaleup_beyond_max(client):
# This method tests the service scale up beyond the maximum allowed scale
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "servicescaleupbeyondmaxtest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 2,
"serviceId": service.id,
"min": 1,
"max": 3,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 200
assert resp.url is not None
json_resp = json.loads(resp.content)
webhook_url = json_resp["url"]
webhook_id = json_resp["id"]
print "Webhook is " + repr(webhook_url)
print "Id is " + repr(webhook_id)
# Execute Webhook
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 200
service = client.wait_success(service)
assert service.scale == 3
# Execute webhook again and ensure the scale
# cannot be incremented beyond max scale
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 400
service = client.reload(service)
assert service.scale == 3
json_resp = json.loads(wh_resp.content)
print json_resp
expected_response = "Error Cannot scale above provided max " \
"scale value in executing driver for scaleService"
assert json_resp['message'] == expected_response
# Delete the Webhook
delete_webhook_verify(env.accountId, webhook_id)
delete_all(client, [env])
def test_webhook_scaleup_beyond_max_1(client):
# This method tests the service scale cannot got up beyond the max scale
# when the initial request to scale up itself is beyond the max scale
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config, 2)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 2
data = {
"name": "servicescaleupbeyondmaxtest_1",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 3,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 200
assert resp.url is not None
json_resp = json.loads(resp.content)
webhook_url = json_resp["url"]
webhook_id = json_resp["id"]
print "Webhook is " + repr(webhook_url)
print "Id is " + repr(webhook_id)
# Execute webhook and ensure the scale cannot be
# incremented beyond max scale
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 400
service = client.reload(service)
assert service.scale == 2
json_resp = json.loads(wh_resp.content)
print "Json response is"
print json_resp
expected_response = "Error Cannot scale above provided max " \
"scale value in executing driver for scaleService"
assert json_resp['message'] == expected_response
# Scale down service to 1
service = client.update(service, name=service.name, scale=1)
service = client.wait_success(service, 300)
assert service.state == "active"
assert service.scale == 1
# Execute the webhook generated earlier and ensure service scale up
# is successful
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 200
service = client.reload(service)
assert service.scale == 4
# Delete the Webhook
delete_webhook_verify(env.accountId, webhook_id)
delete_all(client, [env])
def test_webhook_scaledown(client):
# This method tests the service scale down using webhook token
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config, scale=3)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 3
data = {
"name": "servicescaledowntest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "down",
"amount": 2,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 200
assert resp.url is not None
json_resp = json.loads(resp.content)
webhook_url = json_resp["url"]
webhook_id = json_resp["id"]
print "Webhook is" + repr(webhook_url)
print "Id is" + repr(webhook_id)
# Execute Webhook and ensure the scale is decremented
# by the amount specified
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 200
service = client.wait_success(service)
assert service.scale == 1
# Delete the Webhook
delete_webhook_verify(env.accountId, webhook_id)
delete_all(client, [env])
def test_webhook_scaledown_below_min(client):
# This method tests the service scale down below the minimum allowed scale
env = client.create_stack(name=random_str())
env = client.wait_success(env)
assert env.state == "active"
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config, scale=3)
service = client.wait_success(service)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 3
data = {
"name": "servicescaledownbelowmintest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "down",
"amount": 2,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 200
assert resp.url is not None
json_resp = json.loads(resp.content)
webhook_url = json_resp["url"]
webhook_id = json_resp["id"]
print "Webhook is" + repr(webhook_url)
print "Id is" + repr(webhook_id)
# Execute Webhook
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 200
service = client.wait_success(service)
assert service.scale == 1
# Execute Webhook and ensure scale cannot be
# decremented beyond the min value
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 400
service = client.reload(service)
assert service.scale == 1
json_resp = json.loads(wh_resp.content)
print json_resp
expected_response = "Error Cannot scale below provided min " \
"scale value in executing driver for scaleService"
assert json_resp['message'] == expected_response
# Delete the Webhook
delete_webhook_verify(env.accountId, webhook_id)
delete_all(client, [env])
def test_webhook_scaledown_below_min_1(client):
# This method tests the service scale cannot go down below the min scale
# when the initial request to scale down itself is below the min scale
env = client.create_stack(name=random_str())
env = client.wait_success(env)
assert env.state == "active"
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config, scale=4)
service = client.wait_success(service)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 4
data = {
"name": "servicescaledownbelowmintest_1",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "down",
"amount": 3,
"serviceId": service.id,
"min": 2,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 200
assert resp.url is not None
json_resp = json.loads(resp.content)
webhook_url = json_resp["url"]
webhook_id = json_resp["id"]
print "Webhook is" + repr(webhook_url)
print "Id is" + repr(webhook_id)
# Execute Webhook and ensure scale cannot be
# decremented below the min value
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 400
service = client.reload(service)
assert service.scale == 4
json_resp = json.loads(wh_resp.content)
print json_resp
expected_response = "Error Cannot scale below provided min " \
"scale value in executing driver for scaleService"
assert json_resp['message'] == expected_response
# Scale up service to 5
service = client.update(service, name=service.name, scale=5)
service = client.wait_success(service, 300)
assert service.state == "active"
assert service.scale == 5
# Execute the webhook generated earlier and ensure service scale down
# is successful
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 200
service = client.reload(service)
assert service.scale == 2
# Delete the Webhook
delete_webhook_verify(env.accountId, webhook_id)
delete_all(client, [env])
def test_webhook_invalid_scale_action(client):
# This method tests the use of invalid scale action
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "invalidactiontest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "updown",
"amount": 2,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook and verify invalid action cannot be specified
resp = create_webhook(env.accountId, data)
assert resp.status_code == 400
json_resp = json.loads(resp.content)
print "JSON response is:"
print json_resp
expected_response = "Invalid action updown"
assert json_resp['message'] == expected_response
delete_all(client, [env])
def test_webhook_invalid_service_id(client):
# This method tests the use of invalid service id
env = client.create_stack(name=random_str())
env = client.wait_success(env)
assert env.state == "active"
# Provide an invalid service for the serviceId field
data = {
"name": "invalidserviceidtest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "down",
"amount": 2,
"serviceId": "1s1000a",
"min": 1,
"max": 4,
}
}
# Create Webhook and verify invalid service id cannot be specified
resp = create_webhook(env.accountId, data)
assert resp.status_code == 400
json_resp = json.loads(resp.content)
print "JSON response is:"
print json_resp
expected_response = "Invalid service 1s1000a"
assert json_resp['message'] == expected_response
delete_all(client, [env])
def test_webhook_scaleup_invalid_zero_amount(client):
# This method tests the scale amount of zero
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
data = {
"name": "zeroamounttest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 0,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook and verify zero amount cannot be specified
r = create_webhook(env.accountId, data)
assert r.status_code == 400
assert r.url is not None
resp = json.loads(r.content)
print resp
expected_message = "Invalid amount: 0"
assert resp['message'] == expected_message
delete_all(client, [env])
def test_webhook_scaleup_invalid_negative_amount(client):
# This method tests the negative value for scale amount
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "negativeamounttest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": -1,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook and verify negative amount cannot be specified
r = create_webhook(env.accountId, data)
assert r.status_code == 400
assert r.url is not None
resp = json.loads(r.content)
print resp
expected_message = "Invalid amount: -1"
assert resp['message'] == expected_message
delete_all(client, [env])
def test_webhook_scaleup_invalid_zero_min(client):
# This method tests the zero value for min scale
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "zeromintest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 1,
"serviceId": service.id,
"min": 0,
"max": 4,
}
}
# Create Webhook and verify zero min scale cannot be specified
r = create_webhook(env.accountId, data)
assert r.status_code == 400
assert r.url is not None
resp = json.loads(r.content)
print resp
print resp['message']
expected_message = "Minimum scale not provided/invalid"
assert resp['message'] == expected_message
delete_all(client, [env])
def test_webhook_scaleup_invalid_negative_min(client):
# This method tests the negative value for min scale
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "negativemintest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 1,
"serviceId": service.id,
"min": -1,
"max": 4,
}
}
# Create Webhook and verify negative min scale cannot be specified
r = create_webhook(env.accountId, data)
assert r.status_code == 400
assert r.url is not None
resp = json.loads(r.content)
print resp
expected_message = "Minimum scale not provided/invalid"
assert resp['message'] == expected_message
delete_all(client, [env])
def test_webhook_scaleup_invalid_zero_max(client):
# This method tests the zero value for max scale
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "zeromaxtest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 1,
"serviceId": service.id,
"min": 1,
"max": 0,
}
}
# Create Webhook and verify zero max scale cannot be specified
r = create_webhook(env.accountId, data)
assert r.status_code == 400
assert r.url is not None
resp = json.loads(r.content)
print resp
print resp['message']
expected_message = "Maximum scale not provided/invalid"
assert resp['message'] == expected_message
delete_all(client, [env])
def test_webhook_scaleup_invalid_negative_max(client):
# This method tests the negative value for max scale
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "negativemaxtest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 1,
"serviceId": service.id,
"min": 1,
"max": -1,
}
}
# Create Webhook and verify negative max scale cannot be specified
r = create_webhook(env.accountId, data)
assert r.status_code == 400
assert r.url is not None
resp = json.loads(r.content)
print resp
print resp['message']
expected_message = "Maximum scale not provided/invalid"
assert resp['message'] == expected_message
delete_all(client, [env])
def test_webhook_duplicatename(client):
# This method tests that a duplicate webhook cannot be generated
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "duplicatenametest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 2,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 200
assert resp.url is not None
json_resp = json.loads(resp.content)
webhook_url = json_resp["url"]
webhook_id = json_resp["id"]
print "Webhook is " + repr(webhook_url)
print "Id is " + repr(webhook_id)
resp = create_webhook(env.accountId, data)
assert resp.status_code == 400
json_resp = json.loads(resp.content)
print json_resp
expected_message = "Cannot have duplicate webhook name, webhook " \
+ data["name"] + " already exists"
assert json_resp['message'] == expected_message
# Delete the Webhook
delete_webhook_verify(env.accountId, webhook_id)
delete_all(client, [env])
def test_webhook_external_service(client):
# This method tests that an external service cannot be
# scaled up/down using webhook
env = create_env(client)
random_name = random_str()
ext_service_name = random_name.replace("-", "")
ext_service = client.create_externalService(
name=ext_service_name, stackId=env.id, hostname="google.com")
ext_service = client.wait_success(ext_service, 90)
assert ext_service.state == "inactive"
activate_svc(client, ext_service)
assert ext_service.state
data = {
"name": "extservicetest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 1,
"serviceId": ext_service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 400
json_resp = json.loads(resp.content)
print json_resp
expected_message = "Can only create webhooks for Services. " \
"The supplied service is of type externalService"
assert json_resp['message'] == expected_message
delete_all(client, [env])
def test_webhook_global_service(client):
# This method tests that a global service cannot be
# scaled up/down using webhook
launch_config = {"imageUuid": TEST_IMAGE_UUID}
launch_config["labels"] = {"io.rancher.scheduler.global": "true"}
service, env = create_env_and_svc(
client, launch_config, scale=None)
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
data = {
"name": "globalservicetest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 1,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 400
json_resp = json.loads(resp.content)
print json_resp
expected_message = "Cannot create webhook for global service " + service.id
assert json_resp['message'] == expected_message
delete_all(client, [env])
def test_webhook_service_no_image(client):
# This method tests that a service with no image
# cannot be scaled up/down using webhook
launch_config = {"imageUuid": "docker:rancher/none"}
env = create_env(client)
# Create Service
random_name = random_str()
service_name = random_name.replace("-", "")
service = client.create_service(
name=service_name, stackId=env.id,
launchConfig=launch_config, scale=0,
selectorContainer="test=none")
service = client.wait_success(service)
assert service.state == "inactive"
service.activate()
service = client.wait_success(service)
assert service.state == "active"
data = {
"name": "servicewithnoimagetest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 1,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 400
json_resp = json.loads(resp.content)
print json_resp
expected_message = "Cannot create webhook for service " \
"with no image " + service.id
assert json_resp['message'] == expected_message
delete_all(client, [env])
def test_webhook_missing_projectid(client):
# This method tests that executing a webhook with missing
# project id gives an error message
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "missingprojectidtest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 1,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 200
assert resp.url is not None
json_resp = json.loads(resp.content)
webhook_url = json_resp["url"]
webhook_id = json_resp["id"]
print "Webhook is " + repr(webhook_url)
print "Id is " + repr(webhook_id)
# Remove the project id (last three characters) from the URL
webhook_url = webhook_url[:-3]
print webhook_url
# Execute webhook with missing project Id and verify that it gives
# an "Invalid" error message
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 400
json_resp = json.loads(wh_resp.content)
print "JSON Response is"
print json_resp
expected_keyword = "Invalid"
jsonstrresponse = str(json_resp['message'])
if jsonstrresponse.find(expected_keyword) == -1:
assert False
# Delete the Webhook
delete_webhook_verify(env.accountId, webhook_id)
delete_all(client, [env])
def test_webhook_invalid_projectid(client):
# This method tests that executing a webhook with an invalid
# project id gives an error message
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "invalidprojectidtest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 1,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 200
assert resp.url is not None
json_resp = json.loads(resp.content)
webhook_url = json_resp["url"]
webhook_id = json_resp["id"]
print "Webhook is " + repr(webhook_url)
print "Id is " + repr(webhook_id)
webhook_url_split = webhook_url.split("projectId=")
print webhook_url_split
projectId = webhook_url_split[1]
print projectId
invalid_projectId = "1e1000"
print webhook_url
# Use the invalid project id "1e1000" in the URL
webhook_url_with_invalid_projectId = webhook_url_split[0] + \
"projectId=" + invalid_projectId
print "Webhook URL with invalid project id:"
print webhook_url_with_invalid_projectId
# Execute webhook with invalid project Id and
# verify that it gives an error message
wh_resp = requests.post(webhook_url_with_invalid_projectId)
print "Response is : "
print wh_resp
assert wh_resp.status_code == 500
json_resp = json.loads(wh_resp.content)
print "JSON Response is"
print json_resp
expected_keyword = "Error"
jsonstrresponse = str(json_resp['message'])
if jsonstrresponse.find(expected_keyword) == -1:
assert False
# Delete the Webhook
delete_webhook_verify(env.accountId, webhook_id)
delete_all(client, [env])
def test_webhook_invalid_token(client):
# This method tests that executing a webhook with an
# invalid token gives an error message
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "invalidtokentest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 1,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 200
assert resp.url is not None
json_resp = json.loads(resp.content)
webhook_url = json_resp["url"]
webhook_id = json_resp["id"]
print "Webhook is " + repr(webhook_url)
print "Id is " + repr(webhook_id)
webhook_url_split = webhook_url.split("key=")
key = webhook_url_split[1]
# Create invalid key by removing first 3 characters of the key
modified_key = key[3:]
print "Modified key"
print modified_key
print webhook_url
# Use the invalid key in the URL
webhook_url_with_invalid_token = webhook_url_split[0] + "key=" \
+ modified_key
print "Webhook URL with Invalid token:"
print webhook_url_with_invalid_token
# Execute webhook with invalid key/token and verify that it gives
# an error message
wh_resp = requests.post(webhook_url_with_invalid_token)
print wh_resp
assert wh_resp.status_code == 403
json_resp = json.loads(wh_resp.content)
print "JSON Response is"
print json_resp
expected_keyword = "revoked"
jsonstrresponse = str(json_resp['message'])
if jsonstrresponse.find(expected_keyword) == -1:
assert False
# Delete the Webhook
delete_webhook_verify(env.accountId, webhook_id)
delete_all(client, [env])
def test_webhook_execute_deleted_webhook(client):
# This method tests that executing a deleted webhook gives
# the appropriate error message
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "executedeletedwebhooktest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 1,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 200
assert resp.url is not None
json_resp = json.loads(resp.content)
webhook_url = json_resp["url"]
webhook_id = json_resp["id"]
print "Webhook is " + repr(webhook_url)
print "Id is " + repr(webhook_id)
# Execute Webhook and verify that the scale is incremented by
# the amount specified
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 200
service = client.wait_success(service)
assert service.scale == 2
# Delete the Webhook
delete_webhook_verify(env.accountId, webhook_id)
# Execute the deleted webhook and verify that it gives an error message
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 403
json_resp = json.loads(wh_resp.content)
print "JSON Response is"
print json_resp
expected_message = "Requested webhook has been revoked"
jsonstrresponse = str(json_resp['message'])
if jsonstrresponse.find(expected_message) == -1:
assert False
delete_all(client, [env])
def test_webhook_invalid_driver(client):
# This method tests the use of invalid driver
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "invaliddrivertest",
"driver": "scaleServiceupDown",
"scaleServiceConfig": {
"action": "up",
"amount": 2,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook and verify invalid action cannot be specified
resp = create_webhook(env.accountId, data)
assert resp.status_code == 400
json_resp = json.loads(resp.content)
print "JSON response is:"
print json_resp
expected_response = "Invalid driver " + data['driver']
assert json_resp['message'] == expected_response
delete_all(client, [env])
def test_webhook_list_single_webhook(client):
# This method test lists a single webhook
launch_config = {"imageUuid": TEST_IMAGE_UUID}
service, env = create_env_and_svc(client, launch_config)
assert service.state == "inactive"
service = client.wait_success(service.activate(), 90)
assert service.state == "active"
assert service.scale == 1
data = {
"name": "listsinglewebhooktest",
"driver": "scaleService",
"scaleServiceConfig": {
"action": "up",
"amount": 2,
"serviceId": service.id,
"min": 1,
"max": 4,
}
}
# Create Webhook
resp = create_webhook(env.accountId, data)
assert resp.status_code == 200
assert resp.url is not None
json_resp = json.loads(resp.content)
webhook_url = json_resp["url"]
webhook_id = json_resp["id"]
print "Webhook is " + repr(webhook_url)
print "Id is " + repr(webhook_id)
# List the webhook by id and ensure we get the correct response
resp = list_webhook(env.accountId, webhook_id=webhook_id)
assert resp["name"] == "listsinglewebhooktest"
assert resp["state"] == "active"
assert resp["driver"] == "scaleService"
# Execute Webhook and verify that the scale is incremented by
# the amount specified
wh_resp = requests.post(webhook_url)
assert wh_resp.status_code == 200
service = client.wait_success(service)
assert service.scale == 3
# Delete the Webhook
delete_webhook_verify(env.accountId, webhook_id)
delete_all(client, [env])
| 29.461274 | 79 | 0.643133 | 4,097 | 34,234 | 5.198682 | 0.052966 | 0.030424 | 0.03803 | 0.038312 | 0.881497 | 0.867412 | 0.844641 | 0.83295 | 0.78891 | 0.778065 | 0 | 0.0127 | 0.254805 | 34,234 | 1,161 | 80 | 29.486649 | 0.822194 | 0.123357 | 0 | 0.783784 | 0 | 0 | 0.145977 | 0.007692 | 0 | 0 | 0 | 0 | 0.203931 | 0 | null | null | 0 | 0.001229 | null | null | 0.085995 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
67c5a6c80624047f3b48c4509919dc349960ef0d | 172 | py | Python | anytask/jupyter/urls.py | antselevich/anytask | b00ea8ad929f267ac4a37d1a0eaabce28c5b02cf | [
"MIT"
] | 31 | 2015-03-24T21:11:44.000Z | 2022-03-28T22:55:12.000Z | anytask/jupyter/urls.py | antselevich/anytask | b00ea8ad929f267ac4a37d1a0eaabce28c5b02cf | [
"MIT"
] | 286 | 2015-06-11T10:32:16.000Z | 2022-03-28T12:01:04.000Z | anytask/jupyter/urls.py | bcskda/anytask | 5a359dcb669b689fc5a4f1705f2c88cd031ab37d | [
"MIT"
] | 44 | 2015-05-23T21:30:51.000Z | 2021-11-07T12:56:59.000Z | from django.conf.urls import url
import jupyter.views
urlpatterns = [url(r'^assignments$', jupyter.views.update_jupyter_task, name="jupyter.views.update_jupyter_task"), ]
| 34.4 | 116 | 0.796512 | 24 | 172 | 5.541667 | 0.583333 | 0.270677 | 0.270677 | 0.37594 | 0.43609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075581 | 172 | 4 | 117 | 43 | 0.836478 | 0 | 0 | 0 | 0 | 0 | 0.267442 | 0.19186 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
67db1bf407c1240c6e5c67d72204d2314310891e | 13,901 | py | Python | tests/test_extractor_api_wpd.py | isabella232/comport | 117123862415261095a917ed7f2037c1f986b474 | [
"BSD-3-Clause"
] | 35 | 2015-11-14T18:32:45.000Z | 2022-01-23T15:15:05.000Z | tests/test_extractor_api_wpd.py | codeforamerica/comport | 117123862415261095a917ed7f2037c1f986b474 | [
"BSD-3-Clause"
] | 119 | 2015-11-20T22:45:34.000Z | 2022-02-10T23:02:36.000Z | tests/test_extractor_api_wpd.py | isabella232/comport | 117123862415261095a917ed7f2037c1f986b474 | [
"BSD-3-Clause"
] | 19 | 2015-11-20T20:41:52.000Z | 2022-01-26T04:12:34.000Z | # -*- coding: utf-8 -*-
import pytest
from comport.department.models import Department, Extractor
from comport.data.models import IncidentsUpdated, UseOfForceIncidentWPD, CitizenComplaintWPD
from testclient.JSON_test_client import JSONTestClient
@pytest.mark.usefixtures('db')
class TestExtractorWPD:
def test_post_uof_data(self, testapp):
''' New and updated UOF data from the extractor is processed as expected.
'''
# Set up the extractor
department = Department.create(name="W Police Department", short_name="WPD", load_defaults=False)
extractor, _ = Extractor.from_department_and_password(department=department, password="password")
# Set the correct authorization
testapp.authorization = ('Basic', (extractor.username, 'password'))
# post to the heartbeat URL to start the update
response = testapp.post_json("/data/heartbeat", params={"heartbeat": "heartbeat"})
# Post 5 fake incidents to the UOF endpoint
uof_count = 5
test_client = JSONTestClient()
uof_data = test_client.make_uof(count=uof_count, short_name=department.short_name)
response = testapp.post_json("/data/UOF", params={'month': 0, 'year': 0, 'data': uof_data})
# assert that we got the expected reponse
assert response.status_code == 200
# there are 5 incident rows in the database
check_uofs = UseOfForceIncidentWPD.query.all()
assert len(check_uofs) == uof_count
for incident in uof_data:
# verify that the opaqueIDs posted match those in the database
assert UseOfForceIncidentWPD.query.filter_by(opaque_id=incident['opaqueId']).first() is not None
# verify that the opaqueIds are recorded in IncidentsUpdated tables
record_updated = IncidentsUpdated.query.filter_by(opaque_id=incident['opaqueId']).first()
assert record_updated is not None
assert record_updated.department_id == department.id
assert record_updated.incident_type == "uof"
# post to the heartbeat URL to start the new update
response = testapp.post_json("/data/heartbeat", params={"heartbeat": "heartbeat"})
# Create 5 more fake incidents
new_data = test_client.make_uof(count=uof_count, short_name=department.short_name)
# give them the same opaqueIds as the first batch
for idx, _ in enumerate(new_data):
new_data[idx]['opaqueId'] = uof_data[idx]['opaqueId']
# post the new incident rows
response = testapp.post_json("/data/UOF", params={'month': 0, 'year': 0, 'data': new_data})
# assert that we got the expected reponse
assert response.status_code == 200
# there are 5 incident rows in the database
check_uofs = UseOfForceIncidentWPD.query.all()
assert len(check_uofs) == uof_count
# verify that the opaqueIDs posted match those in the database
for incident in uof_data:
assert UseOfForceIncidentWPD.query.filter_by(opaque_id=incident['opaqueId']).first() is not None
# Create 5 more fake incidents
new_data = test_client.make_uof(count=uof_count, short_name=department.short_name)
# give them the same opaqueIds as the first batch
for idx, _ in enumerate(new_data):
new_data[idx]['opaqueId'] = uof_data[idx]['opaqueId']
# post the new incident rows without starting a new update
response = testapp.post_json("/data/UOF", params={'month': 0, 'year': 0, 'data': new_data})
# assert that we got the expected reponse
assert response.status_code == 200
# there are 10 incident rows in the database
check_uofs = UseOfForceIncidentWPD.query.all()
assert len(check_uofs) == uof_count * 2
def test_all_uof_records_destroyed_when_new_record_posted(self, testapp):
''' Posting a new record with an id that matches a set of past records destroys all of them.
'''
# Set up the extractor
department = Department.create(name="W Police Department", short_name="WPD", load_defaults=False)
extractor, _ = Extractor.from_department_and_password(department=department, password="password")
# Set the correct authorization
testapp.authorization = ('Basic', (extractor.username, 'password'))
# post to the heartbeat URL to start the update
response = testapp.post_json("/data/heartbeat", params={"heartbeat": "heartbeat"})
# Post 5 fake incidents with an identical opaqueId to the UOF endpoint
uof_count = 5
test_client = JSONTestClient()
uof_data = test_client.make_uof(count=uof_count, short_name=department.short_name)
use_id = uof_data[0]['opaqueId']
for idx, _ in enumerate(uof_data):
uof_data[idx]['opaqueId'] = use_id
response = testapp.post_json("/data/UOF", params={'month': 0, 'year': 0, 'data': uof_data})
# assert that we got the expected reponse
assert response.status_code == 200
# there are 5 incident rows in the database
check_uofs = UseOfForceIncidentWPD.query.all()
assert len(check_uofs) == uof_count
# all the records in the database have the same opaqueId
uof_records = UseOfForceIncidentWPD.query.filter_by(opaque_id=use_id).all()
assert len(uof_records) == uof_count
# verify that the opaqueId is recorded in an IncidentsUpdated table
record_updated = IncidentsUpdated.query.filter_by(opaque_id=use_id).first()
assert record_updated is not None
assert record_updated.incident_type == "uof"
assert record_updated.department_id == department.id
# post to the heartbeat URL to start a new update
response = testapp.post_json("/data/heartbeat", params={"heartbeat": "heartbeat"})
# Create 1 new fake incident
new_data = test_client.make_uof(count=1, short_name=department.short_name)
# give it the same opaqueId as the first batch
new_data[0]['opaqueId'] = use_id
# post the new incident row
response = testapp.post_json("/data/UOF", params={'month': 0, 'year': 0, 'data': new_data})
# assert that we got the expected reponse
assert response.status_code == 200
# there is now only 1 incident row in the database
check_uofs = UseOfForceIncidentWPD.query.all()
assert len(check_uofs) == 1
# verify that the opaqueID posted matches that in the database
assert check_uofs[0].opaque_id == use_id
# verify that the opaqueId is recorded in an IncidentsUpdated table
record_updated = IncidentsUpdated.query.filter_by(opaque_id=use_id).first()
assert record_updated is not None
assert record_updated.incident_type == "uof"
assert record_updated.department_id == department.id
def test_post_complaints_data(self, testapp):
''' New and updated complaints data from the extractor is processed as expected.
'''
# Set up the extractor
department = Department.create(name="W Police Department", short_name="WPD", load_defaults=False)
extractor, _ = Extractor.from_department_and_password(department=department, password="password")
# Set the correct authorization
testapp.authorization = ('Basic', (extractor.username, 'password'))
# post to the heartbeat URL to start the update
response = testapp.post_json("/data/heartbeat", params={"heartbeat": "heartbeat"})
# Post 5 fake incidents to the complaints endpoint
complaints_count = 5
test_client = JSONTestClient()
complaints_data = test_client.make_complaints(count=complaints_count, short_name=department.short_name)
response = testapp.post_json("/data/complaints", params={'month': 0, 'year': 0, 'data': complaints_data})
# assert that we got the expected reponse
assert response.status_code == 200
# there are 5 incident rows in the database
check_complaints = CitizenComplaintWPD.query.all()
assert len(check_complaints) == complaints_count
for incident in complaints_data:
# verify that the opaqueIDs posted match those in the database
assert CitizenComplaintWPD.query.filter_by(opaque_id=incident['opaqueId']).first() is not None
# verify that the opaqueIds are recorded in IncidentsUpdated tables
record_updated = IncidentsUpdated.query.filter_by(opaque_id=incident['opaqueId']).first()
assert record_updated is not None
assert record_updated.incident_type == "complaints"
assert record_updated.department_id == department.id
# post to the heartbeat URL to start a new update
response = testapp.post_json("/data/heartbeat", params={"heartbeat": "heartbeat"})
# Create 5 more fake incidents
new_data = test_client.make_complaints(count=complaints_count, short_name=department.short_name)
# give them the same opaqueIds as the first batch
for idx, _ in enumerate(new_data):
new_data[idx]['opaqueId'] = complaints_data[idx]['opaqueId']
# post the new incident rows
response = testapp.post_json("/data/complaints", params={'month': 0, 'year': 0, 'data': new_data})
# assert that we got the expected reponse
assert response.status_code == 200
# there are 5 incident rows in the database
check_complaints = CitizenComplaintWPD.query.all()
assert len(check_complaints) == complaints_count
# verify that the opaqueIDs posted match those in the database
for incident in complaints_data:
assert CitizenComplaintWPD.query.filter_by(opaque_id=incident['opaqueId']).first() is not None
# Create 5 more fake incidents
new_data = test_client.make_complaints(count=complaints_count, short_name=department.short_name)
# give them the same opaqueIds as the first batch
for idx, _ in enumerate(new_data):
new_data[idx]['opaqueId'] = complaints_data[idx]['opaqueId']
# post the new incident rows without starting a new update
response = testapp.post_json("/data/complaints", params={'month': 0, 'year': 0, 'data': new_data})
# assert that we got the expected reponse
assert response.status_code == 200
# there are 10 incident rows in the database
check_complaints = CitizenComplaintWPD.query.all()
assert len(check_complaints) == complaints_count * 2
def test_all_complaints_records_destroyed_when_new_record_posted(self, testapp):
''' Posting a new record with an id that matches a set of past records destroys all of them.
'''
# Set up the extractor
department = Department.create(name="W Police Department", short_name="WPD", load_defaults=False)
extractor, _ = Extractor.from_department_and_password(department=department, password="password")
# Set the correct authorization
testapp.authorization = ('Basic', (extractor.username, 'password'))
# post to the heartbeat URL to start the update
response = testapp.post_json("/data/heartbeat", params={"heartbeat": "heartbeat"})
# Post 5 fake incidents with an identical opaqueId to the complaint endpoint
complaints_count = 5
test_client = JSONTestClient()
complaints_data = test_client.make_complaints(count=complaints_count, short_name=department.short_name)
use_id = complaints_data[0]['opaqueId']
for idx, _ in enumerate(complaints_data):
complaints_data[idx]['opaqueId'] = use_id
response = testapp.post_json("/data/complaints", params={'month': 0, 'year': 0, 'data': complaints_data})
# assert that we got the expected reponse
assert response.status_code == 200
# there are 5 incident rows in the database
check_complaints = CitizenComplaintWPD.query.all()
assert len(check_complaints) == complaints_count
# all the records in the database have the same id
complaint_records = CitizenComplaintWPD.query.filter_by(opaque_id=use_id).all()
assert len(complaint_records) == complaints_count
# verify that the opaqueId is recorded in an IncidentsUpdated table
record_updated = IncidentsUpdated.query.filter_by(opaque_id=use_id).first()
assert record_updated is not None
assert record_updated.incident_type == "complaints"
assert record_updated.department_id == department.id
# post to the heartbeat URL to start a new update
response = testapp.post_json("/data/heartbeat", params={"heartbeat": "heartbeat"})
# Create 1 new fake incident
new_data = test_client.make_complaints(count=1, short_name=department.short_name)
# give it the same opaqueId as the first batch
new_data[0]['opaqueId'] = use_id
# post the new incident
response = testapp.post_json("/data/complaints", params={'month': 0, 'year': 0, 'data': new_data})
# assert that we got the expected reponse
assert response.status_code == 200
# there is now only 1 incident row in the database
check_complaints = CitizenComplaintWPD.query.all()
assert len(check_complaints) == 1
# verify that the opaqueID posted matches that in the database
assert check_complaints[0].opaque_id == use_id
# verify that the opaqueId is recorded in an IncidentsUpdated table
record_updated = IncidentsUpdated.query.filter_by(opaque_id=use_id).first()
assert record_updated is not None
assert record_updated.incident_type == "complaints"
assert record_updated.department_id == department.id
| 48.43554 | 113 | 0.685203 | 1,750 | 13,901 | 5.274857 | 0.079429 | 0.023399 | 0.037049 | 0.044849 | 0.95071 | 0.940418 | 0.926335 | 0.915827 | 0.911386 | 0.911386 | 0 | 0.008305 | 0.22912 | 13,901 | 286 | 114 | 48.604895 | 0.853117 | 0.256816 | 0 | 0.802817 | 0 | 0 | 0.08716 | 0 | 0 | 0 | 0 | 0 | 0.323944 | 1 | 0.028169 | false | 0.056338 | 0.028169 | 0 | 0.06338 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
db0cdf00e3390e57444c176740b5becdfe02d1ec | 9,576 | py | Python | hoomd/md/test-py/test_analyze_log.py | PetersResearchGroup/PCND | 584768cc683a6df0152ead69b567d05b781aab2b | [
"BSD-3-Clause"
] | 1 | 2019-05-21T00:48:37.000Z | 2019-05-21T00:48:37.000Z | hoomd/md/test-py/test_analyze_log.py | PetersResearchGroup/PCND | 584768cc683a6df0152ead69b567d05b781aab2b | [
"BSD-3-Clause"
] | null | null | null | hoomd/md/test-py/test_analyze_log.py | PetersResearchGroup/PCND | 584768cc683a6df0152ead69b567d05b781aab2b | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: iso-8859-1 -*-
# Maintainer: joaander
import hoomd;
import hoomd.md;
from hoomd import *
hoomd.context.initialize()
import unittest
import os
import tempfile
import numpy
# unit tests for analyze.log
class analyze_log_tests (unittest.TestCase):
def setUp(self):
init.create_lattice(lattice.sc(a=2.1878096788957757),n=[5,5,4]);
hoomd.context.current.sorter.set_params(grid=8)
if hoomd.comm.get_rank() == 0:
tmp = tempfile.mkstemp(suffix='.test.log');
self.tmp_file = tmp[1];
else:
self.tmp_file = "invalid";
# tests basic creation of the analyzer
def test(self):
hoomd.analyze.log(quantities = ['test1', 'test2', 'test3'], period = 10, filename=self.tmp_file);
hoomd.run(100);
# tests with phase
def test_phase(self):
hoomd.analyze.log(quantities = ['test1', 'test2', 'test3'], period = 10, filename=self.tmp_file, phase=0);
hoomd.run(100);
# test set_params
def test_set_params(self):
ana = hoomd.analyze.log(quantities = ['test1', 'test2', 'test3'], period = 10, filename=self.tmp_file);
ana.set_params(quantities = ['test1']);
hoomd.run(100);
ana.set_params(delimiter = ' ');
hoomd.run(100);
ana.set_params(quantities = ['test2', 'test3'], delimiter=',')
hoomd.run(100);
ana.set_params(quantities = [u'test4', u'test5'], delimiter=',')
hoomd.run(100);
# test variable period
def test_variable(self):
ana = hoomd.analyze.log(quantities = ['test1', 'test2', 'test3'], period = lambda n: n*10, filename=self.tmp_file);
hoomd.run(100);
# test the initialization checks
def test_init_checks(self):
ana = hoomd.analyze.log(quantities = ['test1', 'test2', 'test3'], period = 10, filename=self.tmp_file);
ana.cpp_analyzer = None;
self.assertRaises(RuntimeError, ana.enable);
self.assertRaises(RuntimeError, ana.disable);
def tearDown(self):
hoomd.context.initialize();
if (hoomd.comm.get_rank()==0):
os.remove(self.tmp_file);
# test analyze.log with query
class analyze_log_query_tests (unittest.TestCase):
def setUp(self):
init.create_lattice(lattice.sc(a=1.5),n=[8,8,8]); # must be close enough to interact
nl = hoomd.md.nlist.cell()
self.pair = hoomd.md.pair.lj(r_cut=2.5, nlist = nl)
self.pair.pair_coeff.set('A', 'A', epsilon=1.0, sigma=1.0)
hoomd.md.integrate.mode_standard(dt=0.005);
hoomd.md.integrate.langevin(hoomd.group.all(), seed=1, kT=1.0);
hoomd.context.current.sorter.set_params(grid=8)
# tests query with no output file
def test(self):
log = hoomd.analyze.log(quantities = ['potential_energy', 'kinetic_energy'], period = 10, filename=None);
hoomd.run(102);
t0 = log.query('timestep');
U0 = log.query('potential_energy');
K0 = log.query('kinetic_energy');
hoomd.run(2);
t1 = log.query('timestep');
U1 = log.query('potential_energy');
K1 = log.query('kinetic_energy');
self.assertEqual(int(t0), 102)
self.assertNotEqual(K0, 0);
self.assertNotEqual(U0, 0);
self.assertEqual(int(t1), 104)
self.assertNotEqual(U0, U1);
self.assertNotEqual(K0, K1);
# tests basic creation of the analyzer
def test_with_file(self):
if hoomd.comm.get_rank() == 0:
tmp = tempfile.mkstemp(suffix='.test.log');
self.tmp_file = tmp[1];
else:
self.tmp_file = "invalid";
log = hoomd.analyze.log(quantities = ['potential_energy', 'kinetic_energy'], period = 10, filename=self.tmp_file);
hoomd.run(11);
t0 = log.query('timestep');
U0 = log.query('potential_energy');
K0 = log.query('kinetic_energy');
hoomd.run(2);
t1 = log.query('timestep');
U1 = log.query('potential_energy');
K1 = log.query('kinetic_energy');
self.assertEqual(int(t0), 10)
self.assertNotEqual(K0, 0);
self.assertNotEqual(U0, 0);
self.assertEqual(int(t1), 10)
self.assertEqual(U0, U1);
self.assertEqual(K0, K1);
def tearDown(self):
self.pair = None;
hoomd.context.initialize();
try:
import h5py
except ImportError:
enable_hdf5 = False
else:
enable_hdf5 = True
if enable_hdf5:
import hoomd.hdf5
# test hdf5.log with query
@unittest.skipIf(not enable_hdf5, "no h5py module available.")
class analyze_log_hdf5_query_tests (unittest.TestCase):
def setUp(self):
init.create_lattice(lattice.sc(a=1.5),n=[8,8,8]); # must be close enough to interact
nl = hoomd.md.nlist.cell()
self.pair = hoomd.md.pair.lj(r_cut=2.5, nlist = nl)
self.pair.pair_coeff.set('A', 'A', epsilon=1.0, sigma=1.0)
hoomd.md.integrate.mode_standard(dt=0.005);
hoomd.md.integrate.langevin(hoomd.group.all(), seed=1, kT=1.0);
hoomd.context.current.sorter.set_params(grid=8)
# tests basic creation of the analyzer
def test_with_file(self):
if hoomd.comm.get_rank() == 0:
tmp = tempfile.mkstemp(suffix='.test.h5');
self.tmp_file = tmp[1];
else:
self.tmp_file = "invalid";
with hoomd.hdf5.File(self.tmp_file,"a") as h5file:
log = hoomd.hdf5.log(h5file, quantities = ['potential_energy', 'kinetic_energy'], period = 10);
hoomd.run(11);
t0 = log.query('timestep');
U0 = log.query('potential_energy');
K0 = log.query('kinetic_energy');
hoomd.run(2);
t1 = log.query('timestep');
U1 = log.query('potential_energy');
K1 = log.query('kinetic_energy');
self.assertEqual(int(t0), 10)
self.assertNotEqual(K0, 0);
self.assertNotEqual(U0, 0);
self.assertEqual(int(t1), 10)
self.assertEqual(U0, U1);
self.assertEqual(K0, K1);
def tearDown(self):
self.pair = None;
hoomd.context.initialize();
if (hoomd.comm.get_rank()==0):
os.remove(self.tmp_file);
# unit tests for analyze.log_hdf5
@unittest.skipIf(not enable_hdf5, "no h5py module available.")
class analyze_log_hdf5_tests (unittest.TestCase):
def setUp(self):
init.create_lattice(lattice.sc(a=2.1878096788957757),n=[5,5,4]);
hoomd.context.current.sorter.set_params(grid=8)
if hoomd.comm.get_rank() == 0:
tmp = tempfile.mkstemp(suffix='.test.log');
self.tmp_file = tmp[1];
else:
self.tmp_file = "invalid";
# tests basic creation of the analyzer
def test(self):
with hoomd.hdf5.File(self.tmp_file,"a") as h5file:
hoomd.hdf5.log(h5file, quantities = ['test1', 'test2', 'test3'], period = 10);
hoomd.run(100);
def test_matrix(self):
def callback(timestep):
return numpy.random.rand(2, 3)
with hoomd.hdf5.File(self.tmp_file,"a") as h5file:
ana = hoomd.hdf5.log(h5file, quantities = ['test1', 'test2', 'test3'], matrix_quantities=["mtest1","mtest2"], period = 10);
ana.register_callback("mtest1",callback,matrix=True)
ana.register_callback("mtest2",callback,matrix=True)
hoomd.run(100);
# tests with phase
def test_phase(self):
def callback(timestep):
return numpy.random.rand(2,3)
with hoomd.hdf5.File(self.tmp_file, "a") as h5file:
ana = hoomd.hdf5.log(h5file,quantities = ['test1', 'test2', 'test3'],matrix_quantities=["mtest1","mtest2"], period = 10,phase=0);
ana.register_callback("mtest1",callback,matrix=True)
ana.register_callback("mtest2",callback,matrix=True)
hoomd.run(100);
# test set_params
def test_set_params(self):
def callback(timestep):
return numpy.random.rand(2, 3)
with hoomd.hdf5.File(self.tmp_file,"a") as h5file:
ana = hoomd.hdf5.log(h5file,quantities = ['test1', 'test2', 'test3'], matrix_quantities=["mtest1","mtest2"], period = 10);
ana.register_callback("mtest1", callback, matrix=True)
ana.register_callback("mtest2", callback, matrix=True)
#hdf5 logger does not support changing the number of logged quantities on the fly.
# ana.set_params(quantities = ['test1']);
# hoomd.run(100);
# ana.set_params(quantities = ['test2', 'test3'])
# hoomd.run(100);
# ana.set_params(quantities = [u'test4', u'test5'])
# hoomd.run(100);
# ana.set_params(matrix_quantities = [])
# hoomd.run(100);
ana.set_params(matrix_quantities = ["mtest1"])
hoomd.run(100);
ana.set_params(matrix_quantities = ["mtest1"])
hoomd.run(100);
# test the initialization checks
def test_init_checks(self):
with hoomd.hdf5.File(self.tmp_file,"a") as h5file:
ana = hoomd.hdf5.log(h5file, quantities = ['test1', 'test2', 'test3'], period = 10);
ana.cpp_analyzer = None;
self.assertRaises(RuntimeError, ana.enable);
self.assertRaises(RuntimeError, ana.disable);
def tearDown(self):
hoomd.context.initialize();
if (hoomd.comm.get_rank()==0):
os.remove(self.tmp_file);
if __name__ == '__main__':
unittest.main(argv = ['test.py', '-v'])
| 35.731343 | 141 | 0.601817 | 1,223 | 9,576 | 4.605887 | 0.137367 | 0.028582 | 0.044914 | 0.044381 | 0.89242 | 0.881413 | 0.881413 | 0.873247 | 0.857447 | 0.846263 | 0 | 0.047798 | 0.250627 | 9,576 | 267 | 142 | 35.865169 | 0.737179 | 0.091479 | 0 | 0.78125 | 0 | 0 | 0.083045 | 0 | 0 | 0 | 0 | 0 | 0.114583 | 1 | 0.125 | false | 0 | 0.052083 | 0.015625 | 0.213542 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
db356cc3b33ac2cddcfd521f1c6b42f1d1e228e7 | 104,217 | py | Python | Uncertainty/data/case-de/case_de_141.py | thanever/SOC | 9f30d1a9c7610a68de9c178a1170bdf1c8ca11d4 | [
"MIT"
] | null | null | null | Uncertainty/data/case-de/case_de_141.py | thanever/SOC | 9f30d1a9c7610a68de9c178a1170bdf1c8ca11d4 | [
"MIT"
] | null | null | null | Uncertainty/data/case-de/case_de_141.py | thanever/SOC | 9f30d1a9c7610a68de9c178a1170bdf1c8ca11d4 | [
"MIT"
] | null | null | null | from numpy import array
def case_de_141():
ppc = {"version": '2'}
ppc["baseMVA"] = 100.0
ppc["bus"] = array([
[75, 2, 104.23, 20.85, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[502, 2, 230.56, 46.11, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[44, 2, 142.1, 28.42, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[492, 2, 78.44, 15.69, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[180, 2, 45.51, 9.1, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[46, 1, 0, 0, 0, 0, 5, -8.213670625178798e+17,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[21, 2, 913.42, 182.68, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[33, 2, 188.05, 37.61, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[559, 2, 69.58, 13.92, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[18, 1, 0, 0, 0, 0, 5, -1.1120497296195262e+17,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[73, 2, 83.63, 16.73, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[503, 2, 70.61, 14.12, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[511, 2, 103.38, 20.68, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[72, 2, 261.22, 52.24, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[288, 2, 61.04, 12.21, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[339, 2, 152.46, 30.49, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[321, 2, 196.61, 39.32, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[8, 2, 151.04, 30.21, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[292, 2, 124.55, 24.91, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[343, 2, 110.9, 22.18, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[12, 1, 0, 0, 0, 0, 5, -1.9147485426082883e+18,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[340, 2, 128.9, 25.78, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[177, 2, 26.53, 5.31, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[497, 2, 963.4, 192.68, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[102, 2, 139.76, 27.95, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[311, 2, 192.11, 38.42, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[429, 2, 328.82, 65.76, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[32, 1, 0, 0, 0, 0, 5, 1.8589075141947197e+18,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[22, 1, 0, 0, 0, 0, 5, -2.1083883574050054e+17,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[101, 2, 72.21, 14.44, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[71, 2, 159.49, 31.9, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[558, 2, 130.02, 26.0, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[217, 2, 39.34, 7.87, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[322, 2, 25.03, 5.01, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[278, 2, 145.44, 29.09, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[498, 2, 45.18, 9.04, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[47, 2, 327.96, 65.59, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[346, 2, 301.83, 60.37, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[74, 1, 0, 0, 0, 0, 5, -3.9710816853684905e+18,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[557, 2, 220.49, 44.1, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[42, 1, 0, 0, 0, 0, 5, -6.512187363009073e+17,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[39, 3, 64.51, 12.9, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[45, 2, 75.43, 15.09, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[493, 2, 101.1, 20.22, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[98, 2, 101.97, 20.39, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[435, 2, 145.68, 29.14, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[338, 2, 246.51, 49.3, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[79, 2, 100.61, 20.12, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[78, 1, 0, 0, 0, 0, 5, -6.38261154212576e+19,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[512, 2, 68.29, 13.66, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[276, 2, 186.31, 37.26, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[569, 2, 180.43, 36.09, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[37, 1, 0, 0, 0, 0, 5, -1.1050623093187891e+17,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[179, 2, 51.77, 10.35, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[325, 2, 149.96, 29.99, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[20, 1, 0, 0, 0, 0, 5, -2.2857291971530646e+17,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[496, 2, 7.7, 1.54, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[279, 1, 0, 0, 0, 0, 5, 82337.0212,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[436, 2, 77.77, 15.55, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[345, 2, 304.04, 60.81, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[505, 2, 327.96, 65.59, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[290, 1, 0, 0, 0, 0, 5, -215811156346.3258,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[363, 2, 307.67, 61.53, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[11, 2, 89.5, 17.9, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[277, 1, 0, 0, 0, 0, 5, 8.630450418300285e+17,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[441, 2, 57.34, 11.47, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[280, 1, 0, 0, 0, 0, 5, 1.5547907899345816e+16,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[504, 2, 46.24, 9.25, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[181, 2, 34.35, 6.87, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[291, 2, 63.18, 12.64, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[344, 2, 278.05, 55.61, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[40, 2, 67.39, 13.48, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[80, 2, 106.87, 21.37, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[183, 2, 465.75, 93.15, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[440, 2, 74.79, 14.96, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[43, 2, 111.07, 22.21, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[10, 1, 0, 0, 0, 0, 5, -2.8816130864663066e+17,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[81, 2, 120.64, 24.13, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[560, 2, 108.7, 21.74, 0, 0, 5, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[341, 2, 116.53, 23.31, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[17, 2, 85.98, 17.2, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[41, 2, 72.43, 14.49, 0, 0, 5, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[192, 2, 54.57, 10.91, 0, 0, 0, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[342, 2, 202.14, 40.43, 0, 0, 0, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[218, 2, 119.86, 23.97, 0, 0, 0, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[65, 2, 5.33, 1.07, 0, 0, 0, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[289, 2, 96.0, 19.2, 0, 0, 0, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[324, 2, 460.34, 92.07, 0, 0, 0, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[48, 2, 225.43, 45.09, 0, 0, 0, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[332, 1, 0, 0, 0, 0, 0, -1.2178270042878761e+20,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[422, 2, 75.05, 15.01, 0, 0, 0, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[23, 2, 119.6, 23.92, 0, 0, 0, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[570, 2, 281.68, 56.34, 0, 0, 0, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[38, 2, 197.02, 39.4, 0, 0, 0, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[31, 2, 149.98, 30.0, 0, 0, 0, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[49, 2, 57.02, 11.4, 0, 0, 0, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[182, 2, 1.56, 0.31, 0, 0, 0, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[9, 2, 102.14, 20.43, 0, 0, 0, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[323, 2, 2.6, 0.52, 0, 0, 0, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[400, 2, 54.6, 10.92, 0, 0, 0, 1.0, 0, 220.0, 0, 1.1, 0.9, 0.6, 10 ],
[30, 1, 0, 0, 0, 0, 0, -4.832486022139699e+17,0, 380.0, 0, 1.1, 0.9, 0.6, 10 ],
[25, 2, 57.2, 11.44, 0, 0, 0, 1.0, 0, 380.0, 0, 1.1, 0.9, 0.6, 10 ]
])
ppc["gen"] = array([
[102, 0, 0, 33.95, -8.49, 1.0, 100, 1, 67.9, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 101.85, 13.58, 20.37, 20.37, 27.16 ],
[493, 0, 0, 75.0, -18.75, 1.0, 100, 1, 150.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 225.0, 30.0, 45.0, 45.0, 60.0 ],
[493, 0, 0, 15.0, -3.75, 1.0, 100, 1, 30.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 45.0, 6.0, 9.0, 9.0, 12.0 ],
[177, 0, 0, 16.35, -4.09, 1.0, 100, 1, 32.7, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 49.05, 6.54, 9.81, 9.81, 13.08 ],
[180, 0, 0, 12.7, -3.18, 1.0, 100, 1, 25.4, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 38.1, 5.08, 7.62, 7.62, 10.16 ],
[180, 0, 0, 166.75, -41.69, 1.0, 100, 1, 333.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 500.25, 66.7, 100.05, 100.05, 133.4 ],
[180, 0, 0, 14.45, -3.61, 1.0, 100, 1, 28.9, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 43.35, 5.78, 8.67, 8.67, 11.56 ],
[183, 0, 0, 11.25, -2.81, 1.0, 100, 1, 22.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 33.75, 4.5, 6.75, 6.75, 9.0 ],
[183, 0, 0, 383.0, -95.75, 1.0, 100, 1, 766.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 1149.0, 153.2, 229.8, 229.8, 306.4 ],
[183, 0, 0, 19.0, -4.75, 1.0, 100, 1, 38.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 57.0, 7.6, 11.4, 11.4, 15.2 ],
[183, 0, 0, 12.0, -3.0, 1.0, 100, 1, 24.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 36.0, 4.8, 7.2, 7.2, 9.6 ],
[496, 0, 0, 26.4, -6.6, 1.0, 100, 1, 52.8, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 79.2, 10.56, 15.84, 15.84, 21.12 ],
[21, 0, 0, 63.5, -15.88, 1.0, 100, 1, 127.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 190.5, 25.4, 38.1, 38.1, 50.8 ],
[21, 0, 0, 97.0, -24.25, 1.0, 100, 1, 194.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 291.0, 38.8, 58.2, 58.2, 77.6 ],
[21, 0, 0, 8.2, -2.05, 1.0, 100, 1, 16.4, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 24.6, 3.28, 4.92, 4.92, 6.56 ],
[217, 0, 0, 54.0, -13.5, 1.0, 100, 1, 108.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 162.0, 21.6, 32.4, 32.4, 43.2 ],
[217, 0, 0, 254.0, -63.5, 1.0, 100, 1, 508.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 762.0, 101.6, 152.4, 152.4, 203.2 ],
[217, 0, 0, 8.5, -2.12, 1.0, 100, 1, 17.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 25.5, 3.4, 5.1, 5.1, 6.8 ],
[498, 0, 0, 149.25, -37.31, 1.0, 100, 1, 298.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 447.75, 59.7, 89.55, 89.55, 119.4 ],
[557, 0, 0, 45.4, -11.35, 1.0, 100, 1, 90.8, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 136.2, 18.16, 27.24, 27.24, 36.32 ],
[558, 0, 0, 37.0, -9.25, 1.0, 100, 1, 74.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 111.0, 14.8, 22.2, 22.2, 29.6 ],
[559, 0, 0, 8.5, -2.12, 1.0, 100, 1, 17.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 25.5, 3.4, 5.1, 5.1, 6.8 ],
[288, 0, 0, 7.85, -1.96, 1.0, 100, 1, 15.7, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 23.55, 3.14, 4.71, 4.71, 6.28 ],
[289, 0, 0, 552.5, -138.12, 1.0, 100, 1, 1105.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 1657.5, 221.0, 331.5, 331.5, 442.0 ],
[560, 0, 0, 10.15, -2.54, 1.0, 100, 1, 20.3, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 30.45, 4.06, 6.09, 6.09, 8.12 ],
[560, 0, 0, 108.0, -27.0, 1.0, 100, 1, 216.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 324.0, 43.2, 64.8, 64.8, 86.4 ],
[560, 0, 0, 29.5, -7.38, 1.0, 100, 1, 59.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 88.5, 11.8, 17.7, 17.7, 23.6 ],
[292, 0, 0, 6.75, -1.69, 1.0, 100, 1, 13.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 20.25, 2.7, 4.05, 4.05, 5.4 ],
[292, 0, 0, 5.6, -1.4, 1.0, 100, 1, 11.2, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 16.8, 2.24, 3.36, 3.36, 4.48 ],
[31, 0, 0, 97.7, -24.42, 1.0, 100, 1, 195.4, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 293.1, 39.08, 58.62, 58.62, 78.16 ],
[311, 0, 0, 437.5, -109.38, 1.0, 100, 1, 875.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 1312.5, 175.0, 262.5, 262.5, 350.0 ],
[321, 0, 0, 6.45, -1.61, 1.0, 100, 1, 12.9, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 19.35, 2.58, 3.87, 3.87, 5.16 ],
[324, 0, 0, 159.9, -39.98, 1.0, 100, 1, 319.8, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 479.7, 63.96, 95.94, 95.94, 127.92 ],
[325, 0, 0, 13.45, -3.36, 1.0, 100, 1, 26.9, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 40.35, 5.38, 8.07, 8.07, 10.76 ],
[502, 0, 0, 54.55, -13.64, 1.0, 100, 1, 109.1, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 163.65, 21.82, 32.73, 32.73, 43.64 ],
[33, 0, 0, 15.9, -3.98, 1.0, 100, 1, 31.8, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 47.7, 6.36, 9.54, 9.54, 12.72 ],
[570, 0, 0, 26.4, -6.6, 1.0, 100, 1, 52.8, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 79.2, 10.56, 15.84, 15.84, 21.12 ],
[570, 0, 0, 44.5, -11.12, 1.0, 100, 1, 89.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 133.5, 17.8, 26.7, 26.7, 35.6 ],
[338, 0, 0, 149.8, -37.45, 1.0, 100, 1, 299.6, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 449.4, 59.92, 89.88, 89.88, 119.84 ],
[338, 0, 0, 41.25, -10.31, 1.0, 100, 1, 82.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 123.75, 16.5, 24.75, 24.75, 33.0 ],
[339, 0, 0, 67.0, -16.75, 1.0, 100, 1, 134.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 201.0, 26.8, 40.2, 40.2, 53.6 ],
[339, 0, 0, 79.5, -19.88, 1.0, 100, 1, 159.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 238.5, 31.8, 47.7, 47.7, 63.6 ],
[339, 0, 0, 55.5, -13.88, 1.0, 100, 1, 111.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 166.5, 22.2, 33.3, 33.3, 44.4 ],
[339, 0, 0, 21.35, -5.34, 1.0, 100, 1, 42.7, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 64.05, 8.54, 12.81, 12.81, 17.08 ],
[339, 0, 0, 29.0, -7.25, 1.0, 100, 1, 58.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 87.0, 11.6, 17.4, 17.4, 23.2 ],
[340, 0, 0, 9.75, -2.44, 1.0, 100, 1, 19.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 29.25, 3.9, 5.85, 5.85, 7.8 ],
[342, 0, 0, 34.98, -8.74, 1.0, 100, 1, 69.95, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 104.93, 13.99, 20.98, 20.98, 27.98 ],
[345, 0, 0, 105.5, -26.38, 1.0, 100, 1, 211.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 316.5, 42.2, 63.3, 63.3, 84.4 ],
[345, 0, 0, 44.5, -11.12, 1.0, 100, 1, 89.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 133.5, 17.8, 26.7, 26.7, 35.6 ],
[345, 0, 0, 163.5, -40.88, 1.0, 100, 1, 327.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 490.5, 65.4, 98.1, 98.1, 130.8 ],
[346, 0, 0, 229.45, -57.36, 1.0, 100, 1, 458.9, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 688.35, 91.78, 137.67, 137.67, 183.56 ],
[363, 0, 0, 40.9, -10.22, 1.0, 100, 1, 81.8, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 122.7, 16.36, 24.54, 24.54, 32.72 ],
[363, 0, 0, 344.0, -86.0, 1.0, 100, 1, 688.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 1032.0, 137.6, 206.4, 206.4, 275.2 ],
[363, 0, 0, 18.0, -4.5, 1.0, 100, 1, 36.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 54.0, 7.2, 10.8, 10.8, 14.4 ],
[503, 0, 0, 26.0, -6.5, 1.0, 100, 1, 52.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 78.0, 10.4, 15.6, 15.6, 20.8 ],
[503, 0, 0, 680.0, -170.0, 1.0, 100, 1, 1360.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 2040.0, 272.0, 408.0, 408.0, 544.0 ],
[503, 0, 0, 29.2, -7.3, 1.0, 100, 1, 58.4, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 87.6, 11.68, 17.52, 17.52, 23.36 ],
[39, 0, 0, 1149.65, -287.41, 1.0, 100, 1, 2299.3, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 3448.95, 459.86, 689.79, 689.79, 919.72 ],
[40, 0, 0, 24.0, -6.0, 1.0, 100, 1, 48.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 72.0, 9.6, 14.4, 14.4, 19.2 ],
[400, 0, 0, 44.0, -11.0, 1.0, 100, 1, 88.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 132.0, 17.6, 26.4, 26.4, 35.2 ],
[400, 0, 0, 30.0, -7.5, 1.0, 100, 1, 60.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 90.0, 12.0, 18.0, 18.0, 24.0 ],
[400, 0, 0, 79.0, -19.75, 1.0, 100, 1, 158.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 237.0, 31.6, 47.4, 47.4, 63.2 ],
[422, 0, 0, 37.0, -9.25, 1.0, 100, 1, 74.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 111.0, 14.8, 22.2, 22.2, 29.6 ],
[43, 0, 0, 98.0, -24.5, 1.0, 100, 1, 196.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 294.0, 39.2, 58.8, 58.8, 78.4 ],
[43, 0, 0, 8.5, -2.12, 1.0, 100, 1, 17.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 25.5, 3.4, 5.1, 5.1, 6.8 ],
[429, 0, 0, 82.0, -20.5, 1.0, 100, 1, 164.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 246.0, 32.8, 49.2, 49.2, 65.6 ],
[44, 0, 0, 13.0, -3.25, 1.0, 100, 1, 26.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 39.0, 5.2, 7.8, 7.8, 10.4 ],
[435, 0, 0, 91.0, -22.75, 1.0, 100, 1, 182.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 273.0, 36.4, 54.6, 54.6, 72.8 ],
[435, 0, 0, 30.75, -7.69, 1.0, 100, 1, 61.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 92.25, 12.3, 18.45, 18.45, 24.6 ],
[436, 0, 0, 13.25, -3.31, 1.0, 100, 1, 26.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 39.75, 5.3, 7.95, 7.95, 10.6 ],
[440, 0, 0, 7.35, -1.84, 1.0, 100, 1, 14.7, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 22.05, 2.94, 4.41, 4.41, 5.88 ],
[441, 0, 0, 37.5, -9.38, 1.0, 100, 1, 75.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 112.5, 15.0, 22.5, 22.5, 30.0 ],
[45, 0, 0, 148.0, -37.0, 1.0, 100, 1, 296.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 444.0, 59.2, 88.8, 88.8, 118.4 ],
[45, 0, 0, 11.55, -2.89, 1.0, 100, 1, 23.1, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 34.65, 4.62, 6.93, 6.93, 9.24 ],
[47, 0, 0, 222.0, -55.5, 1.0, 100, 1, 444.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 666.0, 88.8, 133.2, 133.2, 177.6 ],
[47, 0, 0, 15.85, -3.96, 1.0, 100, 1, 31.7, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 47.55, 6.34, 9.51, 9.51, 12.68 ],
[49, 0, 0, 176.0, -44.0, 1.0, 100, 1, 352.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 528.0, 70.4, 105.6, 105.6, 140.8 ],
[49, 0, 0, 33.0, -8.25, 1.0, 100, 1, 66.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 99.0, 13.2, 19.8, 19.8, 26.4 ],
[49, 0, 0, 18.75, -4.69, 1.0, 100, 1, 37.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 56.25, 7.5, 11.25, 11.25, 15.0 ],
[65, 0, 0, 82.25, -20.56, 1.0, 100, 1, 164.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 246.75, 32.9, 49.35, 49.35, 65.8 ],
[71, 0, 0, 24.5, -6.12, 1.0, 100, 1, 49.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 73.5, 9.8, 14.7, 14.7, 19.6 ],
[71, 0, 0, 80.55, -20.14, 1.0, 100, 1, 161.1, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 241.65, 32.22, 48.33, 48.33, 64.44 ],
[71, 0, 0, 4.95, -1.24, 1.0, 100, 1, 9.9, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 14.85, 1.98, 2.97, 2.97, 3.96 ],
[72, 0, 0, 450.0, -112.5, 1.0, 100, 1, 900.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 1350.0, 180.0, 270.0, 270.0, 360.0 ],
[72, 0, 0, 75.5, -18.88, 1.0, 100, 1, 151.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 226.5, 30.2, 45.3, 45.3, 60.4 ],
[72, 0, 0, 60.0, -15.0, 1.0, 100, 1, 120.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 180.0, 24.0, 36.0, 36.0, 48.0 ],
[511, 0, 0, 61.0, -15.25, 1.0, 100, 1, 122.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 183.0, 24.4, 36.6, 36.6, 48.8 ],
[511, 0, 0, 11.65, -2.91, 1.0, 100, 1, 23.3, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 34.95, 4.66, 6.99, 6.99, 9.32 ],
[75, 0, 0, 24.5, -6.12, 1.0, 100, 1, 49.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 73.5, 9.8, 14.7, 14.7, 19.6 ],
[79, 0, 0, 375.0, -93.75, 1.0, 100, 1, 750.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 1125.0, 150.0, 225.0, 225.0, 300.0 ],
[79, 0, 0, 9.35, -2.34, 1.0, 100, 1, 18.7, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 28.05, 3.74, 5.61, 5.61, 7.48 ],
[81, 0, 0, 1417.5, -354.38, 1.0, 100, 1, 2835.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 4252.5, 567.0, 850.5, 850.5, 1134.0 ],
[81, 0, 0, 62.25, -15.56, 1.0, 100, 1, 124.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.95, 186.75, 24.9, 37.35, 37.35, 49.8 ],
[218, 0, 0, 2.82, -0.7, 1.0, 100, 1, 5.63, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 8.45, 1.13, 1.69, 1.69, 2.25 ],
[498, 0, 0, 104.07, -26.02, 1.0, 100, 1, 208.14, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 312.21, 41.63, 62.44, 62.44, 83.26 ],
[8, 0, 0, 6.15, -1.54, 1.0, 100, 1, 12.3, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 18.45, 2.46, 3.69, 3.69, 4.92 ],
[9, 0, 0, 7.39, -1.85, 1.0, 100, 1, 14.78, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 22.17, 2.96, 4.43, 4.43, 5.91 ],
[11, 0, 0, 44.04, -11.01, 1.0, 100, 1, 88.08, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 132.12, 17.62, 26.42, 26.42, 35.23 ],
[17, 0, 0, 16.6, -4.15, 1.0, 100, 1, 33.19, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 49.79, 6.64, 9.96, 9.96, 13.28 ],
[21, 0, 0, 0.17, -0.04, 1.0, 100, 1, 0.34, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 0.51, 0.07, 0.1, 0.1, 0.14 ],
[23, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[25, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[31, 0, 0, 24.89, -6.22, 1.0, 100, 1, 49.78, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 74.67, 9.96, 14.93, 14.93, 19.91 ],
[33, 0, 0, 148.98, -37.25, 1.0, 100, 1, 297.96, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 446.95, 59.59, 89.39, 89.39, 119.19 ],
[38, 0, 0, 0.81, -0.2, 1.0, 100, 1, 1.61, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 2.42, 0.32, 0.48, 0.48, 0.64 ],
[39, 0, 0, 179.75, -44.94, 1.0, 100, 1, 359.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 539.24, 71.9, 107.85, 107.85, 143.8 ],
[40, 0, 0, 117.18, -29.3, 1.0, 100, 1, 234.37, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 351.55, 46.87, 70.31, 70.31, 93.75 ],
[41, 0, 0, 323.91, -80.98, 1.0, 100, 1, 647.83, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 971.74, 129.57, 194.35, 194.35, 259.13 ],
[43, 0, 0, 150.75, -37.69, 1.0, 100, 1, 301.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 452.25, 60.3, 90.45, 90.45, 120.6 ],
[44, 0, 0, 3.18, -0.8, 1.0, 100, 1, 6.37, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 9.55, 1.27, 1.91, 1.91, 2.55 ],
[45, 0, 0, 69.01, -17.25, 1.0, 100, 1, 138.01, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 207.02, 27.6, 41.4, 41.4, 55.21 ],
[47, 0, 0, 0.82, -0.2, 1.0, 100, 1, 1.64, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 2.45, 0.33, 0.49, 0.49, 0.65 ],
[48, 0, 0, 2.48, -0.62, 1.0, 100, 1, 4.96, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 7.45, 0.99, 1.49, 1.49, 1.99 ],
[49, 0, 0, 92.09, -23.02, 1.0, 100, 1, 184.18, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 276.27, 36.84, 55.25, 55.25, 73.67 ],
[65, 0, 0, 8.12, -2.03, 1.0, 100, 1, 16.24, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 24.36, 3.25, 4.87, 4.87, 6.5 ],
[71, 0, 0, 198.2, -49.55, 1.0, 100, 1, 396.4, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 594.6, 79.28, 118.92, 118.92, 158.56 ],
[72, 0, 0, 250.19, -62.55, 1.0, 100, 1, 500.38, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 750.57, 100.08, 150.11, 150.11, 200.15 ],
[73, 0, 0, 295.36, -73.84, 1.0, 100, 1, 590.72, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 886.08, 118.14, 177.22, 177.22, 236.29 ],
[75, 0, 0, 289.0, -72.25, 1.0, 100, 1, 578.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 867.0, 115.6, 173.4, 173.4, 231.2 ],
[79, 0, 0, 57.79, -14.45, 1.0, 100, 1, 115.57, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 173.36, 23.11, 34.67, 34.67, 46.23 ],
[80, 0, 0, 18.33, -4.58, 1.0, 100, 1, 36.66, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 54.99, 7.33, 11.0, 11.0, 14.66 ],
[81, 0, 0, 156.4, -39.1, 1.0, 100, 1, 312.8, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 469.2, 62.56, 93.84, 93.84, 125.12 ],
[98, 0, 0, 8.91, -2.23, 1.0, 100, 1, 17.82, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 26.74, 3.56, 5.35, 5.35, 7.13 ],
[101, 0, 0, 75.31, -18.83, 1.0, 100, 1, 150.62, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 225.94, 30.12, 45.19, 45.19, 60.25 ],
[102, 0, 0, 27.76, -6.94, 1.0, 100, 1, 55.52, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 83.28, 11.1, 16.66, 16.66, 22.21 ],
[177, 0, 0, 41.87, -10.47, 1.0, 100, 1, 83.74, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 125.6, 16.75, 25.12, 25.12, 33.49 ],
[179, 0, 0, 186.46, -46.61, 1.0, 100, 1, 372.92, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 559.38, 74.58, 111.88, 111.88, 149.17 ],
[180, 0, 0, 114.72, -28.68, 1.0, 100, 1, 229.43, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 344.15, 45.89, 68.83, 68.83, 91.77 ],
[181, 0, 0, 114.8, -28.7, 1.0, 100, 1, 229.6, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 344.4, 45.92, 68.88, 68.88, 91.84 ],
[182, 0, 0, 20.0, -5.0, 1.0, 100, 1, 40.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 59.99, 8.0, 12.0, 12.0, 16.0 ],
[183, 0, 0, 3.56, -0.89, 1.0, 100, 1, 7.11, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 10.67, 1.42, 2.13, 2.13, 2.84 ],
[192, 0, 0, 65.45, -16.36, 1.0, 100, 1, 130.91, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 196.36, 26.18, 39.27, 39.27, 52.36 ],
[217, 0, 0, 8.08, -2.02, 1.0, 100, 1, 16.16, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 24.24, 3.23, 4.85, 4.85, 6.46 ],
[218, 0, 0, 7.52, -1.88, 1.0, 100, 1, 15.03, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 22.55, 3.01, 4.51, 4.51, 6.01 ],
[276, 0, 0, 158.56, -39.64, 1.0, 100, 1, 317.13, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 475.69, 63.43, 95.14, 95.14, 126.85 ],
[278, 0, 0, 188.38, -47.1, 1.0, 100, 1, 376.77, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 565.15, 75.35, 113.03, 113.03, 150.71 ],
[288, 0, 0, 37.31, -9.33, 1.0, 100, 1, 74.62, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 111.93, 14.92, 22.39, 22.39, 29.85 ],
[289, 0, 0, 15.1, -3.78, 1.0, 100, 1, 30.21, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 45.31, 6.04, 9.06, 9.06, 12.08 ],
[291, 0, 0, 11.07, -2.77, 1.0, 100, 1, 22.14, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 33.22, 4.43, 6.64, 6.64, 8.86 ],
[292, 0, 0, 7.05, -1.76, 1.0, 100, 1, 14.09, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 21.14, 2.82, 4.23, 4.23, 5.64 ],
[311, 0, 0, 19.45, -4.86, 1.0, 100, 1, 38.91, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 58.36, 7.78, 11.67, 11.67, 15.56 ],
[321, 0, 0, 25.52, -6.38, 1.0, 100, 1, 51.04, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 76.56, 10.21, 15.31, 15.31, 20.41 ],
[322, 0, 0, 48.15, -12.04, 1.0, 100, 1, 96.29, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 144.44, 19.26, 28.89, 28.89, 38.52 ],
[323, 0, 0, 12.56, -3.14, 1.0, 100, 1, 25.12, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 37.68, 5.02, 7.54, 7.54, 10.05 ],
[324, 0, 0, 29.74, -7.44, 1.0, 100, 1, 59.48, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 89.22, 11.9, 17.84, 17.84, 23.79 ],
[325, 0, 0, 74.68, -18.67, 1.0, 100, 1, 149.37, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 224.05, 29.87, 44.81, 44.81, 59.75 ],
[338, 0, 0, 45.89, -11.47, 1.0, 100, 1, 91.77, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 137.66, 18.35, 27.53, 27.53, 36.71 ],
[339, 0, 0, 73.27, -18.32, 1.0, 100, 1, 146.55, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 219.82, 29.31, 43.96, 43.96, 58.62 ],
[340, 0, 0, 89.39, -22.35, 1.0, 100, 1, 178.77, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 268.16, 35.75, 53.63, 53.63, 71.51 ],
[341, 0, 0, 1.68, -0.42, 1.0, 100, 1, 3.36, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 5.04, 0.67, 1.01, 1.01, 1.34 ],
[342, 0, 0, 45.21, -11.3, 1.0, 100, 1, 90.43, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 135.64, 18.09, 27.13, 27.13, 36.17 ],
[343, 0, 0, 17.37, -4.34, 1.0, 100, 1, 34.74, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 52.11, 6.95, 10.42, 10.42, 13.9 ],
[344, 0, 0, 0.78, -0.2, 1.0, 100, 1, 1.56, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 2.34, 0.31, 0.47, 0.47, 0.62 ],
[345, 0, 0, 1.56, -0.39, 1.0, 100, 1, 3.12, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 4.68, 0.62, 0.94, 0.94, 1.25 ],
[346, 0, 0, 1.74, -0.43, 1.0, 100, 1, 3.48, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 5.21, 0.7, 1.04, 1.04, 1.39 ],
[363, 0, 0, 1.07, -0.27, 1.0, 100, 1, 2.13, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 3.2, 0.43, 0.64, 0.64, 0.85 ],
[400, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[422, 0, 0, 34.25, -8.56, 1.0, 100, 1, 68.5, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 102.75, 13.7, 20.55, 20.55, 27.4 ],
[429, 0, 0, 0.36, -0.09, 1.0, 100, 1, 0.72, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 1.08, 0.14, 0.22, 0.22, 0.29 ],
[435, 0, 0, 94.67, -23.67, 1.0, 100, 1, 189.34, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 284.01, 37.87, 56.8, 56.8, 75.74 ],
[436, 0, 0, 17.35, -4.34, 1.0, 100, 1, 34.7, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 52.05, 6.94, 10.41, 10.41, 13.88 ],
[440, 0, 0, 85.02, -21.25, 1.0, 100, 1, 170.04, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 255.05, 34.01, 51.01, 51.01, 68.01 ],
[441, 0, 0, 94.1, -23.53, 1.0, 100, 1, 188.2, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 282.3, 37.64, 56.46, 56.46, 75.28 ],
[492, 0, 0, 142.02, -35.51, 1.0, 100, 1, 284.04, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 426.07, 56.81, 85.21, 85.21, 113.62 ],
[493, 0, 0, 104.33, -26.08, 1.0, 100, 1, 208.66, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 312.99, 41.73, 62.6, 62.6, 83.46 ],
[496, 0, 0, 79.91, -19.98, 1.0, 100, 1, 159.82, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 239.73, 31.96, 47.95, 47.95, 63.93 ],
[497, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[498, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[502, 0, 0, 117.59, -29.4, 1.0, 100, 1, 235.18, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 352.77, 47.04, 70.55, 70.55, 94.07 ],
[503, 0, 0, 176.7, -44.18, 1.0, 100, 1, 353.41, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 530.11, 70.68, 106.02, 106.02, 141.36 ],
[504, 0, 0, 12.72, -3.18, 1.0, 100, 1, 25.43, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 38.15, 5.09, 7.63, 7.63, 10.17 ],
[505, 0, 0, 0.24, -0.06, 1.0, 100, 1, 0.47, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 0.71, 0.09, 0.14, 0.14, 0.19 ],
[511, 0, 0, 344.92, -86.23, 1.0, 100, 1, 689.84, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 1034.76, 137.97, 206.95, 206.95, 275.94 ],
[512, 0, 0, 39.14, -9.78, 1.0, 100, 1, 78.28, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 117.42, 15.66, 23.48, 23.48, 31.31 ],
[557, 0, 0, 39.69, -9.92, 1.0, 100, 1, 79.38, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 119.07, 15.88, 23.81, 23.81, 31.75 ],
[558, 0, 0, 112.76, -28.19, 1.0, 100, 1, 225.53, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 338.29, 45.11, 67.66, 67.66, 90.21 ],
[559, 0, 0, 40.95, -10.24, 1.0, 100, 1, 81.9, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 122.85, 16.38, 24.57, 24.57, 32.76 ],
[560, 0, 0, 192.25, -48.06, 1.0, 100, 1, 384.51, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 576.76, 76.9, 115.35, 115.35, 153.8 ],
[569, 0, 0, 16.55, -4.14, 1.0, 100, 1, 33.09, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 49.64, 6.62, 9.93, 9.93, 13.24 ],
[570, 0, 0, 70.87, -17.72, 1.0, 100, 1, 141.74, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0.95, 212.61, 28.35, 42.52, 42.52, 56.7 ],
[8, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[9, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[11, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[17, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[21, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[23, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[25, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[31, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[33, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[38, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[39, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[40, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[41, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[43, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[44, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[45, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[47, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[48, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[49, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[65, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[71, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[72, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[73, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[75, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[79, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[80, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[81, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[98, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[101, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[102, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[177, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[179, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[180, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[181, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[182, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[183, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[192, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[217, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[218, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[276, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[278, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[288, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[289, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[291, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[292, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[311, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[321, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[322, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[323, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[324, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[325, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[338, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[339, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[340, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[341, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[342, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[343, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[344, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[345, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[346, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[363, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[400, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[422, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[429, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[435, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[436, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[440, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[441, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[492, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[493, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[496, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[497, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[498, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[502, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[503, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[504, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[505, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[511, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[512, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[557, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[558, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[559, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[560, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[569, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
[570, 0, 0, 0.0, -0.0, 1.0, 100, 1, 0.0, 0.0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0.95, 0.0, 0.0, 0.0, 0.0, 0.0 ],
])
ppc["branch"] = array([
[8, 9, 0.00024379, 0.00243793, 0.35006327, 2395, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[492, 11, 0.0045562, 0.01822479, 0.04820045, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[11, 493, 0.00757174, 0.03028694, 0.0801021, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[492, 493, 0.01130413, 0.04521653, 0.11958747, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[17, 18, 0.00462352, 0.04623523, 0.9335989, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[17, 12, 0.0005602, 0.00560203, 0.1131183, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[20, 21, 0.00108334, 0.01083345, 0.09722357, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[20, 22, 0.00099339, 0.00993386, 0.3566014, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[497, 23, 0.0005476, 0.00219041, 0.00579315, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[25, 22, 0.00035578, 0.00355783, 0.03192931, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[8, 21, 0.00098947, 0.00989474, 0.0887992, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[31, 32, 0.00299776, 0.02997761, 0.60531903, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[32, 33, 0.00167622, 0.01676223, 0.33846928, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[37, 10, 0.00240464, 0.0240464, 0.48555384, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[10, 38, 0.00068488, 0.0068488, 0.13829351, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[37, 38, 0.00143783, 0.01437835, 1.16133176, 1796, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[39, 40, 0.00452163, 0.0452163, 0.91302431, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[39, 41, 0.0017467, 0.01746699, 0.35269996, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[42, 41, 0.00311454, 0.03114543, 0.6289001, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[18, 42, 0.00343975, 0.03439751, 0.69456727, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[492, 43, 0.00910612, 0.03642446, 0.09633445, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[492, 43, 0.00909587, 0.03638347, 0.09622603, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[44, 45, 0.00640579, 0.02562314, 0.0677674, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[44, 505, 0.00151537, 0.00606149, 0.01603126, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[46, 12, 0.00029449, 0.00294494, 0.1057163, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[46, 12, 0.00029482, 0.00294823, 0.10583438, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[47, 48, 0.00053442, 0.00534418, 0.01199019, 299, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[31, 33, 0.0013476, 0.01347599, 0.27211226, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[71, 72, 0.00088786, 0.00887864, 0.31872128, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[73, 74, 0.00125295, 0.01252955, 0.25300129, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[37, 75, 0.00274591, 0.02745914, 0.5544652, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[72, 75, 0.00066887, 0.00668871, 0.24010838, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[37, 72, 0.00362221, 0.03622207, 0.73140949, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[73, 72, 0.00254751, 0.02547507, 0.51440208, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[18, 40, 0.00130277, 0.0130277, 0.26306019, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[492, 45, 0.00771758, 0.0308703, 0.18370115, 520, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[10, 74, 0.00301674, 0.03016736, 0.60915055, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[45, 511, 0.02050843, 0.08203372, 0.05424015, 173, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[78, 32, 0.00134588, 0.0134588, 0.48313778, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[79, 80, 0.00076233, 0.00762327, 0.06841417, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[79, 80, 0.00076174, 0.00761738, 0.06836134, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[81, 79, 0.00215305, 0.02153047, 0.19322279, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[81, 79, 0.00215357, 0.02153566, 0.1932694, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[42, 98, 0.00061861, 0.00618611, 0.22206638, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[42, 98, 0.00061835, 0.00618352, 0.22197315, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[42, 101, 0.00081653, 0.00816534, 0.29311568, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[102, 42, 0.0012403, 0.01240305, 0.44523901, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[37, 39, 0.00065102, 0.00651021, 0.23370076, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[177, 496, 0.00932496, 0.03729983, 0.09864961, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[177, 496, 0.00931603, 0.03726413, 0.09855518, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[179, 493, 0.01426992, 0.05707967, 0.15096279, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[180, 181, 0.01025686, 0.04102744, 0.10850827, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[182, 180, 0.00433818, 0.01735273, 0.04589403, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[179, 181, 0.00489306, 0.01957223, 0.05176412, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[180, 493, 0.0166914, 0.06676562, 0.17657993, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[183, 30, 0.00049645, 0.00496451, 0.17821369, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[183, 21, 0.00025687, 0.00256873, 0.36884485, 2395, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[183, 21, 0.00051295, 0.0051295, 0.18413654, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[183, 30, 0.00049609, 0.00496087, 0.17808317, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[504, 192, 0.00015355, 0.00061421, 0.00162446, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[504, 192, 0.00015421, 0.00061686, 0.00163145, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[217, 98, 0.00012787, 0.00127874, 0.04590362, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[504, 218, 0.00687025, 0.02748099, 0.07268099, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[177, 504, 0.01763702, 0.0705481, 0.18658373, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[37, 39, 0.00086777, 0.00867775, 0.1752243, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[20, 22, 0.00099413, 0.00994131, 0.35686864, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[20, 21, 0.00108314, 0.01083137, 0.09720492, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[503, 276, 0.00335322, 0.01341289, 0.03547406, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[503, 276, 0.00335372, 0.01341488, 0.03547931, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[503, 504, 0.03215471, 0.12861884, 0.34016769, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[503, 504, 0.03216364, 0.12865455, 0.34026211, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[177, 218, 0.01082595, 0.0433038, 0.11452874, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[182, 180, 0.00433157, 0.01732628, 0.04582409, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[277, 278, 0.00143837, 0.01438366, 0.51633804, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[277, 278, 0.00143823, 0.01438227, 0.51628832, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[557, 558, 0.01085322, 0.04341289, 0.25833884, 520, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[557, 559, 0.00853967, 0.03415868, 0.09034196, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[559, 558, 0.01118579, 0.04474314, 0.11833547, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[277, 78, 0.00358577, 0.03585769, 0.32180078, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[277, 279, 0.00213909, 0.02139093, 0.19197048, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[78, 279, 0.0015812, 0.01581198, 0.14190284, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[8, 102, 0.00151001, 0.01510007, 0.5420555, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[8, 101, 0.00192469, 0.01924688, 0.69091598, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[80, 288, 0.00159713, 0.01597126, 0.14333228, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[80, 288, 0.00159681, 0.01596814, 0.14330432, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[80, 289, 0.00013825, 0.0013825, 0.027916, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[80, 289, 0.00014241, 0.00142405, 0.02875502, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[80, 289, 0.00015471, 0.00154709, 0.03123945, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[80, 289, 0.00015129, 0.00151293, 0.03054959, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[276, 560, 0.00889322, 0.03557289, 0.02352056, 173, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[276, 560, 0.00889157, 0.03556628, 0.02351619, 173, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[37, 290, 0.00112768, 0.01127678, 0.22770489, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[290, 74, 0.00414344, 0.04143444, 0.83665966, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[290, 74, 0.00414319, 0.0414319, 0.83660839, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[37, 290, 0.0011259, 0.011259, 0.22734598, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[512, 291, 0.00265967, 0.01063868, 0.02813689, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[512, 291, 0.00266496, 0.01065983, 0.02819285, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[78, 292, 0.0011638, 0.01163804, 0.23499969, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[78, 292, 0.001163, 0.01162996, 0.23483654, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[71, 74, 0.00390452, 0.03904524, 0.78841612, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[74, 278, 0.00154224, 0.01542244, 0.55362774, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[74, 278, 0.00154245, 0.01542452, 0.55370232, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[32, 292, 0.00096794, 0.00967936, 0.34746542, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[503, 560, 0.00378512, 0.0151405, 0.16017272, 693, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[311, 280, 0.00034337, 0.00343369, 0.1232611, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[280, 278, 0.00097498, 0.00974977, 0.78748387, 1796, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[311, 32, 0.00241133, 0.02411334, 0.48690559, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[557, 321, 0.00500298, 0.0200119, 0.05292694, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[277, 65, 0.00188585, 0.01885849, 0.38079775, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[322, 288, 0.001309, 0.01309003, 0.26431871, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[322, 323, 0.00035076, 0.00350762, 0.07082712, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[322, 323, 0.00037006, 0.0037006, 0.0747239, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[277, 324, 0.00197195, 0.01971953, 0.39818407, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[324, 325, 0.00110351, 0.01103509, 0.2228246, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[277, 325, 0.00086657, 0.00866574, 0.17498191, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[332, 78, 0.00129444, 0.01294437, 0.26137749, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[322, 288, 0.001309, 0.01309003, 0.26431871, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[324, 288, 0.00126274, 0.01262742, 0.1133234, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[338, 559, 0.00230702, 0.0092281, 0.09762492, 693, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[339, 559, 0.00890149, 0.03560595, 0.02354242, 173, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[339, 340, 0.02177884, 0.08711537, 0.23040041, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[559, 340, 0.05245818, 0.20983273, 0.13874, 173, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[341, 292, 9.329e-05, 0.00093294, 0.07535316, 1796, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[338, 559, 0.00461405, 0.0184562, 0.04881246, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[557, 342, 0.00302595, 0.0121038, 0.03201181, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[558, 343, 0.00266256, 0.01065025, 0.11266997, 693, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[502, 340, 0.01086926, 0.04347702, 0.11498688, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[502, 340, 0.01086876, 0.04347504, 0.11498163, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[72, 32, 0.00135107, 0.01351073, 0.48500226, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[72, 32, 0.001351, 0.01351004, 0.4849774, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[344, 345, 5.763e-05, 0.00057629, 0.04654687, 1796, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[346, 47, 0.0001134, 0.001134, 0.04070792, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[46, 47, 8.975e-05, 0.00089751, 0.0322183, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[346, 345, 7.218e-05, 0.00072178, 0.02591013, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[363, 344, 2.663e-05, 0.00026627, 0.00955859, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[332, 78, 0.00129421, 0.01294206, 0.26133088, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[37, 49, 0.0016876, 0.01687604, 0.15145211, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[37, 49, 0.0016883, 0.01688296, 0.15151426, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[81, 74, 0.00150357, 0.01503566, 0.13493589, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[81, 74, 0.00150416, 0.01504155, 0.13498871, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[278, 80, 0.00325679, 0.03256787, 0.29227666, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[278, 80, 0.0032572, 0.03257202, 0.29231395, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[81, 278, 0.00421184, 0.04211842, 0.37798704, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[81, 278, 0.0042108, 0.04210803, 0.37789381, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[569, 570, 0.00813488, 0.0325395, 0.08605961, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[498, 400, 0.00303355, 0.01213421, 0.03209225, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[557, 342, 0.00300992, 0.01203967, 0.0318422, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[557, 321, 0.00500231, 0.02000926, 0.05291995, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[277, 65, 0.00188603, 0.01886034, 0.38083504, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[8, 21, 0.00098975, 0.00989751, 0.08882406, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[311, 32, 0.00241182, 0.02411819, 0.48700348, 898, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[422, 81, 0.0002536, 0.00253601, 0.02275915, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[422, 81, 0.00023449, 0.00234488, 0.02104382, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[422, 81, 0.00023272, 0.00232722, 0.02088534, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[422, 81, 0.00023518, 0.0023518, 0.02110597, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[422, 81, 0.00023269, 0.00232687, 0.02088223, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[422, 81, 0.0002536, 0.00253601, 0.02275915, 598, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[45, 429, 0.00640579, 0.02562314, 0.0677674, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[44, 429, 1.322e-05, 5.289e-05, 0.00013989, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[505, 429, 0.00150314, 0.00601256, 0.01590186, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[32, 436, 0.00044813, 0.0044813, 0.16086776, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[435, 436, 6.634e-05, 0.00066343, 0.02381569, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[78, 436, 0.00089768, 0.0089768, 0.32224515, 1197, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[181, 441, 0.01020132, 0.04080529, 0.10792074, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[440, 441, 3.306e-05, 0.00013223, 0.00034972, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[504, 441, 0.01479025, 0.05916099, 0.15646741, 346, 0, 0, 0, 0, 1, -30, 30, 0.1 ],
[10, 492, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[12, 493, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[18, 496, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[20, 497, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[22, 498, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[32, 502, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[37, 503, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[42, 504, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[46, 505, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[74, 511, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[78, 512, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[277, 557, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[279, 558, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[280, 559, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[290, 560, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ],
[332, 569, 0.0, 0.005, 0.0, 2000, 0, 0, 1.0, 0, 1, -30, 30, 0.1 ]
])
ppc["gencost"] = array([
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 28.0, 0, 42.0, 21.0, 33.6, 16.8 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 100.0, 0, 150.0, 75.0, 120.0, 60.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 25.0, 0, 37.5, 18.75, 30.0, 15.0 ],
[2, 0, 0, 3, 0, 100.0, 0, 150.0, 75.0, 120.0, 60.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 25.0, 0, 37.5, 18.75, 30.0, 15.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 25.0, 0, 37.5, 18.75, 30.0, 15.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 100.0, 0, 150.0, 75.0, 120.0, 60.0 ],
[2, 0, 0, 3, 0, 10.0, 0, 15.0, 7.5, 12.0, 6.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 100.0, 0, 150.0, 75.0, 120.0, 60.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 10.0, 0, 15.0, 7.5, 12.0, 6.0 ],
[2, 0, 0, 3, 0, 10.0, 0, 15.0, 7.5, 12.0, 6.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 10.0, 0, 15.0, 7.5, 12.0, 6.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 3.0, 0, 4.5, 2.25, 3.5999999999999996,1.7999999999999998],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 3.0, 0, 4.5, 2.25, 3.5999999999999996,1.7999999999999998],
[2, 0, 0, 3, 0, 10.0, 0, 15.0, 7.5, 12.0, 6.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 100.0, 0, 150.0, 75.0, 120.0, 60.0 ],
[2, 0, 0, 3, 0, 32.0, 0, 48.0, 24.0, 38.4, 19.2 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 25.0, 0, 37.5, 18.75, 30.0, 15.0 ],
[2, 0, 0, 3, 0, 100.0, 0, 150.0, 75.0, 120.0, 60.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 25.0, 0, 37.5, 18.75, 30.0, 15.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 8.0, 0, 12.0, 6.0, 9.6, 4.8 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 3.0, 0, 4.5, 2.25, 3.5999999999999996,1.7999999999999998],
[2, 0, 0, 3, 0, 100.0, 0, 150.0, 75.0, 120.0, 60.0 ],
[2, 0, 0, 3, 0, 3.0, 0, 4.5, 2.25, 3.5999999999999996,1.7999999999999998],
[2, 0, 0, 3, 0, 3.0, 0, 4.5, 2.25, 3.5999999999999996,1.7999999999999998],
[2, 0, 0, 3, 0, 10.0, 0, 15.0, 7.5, 12.0, 6.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 10.0, 0, 15.0, 7.5, 12.0, 6.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 25.0, 0, 37.5, 18.75, 30.0, 15.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 26.0, 0, 39.0, 19.5, 31.2, 15.6 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 3.0, 0, 4.5, 2.25, 3.5999999999999996,1.7999999999999998],
[2, 0, 0, 3, 0, 10.0, 0, 15.0, 7.5, 12.0, 6.0 ],
[2, 0, 0, 3, 0, 25.0, 0, 37.5, 18.75, 30.0, 15.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 3.0, 0, 4.5, 2.25, 3.5999999999999996,1.7999999999999998],
[2, 0, 0, 3, 0, 10.0, 0, 15.0, 7.5, 12.0, 6.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 10.0, 0, 15.0, 7.5, 12.0, 6.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 100.0, 0, 150.0, 75.0, 120.0, 60.0 ],
[2, 0, 0, 3, 0, 50.0, 0, 75.0, 37.5, 60.0, 30.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 10.0, 0, 15.0, 7.5, 12.0, 6.0 ],
[2, 0, 0, 3, 0, 10.0, 0, 15.0, 7.5, 12.0, 6.0 ],
[2, 0, 0, 3, 0, 6.0, 0, 9.0, 4.5, 7.199999999999999,3.5999999999999996],
[2, 0, 0, 3, 0, 10.0, 0, 15.0, 7.5, 12.0, 6.0 ],
[2, 0, 0, 3, 0, 32.0, 0, 48.0, 24.0, 38.4, 19.2 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 1.5, 0.75, 1.2, 0.6 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
[2, 0, 0, 3, 0, 0.0, 0, 0.75, 0.375, 0.6, 0.3 ],
])
return ppc | 126.018138 | 192 | 0.296909 | 20,130 | 104,217 | 1.537059 | 0.045653 | 0.447432 | 0.509324 | 0.570376 | 0.644452 | 0.604635 | 0.603503 | 0.596264 | 0.595003 | 0.595003 | 0 | 0.579283 | 0.48969 | 104,217 | 827 | 193 | 126.018138 | 0.002501 | 0 | 0 | 0.326481 | 0 | 0 | 0.000326 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001209 | false | 0 | 0.001209 | 0 | 0.003628 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e1e339b061e378743f684df71ba7ca4f5f938f29 | 13,412 | py | Python | infoblox_netmri/api/broker/v2_2_0/script_module_broker.py | infobloxopen/infoblox_netmri | aa1c744df7e439dbe163bb9edd165e4e85a9771b | [
"Apache-2.0"
] | 12 | 2016-02-19T12:37:54.000Z | 2022-03-04T20:11:08.000Z | infoblox_netmri/api/broker/v2_2_0/script_module_broker.py | infobloxopen/infoblox_netmri | aa1c744df7e439dbe163bb9edd165e4e85a9771b | [
"Apache-2.0"
] | 18 | 2015-11-12T18:37:00.000Z | 2021-05-19T07:59:55.000Z | infoblox_netmri/api/broker/v2_2_0/script_module_broker.py | infobloxopen/infoblox_netmri | aa1c744df7e439dbe163bb9edd165e4e85a9771b | [
"Apache-2.0"
] | 18 | 2016-01-07T12:04:34.000Z | 2022-03-31T11:05:41.000Z | from ..broker import Broker
class ScriptModuleBroker(Broker):
controller = "script_modules"
def index(self, **kwargs):
"""Lists the available script modules. Any of the inputs listed may be be used to narrow the list; other inputs will be ignored. Of the various ways to query lists, using this method is most efficient.
**Inputs**
| ``api version min:`` 2.1
| ``api version max:`` 2.4
| ``required:`` False
| ``default:`` None
:param id: The internal NetMRI identifier of the Script Module.
:type id: Integer
| ``api version min:`` 2.5
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param id: The internal NetMRI identifier of the Script Module.
:type id: Array of Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 0
:param start: The record number to return in the selected page of data. It will always appear, although it may not be the first record. See the :limit for more information.
:type start: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` 1000
:param limit: The size of the page of data, that is, the maximum number of records returned. The limit size will be used to break the data up into pages and the first page with the start record will be returned. So if you have 100 records and use a :limit of 10 and a :start of 10, you will get records 10-19. The maximum limit is 10000.
:type limit: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` id
:param sort: The data field(s) to use for sorting the output. Default is id. Valid values are id, name, category, language, description, created_by, updated_by, created_at, updated_at.
:type sort: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` asc
:param dir: The direction(s) in which to sort the data. Default is 'asc'. Valid values are 'asc' and 'desc'.
:type dir: Array of String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param select: The list of attributes to return for each ScriptModule. Valid values are id, name, category, language, description, created_by, updated_by, created_at, updated_at. If empty or omitted, all attributes will be returned.
:type select: Array
| ``api version min:`` 2.8
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param goto_field: The field name for NIOS GOTO that is used for locating a row position of records.
:type goto_field: String
| ``api version min:`` 2.8
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param goto_value: The value of goto_field for NIOS GOTO that is used for locating a row position of records.
:type goto_value: String
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return script_modules: An array of the ScriptModule objects that match the specified input criteria.
:rtype script_modules: Array of ScriptModule
"""
return self.api_list_request(self._get_method_fullname("index"), kwargs)
def destroy(self, **kwargs):
"""Deletes the specified script module from NetMRI.
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param id: The internal NetMRI identifier of the Script Module.
:type id: Integer
**Outputs**
"""
return self.api_request(self._get_method_fullname("destroy"), kwargs)
def export(self, **kwargs):
"""This method exports a Script Module
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param id: The internal NetMRI identifier of the Script Module
:type id: Integer
**Outputs**
"""
return self.api_request(self._get_method_fullname("export"), kwargs)
def create(self, **kwargs):
"""Creates a new script module.
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param category: User defined category of the Script Module.
:type category: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param description: A description for the Script Module.
:type description: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param language: The language of the script module
:type language: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param name: The unique name of the Script Module.
:type name: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param overwrite_ind: If set to 1, overwrite existing file. If set to 0, do not overwrite existing file
:type overwrite_ind: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param script_source: Script source
:type script_source: String
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return id: The id of the newly created script module.
:rtype id: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return model: The class name of the newly created script module.
:rtype model: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return uri: A URI that may be used to retrieve the newly created script module.
:rtype uri: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return script_module: The newly created script module.
:rtype script_module: ScriptModule
"""
return self.api_request(self._get_method_fullname("create"), kwargs)
def update(self, **kwargs):
"""Updates an existing script module.
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param id: The internal NetMRI identifier of the Script Module.
:type id: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param category: User defined category of the Script Module. If omitted, this field will not be updated.
:type category: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param description: A description for the Script Module. If omitted, this field will not be updated.
:type description: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param language: The language of the script module If omitted, this field will not be updated.
:type language: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param name: The unique name of the Script Module.
:type name: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param overwrite_ind: If set to 1, overwrite existing file. If set to 0, do not overwrite existing file
:type overwrite_ind: Boolean
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param script_source: Script source
:type script_source: String
**Outputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return id: The id of the updated script module.
:rtype id: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return model: The class name of the updated script module.
:rtype model: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return uri: A URI that may be used to retrieve the updated script module.
:rtype uri: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:return script_module: The updated script module.
:rtype script_module: ScriptModule
"""
return self.api_request(self._get_method_fullname("update"), kwargs)
def show(self, **kwargs):
"""This method will make return the specified library module
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param id: The internal NetMRI identifier of the Library module to show
:type id: Integer
**Outputs**
"""
return self.api_request(self._get_method_fullname("show"), kwargs)
def duplicate(self, **kwargs):
"""This method will make a copy of the specified script
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param id: The internal NetMRI identifier of the script from which to make a copy
:type id: Integer
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param name: The name to be assigned to the new script
:type name: String
**Outputs**
"""
return self.api_request(self._get_method_fullname("duplicate"), kwargs)
def import_data(self, **kwargs):
"""This method imports a script module
**Inputs**
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` True
| ``default:`` None
:param sm_file: The script module file contents
:type sm_file: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param name: The script module file name
:type name: String
| ``api version min:`` None
| ``api version max:`` None
| ``required:`` False
| ``default:`` None
:param overwrite_ind: An indicator to overwrite an existing script module file with the same name. Overwrite if set to 1. Do not overwrite if set to 0
:type overwrite_ind: Boolean
**Outputs**
"""
return self.api_request(self._get_method_fullname("import"), kwargs)
| 33.034483 | 350 | 0.512899 | 1,430 | 13,412 | 4.764336 | 0.128671 | 0.114487 | 0.074417 | 0.094819 | 0.736093 | 0.728901 | 0.715103 | 0.711581 | 0.711581 | 0.694995 | 0 | 0.004452 | 0.380406 | 13,412 | 405 | 351 | 33.116049 | 0.815403 | 0.685133 | 0 | 0 | 0 | 0 | 0.058879 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.421053 | false | 0 | 0.157895 | 0 | 1.105263 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
c0109094f4e3b59d09077e0195b0a978d8ce5cba | 48 | py | Python | tests/basics/zip.py | peterson79/pycom-micropython-sigfox | 3f93fc2c02567c96f18cff4af9125db8fd7a6fb4 | [
"MIT"
] | 303 | 2015-07-11T17:12:55.000Z | 2018-01-08T03:02:37.000Z | tests/basics/zip.py | peterson79/pycom-micropython-sigfox | 3f93fc2c02567c96f18cff4af9125db8fd7a6fb4 | [
"MIT"
] | 27 | 2015-01-02T16:17:37.000Z | 2015-09-07T19:21:26.000Z | tests/basics/zip.py | peterson79/pycom-micropython-sigfox | 3f93fc2c02567c96f18cff4af9125db8fd7a6fb4 | [
"MIT"
] | 26 | 2018-01-18T09:15:33.000Z | 2022-02-07T13:09:14.000Z | print(list(zip()))
print(list(zip([1], {2,3})))
| 16 | 28 | 0.5625 | 9 | 48 | 3 | 0.666667 | 0.666667 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 0.0625 | 48 | 2 | 29 | 24 | 0.533333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
c028cf159a35e0e2645f4b607433f6ca5e08eb64 | 3,122 | py | Python | Paper_Specific_Versions/2019_DTI/Code/8-ADNI_classification_original_data.py | adamwild/AD-ML | e4ac0b7d312ab482b9b52bb3f5c6745cc06431e9 | [
"MIT"
] | null | null | null | Paper_Specific_Versions/2019_DTI/Code/8-ADNI_classification_original_data.py | adamwild/AD-ML | e4ac0b7d312ab482b9b52bb3f5c6745cc06431e9 | [
"MIT"
] | null | null | null | Paper_Specific_Versions/2019_DTI/Code/8-ADNI_classification_original_data.py | adamwild/AD-ML | e4ac0b7d312ab482b9b52bb3f5c6745cc06431e9 | [
"MIT"
] | null | null | null | from clinica_ml_dwi.mlworkflow_dwi_utils import run_voxel_based_classification
# ########################
# ### Original classification
# ########################
caps_directory= 'PATH/TO/CAPS_DIR'
output_dir = 'PATH/TO/CLASSIFICATION_OUTPUT'
n_threads = 72
n_iterations = 250
test_size = 0.2
grid_search_folds = 10
modality = 'T1'
# ########################
# ### DWI
# ########################
###### CN vs AD
task='AD_vs_CN_VB'
diagnoses_tsv = 'PATH/TO/DIAGONISIS_TSV'
subjects_visits_tsv = 'PATH/TO/SUBJECTS_VISITS_TSV'
run_voxel_based_classification(caps_directory, diagnoses_tsv, subjects_visits_tsv, output_dir,
task, n_threads, n_iterations, test_size, grid_search_folds)
##### CN vs pMCI
task='CN_vs_pMCI_VB'
diagnoses_tsv = 'PATH/TO/DIAGONISIS_TSV'
subjects_visits_tsv = 'PATH/TO/SUBJECTS_VISITS_TSV'
run_voxel_based_classification(caps_directory, diagnoses_tsv, subjects_visits_tsv, output_dir,
task, n_threads, n_iterations, test_size, grid_search_folds)
###### CN vs MCI
task='CN_vs_MCI_VB'
diagnoses_tsv = 'PATH/TO/DIAGONISIS_TSV'
subjects_visits_tsv = 'PATH/TO/SUBJECTS_VISITS_TSV'
run_voxel_based_classification(caps_directory, diagnoses_tsv, subjects_visits_tsv, output_dir,
task, n_threads, n_iterations, test_size, grid_search_folds)
###### sMCI vs pMCI
task='sMCI_vs_pMCI_VB'
diagnoses_tsv = 'PATH/TO/DIAGONISIS_TSV'
subjects_visits_tsv = 'PATH/TO/SUBJECTS_VISITS_TSV'
run_voxel_based_classification(caps_directory, diagnoses_tsv, subjects_visits_tsv, output_dir,
task, n_threads, n_iterations, test_size, grid_search_folds)
# ########################
# ### T1
# ########################
###### CN vs AD
task='AD_vs_CN_VB_T1'
diagnoses_tsv = 'PATH/TO/DIAGONISIS_TSV'
subjects_visits_tsv = 'PATH/TO/SUBJECTS_VISITS_TSV'
run_voxel_based_classification(caps_directory, diagnoses_tsv, subjects_visits_tsv, output_dir,
task, n_threads, n_iterations, test_size, grid_search_folds, modality=modality)
##### CN vs pMCI
task='CN_vs_pMCI_VB_T1'
diagnoses_tsv = 'PATH/TO/DIAGONISIS_TSV'
subjects_visits_tsv = 'PATH/TO/SUBJECTS_VISITS_TSV'
run_voxel_based_classification(caps_directory, diagnoses_tsv, subjects_visits_tsv, output_dir,
task, n_threads, n_iterations, test_size, grid_search_folds, modality=modality)
###### CN vs MCI
task='CN_vs_MCI_VB_T1'
diagnoses_tsv = 'PATH/TO/DIAGONISIS_TSV'
subjects_visits_tsv = 'PATH/TO/SUBJECTS_VISITS_TSV'
run_voxel_based_classification(caps_directory, diagnoses_tsv, subjects_visits_tsv, output_dir,
task, n_threads, n_iterations, test_size, grid_search_folds, modality=modality)
###### sMCI vs pMCI
task='sMCI_vs_pMCI_VB_T1'
diagnoses_tsv = 'PATH/TO/DIAGONISIS_TSV'
subjects_visits_tsv = 'PATH/TO/SUBJECTS_VISITS_TSV'
run_voxel_based_classification(caps_directory, diagnoses_tsv, subjects_visits_tsv, output_dir,
task, n_threads, n_iterations, test_size, grid_search_folds, modality=modality)
| 39.518987 | 111 | 0.709801 | 416 | 3,122 | 4.855769 | 0.110577 | 0.166337 | 0.20198 | 0.158416 | 0.893069 | 0.893069 | 0.893069 | 0.893069 | 0.835644 | 0.835644 | 0 | 0.005693 | 0.15599 | 3,122 | 78 | 112 | 40.025641 | 0.760911 | 0.04196 | 0 | 0.666667 | 0 | 0 | 0.198921 | 0.151439 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.020833 | 0 | 0.020833 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fbe4973a7131a8bd98f9e89d3a6da511ee41fe69 | 5,795 | py | Python | test/test.py | DeronW/journey | 69c615c68ab89f16312de995134dba68dc4e665a | [
"MIT"
] | 1 | 2021-12-23T08:15:31.000Z | 2021-12-23T08:15:31.000Z | test/test.py | DeronW/journey | 69c615c68ab89f16312de995134dba68dc4e665a | [
"MIT"
] | null | null | null | test/test.py | DeronW/journey | 69c615c68ab89f16312de995134dba68dc4e665a | [
"MIT"
] | null | null | null | # first, start up Node server normally
# then, run current file in pure python3
import unittest
import requests
class TestSpot(unittest.TestCase):
def test_ping(self):
r = requests.get("http://localhost:3000/ping")
assert r.status_code == 200
assert r.text == "pong"
class TestCoreAPI(unittest.TestCase):
def test_aggregate_default(self):
r = requests.post(
"http://localhost:3000/aggregate",
json={"points": [{"lat": 1, "lng": 1}, {"lat": 2, "lng": 2}]},
)
assert r.status_code == 200
def test_aggregate_polygon(self):
r = requests.post(
"http://localhost:3000/aggregate",
json={
"points": [{"lat": 1, "lng": 1}, {"lat": 2, "lng": 2}],
"pageSize": 20,
"pageNum": 2,
"distance": 9999,
"mode": "polylineBuffer",
"debug": True,
},
)
assert r.status_code == 200
def test_aggregate_ellipse(self):
r = requests.post(
"http://localhost:3000/aggregate",
json={
"points": [{"lat": 1, "lng": 1}, {"lat": 2, "lng": 2}],
"pageSize": 20,
"pageNum": 2,
"shrink": 0.3,
"mode": "bundingCircle",
"debug": True,
},
)
assert r.status_code == 200
def test_aggregate_400(self):
r = requests.post(
"http://localhost:3000/aggregate",
json={
"points": [{"lat": 1, "lng": 1}, {"lat": 2, "lng": 2}],
"shrink": 2,
"mode": "bundingCircle",
},
)
assert r.status_code == 400
r = requests.post(
"http://localhost:3000/aggregate",
json={
"points": [{"lat": 1, "lng": 1}, {"lat": 2, "lng": 2}],
"shrink": 0,
"mode": "bundingCircle",
},
)
assert r.status_code == 400
class TestPOI(unittest.TestCase):
def test_list(self):
r = requests.get("http://localhost:3000/poi/list")
assert r.status_code == 200
r = requests.get("http://localhost:3000/poi/list?pageNum=1&pageSize=2")
assert r.status_code == 200
def test_create(self):
source_id = 12345678
r = requests.post("http://localhost:3000/poi/delete?source_id=%s" % source_id)
assert r.status_code == 200
r = requests.post(
"http://localhost:3000/poi/create",
json={"source_id": source_id, "lat": 1, "lng": 1},
)
assert r.status_code == 200
r = requests.post(
"http://localhost:3000/poi/create",
json={"source_id": source_id, "lat": 1, "lng": 1},
)
assert r.status_code == 409
r = requests.get("http://localhost:3000/poi/info?source_id=%s" % source_id)
assert r.status_code, 200
assert r.json()["data"]["source_id"] == source_id
r = requests.post("http://localhost:3000/poi/delete?source_id=%s" % source_id)
assert r.status_code == 200
def test_info(self):
source_id = 12345678
r = requests.post(
"http://localhost:3000/poi/create",
json={"source_id": source_id, "lat": 1, "lng": 1},
)
assert r.status_code == 200
r = requests.get("http://localhost:3000/poi/info?source_id=%s" % source_id)
assert r.status_code == 200
assert r.json()["data"]["lat"] == 1
r = requests.post("http://localhost:3000/poi/delete?source_id=%s" % source_id)
assert r.status_code == 200
r = requests.get("http://localhost:3000/poi/info?source_id=%s" % source_id)
assert r.status_code == 404
def test_update(self):
source_id = 12345678
r = requests.post("http://localhost:3000/poi/delete?source_id=%s" % source_id)
assert r.status_code == 200
r = requests.post(
"http://localhost:3000/poi/create",
json={"source_id": source_id, "lat": 1, "lng": 1},
)
assert r.status_code == 200
r = requests.post(
"http://localhost:3000/poi/update?source_id=%s" % source_id,
json={"source_id": 12345678, "lat": 2, "lng": 2},
)
assert r.status_code == 200
r = requests.get("http://localhost:3000/poi/info?source_id=%s" % source_id)
assert r.status_code == 200
assert r.json()["data"]["lat"] == 2
r = requests.post("http://localhost:3000/poi/delete?source_id=%s" % source_id)
assert r.status_code == 200
r = requests.post(
"http://localhost:3000/poi/update?source_id=%s" % source_id,
json={"source_id": 12345678, "lat": 3, "lng": 3},
)
assert r.status_code == 200
r = requests.get("http://localhost:3000/poi/info?source_id=%s" % source_id)
assert r.status_code == 200
assert r.json()["data"]["lat"] == 3
r = requests.post("http://localhost:3000/poi/delete?source_id=%s" % source_id)
assert r.status_code == 200
def test_remove(self):
source_id = 12345678
r = requests.post(
"http://localhost:3000/poi/create",
json={"source_id": source_id, "lat": 1, "lng": 1},
)
assert r.status_code == 200
r = requests.post("http://localhost:3000/poi/delete?source_id=%s" % source_id)
assert r.status_code == 200
r = requests.get("http://localhost:3000/poi/%s" % source_id)
assert r.status_code == 404
r = requests.post("http://localhost:3000/poi/delete?source_id=%s" % source_id)
assert r.status_code == 200
| 32.374302 | 86 | 0.531148 | 707 | 5,795 | 4.223479 | 0.106082 | 0.131279 | 0.165104 | 0.165104 | 0.872405 | 0.872405 | 0.872405 | 0.826189 | 0.787341 | 0.778969 | 0 | 0.076904 | 0.308887 | 5,795 | 178 | 87 | 32.55618 | 0.668664 | 0.012942 | 0 | 0.642857 | 0 | 0 | 0.259402 | 0 | 0 | 0 | 0 | 0 | 0.242857 | 1 | 0.071429 | false | 0 | 0.014286 | 0 | 0.107143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
221a4cb51a127a344f64454229cc8d82c04a4f03 | 173 | py | Python | battlescribe_parser/roster/file.py | JKolios/battlewhoosh | 9a833e40dd3f9291d89968c279894fd9be3f37cf | [
"MIT"
] | 1 | 2021-10-10T07:13:39.000Z | 2021-10-10T07:13:39.000Z | battlescribe_parser/roster/file.py | JKolios/battlewhoosh | 9a833e40dd3f9291d89968c279894fd9be3f37cf | [
"MIT"
] | null | null | null | battlescribe_parser/roster/file.py | JKolios/battlewhoosh | 9a833e40dd3f9291d89968c279894fd9be3f37cf | [
"MIT"
] | null | null | null | import battlescribe_parser.bsdata.file
class RosterFile(battlescribe_parser.bsdata.file.File):
# TODO: Implement :)
def scrape_selections(self):
return []
| 21.625 | 55 | 0.728324 | 19 | 173 | 6.473684 | 0.736842 | 0.292683 | 0.390244 | 0.455285 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.17341 | 173 | 7 | 56 | 24.714286 | 0.86014 | 0.104046 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
224c312ab28dbebe5a4f1334fcb3746e9dbce9c2 | 27,061 | py | Python | src/optimize_hyperparams.py | varunlakshmanan/GLASS | eac0b523ef34e4828e1c279a93d293577103ff3b | [
"MIT"
] | 2 | 2020-05-08T19:28:24.000Z | 2020-05-08T21:30:06.000Z | src/optimize_hyperparams.py | varunlakshmanan/glass | eac0b523ef34e4828e1c279a93d293577103ff3b | [
"MIT"
] | null | null | null | src/optimize_hyperparams.py | varunlakshmanan/glass | eac0b523ef34e4828e1c279a93d293577103ff3b | [
"MIT"
] | null | null | null | import optuna
from sklearn.model_selection import cross_val_score
import warnings
warnings.filterwarnings("ignore", "Solver terminated early.*")
warnings.filterwarnings("ignore", "Maximum number of iteration reached before*")
# All objective functions define a single trial in Optuna for each type of estimator
def dt_objective(trial, x, y, estimator):
max_depth = int(trial.suggest_int("max_depth", 1, 15, 1))
min_samples_split = trial.suggest_discrete_uniform("min_samples_split", 0.005, 0.5, 0.005)
min_samples_leaf = trial.suggest_discrete_uniform("min_samples_leaf", 0.001, 0.1, 0.001)
min_weight_fraction_leaf = trial.suggest_discrete_uniform("min_weight_fraction_leaf", 0.0, 0.5, 0.05)
max_features = trial.suggest_discrete_uniform("max_features", 0.01, 0.25, 0.01)
max_leaf_nodes = int(trial.suggest_int("max_leaf_nodes", 64, 512, 64))
params = {
"max_depth": max_depth,
"min_samples_split": min_samples_split,
"min_samples_leaf": min_samples_leaf,
"min_weight_fraction_leaf": min_weight_fraction_leaf,
"max_features": max_features,
"max_leaf_nodes": max_leaf_nodes
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def rf_objective(trial, x, y, estimator):
n_estimators = int(trial.suggest_int("n_estimators", 50, 200, 50))
max_depth = int(trial.suggest_int("max_depth", 1, 15, 1))
min_samples_split = trial.suggest_discrete_uniform("min_samples_split", 0.005, 0.5, 0.005)
min_samples_leaf = trial.suggest_discrete_uniform("min_samples_leaf", 0.001, 0.1, 0.001)
min_weight_fraction_leaf = trial.suggest_discrete_uniform("min_weight_fraction_leaf", 0.0, 0.5, 0.05)
max_features = trial.suggest_discrete_uniform("max_features", 0.01, 0.25, 0.01)
max_leaf_nodes = int(trial.suggest_int("max_leaf_nodes", 64, 512, 64))
params = {
"n_estimators": n_estimators,
"max_depth": max_depth,
"min_samples_split": min_samples_split,
"min_samples_leaf": min_samples_leaf,
"min_weight_fraction_leaf": min_weight_fraction_leaf,
"max_features": max_features,
"max_leaf_nodes": max_leaf_nodes
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def ab_objective(trial, x, y, estimator):
n_estimators = int(trial.suggest_int("n_estimators", 50, 200, 50))
learning_rate = trial.suggest_discrete_uniform("learning_rate", 0.001, 1, 0.0001)
params = {
"n_estimators": n_estimators,
"learning_rate": learning_rate,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def bag_objective(trial, x, y, estimator):
n_estimators = int(trial.suggest_int("n_estimators", 50, 200, 50))
max_samples = trial.suggest_discrete_uniform("max_samples", 0.01, 1.0, 0.01)
max_features = trial.suggest_discrete_uniform("max_features", 0.01, 0.25, 0.01)
params = {
"n_estimators": n_estimators,
"max_samples": max_samples,
"max_features": max_features
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def et_objective(trial, x, y, estimator):
n_estimators = int(trial.suggest_int("n_estimators", 50, 200, 50))
max_depth = int(trial.suggest_int("max_depth", 1, 15, 1))
min_samples_split = trial.suggest_discrete_uniform("min_samples_split", 0.005, 0.5, 0.005)
min_samples_leaf = trial.suggest_discrete_uniform("min_samples_leaf", 0.001, 0.1, 0.001)
min_weight_fraction_leaf = trial.suggest_discrete_uniform("min_weight_fraction_leaf", 0.0, 0.5, 0.05)
max_features = trial.suggest_discrete_uniform("max_features", 0.01, 0.25, 0.01)
max_leaf_nodes = int(trial.suggest_int("max_leaf_nodes", 64, 512, 64))
params = {
"n_estimators": n_estimators,
"max_depth": max_depth,
"min_samples_split": min_samples_split,
"min_samples_leaf": min_samples_leaf,
"min_weight_fraction_leaf": min_weight_fraction_leaf,
"max_features": max_features,
"max_leaf_nodes": max_leaf_nodes
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def gb_objective(trial, x, y, estimator):
learning_rate = trial.suggest_discrete_uniform("learning_rate", 0.001, 1, 0.0001)
n_estimators = int(trial.suggest_int("n_estimators", 50, 200, 50))
min_samples_split = trial.suggest_discrete_uniform("min_samples_split", 0.005, 0.5, 0.005)
min_samples_leaf = trial.suggest_discrete_uniform("min_samples_leaf", 0.001, 0.1, 0.001)
min_weight_fraction_leaf = trial.suggest_discrete_uniform("min_weight_fraction_leaf", 0.0, 0.5, 0.05)
max_depth = int(trial.suggest_int("max_depth", 1, 15, 1))
max_features = trial.suggest_discrete_uniform("max_features", 0.01, 0.25, 0.01)
max_leaf_nodes = int(trial.suggest_int("max_leaf_nodes", 64, 512, 64))
params = {
"learning_rate": learning_rate,
"n_estimators": n_estimators,
"min_samples_split": min_samples_split,
"min_samples_leaf": min_samples_leaf,
"min_weight_fraction_leaf": min_weight_fraction_leaf,
"max_depth": max_depth,
"max_features": max_features,
"max_leaf_nodes": max_leaf_nodes
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def gp_objective(trial, x, y, estimator):
n_restarts_optimizer = int(trial.suggest_discrete_uniform("n_restarts_optimizer", 0, 100, 2))
params = {
"n_restarts_optimizer": n_restarts_optimizer
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def lr_objective(trial, x, y, estimator):
penalty = trial.suggest_categorical("penalty", ["l2", "none"])
tol = trial.suggest_discrete_uniform("tol", 10**(-6), 10**(-2), 10**(-4))
c = trial.suggest_discrete_uniform("C", 0.01, 10, 0.1)
solver = trial.suggest_categorical("solver", ["newton-cg", "lbfgs", "sag", "saga"])
max_iter = int(trial.suggest_discrete_uniform("max_iter", 100, 9100, 1000))
params = {
"penalty": penalty,
"tol": tol,
"C": c,
"solver": solver,
"max_iter": max_iter
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def pa_objective(trial, x, y, estimator):
c = trial.suggest_discrete_uniform("C", 0.01, 10, 0.1)
max_iter = int(trial.suggest_discrete_uniform("max_iter", 500, 5000, 500))
tol = trial.suggest_discrete_uniform("tol", 10**(-5), 10**(-1), 10**(-3))
params = {
"C": c,
"max_iter": max_iter,
"tol": tol,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def ridge_objective(trial, x, y, estimator):
alpha = trial.suggest_discrete_uniform("alpha", 0.01, 10, 0.1)
max_iter = int(trial.suggest_discrete_uniform("max_iter", 500, 10000, 500))
tol = trial.suggest_discrete_uniform("tol", 10**(-5), 10**(-1), 10**(-3))
solver = trial.suggest_categorical("solver", ["auto", "svd", "cholesky", "lsqr", "sparse_cg", "sag", "saga"])
params = {
"solver": solver,
"alpha": alpha,
"max_iter": max_iter,
"tol": tol,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def sgd_objective(trial, x, y, estimator):
penalty = trial.suggest_categorical("penalty", ["l2", "l1"])
alpha = trial.suggest_discrete_uniform("alpha", 0.00001, 0.01, 0.001)
l1_ratio = trial.suggest_discrete_uniform("l1_ratio", 0.01, 0.3, 0.01)
max_iter = int(trial.suggest_discrete_uniform("max_iter", 500, 20000, 500))
tol = trial.suggest_discrete_uniform("tol", 10**(-5), 10**(-1), 10**(-3))
params = {
"penalty": penalty,
"alpha": alpha,
"l1_ratio": l1_ratio,
"max_iter": max_iter,
"tol": tol,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def per_objective(trial, x, y, estimator):
penalty = trial.suggest_categorical("penalty", ["l2", "l1", "elasticnet"])
alpha = trial.suggest_discrete_uniform("alpha", 0.00001, 0.01, 0.001)
max_iter = int(trial.suggest_discrete_uniform("max_iter", 500, 5000, 500))
tol = trial.suggest_discrete_uniform("tol", 10**(-5), 10**(-1), 10**(-3))
params = {
"penalty": penalty,
"alpha": alpha,
"max_iter": max_iter,
"tol": tol,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def sv_objective(trial, x, y, estimator):
C = trial.suggest_discrete_uniform("C", 0.01, 10, 0.1)
tol = trial.suggest_discrete_uniform("tol", 10**(-5), 10**(-1), 10**(-3))
params = {
"C": C,
"tol": tol,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def nu_sv_objective(trial, x, y, estimator):
nu = trial.suggest_discrete_uniform("C", 0.01, 1, 0.01)
tol = trial.suggest_discrete_uniform("tol", 10**(-5), 10**(-1), 10**(-3))
kernel = trial.suggest_categorical("kernel", ["linear", "poly", "rbf", "sigmoid"])
max_iter = int(trial.suggest_discrete_uniform("max_iter", 500, 20000, 500))
params = {
"nu": nu,
"kernel": kernel,
"tol": tol,
"max_iter": max_iter,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def l_sv_objective(trial, x, y, estimator):
C = trial.suggest_discrete_uniform("C", 0.01, 10, 0.1)
tol = trial.suggest_discrete_uniform("tol", 10**(-5), 10**(-1), 10**(-3))
max_iter = int(trial.suggest_discrete_uniform("max_iter", 500, 40000, 500))
params = {
"C": C,
"tol": tol,
"max_iter": max_iter,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def kn_objective(trial, x, y, estimator):
n_neighbors = int(trial.suggest_int("n_neighbors", 1, 10, 1))
weights = trial.suggest_categorical("weights", ["uniform", "distance"])
algorithm = trial.suggest_categorical("algorithm", ["auto", "ball_tree", "kd_tree", "brute"])
p = int(trial.suggest_discrete_uniform("p", 1, 10, 1))
params = {
"n_neighbors": n_neighbors,
"weights": weights,
"algorithm": algorithm,
"p": p
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def bnb_objective(trial, x, y, estimator):
alpha = trial.suggest_discrete_uniform("alpha", 0.0, 2.0, 0.001)
binarize = trial.suggest_discrete_uniform("binarize", 0.0, 2.0, 0.001)
params = {
"alpha": alpha,
"binarize": binarize,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def gnb_objective(trial, x, y, estimator):
var_smoothing = trial.suggest_discrete_uniform("var_smoothing", 1**(-10), 1**(-8), 1**(-9))
params = {
"var_smoothing": var_smoothing,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def lda_objective(trial, x, y, estimator):
solver = trial.suggest_categorical("solver", ["svd", "lsqr", "eigen"])
params = {
"solver": solver,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def qda_objective(trial, x, y, estimator):
reg_param = trial.suggest_discrete_uniform("reg_param", 0.0, 1.0, 0.001)
params = {
"reg_param": reg_param,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def xgb_objective(trial, x, y, estimator):
n_estimators = int(trial.suggest_int("n_estimators", 50, 200, 50))
max_depth = int(trial.suggest_int("max_depth", 2, 6, 14))
learning_rate = trial.suggest_discrete_uniform("learning_rate", 0.0, 1.0, 0.001)
params = {
"n_estimators": n_estimators,
"max_depth": max_depth,
"learning_rate": learning_rate,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
def lgbm_objective(trial, x, y, estimator):
boosting_type = trial.suggest_categorical("boosting_type", ["gbdt", "dart", "goss"])
num_leaves = int(trial.suggest_int("num_leaves", 8, 128, 1))
max_depth = int(trial.suggest_int("max_depth", -1, 4, 1))
learning_rate = trial.suggest_discrete_uniform("learning_rate", 0.001, 0.25, 0.001)
n_estimators = int(trial.suggest_int("n_estimators", 50, 200, 50))
params = {
"boosting_type": boosting_type,
"num_leaves": num_leaves,
"max_depth": max_depth,
"learning_rate": learning_rate,
"n_estimators": n_estimators,
}
estimator.set_params(**params)
score = cross_val_score(estimator, x, y)
accuracy = score.mean()
if accuracy < 0:
score = cross_val_score(estimator, x, y, scoring="neg_mean_squared_error")
accuracy = score.mean()
return accuracy
return accuracy
# Runs a study composed of the trials defined above for each estimator, and sets the best possible parameters for each estimator
def optimize_hyperparams(estimators, x, y, timeout):
print("Optimizing hyperparameters...")
optimized_estimators = []
for estimator in estimators:
if "DecisionTree" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: dt_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "RandomForest" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: rf_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "AdaBoost" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: ab_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "Bagging" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: bag_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "ExtraTrees" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: et_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "GradientBoosting" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: gb_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "GaussianProcess" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: gp_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "LogisticRegression" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: lr_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "PassiveAggressive" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: pa_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "Ridge" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: ridge_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "SGD" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: sgd_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "Perceptron" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: per_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "SVC()" == str(estimator) or "SVR()" == str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: sv_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "NuSVC" in str(estimator) or "NuSVR" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: nu_sv_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "LinearSVC" in str(estimator) or "LinearSVR" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: l_sv_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "KNeighbors" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: kn_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "BernoulliNB" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: bnb_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "GaussianNB" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: gnb_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "LinearDiscriminantAnalysis" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: lda_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "QuadraticDiscriminantAnalysis" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: qda_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "XGB" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: xgb_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
elif "LGBM" in str(estimator):
optuna.logging.set_verbosity(verbosity=optuna.logging.CRITICAL)
study = optuna.create_study(direction="maximize")
study.optimize(lambda trial: lgbm_objective(trial, x, y, estimator), timeout=timeout)
estimator.set_params(**study.best_params)
estimator.fit(x, y)
optimized_estimators.append(estimator)
return optimized_estimators
| 40.877644 | 129 | 0.654484 | 3,371 | 27,061 | 5.024918 | 0.064966 | 0.013106 | 0.061397 | 0.082886 | 0.888777 | 0.874019 | 0.853356 | 0.846272 | 0.839896 | 0.829565 | 0 | 0.028746 | 0.224825 | 27,061 | 661 | 130 | 40.939486 | 0.778758 | 0.007723 | 0 | 0.746479 | 0 | 0 | 0.104514 | 0.027227 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040493 | false | 0.001761 | 0.005282 | 0 | 0.125 | 0.001761 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
224c3eac6063b64913761956bf67279dbd0795f1 | 68,592 | py | Python | benchmarks/SimResults/_bigLittle_hrrs_spec_tugberk_ml/SystemIPC/cmp_libquantum/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/_bigLittle_hrrs_spec_tugberk_ml/SystemIPC/cmp_libquantum/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | benchmarks/SimResults/_bigLittle_hrrs_spec_tugberk_ml/SystemIPC/cmp_libquantum/power.py | TugberkArkose/MLScheduler | e493b6cbf7b9d29a2c9300d7dd6f0c2f102e4061 | [
"Unlicense"
] | null | null | null | power = {'BUSES': {'Area': 1.33155,
'Bus/Area': 1.33155,
'Bus/Gate Leakage': 0.00662954,
'Bus/Peak Dynamic': 0.0,
'Bus/Runtime Dynamic': 0.0,
'Bus/Subthreshold Leakage': 0.0691322,
'Bus/Subthreshold Leakage with power gating': 0.0259246,
'Gate Leakage': 0.00662954,
'Peak Dynamic': 0.0,
'Runtime Dynamic': 0.0,
'Subthreshold Leakage': 0.0691322,
'Subthreshold Leakage with power gating': 0.0259246},
'Core': [{'Area': 32.6082,
'Execution Unit/Area': 8.2042,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.202689,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.0,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.122718,
'Execution Unit/Instruction Scheduler/Area': 2.17927,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.328073,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.00115349,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.20978,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.161944,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.017004,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00962066,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00730101,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 1.00996,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00529112,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 2.07911,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.280429,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0800117,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0455351,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 4.84781,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.841232,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.000856399,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.55892,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.160834,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.0178624,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00897339,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.603206,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.114878,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.0641291,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.160075,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 5.17472,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.0058706,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0424518,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0434167,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.0424518,
'Execution Unit/Register Files/Runtime Dynamic': 0.0492873,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0442632,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00607074,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.102581,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.291753,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.0920413,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0345155,
'Execution Unit/Runtime Dynamic': 1.55231,
'Execution Unit/Subthreshold Leakage': 1.83518,
'Execution Unit/Subthreshold Leakage with power gating': 0.709678,
'Gate Leakage': 0.372997,
'Instruction Fetch Unit/Area': 5.86007,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00137178,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00137178,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.00119861,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000466078,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000623684,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00456586,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.013017,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0590479,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0417376,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 2.65487,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.189128,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.14176,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 5.0042,
'Instruction Fetch Unit/Runtime Dynamic': 0.390208,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932587,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.408542,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0648581,
'L2/Runtime Dynamic': 0.0172214,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80969,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 1.94089,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.365353,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0351387,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0227685,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0227686,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 2.04884,
'Load Store Unit/Runtime Dynamic': 0.500408,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.0561434,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.112287,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591622,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283406,
'Memory Management Unit/Area': 0.434579,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0199255,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.0208993,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00813591,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.16507,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0310056,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.358015,
'Memory Management Unit/Runtime Dynamic': 0.051905,
'Memory Management Unit/Subthreshold Leakage': 0.0769113,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0399462,
'Peak Dynamic': 17.2123,
'Renaming Unit/Area': 0.369768,
'Renaming Unit/FP Front End RAT/Area': 0.168486,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00489731,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 3.33511,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.0,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0437281,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.024925,
'Renaming Unit/Free List/Area': 0.0414755,
'Renaming Unit/Free List/Gate Leakage': 4.15911e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0401324,
'Renaming Unit/Free List/Runtime Dynamic': 0.00828093,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000670426,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000377987,
'Renaming Unit/Gate Leakage': 0.00863632,
'Renaming Unit/Int Front End RAT/Area': 0.114751,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.00038343,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.86945,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0852802,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00611897,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00348781,
'Renaming Unit/Peak Dynamic': 4.56169,
'Renaming Unit/Runtime Dynamic': 0.0935611,
'Renaming Unit/Subthreshold Leakage': 0.070483,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0362779,
'Runtime Dynamic': 2.60562,
'Subthreshold Leakage': 6.21877,
'Subthreshold Leakage with power gating': 2.58311},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.202689,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.0,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.0543723,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.0877005,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.0442682,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.186341,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.0621867,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 3.94439,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00228062,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0164919,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0168666,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.0164919,
'Execution Unit/Register Files/Runtime Dynamic': 0.0191472,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.0347438,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.0964688,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 0.910023,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.00057253,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.00057253,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.000505661,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000199572,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.00024229,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00189301,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.00523969,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0162143,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 1.03137,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.0738241,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.0550709,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 3.29994,
'Instruction Fetch Unit/Runtime Dynamic': 0.152242,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0280115,
'L2/Runtime Dynamic': 0.00965762,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 1.51897,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.150761,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0091184,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.0091184,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 1.56203,
'Load Store Unit/Runtime Dynamic': 0.204849,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.0224844,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.0449689,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.0079798,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.00840035,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.0641267,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0121027,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.233943,
'Memory Management Unit/Runtime Dynamic': 0.0205031,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 12.6578,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.0,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.00245313,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0279667,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.0304199,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 1.32769,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.202689,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.0,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.05371,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.0866323,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.0437291,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.184071,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.061428,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 3.9429,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00225284,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0162907,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0166611,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.0162907,
'Execution Unit/Register Files/Runtime Dynamic': 0.018914,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.0343199,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.0948756,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 0.905927,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.000555719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.000555719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.000490572,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000193486,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000239338,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00184135,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.00509449,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0160168,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 1.01881,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.0712005,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.0544002,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 3.28676,
'Instruction Fetch Unit/Runtime Dynamic': 0.148553,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.026332,
'L2/Runtime Dynamic': 0.00857008,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 1.5014,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.14058,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.00855007,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.00854996,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 1.54178,
'Load Store Unit/Runtime Dynamic': 0.191296,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.021083,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.0421655,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.00748243,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.00787769,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.0633456,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0116726,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.232308,
'Memory Management Unit/Runtime Dynamic': 0.0195503,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 12.6196,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.0,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.00242325,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0276474,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.0300706,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 1.30397,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328},
{'Area': 32.0201,
'Execution Unit/Area': 7.68434,
'Execution Unit/Complex ALUs/Area': 0.235435,
'Execution Unit/Complex ALUs/Gate Leakage': 0.0132646,
'Execution Unit/Complex ALUs/Peak Dynamic': 0.0,
'Execution Unit/Complex ALUs/Runtime Dynamic': 0.202689,
'Execution Unit/Complex ALUs/Subthreshold Leakage': 0.20111,
'Execution Unit/Complex ALUs/Subthreshold Leakage with power gating': 0.0754163,
'Execution Unit/Floating Point Units/Area': 4.6585,
'Execution Unit/Floating Point Units/Gate Leakage': 0.0656156,
'Execution Unit/Floating Point Units/Peak Dynamic': 0.0,
'Execution Unit/Floating Point Units/Runtime Dynamic': 0.304033,
'Execution Unit/Floating Point Units/Subthreshold Leakage': 0.994829,
'Execution Unit/Floating Point Units/Subthreshold Leakage with power gating': 0.373061,
'Execution Unit/Gate Leakage': 0.120359,
'Execution Unit/Instruction Scheduler/Area': 1.66526,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Area': 0.275653,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Gate Leakage': 0.000977433,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Peak Dynamic': 1.04181,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Runtime Dynamic': 0.0491084,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage': 0.0143453,
'Execution Unit/Instruction Scheduler/FP Instruction Window/Subthreshold Leakage with power gating': 0.00810519,
'Execution Unit/Instruction Scheduler/Gate Leakage': 0.00568913,
'Execution Unit/Instruction Scheduler/Instruction Window/Area': 0.805223,
'Execution Unit/Instruction Scheduler/Instruction Window/Gate Leakage': 0.00414562,
'Execution Unit/Instruction Scheduler/Instruction Window/Peak Dynamic': 1.6763,
'Execution Unit/Instruction Scheduler/Instruction Window/Runtime Dynamic': 0.07921,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage': 0.0625755,
'Execution Unit/Instruction Scheduler/Instruction Window/Subthreshold Leakage with power gating': 0.0355964,
'Execution Unit/Instruction Scheduler/Peak Dynamic': 3.82262,
'Execution Unit/Instruction Scheduler/ROB/Area': 0.584388,
'Execution Unit/Instruction Scheduler/ROB/Gate Leakage': 0.00056608,
'Execution Unit/Instruction Scheduler/ROB/Peak Dynamic': 1.10451,
'Execution Unit/Instruction Scheduler/ROB/Runtime Dynamic': 0.0399825,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage': 0.00906853,
'Execution Unit/Instruction Scheduler/ROB/Subthreshold Leakage with power gating': 0.00364446,
'Execution Unit/Instruction Scheduler/Runtime Dynamic': 0.168301,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage': 0.0859892,
'Execution Unit/Instruction Scheduler/Subthreshold Leakage with power gating': 0.047346,
'Execution Unit/Integer ALUs/Area': 0.47087,
'Execution Unit/Integer ALUs/Gate Leakage': 0.0265291,
'Execution Unit/Integer ALUs/Peak Dynamic': 0.0561659,
'Execution Unit/Integer ALUs/Runtime Dynamic': 0.101344,
'Execution Unit/Integer ALUs/Subthreshold Leakage': 0.40222,
'Execution Unit/Integer ALUs/Subthreshold Leakage with power gating': 0.150833,
'Execution Unit/Peak Dynamic': 3.9326,
'Execution Unit/Register Files/Area': 0.570804,
'Execution Unit/Register Files/Floating Point RF/Area': 0.208131,
'Execution Unit/Register Files/Floating Point RF/Gate Leakage': 0.000232788,
'Execution Unit/Register Files/Floating Point RF/Peak Dynamic': 0.0,
'Execution Unit/Register Files/Floating Point RF/Runtime Dynamic': 0.00205983,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage': 0.00399698,
'Execution Unit/Register Files/Floating Point RF/Subthreshold Leakage with power gating': 0.00176968,
'Execution Unit/Register Files/Gate Leakage': 0.000622708,
'Execution Unit/Register Files/Integer RF/Area': 0.362673,
'Execution Unit/Register Files/Integer RF/Gate Leakage': 0.00038992,
'Execution Unit/Register Files/Integer RF/Peak Dynamic': 0.0148952,
'Execution Unit/Register Files/Integer RF/Runtime Dynamic': 0.0152337,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage': 0.00614175,
'Execution Unit/Register Files/Integer RF/Subthreshold Leakage with power gating': 0.00246675,
'Execution Unit/Register Files/Peak Dynamic': 0.0148952,
'Execution Unit/Register Files/Runtime Dynamic': 0.0172935,
'Execution Unit/Register Files/Subthreshold Leakage': 0.0101387,
'Execution Unit/Register Files/Subthreshold Leakage with power gating': 0.00423643,
'Execution Unit/Results Broadcast Bus/Area Overhead': 0.0390912,
'Execution Unit/Results Broadcast Bus/Gate Leakage': 0.00537402,
'Execution Unit/Results Broadcast Bus/Peak Dynamic': 0.03138,
'Execution Unit/Results Broadcast Bus/Runtime Dynamic': 0.088451,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage': 0.081478,
'Execution Unit/Results Broadcast Bus/Subthreshold Leakage with power gating': 0.0305543,
'Execution Unit/Runtime Dynamic': 0.882111,
'Execution Unit/Subthreshold Leakage': 1.79543,
'Execution Unit/Subthreshold Leakage with power gating': 0.688821,
'Gate Leakage': 0.368936,
'Instruction Fetch Unit/Area': 5.85939,
'Instruction Fetch Unit/Branch Predictor/Area': 0.138516,
'Instruction Fetch Unit/Branch Predictor/Chooser/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Chooser/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Chooser/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Chooser/Runtime Dynamic': 0.000506095,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Chooser/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/Gate Leakage': 0.000757657,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Area': 0.0435221,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Gate Leakage': 0.000278362,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Peak Dynamic': 0.0168831,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Runtime Dynamic': 0.000506095,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage': 0.00759719,
'Instruction Fetch Unit/Branch Predictor/Global Predictor/Subthreshold Leakage with power gating': 0.0039236,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Area': 0.0257064,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Gate Leakage': 0.000154548,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Peak Dynamic': 0.0142575,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Runtime Dynamic': 0.000446167,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage': 0.00384344,
'Instruction Fetch Unit/Branch Predictor/L1_Local Predictor/Subthreshold Leakage with power gating': 0.00198631,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Area': 0.0151917,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Gate Leakage': 8.00196e-05,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Peak Dynamic': 0.00527447,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Runtime Dynamic': 0.000175649,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage': 0.00181347,
'Instruction Fetch Unit/Branch Predictor/L2_Local Predictor/Subthreshold Leakage with power gating': 0.000957045,
'Instruction Fetch Unit/Branch Predictor/Peak Dynamic': 0.0597838,
'Instruction Fetch Unit/Branch Predictor/RAS/Area': 0.0105732,
'Instruction Fetch Unit/Branch Predictor/RAS/Gate Leakage': 4.63858e-05,
'Instruction Fetch Unit/Branch Predictor/RAS/Peak Dynamic': 0.0117602,
'Instruction Fetch Unit/Branch Predictor/RAS/Runtime Dynamic': 0.000218833,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage': 0.000932505,
'Instruction Fetch Unit/Branch Predictor/RAS/Subthreshold Leakage with power gating': 0.000494733,
'Instruction Fetch Unit/Branch Predictor/Runtime Dynamic': 0.00167719,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage': 0.0199703,
'Instruction Fetch Unit/Branch Predictor/Subthreshold Leakage with power gating': 0.0103282,
'Instruction Fetch Unit/Branch Target Buffer/Area': 0.64954,
'Instruction Fetch Unit/Branch Target Buffer/Gate Leakage': 0.00272758,
'Instruction Fetch Unit/Branch Target Buffer/Peak Dynamic': 0.177867,
'Instruction Fetch Unit/Branch Target Buffer/Runtime Dynamic': 0.00466095,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage': 0.0811682,
'Instruction Fetch Unit/Branch Target Buffer/Subthreshold Leakage with power gating': 0.0435357,
'Instruction Fetch Unit/Gate Leakage': 0.0589979,
'Instruction Fetch Unit/Instruction Buffer/Area': 0.0226323,
'Instruction Fetch Unit/Instruction Buffer/Gate Leakage': 6.83558e-05,
'Instruction Fetch Unit/Instruction Buffer/Peak Dynamic': 0.606827,
'Instruction Fetch Unit/Instruction Buffer/Runtime Dynamic': 0.0146445,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage': 0.00151885,
'Instruction Fetch Unit/Instruction Buffer/Subthreshold Leakage with power gating': 0.000701682,
'Instruction Fetch Unit/Instruction Cache/Area': 3.14635,
'Instruction Fetch Unit/Instruction Cache/Gate Leakage': 0.029931,
'Instruction Fetch Unit/Instruction Cache/Peak Dynamic': 0.931519,
'Instruction Fetch Unit/Instruction Cache/Runtime Dynamic': 0.0669734,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage': 0.367022,
'Instruction Fetch Unit/Instruction Cache/Subthreshold Leakage with power gating': 0.180386,
'Instruction Fetch Unit/Instruction Decoder/Area': 1.85799,
'Instruction Fetch Unit/Instruction Decoder/Gate Leakage': 0.0222493,
'Instruction Fetch Unit/Instruction Decoder/Peak Dynamic': 1.37404,
'Instruction Fetch Unit/Instruction Decoder/Runtime Dynamic': 0.0497394,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage': 0.442943,
'Instruction Fetch Unit/Instruction Decoder/Subthreshold Leakage with power gating': 0.166104,
'Instruction Fetch Unit/Peak Dynamic': 3.19524,
'Instruction Fetch Unit/Runtime Dynamic': 0.137695,
'Instruction Fetch Unit/Subthreshold Leakage': 0.932286,
'Instruction Fetch Unit/Subthreshold Leakage with power gating': 0.40843,
'L2/Area': 4.53318,
'L2/Gate Leakage': 0.015464,
'L2/Peak Dynamic': 0.0252124,
'L2/Runtime Dynamic': 0.00854446,
'L2/Subthreshold Leakage': 0.834142,
'L2/Subthreshold Leakage with power gating': 0.401066,
'Load Store Unit/Area': 8.80901,
'Load Store Unit/Data Cache/Area': 6.84535,
'Load Store Unit/Data Cache/Gate Leakage': 0.0279261,
'Load Store Unit/Data Cache/Peak Dynamic': 1.49562,
'Load Store Unit/Data Cache/Runtime Dynamic': 0.137772,
'Load Store Unit/Data Cache/Subthreshold Leakage': 0.527675,
'Load Store Unit/Data Cache/Subthreshold Leakage with power gating': 0.25085,
'Load Store Unit/Gate Leakage': 0.0350888,
'Load Store Unit/LoadQ/Area': 0.0836782,
'Load Store Unit/LoadQ/Gate Leakage': 0.00059896,
'Load Store Unit/LoadQ/Peak Dynamic': 0.0083632,
'Load Store Unit/LoadQ/Runtime Dynamic': 0.00836326,
'Load Store Unit/LoadQ/Subthreshold Leakage': 0.00941961,
'Load Store Unit/LoadQ/Subthreshold Leakage with power gating': 0.00536918,
'Load Store Unit/Peak Dynamic': 1.53512,
'Load Store Unit/Runtime Dynamic': 0.18738,
'Load Store Unit/StoreQ/Area': 0.322079,
'Load Store Unit/StoreQ/Gate Leakage': 0.00329971,
'Load Store Unit/StoreQ/Peak Dynamic': 0.0206222,
'Load Store Unit/StoreQ/Runtime Dynamic': 0.0412447,
'Load Store Unit/StoreQ/Subthreshold Leakage': 0.0345621,
'Load Store Unit/StoreQ/Subthreshold Leakage with power gating': 0.0197004,
'Load Store Unit/Subthreshold Leakage': 0.591321,
'Load Store Unit/Subthreshold Leakage with power gating': 0.283293,
'Memory Management Unit/Area': 0.4339,
'Memory Management Unit/Dtlb/Area': 0.0879726,
'Memory Management Unit/Dtlb/Gate Leakage': 0.00088729,
'Memory Management Unit/Dtlb/Peak Dynamic': 0.00731889,
'Memory Management Unit/Dtlb/Runtime Dynamic': 0.00769748,
'Memory Management Unit/Dtlb/Subthreshold Leakage': 0.0155699,
'Memory Management Unit/Dtlb/Subthreshold Leakage with power gating': 0.00887485,
'Memory Management Unit/Gate Leakage': 0.00808595,
'Memory Management Unit/Itlb/Area': 0.301552,
'Memory Management Unit/Itlb/Gate Leakage': 0.00393464,
'Memory Management Unit/Itlb/Peak Dynamic': 0.0579184,
'Memory Management Unit/Itlb/Runtime Dynamic': 0.0109796,
'Memory Management Unit/Itlb/Subthreshold Leakage': 0.0413758,
'Memory Management Unit/Itlb/Subthreshold Leakage with power gating': 0.0235842,
'Memory Management Unit/Peak Dynamic': 0.2266,
'Memory Management Unit/Runtime Dynamic': 0.0186771,
'Memory Management Unit/Subthreshold Leakage': 0.0766103,
'Memory Management Unit/Subthreshold Leakage with power gating': 0.0398333,
'Peak Dynamic': 12.5042,
'Renaming Unit/Area': 0.303608,
'Renaming Unit/FP Front End RAT/Area': 0.131045,
'Renaming Unit/FP Front End RAT/Gate Leakage': 0.00351123,
'Renaming Unit/FP Front End RAT/Peak Dynamic': 2.51468,
'Renaming Unit/FP Front End RAT/Runtime Dynamic': 0.0,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage': 0.0308571,
'Renaming Unit/FP Front End RAT/Subthreshold Leakage with power gating': 0.0175885,
'Renaming Unit/Free List/Area': 0.0340654,
'Renaming Unit/Free List/Gate Leakage': 2.5481e-05,
'Renaming Unit/Free List/Peak Dynamic': 0.0306032,
'Renaming Unit/Free List/Runtime Dynamic': 0.00221563,
'Renaming Unit/Free List/Subthreshold Leakage': 0.000370144,
'Renaming Unit/Free List/Subthreshold Leakage with power gating': 0.000201064,
'Renaming Unit/Gate Leakage': 0.00708398,
'Renaming Unit/Int Front End RAT/Area': 0.0941223,
'Renaming Unit/Int Front End RAT/Gate Leakage': 0.000283242,
'Renaming Unit/Int Front End RAT/Peak Dynamic': 0.731965,
'Renaming Unit/Int Front End RAT/Runtime Dynamic': 0.0252779,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage': 0.00435488,
'Renaming Unit/Int Front End RAT/Subthreshold Leakage with power gating': 0.00248228,
'Renaming Unit/Peak Dynamic': 3.58947,
'Renaming Unit/Runtime Dynamic': 0.0274935,
'Renaming Unit/Subthreshold Leakage': 0.0552466,
'Renaming Unit/Subthreshold Leakage with power gating': 0.0276461,
'Runtime Dynamic': 1.2619,
'Subthreshold Leakage': 6.16288,
'Subthreshold Leakage with power gating': 2.55328}],
'DRAM': {'Area': 0,
'Gate Leakage': 0,
'Peak Dynamic': 6.514505055053482,
'Runtime Dynamic': 6.514505055053482,
'Subthreshold Leakage': 4.252,
'Subthreshold Leakage with power gating': 4.252},
'L3': [{'Area': 61.9075,
'Gate Leakage': 0.0484137,
'Peak Dynamic': 0.309034,
'Runtime Dynamic': 0.10016,
'Subthreshold Leakage': 6.80085,
'Subthreshold Leakage with power gating': 3.32364}],
'Processor': {'Area': 191.908,
'Gate Leakage': 1.53485,
'Peak Dynamic': 55.3029,
'Peak Power': 88.4152,
'Runtime Dynamic': 6.59934,
'Subthreshold Leakage': 31.5774,
'Subthreshold Leakage with power gating': 13.9484,
'Total Cores/Area': 128.669,
'Total Cores/Gate Leakage': 1.4798,
'Total Cores/Peak Dynamic': 54.9939,
'Total Cores/Runtime Dynamic': 6.49918,
'Total Cores/Subthreshold Leakage': 24.7074,
'Total Cores/Subthreshold Leakage with power gating': 10.2429,
'Total L3s/Area': 61.9075,
'Total L3s/Gate Leakage': 0.0484137,
'Total L3s/Peak Dynamic': 0.309034,
'Total L3s/Runtime Dynamic': 0.10016,
'Total L3s/Subthreshold Leakage': 6.80085,
'Total L3s/Subthreshold Leakage with power gating': 3.32364,
'Total Leakage': 33.1122,
'Total NoCs/Area': 1.33155,
'Total NoCs/Gate Leakage': 0.00662954,
'Total NoCs/Peak Dynamic': 0.0,
'Total NoCs/Runtime Dynamic': 0.0,
'Total NoCs/Subthreshold Leakage': 0.0691322,
'Total NoCs/Subthreshold Leakage with power gating': 0.0259246}} | 75.045952 | 124 | 0.682004 | 8,082 | 68,592 | 5.782232 | 0.064835 | 0.123598 | 0.112985 | 0.093469 | 0.941625 | 0.934328 | 0.921339 | 0.89611 | 0.86765 | 0.848819 | 0 | 0.13167 | 0.224385 | 68,592 | 914 | 125 | 75.045952 | 0.746734 | 0 | 0 | 0.664114 | 0 | 0 | 0.657574 | 0.04811 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3f0dbd1cb5db01e3bef0183780628618b4e86a93 | 38,185 | py | Python | integrationtest/vm/installation/test_stub.py | bgerxx/woodpecker | fdc51245945cc9be4d1f028988079213eb99b2ad | [
"Apache-2.0"
] | null | null | null | integrationtest/vm/installation/test_stub.py | bgerxx/woodpecker | fdc51245945cc9be4d1f028988079213eb99b2ad | [
"Apache-2.0"
] | null | null | null | integrationtest/vm/installation/test_stub.py | bgerxx/woodpecker | fdc51245945cc9be4d1f028988079213eb99b2ad | [
"Apache-2.0"
] | null | null | null | import os
import subprocess
import time
import zstacklib.utils.ssh as ssh
import zstackwoodpecker.test_util as test_util
import zstackwoodpecker.test_lib as test_lib
import zstackwoodpecker.operations.resource_operations as res_ops
import zstackwoodpecker.zstack_test.zstack_test_vm as zstack_vm_header
import zstackwoodpecker.operations.scenario_operations as scen_ops
def create_vlan_vm(image_name, l3_name=None, disk_offering_uuids=None):
image_uuid = test_lib.lib_get_image_by_name(image_name).uuid
if not l3_name:
l3_name = os.environ.get('l3PublicNetworkName')
l3_net_uuid = test_lib.lib_get_l3_by_name(l3_name).uuid
return create_vm([l3_net_uuid], image_uuid, 'zs_install_%s' % image_name, \
disk_offering_uuids)
def create_vm(l3_uuid_list, image_uuid, vm_name = None, \
disk_offering_uuids = None, default_l3_uuid = None):
vm_creation_option = test_util.VmOption()
conditions = res_ops.gen_query_conditions('name', '=', os.environ.get('instanceOfferingName_m'))
instance_offering_uuid = res_ops.query_resource(res_ops.INSTANCE_OFFERING, conditions)[0].uuid
vm_creation_option.set_instance_offering_uuid(instance_offering_uuid)
vm_creation_option.set_l3_uuids(l3_uuid_list)
vm_creation_option.set_image_uuid(image_uuid)
vm_creation_option.set_name(vm_name)
vm_creation_option.set_data_disk_uuids(disk_offering_uuids)
vm_creation_option.set_default_l3_uuid(default_l3_uuid)
vm = zstack_vm_header.ZstackTestVm()
vm.set_creation_option(vm_creation_option)
vm.create()
return vm
def create_vm_scenario(image_name, vm_name = None):
zstack_management_ip = test_lib.all_scenario_config.basicConfig.zstackManagementIp.text_
vm_creation_option = test_util.VmOption()
conditions = res_ops.gen_query_conditions('name', '=', os.environ.get('instanceOfferingName_m'))
instance_offering_uuid = scen_ops.query_resource(zstack_management_ip, res_ops.INSTANCE_OFFERING, conditions).inventories[0].uuid
vm_creation_option.set_instance_offering_uuid(instance_offering_uuid)
conditions = res_ops.gen_query_conditions('name', '=', 'public network')
l3_uuid = scen_ops.query_resource(zstack_management_ip, res_ops.L3_NETWORK, conditions).inventories[0].uuid
vm_creation_option.set_l3_uuids([l3_uuid])
vm_creation_option.set_default_l3_uuid(l3_uuid)
conditions = res_ops.gen_query_conditions('name', '=', image_name)
image_uuid = scen_ops.query_resource(zstack_management_ip, res_ops.IMAGE, conditions).inventories[0].uuid
vm_creation_option.set_image_uuid(image_uuid)
vm_creation_option.set_name(vm_name)
return scen_ops.create_vm(zstack_management_ip, vm_creation_option)
def destroy_vm_scenario(vm_uuid):
zstack_management_ip = test_lib.all_scenario_config.basicConfig.zstackManagementIp.text_
scen_ops.destroy_vm(zstack_management_ip, vm_uuid)
def create_instance_vm(image_name, instance_offering_uuid, l3_name=None, disk_offering_uuids = None, default_l3_uuid = None):
image_uuid = test_lib.lib_get_image_by_name(image_name).uuid
if not l3_name:
l3_name = os.environ.get('l3PublicNetworkName')
l3_net_uuid = test_lib.lib_get_l3_by_name(l3_name).uuid
vm_name = 'zs_install_%s' % image_name
vm_creation_option = test_util.VmOption()
vm_creation_option.set_instance_offering_uuid(instance_offering_uuid)
vm_creation_option.set_l3_uuids([l3_net_uuid])
vm_creation_option.set_image_uuid(image_uuid)
vm_creation_option.set_name(vm_name)
vm_creation_option.set_data_disk_uuids(disk_offering_uuids)
vm_creation_option.set_default_l3_uuid(default_l3_uuid)
vm = zstack_vm_header.ZstackTestVm()
vm.set_creation_option(vm_creation_option)
vm.create()
return vm
def check_str(string):
if string == None:
return ""
return string
def execute_shell_in_process(cmd, tmp_file, timeout = 1200, no_timeout_excep = False):
logfd = open(tmp_file, 'w', 0)
process = subprocess.Popen(cmd, executable='/bin/sh', shell=True, stdout=logfd, stderr=logfd, universal_newlines=True)
start_time = time.time()
while process.poll() is None:
curr_time = time.time()
test_time = curr_time - start_time
if test_time > timeout:
process.kill()
logfd.close()
logfd = open(tmp_file, 'r')
test_util.test_logger('[shell:] %s [timeout logs:] %s' % (cmd, '\n'.join(logfd.readlines())))
logfd.close()
if no_timeout_excep:
test_util.test_logger('[shell:] %s timeout, after %d seconds' % (cmd, test_time))
return 1
else:
os.system('rm -f %s' % tmp_file)
test_util.test_fail('[shell:] %s timeout, after %d seconds' % (cmd, timeout))
if test_time%10 == 0:
print('shell script used: %ds' % int(test_time))
time.sleep(1)
logfd.close()
logfd = open(tmp_file, 'r')
test_util.test_logger('[shell:] %s [logs]: %s' % (cmd, '\n'.join(logfd.readlines())))
logfd.close()
return process.returncode
def execute_shell_in_process_stdout(cmd, tmp_file, timeout = 1200, no_timeout_excep = False):
logfd = open(tmp_file, 'w', 0)
process = subprocess.Popen(cmd, executable='/bin/sh', shell=True, stdout=logfd, universal_newlines=True)
start_time = time.time()
while process.poll() is None:
curr_time = time.time()
test_time = curr_time - start_time
if test_time > timeout:
process.kill()
logfd.close()
logfd = open(tmp_file, 'r')
test_util.test_logger('[shell:] %s [timeout logs:] %s' % (cmd, '\n'.join(logfd.readlines())))
logfd.close()
if no_timeout_excep:
test_util.test_logger('[shell:] %s timeout, after %d seconds' % (cmd, test_time))
return 1
else:
os.system('rm -f %s' % tmp_file)
test_util.test_fail('[shell:] %s timeout, after %d seconds' % (cmd, timeout))
if test_time%10 == 0:
print('shell script used: %ds' % int(test_time))
time.sleep(1)
logfd.close()
logfd = open(tmp_file, 'r')
stdout = '\n'.join(logfd.readlines())
logfd.close()
test_util.test_logger('[shell:] %s [logs]: %s' % (cmd, stdout))
return (process.returncode, stdout)
def scp_file_to_vm(vm_ip, src_file, target_file):
vm_username = os.environ['imageUsername']
vm_password = os.environ['imagePassword']
ssh.scp_file(src_file, target_file, vm_ip, vm_username, vm_password)
def copy_id_dsa(vm_ip, ssh_cmd, tmp_file):
src_file = '/root/.ssh/id_dsa'
target_file = '/root/.ssh/id_dsa'
if not os.path.exists(src_file):
os.system("ssh-keygen -t dsa -N '' -f %s" % src_file)
scp_file_to_vm(vm_ip, src_file, target_file)
cmd = '%s "chmod 600 /root/.ssh/id_dsa"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
def copy_id_dsa_pub(vm_ip):
src_file = '/root/.ssh/id_dsa.pub'
target_file = '/root/.ssh/authorized_keys'
if not os.path.exists(src_file):
os.system("ssh-keygen -t dsa -N '' -f %s" % src_file)
scp_file_to_vm(vm_ip, src_file, target_file)
def make_ssh_no_password(vm_ip, tmp_file):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
ssh.make_ssh_no_password(vm_ip, os.environ['imageUsername'], os.environ['imagePassword'])
copy_id_dsa(vm_ip, ssh_cmd, tmp_file)
copy_id_dsa_pub(vm_ip)
def update_mn_hostname(vm_ip, tmp_file):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
cmd = '%s "hostnamectl set-hostname zs-test" ' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
def update_repo(vm_ip, tmp_file):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
cmd = '''%s 'echo "172.20.198.214 repo.zstack.io" >> /etc/hosts' ''' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
def update_mn_ip(vm_ip, mn_ip, tmp_file):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
cmd = '%s "zstack-ctl change_ip --ip="%s ' % (ssh_cmd, vm_ip)
process_result = execute_shell_in_process(cmd, tmp_file)
def reset_rabbitmq_for_13(vm_ip, tmp_file):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
cmd = '''%s 'rabbitmqctl add_user zstack zstack.password' ''' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "rabbitmqctl set_user_tags zstack administrator" ' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '''%s 'rabbitmqctl set_permissions -p / zstack ".*" ".*" ".*" ' ''' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
def reset_rabbitmq(vm_ip, tmp_file):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
cmd = '%s "zstack-ctl reset_rabbitmq" ' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
def start_mn(vm_ip, tmp_file):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
cmd = '%s "zstack-ctl start"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
time.sleep(40)
def start_node(vm_ip, tmp_file):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
cmd = '%s "zstack-ctl start_node"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
time.sleep(40)
def stop_mn(vm_ip, tmp_file):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
cmd = '%s "zstack-ctl stop"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
time.sleep(40)
def stop_node(vm_ip, tmp_file):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
cmd = '%s "zstack-ctl stop_node"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
time.sleep(40)
def rm_old_zstack(vm_ip, zstack_home, tmp_file):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
cmd = '%s "rm -rf %s" ' % (ssh_cmd, zstack_home)
process_result = execute_shell_in_process(cmd, tmp_file)
def update_iso(vm_ip, tmp_file, iso_path, upgrade_script_path):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
vm_username = os.environ['imageUsername']
vm_password = os.environ['imagePassword']
cmd = '%s "rm -f /opt/zstack.iso"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
ssh.scp_file(iso_path, '/opt/zstack.iso', vm_ip, vm_username, vm_password)
ssh.scp_file(upgrade_script_path, '/opt/zstack-upgrade', vm_ip, vm_username, vm_password)
cmd = '%s "mkdir -p /opt/zstack-dvd"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "bash /opt/zstack-upgrade -r /opt/zstack.iso"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
#cmd = '%s "zstack-ctl stop"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "yum -y --disablerepo=* --enablerepo=zstack-local,qemu-kvm-ev clean all"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "yum -y clean all"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
#cmd = '%s "yum -y --disablerepo=* --enablerepo=zstack-local,qemu-kvm-ev update"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
def update_19_iso(vm_ip, tmp_file, iso_19_path, upgrade_script_path):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
vm_username = os.environ['imageUsername']
vm_password = os.environ['imagePassword']
#cmd = '%s "rm -f /opt/zstack_19.iso"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
ssh.scp_file(iso_19_path, '/opt/zstack_19.iso', vm_ip, vm_username, vm_password)
ssh.scp_file(upgrade_script_path, '/opt/zstack-upgrade', vm_ip, vm_username, vm_password)
cmd = '%s "mkdir -p /opt/zstack-dvd"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "bash /opt/zstack-upgrade -r /opt/zstack_19.iso"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
#cmd = '%s "zstack-ctl stop"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "yum -y --disablerepo=* --enablerepo=zstack-local,qemu-kvm-ev clean all"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "yum -y clean all"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
#cmd = '%s "yum -y --disablerepo=* --enablerepo=zstack-local,qemu-kvm-ev update"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
def update_10_iso(vm_ip, tmp_file, iso_10_path, upgrade_script_path):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
vm_username = os.environ['imageUsername']
vm_password = os.environ['imagePassword']
#cmd = '%s "rm -f /opt/zstack_10.iso"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
ssh.scp_file(iso_10_path, '/opt/zstack_10.iso', vm_ip, vm_username, vm_password)
ssh.scp_file(upgrade_script_path, '/opt/zstack-upgrade', vm_ip, vm_username, vm_password)
cmd = '%s "mkdir -p /opt/zstack-dvd"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "bash /opt/zstack-upgrade -r /opt/zstack_10.iso"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
#cmd = '%s "zstack-ctl stop"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "yum -y --disablerepo=* --enablerepo=zstack-local,qemu-kvm-ev clean all"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "yum -y clean all"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
#cmd = '%s "yum -y --disablerepo=* --enablerepo=zstack-local,qemu-kvm-ev update"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
def update_20_iso(vm_ip, tmp_file, iso_20_path, upgrade_script_path):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
vm_username = os.environ['imageUsername']
vm_password = os.environ['imagePassword']
#cmd = '%s "rm -f /opt/zstack_20.iso"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
ssh.scp_file(iso_20_path, '/opt/zstack_20.iso', vm_ip, vm_username, vm_password)
ssh.scp_file(upgrade_script_path, '/opt/zstack-upgrade', vm_ip, vm_username, vm_password)
cmd = '%s "mkdir -p /opt/zstack-dvd"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "bash /opt/zstack-upgrade -r /opt/zstack_20.iso"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
#cmd = '%s "zstack-ctl stop"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "yum -y --disablerepo=* --enablerepo=zstack-local,qemu-kvm-ev clean all"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "yum -y clean all"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
#cmd = '%s "yum -y --disablerepo=* --enablerepo=zstack-local,qemu-kvm-ev update"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
def update_21_iso(vm_ip, tmp_file, iso_21_path, upgrade_script_path):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
vm_username = os.environ['imageUsername']
vm_password = os.environ['imagePassword']
#cmd = '%s "rm -f /opt/zstack_20.iso"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
ssh.scp_file(iso_21_path, '/opt/zstack_21.iso', vm_ip, vm_username, vm_password)
ssh.scp_file(upgrade_script_path, '/opt/zstack-upgrade', vm_ip, vm_username, vm_password)
cmd = '%s "mkdir -p /opt/zstack-dvd"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "bash /opt/zstack-upgrade -r /opt/zstack_21.iso"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
#cmd = '%s "zstack-ctl stop"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "yum -y --disablerepo=* --enablerepo=zstack-local,qemu-kvm-ev clean all"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
cmd = '%s "yum -y clean all"' % ssh_cmd
process_result = execute_shell_in_process(cmd, tmp_file)
#cmd = '%s "yum -y --disablerepo=* --enablerepo=zstack-local,qemu-kvm-ev update"' % ssh_cmd
#process_result = execute_shell_in_process(cmd, tmp_file)
def prepare_mevoco_test_env(vm_inv):
all_in_one_pkg = os.environ['zstackPkg']
scp_file_to_vm(vm_ip, all_in_one_pkg, '/root/zizhu.bin')
vm_ip = vm_inv.vmNics[0].ip
ssh.make_ssh_no_password(vm_ip, test_lib.lib_get_vm_username(vm_inv), \
test_lib.lib_get_vm_password(vm_inv))
def prepare_test_env(vm_inv, aio_target):
zstack_install_script = os.environ['zstackInstallScript']
target_file = '/root/zstack_installer.sh'
vm_ip = vm_inv.vmNics[0].ip
vm_username = test_lib.lib_get_vm_username(vm_inv)
vm_password = test_lib.lib_get_vm_password(vm_inv)
scp_file_to_vm(vm_ip, zstack_install_script, target_file)
all_in_one_pkg = os.environ['zstackPkg']
scp_file_to_vm(vm_ip, all_in_one_pkg, aio_target)
ssh.make_ssh_no_password(vm_ip, vm_username, vm_password)
def prepare_upgrade_test_env(vm_inv, aio_target, upgrade_pkg):
zstack_install_script = os.environ['zstackInstallScript']
target_file = '/root/zstack_installer.sh'
vm_ip = vm_inv.vmNics[0].ip
vm_username = test_lib.lib_get_vm_username(vm_inv)
vm_password = test_lib.lib_get_vm_password(vm_inv)
scp_file_to_vm(vm_ip, zstack_install_script, target_file)
scp_file_to_vm(vm_ip, upgrade_pkg, aio_target)
ssh.make_ssh_no_password(vm_ip, vm_username, vm_password)
def prepare_yum_repo(vm_inv):
origin_file = '/etc/yum.repos.d/epel.repo'
target_file = '/etc/yum.repos.d/epel.repo'
vm_ip = vm_inv.vmNics[0].ip
vm_username = test_lib.lib_get_vm_username(vm_inv)
vm_password = test_lib.lib_get_vm_password(vm_inv)
scp_file_to_vm(vm_ip, origin_file, target_file)
ssh.make_ssh_no_password(vm_ip, vm_username, vm_password)
def upgrade_zstack(vm_ip, target_file, tmp_file):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
vm_username = os.environ['imageUsername']
vm_password = os.environ['imagePassword']
ssh.scp_file(target_file, '/opt/zstack_installer', vm_ip, vm_username, vm_password)
env_var = "WEBSITE='%s'" % 'localhost'
# cmd = '%s "%s bash %s -u -R aliyun"' % (ssh_cmd, env_var, target_file)
cmd = '%s "%s bash %s -u -o"' % (ssh_cmd, env_var, '/opt/zstack_installer')
process_result = execute_shell_in_process(cmd, tmp_file)
if process_result != 0:
cmd = '%s "cat /tmp/zstack_installation.log"' % ssh_cmd
execute_shell_in_process(cmd, tmp_file)
if 'no management-node-ready message received within' in open(tmp_file).read():
times = 30
cmd = '%s "zstack-ctl status"' % ssh_cmd
while (times > 0):
time.sleep(10)
process_result = execute_shell_in_process(cmd, tmp_file, 10, True)
times -= 0
if process_result == 0:
test_util.test_logger("management node start after extra %d seconds" % (30 - times + 1) * 10 )
return 0
test_util.test_logger("mn node is still not started up, wait for another 10 seconds...")
else:
test_util.test_fail('zstack upgrade failed')
def execute_mevoco_aliyun_install(ssh_cmd, tmp_file):
target_file = '/root/zizhu.bin'
env_var = "ZSTACK_ALL_IN_ONE='%s' WEBSITE='%s'" % \
(target_file, 'localhost')
cmd = '%s "%s bash /root/zizhu.bin -R aliyun -m"' % (ssh_cmd, env_var)
process_result = execute_shell_in_process(cmd, tmp_file)
if process_result != 0:
cmd = '%s "cat /tmp/zstack_installation.log"' % ssh_cmd
execute_shell_in_process(cmd, tmp_file)
if 'no management-node-ready message received within' in open(tmp_file).read():
times = 30
cmd = '%s "zstack-ctl status"' % ssh_cmd
while (times > 0):
time.sleep(10)
process_result = execute_shell_in_process(cmd, tmp_file, 10, True)
times -= 0
if process_result == 0:
test_util.test_logger("management node start after extra %d seconds" % (30 - times + 1) * 10 )
return 0
test_util.test_logger("mn node is still not started up, wait for another 10 seconds...")
else:
test_util.test_fail('zstack installation failed')
def execute_mevoco_online_install(ssh_cmd, tmp_file):
target_file = '/root/zizhu.bin'
env_var = "ZSTACK_ALL_IN_ONE='%s' WEBSITE='%s'" % \
(target_file, 'localhost')
cmd = '%s "%s bash /root/zizhu.bin -m"' % (ssh_cmd, env_var)
process_result = execute_shell_in_process(cmd, tmp_file)
if process_result != 0:
cmd = '%s "cat /tmp/zstack_installation.log"' % ssh_cmd
execute_shell_in_process(cmd, tmp_file)
if 'no management-node-ready message received within' in open(tmp_file).read():
times = 30
cmd = '%s "zstack-ctl status"' % ssh_cmd
while (times > 0):
time.sleep(10)
process_result = execute_shell_in_process(cmd, tmp_file, 10, True)
times -= 0
if process_result == 0:
test_util.test_logger("management node start after extra %d seconds" % (30 - times + 1) * 10 )
return 0
test_util.test_logger("mn node is still not started up, wait for another 10 seconds...")
else:
test_util.test_fail('zstack installation failed')
def execute_install_with_args(ssh_cmd, args, target_file, tmp_file):
env_var = " WEBSITE='%s'" % ('localhost')
cmd = '%s "%s bash %s %s"' % (ssh_cmd, env_var, target_file, args)
process_result = execute_shell_in_process(cmd, tmp_file, 2400)
if process_result != 0:
cmd = '%s "cat /tmp/zstack_installation.log"' % ssh_cmd
execute_shell_in_process(cmd, tmp_file)
if 'no management-node-ready message received within' in open(tmp_file).read():
times = 30
cmd = '%s "zstack-ctl status"' % ssh_cmd
while (times > 0):
time.sleep(10)
process_result = execute_shell_in_process(cmd, tmp_file, 10, True)
times -= 0
if process_result == 0:
test_util.test_logger("management node start after extra %d seconds" % (30 - times + 1) * 10 )
return 0
test_util.test_logger("mn node is still not started up, wait for another 10 seconds...")
else:
test_util.test_fail('zstack installation failed')
def execute_all_install(ssh_cmd, target_file, tmp_file):
env_var = " WEBSITE='%s'" % ('localhost')
# cmd = '%s "%s bash %s -R aliyun"' % (ssh_cmd, env_var, target_file)
cmd = '%s "%s bash %s -o"' % (ssh_cmd, env_var, target_file)
process_result = execute_shell_in_process(cmd, tmp_file, 2400)
if process_result != 0:
cmd = '%s "cat /tmp/zstack_installation.log"' % ssh_cmd
execute_shell_in_process(cmd, tmp_file)
if 'no management-node-ready message received within' in open(tmp_file).read():
times = 30
cmd = '%s "zstack-ctl status"' % ssh_cmd
while (times > 0):
time.sleep(10)
process_result = execute_shell_in_process(cmd, tmp_file, 10, True)
times -= 0
if process_result == 0:
test_util.test_logger("management node start after extra %d seconds" % (30 - times + 1) * 10 )
return 0
test_util.test_logger("mn node is still not started up, wait for another 10 seconds...")
else:
test_util.test_fail('zstack installation failed')
def only_install_zstack(ssh_cmd, target_file, tmp_file):
env_var = "WEBSITE='%s'" % 'localhost'
cmd = '%s "%s bash %s -d -i"' % (ssh_cmd, env_var, target_file)
process_result = execute_shell_in_process(cmd, tmp_file)
if process_result != 0:
cmd = '%s "cat /tmp/zstack_installation.log"' % ssh_cmd
execute_shell_in_process(cmd, tmp_file)
test_util.test_fail('zstack installation failed')
def check_installation(vm_ip, tmp_file):
# cmd = '%s "/usr/bin/zstack-cli LogInByAccount accountName=admin password=password"' % ssh_cmd
# process_result = execute_shell_in_process(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-cli login failed')
vm_username = os.environ['imageUsername']
vm_password = os.environ['imagePassword']
bs_option = test_util.BackupStorageOption()
bs_option.name = 'bs1'
bs_option.description = 'bs'
bs_option.hostname = vm_ip
bs_option.url = '/home/bs'
bs_option.username = vm_username
bs_option.password = vm_password
bs_option.sshPort = '22'
bs = scen_ops.create_sftp_backup_storage(vm_ip, bs_option)
scen_ops.delete_backup_storage(vm_ip, bs.uuid)
# vm_passwd = test_lib.lib_get_vm_password(vm_inv)
# vm_ip = vm_ip = vm_inv.vmNics[0].ip
# cmd = '%s "/usr/bin/zstack-cli AddSftpBackupStorage name=bs1 description=bs hostname=%s username=root password=%s url=/home/bs"' % (ssh_cmd, vm_ip, vm_passwd)
# process_result = execute_shell_in_process(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-cli create Backup Storage failed')
#
# cmd = '%s "/usr/bin/zstack-cli QuerySftpBackupStorage name=bs1"' % ssh_cmd
# process_result = execute_shell_in_process(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-cli Query Backup Storage failed')
# cmd = '%s "/usr/bin/zstack-cli QuerySftpBackupStorage name=bs1 fields=uuid" | grep uuid | awk \'{print $2}\'' % ssh_cmd
# (process_result, bs_uuid) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-cli Query Backup Storage failed')
#
# cmd = '%s "/usr/bin/zstack-cli DeleteBackupStorage uuid=%s"' % (ssh_cmd, bs_uuid.split('"')[1])
# process_result = execute_shell_in_process(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-cli Delete Backup Storage failed')
# check zone
# zone = scen_ops.create_zone(zstack_management_ip, zone_option)
# cmd = '%s "/usr/bin/zstack-cli CreateZone name=ZONE1"' % ssh_cmd
# process_result = execute_shell_in_process(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-cli Create Zone failed')
#
# cmd = '%s "/usr/bin/zstack-cli QueryZone name=ZONE1"' % ssh_cmd
# process_result = execute_shell_in_process(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-cli Query Zone failed')
# cmd = '%s "/usr/bin/zstack-cli QueryZone name=ZONE1 fields=uuid" | grep uuid | awk \'{print $2}\'' % ssh_cmd
# (process_result, zone_uuid) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-cli Query Zone failed')
#
# cmd = '%s "/usr/bin/zstack-cli DeleteZone uuid=%s"' % (ssh_cmd, zone_uuid.split('"')[1])
# process_result = execute_shell_in_process(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-cli Delete Zone failed')
# check item
# cmd = '%s "/usr/bin/zstack-ctl status" | grep \'^version\' | awk \'{print $2}\'' % ssh_cmd
# (process_result, version_info) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl get version failed')
# if '1.3' in version_info or '1.2' in version_info:
# cmd = '%s "/usr/bin/zstack-ctl status" | grep \'^status\' | awk \'{print $2}\'' % ssh_cmd
# (process_result, status_info) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl get status failed')
# if not 'Running' in status_info:
# test_util.test_dsc('zstack is not running, try to start zstack')
# cmd = '%s "/usr/bin/zstack-ctl start"' % ssh_cmd
# process_result = process_result = execute_shell_in_process(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl start failed')
# time.sleep(5)
# cmd = '%s "/usr/bin/zstack-ctl status" | grep \'^status\' | awk \'{print $2}\'' % ssh_cmd
# (process_result, status_info) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl get status failed')
# if not 'Running' in status_info:
# test_util.test_fail('zstack is not running, start zstack failed')
# test_util.test_dsc('check zstack status, zstack is running')
# else:
# cmd = '%s "/usr/bin/zstack-ctl status" | grep \'^MN status\' | awk \'{print $3}\'' % ssh_cmd
# (process_result, mn_status) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl get MN status failed')
# if not 'Running' in mn_status:
# test_util.test_dsc('management node is not running, try to start management node')
# cmd = '%s "/usr/bin/zstack-ctl start_node"' % ssh_cmd
# process_result = process_result = execute_shell_in_process(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl start_node failed')
# time.sleep(5)
# cmd = '%s "/usr/bin/zstack-ctl status" | grep \'^MN status\' | awk \'{print $3}\'' % ssh_cmd
# (process_result, mn_status) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl get MN status failed')
# if not 'Running' in mn_status:
# test_util.test_fail('management node is not running, start management node failed')
# test_util.test_dsc('check MN, MN is running')
# cmd = '%s "/usr/bin/zstack-ctl status" | grep \'^UI status\' | awk \'{print $3}\'' % ssh_cmd
# (process_result, ui_status) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl get UI status failed')
# if not 'Running' in ui_status:
# test_util.test_dsc('UI is not running, try to start UI')
# cmd = '%s "/usr/bin/zstack-ctl start_ui"' % ssh_cmd
# process_result = process_result = execute_shell_in_process(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl start_ui failed')
# time.sleep(5)
# cmd = '%s "/usr/bin/zstack-ctl status" | grep \'^MN status\' | awk \'{print $3}\'' % ssh_cmd
# (process_result, mn_status) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl get MN status failed')
# if not 'Running' in mn_status:
# test_util.test_fail('UI is not running, start UI failed')
# test_util.test_dsc('check UI, UI is running')
#
# cmd = '%s "/usr/bin/zstack-ctl status" | grep ^ZSTACK_HOME | awk \'{print $2}\'' % ssh_cmd
# (process_result, zstack_home) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl status get ZSTACK_HOME failed')
# zstack_home = zstack_home[:-1]
# cmd = '%s "[ -d " %s " ] && echo yes || echo no" ' % (ssh_cmd, zstack_home)
# (process_result, dir_exist) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('check ZSTACK_HOME failed')
# dir_exist = dir_exist[:-1]
# if dir_exist == 'no':
# test_util.test_fail('there is no ZSTACK_HOME')
#
# cmd = '%s "/usr/bin/zstack-ctl status" | grep ^zstack.properties | awk \'{print $2}\'' % ssh_cmd
# (process_result, properties_file) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl status get zstack.properties failed')
# properties_file = properties_file[:-1]
# cmd = '%s "[ -f " %s " ] && echo yes || echo no" ' % (ssh_cmd, properties_file)
# (process_result, file_exist) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('check zstack.properties failed')
# file_exist = file_exist[:-1]
# if file_exist == 'no':
# test_util.test_fail('there is no zstack.properties')
#
# cmd = '%s "/usr/bin/zstack-ctl status" | grep ^log4j2.xml | awk \'{print $2}\'' % ssh_cmd
# (process_result, properties_file) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl status get log4j2.xml failed')
# properties_file = properties_file[:-1]
# cmd = '%s "[ -f " %s " ] && echo yes || echo no" ' % (ssh_cmd, properties_file)
# (process_result, file_exist) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('check log4j2.xml failed')
# file_exist = file_exist[:-1]
# if file_exist == 'no':
# test_util.test_fail('there is no log4j2.xml')
#
# cmd = '%s "/usr/bin/zstack-ctl status" | grep ^\'PID file\' | awk \'{print $3}\'' % ssh_cmd
# (process_result, properties_file) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl status get PID file failed')
# properties_file = properties_file[:-1]
# cmd = '%s "[ -f " %s " ] && echo yes || echo no" ' % (ssh_cmd, properties_file)
# (process_result, file_exist) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('check PID file failed')
# file_exist = file_exist[:-1]
# if file_exist == 'no':
# test_util.test_fail('there is no PID file')
#
# cmd = '%s "/usr/bin/zstack-ctl status" | grep ^\'log file\' | awk \'{print $3}\'' % ssh_cmd
# (process_result, properties_file) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('zstack-ctl status get log file failed')
# properties_file = properties_file[:-1]
# cmd = '%s "[ -f " %s " ] && echo yes || echo no" ' % (ssh_cmd, properties_file)
# (process_result, file_exist) = execute_shell_in_process_stdout(cmd, tmp_file)
# if process_result != 0:
# test_util.test_fail('check log file failed')
# file_exist = file_exist[:-1]
# if file_exist == 'no':
# test_util.test_fail('there is no log file')
def create_sftp_backup_storage(vm_ip, tmp_file):
vm_username = os.environ['imageUsername']
vm_password = os.environ['imagePassword']
bs_option = test_util.BackupStorageOption()
bs_option.name = 'bs1'
bs_option.description = 'bs'
bs_option.hostname = vm_ip
bs_option.url = '/home/bs'
bs_option.username = vm_username
bs_option.password = vm_password
bs_option.sshPort = '22'
bs = scen_ops.create_sftp_backup_storage(vm_ip, bs_option)
scen_ops.reconnect_backup_storage(vm_ip, bs.uuid)
def reconnect_backup_storage(vm_ip, tmp_file):
bs = scen_ops.query_backup_storage(vm_ip, tmp_file)
scen_ops.reconnect_backup_storage(vm_ip, bs.uuid)
def delete_backup_storage(vm_ip, tmp_file):
bs = scen_ops.query_backup_storage(vm_ip, tmp_file)
scen_ops.delete_backup_storage(vm_ip, bs.uuid)
def check_zstack_version(vm_ip, tmp_file, pkg_version):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
def check_zstack_version(vm_ip, tmp_file, pkg_version):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
cmd = '%s "/usr/bin/zstack-ctl status" | grep ^version | awk \'{print $2}\'' % ssh_cmd
(process_result, version) = execute_shell_in_process_stdout(cmd, tmp_file)
if process_result != 0:
test_util.test_fail('zstack-ctl get version failed')
version = version[:-1]
test_util.test_dsc("current version number: %s" % version)
if version != pkg_version:
test_util.test_fail('try to install zstack-%s, but current version is zstack-%s' % (pkg_version, version))
def check_zstack_or_mevoco(vm_ip, tmp_file, zom):
ssh_cmd = 'ssh -oStrictHostKeyChecking=no -oCheckHostIP=no -oUserKnownHostsFile=/dev/null %s' % vm_ip
cmd = '%s "/usr/bin/zstack-ctl status" | grep ^version | awk \'{print $3}\'' % ssh_cmd
(process_result, version) = execute_shell_in_process_stdout(cmd, tmp_file)
if process_result != 0:
test_util.test_fail('zstack-ctl get version failed')
version = version[1:-1]
test_util.test_dsc("current version: %s" % version)
if version != zom:
test_util.test_fail('try to install %s, but current version is %s' % (zom, version))
| 50.576159 | 163 | 0.680791 | 5,573 | 38,185 | 4.343442 | 0.05419 | 0.041643 | 0.042138 | 0.08502 | 0.900892 | 0.888003 | 0.8621 | 0.844419 | 0.824258 | 0.813517 | 0 | 0.010447 | 0.195338 | 38,185 | 754 | 164 | 50.643236 | 0.777355 | 0.29797 | 0 | 0.713376 | 0 | 0.012739 | 0.233715 | 0.070024 | 0 | 0 | 0 | 0 | 0 | 1 | 0.093418 | false | 0.072187 | 0.019108 | 0 | 0.144374 | 0.008493 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
3f11fc69e36b47009860858e08be9a583ddc54da | 21,006 | py | Python | deeppavlov/agents/insults/insults_agents.py | deepmipt/kpi2017 | 0f6b13c6ea76e544804ce66ba372c66d5ef9ee30 | [
"Apache-2.0"
] | 3 | 2018-02-19T15:34:44.000Z | 2018-06-05T10:02:00.000Z | deeppavlov/agents/insults/insults_agents.py | deepmipt/kpi2017 | 0f6b13c6ea76e544804ce66ba372c66d5ef9ee30 | [
"Apache-2.0"
] | null | null | null | deeppavlov/agents/insults/insults_agents.py | deepmipt/kpi2017 | 0f6b13c6ea76e544804ce66ba372c66d5ef9ee30 | [
"Apache-2.0"
] | 1 | 2021-03-22T09:06:52.000Z | 2021-03-22T09:06:52.000Z | # Copyright 2017 Neural Networks and Deep Learning lab, MIPT
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import copy
from parlai.core.agents import Agent
from . import config
from .model import InsultsModel
from .utils import create_vectorizer_selector, get_vectorizer_selector
from .embeddings_dict import EmbeddingsDict
class EnsembleInsultsAgent(Agent):
"""EnsembleInsultsAgent
Class that gets the observations from teacher and
trains several models, gives weighted predictions.
Attributes:
id: agent name
episode_done: flag is episode done
is_shared: flag is parallel computations
word_dict: dictionary of words
num_ngrams: number of considered ngrams (for sklearn models)
models: list of chosen models to ensemble
model_coefs: list of cefficients to sum models predictions with
n_examples: number of samples
observation: gathered text observations (samples)
"""
@staticmethod
def add_cmdline_args(argparser):
"""Add arguments from command line."""
config.add_cmdline_args(argparser)
ensemble = argparser.add_argument_group('Ensemble parameters')
ensemble.add_argument('--model_files', type=str, default=None, nargs='+',
help='list of all the model files for the ensemble')
ensemble.add_argument('--model_names', type=str, default=None, nargs='+',
help='list of all the model names for the ensemble')
ensemble.add_argument('--model_coefs', type=str, default=None, nargs='+',
help='list of all the model coefs for the ensemble')
def __init__(self, opt, shared=None):
"""Initialize the class according to the given parameters in opt."""
self.id = 'InsultsAgent'
self.episode_done = True
super().__init__(opt, shared)
if shared is not None:
self.is_shared = True
return
# Set up params/logging/dicts
self.is_shared = False
self.models = []
for i, model_name in enumerate(opt.get('model_names', [])):
print('Model:', model_name)
model_file = opt.get('model_files', [])[i]
opt['pretrained_model'] = model_file
print('Model file:', model_file)
if model_name == 'cnn_word' or model_name == 'lstm_word':
self.word_dict = None
embedding_dict = EmbeddingsDict(opt, opt.get('embedding_dim'))
self.num_ngrams = None
if model_name == 'log_reg' or model_name == 'svc':
self.word_dict = None
embedding_dict = None
self.num_ngrams = 6
self.models.append(InsultsModel(model_name, self.word_dict, embedding_dict, opt))
if model_name == 'log_reg' or model_name == 'svc':
print('Reading vectorizers and selectors')
self.models[i].vectorizers, self.models[i].selectors = get_vectorizer_selector(model_file,
self.num_ngrams)
self.model_coefs = [float(coef) for coef in opt.get('model_coefs', [])]
print('model coefs:', self.model_coefs)
self.n_examples = 0
def observe(self, observation):
"""Gather obtained observation (sample) with previous observations."""
observation = copy.deepcopy(observation)
if not self.episode_done:
# if the last example wasn't the end of an episode, then we need to
# recall what was said in that example
prev_dialogue = self.observation['text']
observation['text'] = prev_dialogue + '\n' + observation['text']
self.observation = observation
self.episode_done = observation['episode_done']
return observation
def _predictions2text(self, predictions):
"""Convert float predictions to text labels."""
y = ['Insult' if ex > 0.5 else 'Non-insult' for ex in predictions]
return y
def _text2predictions(self, predictions):
"""Convert text labels to integer class labels."""
y = [1 if ex == 'Insult' else 0 for ex in predictions]
return y
def _build_ex(self, ex):
"""Find the token span of the answer in the context for this example."""
if 'text' not in ex:
return
inputs = dict()
inputs['question'] = ex['text']
if 'labels' in ex:
inputs['labels'] = ex['labels']
return inputs
def act(self):
"""Call batch act with batch of one sample."""
return self.batch_act([self.observation])[0]
def batch_act(self, observations):
"""Train model or predict for given batch of observations."""
if self.is_shared:
raise RuntimeError("Parallel act is not supported.")
batch_size = len(observations)
# initialize a table of replies with this agent's id
batch_reply = [{'id': self.getID()} for _ in range(batch_size)]
predictions = [[] for _ in range(batch_size)]
for j, model in enumerate(self.models):
examples = [model._build_ex(obs) for obs in observations]
valid_inds = [i for i in range(batch_size) if examples[i] is not None]
examples = [ex for ex in examples if ex is not None]
batch = model._batchify(examples, self.word_dict)
prediction = model.predict(batch)
for i in range(len(prediction)):
predictions[valid_inds[i]].append(prediction[i])
for i in range(batch_size):
if len(predictions[i]):
prediction = self.weighted_sum(predictions[i])
batch_reply[i]['text'] = self._predictions2text([prediction])[0]
batch_reply[i]['score'] = prediction
return batch_reply
def weighted_sum(self, predictions):
"""Train model or predict for given batch of observations."""
result = 0
for j in range(len(predictions)):
result += self.model_coefs[j] * predictions[j]
result = result / sum(self.model_coefs)
return result
class BoostEnsembleInsultsAgent(Agent):
"""BoostEnsembleInsultsAgent
Class that gets the observations from teacher and
trains several models, gives weighted predictions.
Attributes:
id: agent name
episode_done: flag is episode done
is_shared: flag is parallel computations
word_dict: dictionary of words
num_ngrams: number of considered ngrams (for sklearn models)
models: list of chosen models to ensemble
model_coefs: list of cefficients to sum models predictions with
n_examples: number of samples
observation: gathered text observations (samples)
"""
@staticmethod
def add_cmdline_args(argparser):
"""Add arguments from command line."""
config.add_cmdline_args(argparser)
ensemble = argparser.add_argument_group('Ensemble parameters')
ensemble.add_argument('--model_files', type=str, default=None, nargs='+',
help='list of all the model files for the ensemble')
ensemble.add_argument('--model_names', type=str, default=None, nargs='+',
help='list of all the model names for the ensemble')
ensemble.add_argument('--model_coefs', type=str, default=None, nargs='+',
help='list of all the model coefs for the ensemble')
def __init__(self, opt, shared=None):
"""Initialize the class according to the given parameters in opt."""
self.id = 'InsultsAgent'
self.episode_done = True
super().__init__(opt, shared)
if shared is not None:
self.is_shared = True
return
# Set up params/logging/dicts
self.is_shared = False
self.models = []
for i, model_name in enumerate(opt.get('model_names', [])):
print('Model:', model_name)
model_file = opt.get('model_files', [])[i]
opt['pretrained_model'] = model_file
print('Model file:', model_file)
if model_name == 'cnn_word' or model_name == 'lstm_word':
self.word_dict = None
embedding_dict = EmbeddingsDict(opt, opt.get('embedding_dim'))
self.num_ngrams = None
if model_name == 'log_reg' or model_name == 'svc':
self.word_dict = None
embedding_dict = None
self.num_ngrams = 6
self.models.append(InsultsModel(model_name, self.word_dict, embedding_dict, opt))
if model_name == 'log_reg' or model_name == 'svc':
print('Reading vectorizers and selectors')
self.models[i].vectorizers, self.models[i].selectors = get_vectorizer_selector(model_file,
self.num_ngrams)
self.model_coefs = [float(coef) for coef in opt.get('model_coefs', [])]
print('model coefs:', self.model_coefs)
self.n_examples = 0
def observe(self, observation):
"""Gather obtained observation (sample) with previous observations."""
observation = copy.deepcopy(observation)
if not self.episode_done:
# if the last example wasn't the end of an episode, then we need to
# recall what was said in that example
prev_dialogue = self.observation['text']
observation['text'] = prev_dialogue + '\n' + observation['text']
self.observation = observation
self.episode_done = observation['episode_done']
return observation
def _predictions2text(self, predictions):
"""Convert float predictions to text labels."""
y = ['Insult' if ex > 0.5 else 'Non-insult' for ex in predictions]
return y
def _text2predictions(self, predictions):
"""Convert text labels to integer class labels."""
y = [1 if ex == 'Insult' else 0 for ex in predictions]
return y
def _build_ex(self, ex):
"""Find the token span of the answer in the context for this example."""
if 'text' not in ex:
return
inputs = dict()
inputs['question'] = ex['text']
if 'labels' in ex:
inputs['labels'] = ex['labels']
return inputs
def act(self):
"""Call batch act with batch of one sample."""
return self.batch_act([self.observation])[0]
def batch_act(self, observations):
"""Train model or predict for given batch of observations."""
if self.is_shared:
raise RuntimeError("Parallel act is not supported.")
batch_size = len(observations)
# initialize a table of replies with this agent's id
batch_reply = [{'id': self.getID()} for _ in range(batch_size)]
predictions = [[] for _ in range(batch_size)]
for j, model in enumerate(self.models):
examples = [model._build_ex(obs) for obs in observations]
valid_inds = [i for i in range(batch_size) if examples[i] is not None]
examples = [ex for ex in examples if ex is not None]
batch = model._batchify(examples, self.word_dict)
prediction = model.predict(batch)
for i in range(len(prediction)):
predictions[valid_inds[i]].append(prediction[i])
for i in range(batch_size):
if len(predictions[i]):
prediction = self.weighted_sum(predictions[i])
batch_reply[i]['text'] = self._predictions2text([prediction])[0]
batch_reply[i]['score'] = prediction
return batch_reply
def weighted_sum(self, predictions):
"""Sum predictions by chosen models with given coefficients."""
result = 0
for j in range(len(predictions)):
result += self.model_coefs[j] * predictions[j]
result = result / sum(self.model_coefs)
return result
class InsultsAgent(Agent):
"""insultsAgent
Class that gets the observations from teacher and
trains model, gives predictions.
Attibutes:
id: agent name
episode_done: flag is episode done
is_shared: flag is parallel computations
model_name: name of chosen model to fit
word_dict: dictionary of words
num_ngrams: number of considered ngrams (for sklearn models)
model: chosen model to fit
n_examples: number of samples
observation: gathered text observations (samples)
"""
@staticmethod
def add_cmdline_args(argparser):
"""Add arguments from command line."""
config.add_cmdline_args(argparser)
def __init__(self, opt, shared=None):
"""Initialize the class according to given parameters from opt."""
self.id = 'InsultsAgent'
self.episode_done = True
super().__init__(opt, shared)
if shared is not None:
self.is_shared = True
return
# Set up params/logging/dicts
self.is_shared = False
self.model_name = opt['model_name']
if self.model_name == 'cnn_word' or self.model_name == 'lstm_word':
self.word_dict = None
embedding_dict = EmbeddingsDict(opt, opt.get('embedding_dim'))
self.num_ngrams = None
if self.model_name == 'log_reg' or self.model_name == 'svc':
self.word_dict = None
embedding_dict = None
self.num_ngrams = 6
print('create model', self.model_name)
self.model = InsultsModel(self.model_name, self.word_dict, embedding_dict, opt)
self.n_examples = 0
if (self.model.from_saved == True and self.model.model_type == 'ngrams'):
print ('Reading vectorizers and selectors')
self.model.vectorizers, self.model.selectors = get_vectorizer_selector(self.opt['model_file'], self.num_ngrams)
def observe(self, observation):
"""Gather obtained observation (sample) with previous observations."""
observation = copy.deepcopy(observation)
if not self.episode_done:
# if the last example wasn't the end of an episode, then we need to
# recall what was said in that example
prev_dialogue = self.observation['text']
observation['text'] = prev_dialogue + '\n' + observation['text']
self.observation = observation
self.episode_done = observation['episode_done']
return observation
def act(self):
"""Call batch act with batch of one sample."""
return self.batch_act([self.observation])[0]
def batch_act(self, observations):
"""Train model or predict for given batch of observations."""
if self.is_shared:
raise RuntimeError("Parallel act is not supported.")
batch_size = len(observations)
# initialize a table of replies with this agent's id
batch_reply = [{'id': self.getID()} for _ in range(batch_size)]
examples = [self._build_ex(obs) for obs in observations]
valid_inds = [i for i in range(batch_size) if examples[i] is not None]
examples = [ex for ex in examples if ex is not None]
if 'labels' in observations[0]:
self.n_examples += len(examples)
batch = self.model._batchify(examples)
predictions = self.model.update(batch)
predictions_text = self._predictions2text(predictions)
for i in range(len(predictions)):
batch_reply[valid_inds[i]]['text'] = predictions_text[i]
batch_reply[valid_inds[i]]['score'] = predictions[i]
else:
batch = self.model._batchify(examples)
predictions = self.model.predict(batch)
predictions_text = self._predictions2text(predictions)
for i in range(len(predictions)):
batch_reply[valid_inds[i]]['text'] = predictions_text[i]
batch_reply[valid_inds[i]]['score'] = predictions[i]
return batch_reply
def _build_ex(self, ex):
"""Find the token span of the answer in the context for this example."""
if 'text' not in ex:
return
inputs = dict()
inputs['question'] = ex['text']
if 'labels' in ex:
inputs['labels'] = ex['labels']
return inputs
def _predictions2text(self, predictions):
"""Convert float predictions to text labels."""
y = ['Insult' if ex > 0.5 else 'Non-insult' for ex in predictions]
return y
def _text2predictions(self, predictions):
"""Convert text labels to integer class labels."""
y = [1 if ex == 'Insult' else 0 for ex in predictions]
return y
def report(self):
"""Return report."""
report = dict()
report['updates'] = self.model.updates
report['n_examples'] = self.n_examples
report['loss'] = self.model.train_loss
report['accuracy'] = self.model.train_acc
report['auc'] = self.model.train_auc
return report
def save(self):
"""Save trained model."""
self.model.save()
class OneEpochAgent(InsultsAgent):
"""OneEpochAgent
Child class for class InsultsAgent.
Class collects all the train data, vectorizes it, selects features,
trains n-gram models from sklearn such SVC and LogisticRegression.
Attibutes:
observation:
observations_: all the train data
model: model to fit
opt: given parameters
"""
def __init__(self, opt, shared=None):
"""Initialize the class."""
super().__init__(opt, shared)
self.observation = ''
self.observations_ = []
def batch_act(self, observations):
"""Collect train observations, do not train."""
self.observations_ += observations
if self.is_shared:
raise RuntimeError("Parallel act is not supported.")
batch_size = len(observations)
# initialize a table of replies with this agent's id
batch_reply = [{'id': self.getID()} for _ in range(batch_size)]
examples = [self._build_ex(obs) for obs in observations]
valid_inds = [i for i in range(batch_size) if examples[i] is not None]
examples = [ex for ex in examples if ex is not None]
if 'labels' in observations[0]:
self.n_examples += len(examples)
else:
batch = self.model._batchify(examples)
predictions = self.model.predict(batch).reshape(-1)
predictions_text = self._predictions2text(predictions)
for i in range(len(predictions)):
batch_reply[valid_inds[i]]['text'] = predictions_text[i]
batch_reply[valid_inds[i]]['score'] = predictions[i]
return batch_reply
def save(self):
"""Create features, train and save model."""
if not self.is_shared:
train_data = [observation['text'] for observation in self.observations_ if 'text' in observation.keys()]
train_labels = self._text2predictions([observation['labels'][0] for observation in self.observations_ if 'labels' in observation.keys()])
self.model.num_ngrams = self.num_ngrams
if self.model.from_saved == False:
print('Creating vectorizers and selectors')
create_vectorizer_selector(train_data, train_labels, self.opt['model_file'],
ngram_list=[1, 2, 3, 4, 5, 3],
max_num_features_list=[2000, 4000, 100, 1000, 1000, 2000],
analyzer_type_list=['word', 'word', 'word', 'char', 'char', 'char'])
print('Reading vectorizers and selectors')
self.model.vectorizers, self.model.selectors = get_vectorizer_selector(self.opt['model_file'], self.num_ngrams)
print('Training model', self.model_name)
self.model.update([train_data, train_labels])
print('\n[model] trained loss = %.4f | acc = %.4f | auc = %.4f' %
(self.model.train_loss, self.model.train_acc, self.model.train_auc,))
self.model.save()
return
return
| 41.844622 | 149 | 0.612063 | 2,535 | 21,006 | 4.931755 | 0.113609 | 0.029515 | 0.011518 | 0.015358 | 0.824348 | 0.816189 | 0.806431 | 0.806431 | 0.799472 | 0.788594 | 0 | 0.005434 | 0.290441 | 21,006 | 501 | 150 | 41.928144 | 0.833345 | 0.215605 | 0 | 0.861199 | 0 | 0 | 0.094403 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.097792 | false | 0 | 0.018927 | 0 | 0.22082 | 0.044164 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3f49451a3936b9ac81b6ce803aaccf193ad21e10 | 189 | py | Python | iris_sdk/models/data/calling_name.py | NumberAI/python-bandwidth-iris | 0e05f79d68b244812afb97e00fd65b3f46d00aa3 | [
"MIT"
] | 2 | 2020-04-13T13:47:59.000Z | 2022-02-23T20:32:41.000Z | iris_sdk/models/data/calling_name.py | bandwidthcom/python-bandwidth-iris | dbcb30569631395041b92917252d913166f7d3c9 | [
"MIT"
] | 5 | 2020-09-18T20:59:24.000Z | 2021-08-25T16:51:42.000Z | iris_sdk/models/data/calling_name.py | bandwidthcom/python-bandwidth-iris | dbcb30569631395041b92917252d913166f7d3c9 | [
"MIT"
] | 5 | 2018-12-12T14:39:50.000Z | 2020-11-17T21:42:29.000Z | #!/usr/bin/env python
from iris_sdk.models.base_resource import BaseData
from iris_sdk.models.maps.calling_name import CallingNameMap
class CallingName(CallingNameMap, BaseData):
pass | 27 | 60 | 0.825397 | 26 | 189 | 5.846154 | 0.730769 | 0.105263 | 0.144737 | 0.223684 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.100529 | 189 | 7 | 61 | 27 | 0.894118 | 0.10582 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
4500c03cb7a2e6cc31d847775f5bc92155017ab2 | 100 | py | Python | ips/ip/spi_slave_cpha0/__init__.py | zld012739/zldrepository | 5635b78a168956091676ef4dd99fa564be0e5ba0 | [
"MIT"
] | null | null | null | ips/ip/spi_slave_cpha0/__init__.py | zld012739/zldrepository | 5635b78a168956091676ef4dd99fa564be0e5ba0 | [
"MIT"
] | null | null | null | ips/ip/spi_slave_cpha0/__init__.py | zld012739/zldrepository | 5635b78a168956091676ef4dd99fa564be0e5ba0 | [
"MIT"
] | null | null | null | from spi_slave_cpha0_partial import get_ip_name
from spi_slave_cpha0_partial import SPI_SLAVE_CPHA0
| 33.333333 | 51 | 0.92 | 18 | 100 | 4.555556 | 0.5 | 0.292683 | 0.47561 | 0.414634 | 0.731707 | 0.731707 | 0 | 0 | 0 | 0 | 0 | 0.032609 | 0.08 | 100 | 2 | 52 | 50 | 0.858696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
452387cfc5a999f1057b8214aa38d06fb0c44924 | 1,803 | py | Python | example.py | harmsm/gmx-template | 43e6f72b9948479d37c8f34da21166b4fb8b5823 | [
"MIT"
] | null | null | null | example.py | harmsm/gmx-template | 43e6f72b9948479d37c8f34da21166b4fb8b5823 | [
"MIT"
] | null | null | null | example.py | harmsm/gmx-template | 43e6f72b9948479d37c8f34da21166b4fb8b5823 | [
"MIT"
] | 3 | 2021-04-16T20:23:07.000Z | 2021-04-16T21:29:35.000Z | import setup_gromacs_md as sgm
print("*** Protein alone ***")
sgm.setup_run(output_dir="example-outputs/prot-alone",
ff_source_dir="ff/charmm36-feb2021.ff/",
mdp_files_dir="ff/mdp-files/",
protein_pdb_file="example-inputs/tlr4-md2.pdb")
print("*** Custom ligand alone, not posed ***")
sgm.setup_run(output_dir="example-outputs/lig-alone",
ff_source_dir="ff/charmm36-feb2021.ff/",
mdp_files_dir="ff/mdp-files/",
charmm_gui_gromacs_dir="ff/e-coli_l-i-o-o1/")
print("*** Custom ligand alone, posed ***")
sgm.setup_run(output_dir="example-outputs/lig-alone_posed",
ff_source_dir="ff/charmm36-feb2021.ff/",
mdp_files_dir="ff/mdp-files/",
charmm_gui_gromacs_dir="ff/e-coli_l-i-o-o1/",
ligand_pose_files="example-inputs/e-coli_l-i-o-o1_pose1.pdb")
print("*** Protein with single pose of custom ligand ***")
sgm.setup_run(output_dir="example-outputs/prot-one-lig_posed",
ff_source_dir="ff/charmm36-feb2021.ff/",
mdp_files_dir="ff/mdp-files/",
charmm_gui_gromacs_dir="ff/e-coli_l-i-o-o1/",
protein_pdb_file="example-inputs/tlr4-md2.pdb",
ligand_pose_files="example-inputs/e-coli_l-i-o-o1_pose1.pdb")
print("*** Protein with two poses of custom ligand ***")
sgm.setup_run(output_dir="example-outputs/prot-two-lig_posed",
ff_source_dir="ff/charmm36-feb2021.ff/",
mdp_files_dir="ff/mdp-files/",
charmm_gui_gromacs_dir="ff/e-coli_l-i-o-o1/",
protein_pdb_file="example-inputs/tlr4-md2.pdb",
ligand_pose_files=["example-inputs/e-coli_l-i-o-o1_pose1.pdb",
"example-inputs/e-coli_l-i-o-o1_pose2.pdb"])
| 47.447368 | 77 | 0.630616 | 268 | 1,803 | 3.977612 | 0.179104 | 0.065666 | 0.093809 | 0.052533 | 0.886492 | 0.886492 | 0.886492 | 0.886492 | 0.794559 | 0.794559 | 0 | 0.033803 | 0.212424 | 1,803 | 37 | 78 | 48.72973 | 0.716901 | 0 | 0 | 0.53125 | 0 | 0 | 0.463672 | 0.280643 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.03125 | 0 | 0.03125 | 0.15625 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
452c153803cfa3ad2b3112c4c7a53f586e2a9845 | 14,888 | bzl | Python | rust/known_shas.bzl | th0br0/rules_rust | 4a9d0e0b6c66f1e98d15cbd3cccc8100a0454fc9 | [
"Apache-2.0"
] | null | null | null | rust/known_shas.bzl | th0br0/rules_rust | 4a9d0e0b6c66f1e98d15cbd3cccc8100a0454fc9 | [
"Apache-2.0"
] | null | null | null | rust/known_shas.bzl | th0br0/rules_rust | 4a9d0e0b6c66f1e98d15cbd3cccc8100a0454fc9 | [
"Apache-2.0"
] | null | null | null | # This is a generated file -- see //util:fetch_shas.sh
FILE_KEY_TO_SHA = {
"2018-07-07/rust-beta-x86_64-apple-darwin": "098a05b3c54529c03ce96bc90faefc46027063773f0ffba17931aba38b897cda",
"2018-07-07/rust-beta-x86_64-unknown-freebsd": "636744474bddeb5be9bec7954c9b252c6df4020df3edeade26931d76b69ab526",
"2018-07-07/rust-beta-x86_64-unknown-linux-gnu": "e98a60d6ec24f26f8aacfef5e588cdd7b8d75f08d2ecb73a81d9ffde2dc12095",
"2018-07-07/rustc-beta-x86_64-apple-darwin": "8bfd73443125dffde1d55d761f4fb4b318081dbdbb14ba52dea93851024dc35b",
"2018-07-07/rustc-beta-x86_64-unknown-freebsd": "7d489a6319b7bcefd0807ec1a3c5bb07fdd0a966bbc74c90fb43ecd045aa9662",
"2018-07-07/rustc-beta-x86_64-unknown-linux-gnu": "11caee3c8c73a27e0bf45217cf80c694e923baf86d5ddcc422053435e5702e01",
"2018-07-07/rustc-nightly-x86_64-apple-darwin": "b1e1a7d59596d172468604185319f0be220568deb9dbbba2d569852abdb80ec8",
"2018-07-07/rustc-nightly-x86_64-unknown-freebsd": "5f23bc9ca20fe677270229988ed3954c374b5ae3d467e1780ccda43d86df5f02",
"2018-07-07/rustc-nightly-x86_64-unknown-linux-gnu": "6780a8c6720d774e8ebc626e5e238d35857598ac490a3d2bde80bd6664b783c5",
"2018-07-07/rust-nightly-x86_64-apple-darwin": "5a4122c36e1f1f00d7ea711b0808c697ecfbb90abc08ced4340c66f298040300",
"2018-07-07/rust-nightly-x86_64-unknown-freebsd": "8b0ca59fa97f7d2af8d9823eec1ad3e3c90de027ae266dacc8d9699dbd74b66a",
"2018-07-07/rust-nightly-x86_64-unknown-linux-gnu": "eb0e6a173835994be99e0e31dfc8443543ad4217dc0cbc89fb9c82232d5f3db8",
"2018-07-07/rust-std-beta-x86_64-apple-darwin": "2d856e488e87d9ec5e66ffd77e6528d139ada2a9eac2c9ac7b05593678c97c6e",
"2018-07-07/rust-std-beta-x86_64-unknown-freebsd": "785bb39558b05c3833c5b7e78c297ce0c5819d6d36a4642b7e369660f124388e",
"2018-07-07/rust-std-beta-x86_64-unknown-linux-gnu": "90b0ea04e179ecb89a31524f66f49ada5138b3dbb8862f5586de596df1a6f15a",
"2018-07-07/rust-std-nightly-x86_64-apple-darwin": "04b4b9896c5a494a915ee06743a28f61dd63ff4ea03ebc4c6c3f1235de52ebeb",
"2018-07-07/rust-std-nightly-x86_64-unknown-freebsd": "4f891a752860679d363f972c9766b2ab8eed3d18f04ed3c38cf1f30cbf01cb64",
"2018-07-07/rust-std-nightly-x86_64-unknown-linux-gnu": "6f7d2d9958dc1a900364ea042d2b4cd421c1d4065667d6e82454905fb05bb7fb",
"2018-07-08/rustc-nightly-x86_64-apple-darwin": "82d69b7241e22dc57852f523205b397027ef7b8121054b21f9f931584b7bd53c",
"2018-07-08/rustc-nightly-x86_64-unknown-freebsd": "663e40ee11d7a618b1adcc1d1e68771ce6a31eee6b442d99fb9691c7d3d3377d",
"2018-07-08/rustc-nightly-x86_64-unknown-linux-gnu": "1becc7ef721485b08d798c772d5a2e65d657eac21333607c1200e78b8f1a5162",
"2018-07-08/rust-nightly-x86_64-apple-darwin": "3cfb0571270a5ab6c69d0fddf89ff05d960411fc345621bcee16802d9a3c5e9a",
"2018-07-08/rust-nightly-x86_64-unknown-freebsd": "64e21c105b885e4d8fa594180be2634f2c3ba9b6bfb828cc25fda42cabd5277d",
"2018-07-08/rust-nightly-x86_64-unknown-linux-gnu": "f98e9e23abd3c98f188a9aa5f67495692c77e1747fdf66324dbee5d8720189d7",
"2018-07-08/rust-std-nightly-x86_64-apple-darwin": "75066ccb0b565a5f774931201a1a680c85059b5feb9cffe5cee0bf17da012bc8",
"2018-07-08/rust-std-nightly-x86_64-unknown-freebsd": "c4ebfdaf994854fd4e55f8f294e53c914ee599738fcdf3c34c1cc94928c0c4a9",
"2018-07-08/rust-std-nightly-x86_64-unknown-linux-gnu": "4fd55a5732625cf57a9e0a7de5d10155a02a9cb0e52b23e8e3b9b0089a75acd1",
"2018-07-09/rust-beta-x86_64-apple-darwin": "88c8cc09021f60241375914a37b8741de414cbfcf1e455d18bdf1621a6caab6d",
"2018-07-09/rust-beta-x86_64-unknown-freebsd": "23ca8e71a2ecf9e2342c7df5f905558c62f6c691f6b3f000fff8f81eefb006b4",
"2018-07-09/rust-beta-x86_64-unknown-linux-gnu": "62f5b7fa5d386ccc709f4ed232169b0130de6d432ec18fe49093d3694201bc65",
"2018-07-09/rustc-beta-x86_64-apple-darwin": "6c6973c44d1b008a54ed2e27dbe06ec7847667834f6fbae085b634fc39be01f1",
"2018-07-09/rustc-beta-x86_64-unknown-freebsd": "b11018ccced1428f428a055497609446b6f8f96cd23bffd343c2fe1f96f3b723",
"2018-07-09/rustc-beta-x86_64-unknown-linux-gnu": "4d19adf1cd30203c7ea673f2b4e369f5bed61eaea0fb9d1b705003c959fdf2d3",
"2018-07-09/rustc-nightly-x86_64-apple-darwin": "82b90874be2894e1338d62a9eb167c02d83d401130f67fe72b6d5ab01bb6ca63",
"2018-07-09/rustc-nightly-x86_64-unknown-freebsd": "6788b84aeacfca583cf6e877527fc4b30f6e3cbe568276b7145e0885604b3266",
"2018-07-09/rustc-nightly-x86_64-unknown-linux-gnu": "8a69af90c216b8a0e4c2cb641f63d2998eff7743fcaf7ef1aa113365ee8b1d1c",
"2018-07-09/rust-nightly-x86_64-apple-darwin": "bfdec1085b33c1018bc64fecbfa0000bfc8c8d74c391d17ac567ec135ab3478f",
"2018-07-09/rust-nightly-x86_64-unknown-freebsd": "e72d96c471db54f136ad9ae8f6f67d19c3f6402745e587e5f344a3db5b43378c",
"2018-07-09/rust-nightly-x86_64-unknown-linux-gnu": "420a54694a714892aacb209cfa8078724ee0c345af96a2bccd85bf22a7c6580f",
"2018-07-09/rust-std-beta-x86_64-apple-darwin": "b12eafbe4c95c824d83f6bd948203bcee786cb199ec3b4a576f27b59f39b19bb",
"2018-07-09/rust-std-beta-x86_64-unknown-freebsd": "e100887a4f7720fe7160782a2bb40e30f5be0544eaf8cc9fc54db6bca9e69a10",
"2018-07-09/rust-std-beta-x86_64-unknown-linux-gnu": "244ca0c8f2b4d63d4b15a2ef65eb4aaf735ff8a563ad77c8a963ac433d4ab4d0",
"2018-07-09/rust-std-nightly-x86_64-apple-darwin": "251b40205aceb9310af15a39072b006a6ade88e7ab37a0adc355ab3a9008933e",
"2018-07-09/rust-std-nightly-x86_64-unknown-freebsd": "c7e23e9e266c4042ecb46001d87e22826fda92ffb0e3d7e425d3a9127d018f81",
"2018-07-09/rust-std-nightly-x86_64-unknown-linux-gnu": "2c1338e8c29422047481999713d8169eeca18aed26c0811f48c1960b7bfa30b9",
"2018-07-10/rust-beta-x86_64-apple-darwin": "efd11e141b5a0daedeee9ba6d05bcd4fde44ab106d768a0d3852adc799b78986",
"2018-07-10/rust-beta-x86_64-unknown-freebsd": "b43f357cff51690c2b47cf01fca7bc45653117a8928ed6aadc8bfef5bc2d6c72",
"2018-07-10/rust-beta-x86_64-unknown-linux-gnu": "22db14a5527aa9b17396753bb3c64a84c728fa1e83dbcdd527c5ba1ee057e362",
"2018-07-10/rustc-beta-x86_64-apple-darwin": "7af1fcb2c37adacb611f2ddb6d8ad9a808fad6d3c97c9bcb580847dd3d87c5a7",
"2018-07-10/rustc-beta-x86_64-unknown-freebsd": "6ed3b079aad9259337f06d49628ddb011eebeffada5037819615f40aeb9ec486",
"2018-07-10/rustc-beta-x86_64-unknown-linux-gnu": "306b18b1602985bce7578971af3904f691f197b91eab2d3126fdd679a29de9ae",
"2018-07-10/rustc-nightly-x86_64-apple-darwin": "fd5749fe18901956c712be8b6e90cfb1eedf28193c1a3f90903182bc5e28dd38",
"2018-07-10/rustc-nightly-x86_64-unknown-freebsd": "75eae623c02be572fd76403b41437460ed7025674073facffbe7ac1148023f2b",
"2018-07-10/rustc-nightly-x86_64-unknown-linux-gnu": "eaa3078e699a2afaffe881b10cc3e2d63b3744df69cb1a279a50e35a9ac2530d",
"2018-07-10/rust-nightly-x86_64-apple-darwin": "40c559e60ae785ad240c1c47c335ac5151d691a57b97b059c7fd9f803447bf78",
"2018-07-10/rust-nightly-x86_64-unknown-freebsd": "76a8be14262c3f2348b69fb695b918f13a6ba4891d010e99bd5430c8b25defd0",
"2018-07-10/rust-nightly-x86_64-unknown-linux-gnu": "bb7d3687166b0f1c768f307cd57340df342f7cd38b97ea689b90b8341dce4342",
"2018-07-10/rust-std-beta-x86_64-apple-darwin": "a523e9c8efc439519e741fc21ba028920a0e4f81fd5eeb78e6f65635775a7df9",
"2018-07-10/rust-std-beta-x86_64-unknown-freebsd": "b8f6dcdde12bd6fa82eacbc685410e6dbdd4892befd1e588a88d34e76de35ca8",
"2018-07-10/rust-std-beta-x86_64-unknown-linux-gnu": "0541ee1ace66ca18304aa5822fa42e016751b85e459372d9bd98633204d37173",
"2018-07-10/rust-std-nightly-x86_64-apple-darwin": "3d638256c2f4a7d33956a73ea141e8db6e8b5f985c487ba083cded8b02fd2e31",
"2018-07-10/rust-std-nightly-x86_64-unknown-freebsd": "056b8c12783c8b51aa86e77a0416dabd7027e8ec5677dd4f9f650134bb6a1a19",
"2018-07-10/rust-std-nightly-x86_64-unknown-linux-gnu": "8dbebab5610eba12fde192bdc6623fa70dc41cf69542fadb8fb3b1c5182c82c5",
"2018-07-11/rustc-nightly-x86_64-apple-darwin": "930c19558b54d6787edc57770699ac56f7cd9204a92fecaf308b7715de1276af",
"2018-07-11/rustc-nightly-x86_64-unknown-freebsd": "dab19e0c9da6dfae7748851b1cb86aca941d4f7ba0b48dbcbe1d47e9858b271e",
"2018-07-11/rustc-nightly-x86_64-unknown-linux-gnu": "da06b238742b46a2472dad314026c1603f7cc78a28209773c4aafb5b482fbea2",
"2018-07-11/rust-nightly-x86_64-apple-darwin": "e2f2974202542b7ee54ef1a24bebc5636142caa749f422bc0caa38834ca7269a",
"2018-07-11/rust-nightly-x86_64-unknown-freebsd": "4cde673927c0d6360c184a21d2a46911c79bb143a2ef8c3aef119004bca952e4",
"2018-07-11/rust-nightly-x86_64-unknown-linux-gnu": "c0eedcda1085adc913ad4c945a87db77d5dc5c6e789901c2dc322b76699649b9",
"2018-07-11/rust-std-nightly-x86_64-apple-darwin": "85e2404e3a00ee2e307e4b15e474e2557fa622eeac660ad531c826bc99affff2",
"2018-07-11/rust-std-nightly-x86_64-unknown-freebsd": "491f7cfc7bae506daa6d675a0f4e4c18470c6b6ebc6f9507b530dd273c327866",
"2018-07-11/rust-std-nightly-x86_64-unknown-linux-gnu": "9cdf9b6ea66bc040a3eb2b1218197ca8802f4e334e9e87c0e9b276bce4776882",
"rust-1.26.0-x86_64-apple-darwin": "38708803c3096b8f101d1919ee2d7e723b0adf1bc1bb986b060973b57d8c7c28",
"rust-1.26.0-x86_64-unknown-freebsd": "a03cbe097670042c90d18654fbc852c9d473261d61c03d0f745bbaee759780ed",
"rust-1.26.0-x86_64-unknown-linux-gnu": "13691d7782577fc9f110924b26603ade1990de0b691a3ce2dc324b4a72a64a68",
"rust-1.26.1-x86_64-apple-darwin": "ebf898b9fa7e2aafc53682a41f18af5ca6660ebe82dd78f28cd9799fe4dc189a",
"rust-1.26.1-x86_64-unknown-freebsd": "910128f60c680e175ae93722272f491c6835f27652f9f3fe415dc0d9c482e204",
"rust-1.26.1-x86_64-unknown-linux-gnu": "b7e964bace1286696d511c287b945f3ece476ba77a231f0c31f1867dfa5080e0",
"rust-1.26.2-x86_64-apple-darwin": "f193705d4c0572a358670dbacbf0ffadcd04b3989728b442f4680fa1e065fa72",
"rust-1.26.2-x86_64-unknown-freebsd": "0ad985cf36b3946f086fd3c3c6eb97b0c94b24285147a04da22c00d4d522727a",
"rust-1.26.2-x86_64-unknown-linux-gnu": "d2b4fb0c544874a73c463993bde122f031c34897bb1eeb653d2ba2b336db83e6",
"rust-1.27.0-x86_64-apple-darwin": "a1d48190992e01aac1a181bce490c80cb2c1421724b4ff0e2fb7e224a958ce0f",
"rust-1.27.0-x86_64-unknown-freebsd": "f0754434f76f261ecdfd7ea3645b251b0188e263c0c7a7466aafac1b034d20ec",
"rust-1.27.0-x86_64-unknown-linux-gnu": "235ad78e220b10a2d0267aea1e2c0f19ef5eaaff53ad6ff8b12c1d4370dec9a3",
"rust-1.27.1-x86_64-apple-darwin": "475be237962d6aef1038a2faada26fda1e0eaea5d71d6950229a027a9c2bfe08",
"rust-1.27.1-x86_64-unknown-freebsd": "739d38036c9f08c13bc7425cc5cccd3dd37860fa6e9dfc7bcd9081c8d3c5ccdd",
"rust-1.27.1-x86_64-unknown-linux-gnu": "435778a837af764da2a7a7fb4d386b7b78516c7dfc732d892858e9a8a539989b",
"rust-1.27.2-x86_64-apple-darwin": "30c5cc58759caa4efdf2ea7d8438633139c98bee3408beb29ceb26985f3f5f70",
"rust-1.27.2-x86_64-unknown-freebsd": "b114c5eebc120b360d4d3c4360421ff181cc47bb311e161d3af6971b6d3e6244",
"rust-1.27.2-x86_64-unknown-linux-gnu": "5028a18e913ef3eb53e8d8119d2cc0594442725e055a9361012f8e26f754f2bf",
"rustc-1.26.0-x86_64-apple-darwin": "5cb67314656d16cf2a1bdc84213aaaf6afdb5811825c7afba916e2d42d3d641f",
"rustc-1.26.0-x86_64-unknown-freebsd": "9499ce5b68d631f8345c387e1f59b21892d97e0acb5650deb61a34719310bd38",
"rustc-1.26.0-x86_64-unknown-linux-gnu": "7ca9a30010602aaf2244c376a3cc5baa89429d54da17b8ba1cb0cdfdc846cc61",
"rustc-1.26.1-x86_64-apple-darwin": "e5f4291c3709b170fbeb17fab7fae50fe0c626dbdc5c42ddb1f342ea03acbad4",
"rustc-1.26.1-x86_64-unknown-freebsd": "dc3dc36010d73349152e6158522e82830fda173007b9299b0a947c90769c54ff",
"rustc-1.26.1-x86_64-unknown-linux-gnu": "45bc1c30e0c473c42889f22b182ec6f0b0fc3be0825e1607c64933592486eb2a",
"rustc-1.26.2-x86_64-apple-darwin": "5b0a3d94a4fa76ed28859123e35c09a91d7eb8ff65f40ec4c50dfa56ffed8ae5",
"rustc-1.26.2-x86_64-unknown-freebsd": "48f20a8dc6bc54c90aae685d0c3fa2caf3677f1c4a4d0c53aee9d15588bd0735",
"rustc-1.26.2-x86_64-unknown-linux-gnu": "1ebdafe52b581a63cea217a036fd6e77706d2715ae9cfe10a8c715d753326004",
"rustc-1.27.0-x86_64-apple-darwin": "0b00c6971ef524f68b911f621d199e60c339c390b18e12700d55e012b62aa90c",
"rustc-1.27.0-x86_64-unknown-freebsd": "24c193213450ffacffebdd1413d77fc3c1ed00049cf1ede2d0f3f370dd86b462",
"rustc-1.27.0-x86_64-unknown-linux-gnu": "29f399a1a208ea3f27f21e57f2d832e9d801c397a986aaea17e3a2ddeded6c3c",
"rustc-1.27.1-x86_64-apple-darwin": "747f616e07e5da9323a21c1cf9d76b53bb46094a68223d461a7333f26c714f19",
"rustc-1.27.1-x86_64-unknown-freebsd": "9b199c21094f996fd9d4b620a5ff2c4bc5b8dab13e96bdf7c113291f601ec944",
"rustc-1.27.1-x86_64-unknown-linux-gnu": "a6bf6205b345b854d705d0028a4e7161a0f5b209e464130e7d135fa01a296dc1",
"rustc-1.27.2-x86_64-apple-darwin": "b5c5edd2094afd0a92ad776dbd12cb6ee37800b940437dece10229ccacd1f561",
"rustc-1.27.2-x86_64-unknown-freebsd": "66d739632574fa52e82b40aca0eb4cef7a38047ed67cd6a240d8798a3cf9b6a6",
"rustc-1.27.2-x86_64-unknown-linux-gnu": "ec3efc17ddbe6625840957049e15ebae960f447c8e8feb7da40c28dd6adf655f",
"rust-std-1.26.0-x86_64-apple-darwin": "cb5a0114e9e383aa93267868482db84f791124ee4faafdaed08ec6782d000fc2",
"rust-std-1.26.0-x86_64-unknown-freebsd": "38cd138eba2ccaff59513d154fec580b6663ca6ef38cd620c348364aa1e11a40",
"rust-std-1.26.0-x86_64-unknown-linux-gnu": "e27cb5c21541a500c8df919e15c8d3b002456ebbe573122e7b058cf5b4c3c13a",
"rust-std-1.26.1-x86_64-apple-darwin": "d43e06674e645e120af6716e6d0db5771fa8818b5a48fbee9791360086cdec4a",
"rust-std-1.26.1-x86_64-unknown-freebsd": "1d63cc1f6dc6dfa2644619cd8c264c3d1be0fe5c44c5454e8ea04bd7beb036fb",
"rust-std-1.26.1-x86_64-unknown-linux-gnu": "cc7cec9a121a97e8e23c350305a0e4cd4e3b475fd5a36fa6335a585d3c511f0d",
"rust-std-1.26.2-x86_64-apple-darwin": "712a79cd10b96c7119980e535a36595e03c69a360f1541f690c09de858d92723",
"rust-std-1.26.2-x86_64-unknown-freebsd": "f54b58bf941d794ee10ab7ee9e1c94a70012073b0ee633ec2be585b1be2e31de",
"rust-std-1.26.2-x86_64-unknown-linux-gnu": "91634f05bf2d0a20e627aed08a8450673acecb963869273221de17130540fb26",
"rust-std-1.27.0-x86_64-apple-darwin": "15ee6418f9b564618e9c81a6dcd7706a2f8ae5ca24fd1b6d7527c97563a47e57",
"rust-std-1.27.0-x86_64-unknown-freebsd": "6e307cc3798b50b37beb9ff43e88b12fb565ddaf051925fffa35bfbeb091d660",
"rust-std-1.27.0-x86_64-unknown-linux-gnu": "b8cf36922315ca792929d515327c74b873358a64be4929b2ecfbe23af21e8043",
"rust-std-1.27.1-x86_64-apple-darwin": "a521599355e564984e43a63042b1de93dd7cf96730930501f86611dd766384e8",
"rust-std-1.27.1-x86_64-unknown-freebsd": "12902b61a4897ade258217f045dfac3fe83d49dd52d1e2250bd94c3a10642b08",
"rust-std-1.27.1-x86_64-unknown-linux-gnu": "9a1830b522117d68eeec703b50692093352212e035a46baceea666bb37739c2d",
"rust-std-1.27.2-x86_64-apple-darwin": "eed3688d9f551066593b34f07e4d28846caa99624c2168387993acc6bddd003d",
"rust-std-1.27.2-x86_64-unknown-freebsd": "6051f8bacbfbd2c3dceeddab8c66274bed7ef260cf346d367c53495cd1567572",
"rust-std-1.27.2-x86_64-unknown-linux-gnu": "68984f2233853d3e9c7c56edd72a91b5822157f28fdb42023fb311af68f842dd",
}
| 114.523077 | 127 | 0.835639 | 1,304 | 14,888 | 9.440951 | 0.125 | 0.051174 | 0.081878 | 0.054585 | 0.340915 | 0.340915 | 0.313135 | 0.193404 | 0.042726 | 0 | 0 | 0.443799 | 0.051652 | 14,888 | 129 | 128 | 115.410853 | 0.428146 | 0.003493 | 0 | 0 | 1 | 0 | 0.896522 | 0.896522 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
18acc9ded43307169994f62078d8e11797137920 | 16,543 | py | Python | remo/dashboard/tests/test_models.py | glogiotatidis/remo | 1c4f55c63c8d03cbee776b60af042b8068d9f297 | [
"BSD-3-Clause"
] | null | null | null | remo/dashboard/tests/test_models.py | glogiotatidis/remo | 1c4f55c63c8d03cbee776b60af042b8068d9f297 | [
"BSD-3-Clause"
] | null | null | null | remo/dashboard/tests/test_models.py | glogiotatidis/remo | 1c4f55c63c8d03cbee776b60af042b8068d9f297 | [
"BSD-3-Clause"
] | null | null | null | from datetime import timedelta
from django.contrib.auth.models import Group
from django.contrib.contenttypes.models import ContentType
from django.utils.timezone import now
from nose.tools import eq_, ok_
from remo.base.tests import RemoTestCase
from remo.dashboard.models import ActionItem
from remo.events.models import Event
from remo.events.tasks import notify_event_owners_to_input_metrics
from remo.events.tests import EventFactory, EventMetricOutcomeFactory
from remo.profiles.tests import UserFactory
from remo.remozilla.models import Bug
from remo.remozilla.tests import BugFactory
from remo.voting.models import Poll
from remo.voting.tasks import resolve_action_items, create_poll_action_items
from remo.voting.tests import PollFactory, VoteFactory
class RemozillaActionItems(RemoTestCase):
"""Test related to new action items created from bugzilla."""
def test_waiting_receipts(self):
model = ContentType.objects.get_for_model(Bug)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
whiteboard = '[waiting receipts]'
user = UserFactory.create(groups=['Rep'])
bug = BugFactory.build(whiteboard=whiteboard, assigned_to=user)
bug.save()
items = ActionItem.objects.filter(content_type=model, object_id=bug.id)
eq_(items.count(), 1)
eq_(items[0].name, 'Add receipts for ' + bug.summary)
eq_(items[0].user, user)
eq_(items[0].priority, ActionItem.NORMAL)
def test_waiting_multiple_documents(self):
model = ContentType.objects.get_for_model(Bug)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
whiteboard = '[waiting receipts][waiting report][waiting photos]'
user = UserFactory.create(groups=['Rep'])
bug = BugFactory.build(whiteboard=whiteboard, assigned_to=user)
bug.save()
items = ActionItem.objects.filter(content_type=model, object_id=bug.id)
eq_(items.count(), 3)
namelist = ['Add receipts for ' + bug.summary,
'Add report for ' + bug.summary,
'Add photos for ' + bug.summary]
for item in items:
ok_(item.name in namelist)
eq_(item.user, user)
eq_(item.priority, ActionItem.NORMAL)
def test_update_bug_whiteboard(self):
model = ContentType.objects.get_for_model(Bug)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
whiteboard = '[waiting receipts][waiting report][waiting photos]'
user = UserFactory.create(groups=['Rep'])
bug = BugFactory.build(whiteboard=whiteboard, assigned_to=user)
bug.save()
items = ActionItem.objects.filter(content_type=model)
eq_(items.count(), 3)
bug.whiteboard = ''
bug.save()
items = ActionItem.objects.filter(content_type=model, object_id=bug.id)
for item in items:
ok_(item.completed)
ok_(item.resolved)
def test_mentor_validation(self):
model = ContentType.objects.get_for_model(Bug)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
mentor = UserFactory.create(groups=['Rep', 'Mentor'])
UserFactory.create(groups=['Rep'], userprofile__mentor=mentor)
bug = BugFactory.build(pending_mentor_validation=True,
assigned_to=mentor)
bug.save()
items = ActionItem.objects.filter(content_type=model, object_id=bug.id)
eq_(items.count(), 1)
eq_(items[0].name, 'Waiting mentor validation for ' + bug.summary)
eq_(items[0].user, mentor)
eq_(items[0].priority, ActionItem.BLOCKER)
def test_change_assigned_user(self):
model = ContentType.objects.get_for_model(Bug)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
user_1 = UserFactory.create(groups=['Rep'])
user_2 = UserFactory.create(groups=['Rep'])
bug = BugFactory.build(assigned_to=user_1,
pending_mentor_validation=True)
bug.save()
item = ActionItem.objects.get(content_type=model, object_id=bug.id)
eq_(item.user, user_1)
bug.assigned_to = user_2
bug.save()
item = ActionItem.objects.get(content_type=model, object_id=bug.id)
eq_(item.user, user_2)
def test_resolve_mentor_validation(self):
model = ContentType.objects.get_for_model(Bug)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
mentor = UserFactory.create(groups=['Rep', 'Mentor'])
UserFactory.create(groups=['Rep'], userprofile__mentor=mentor)
bug = BugFactory.build(pending_mentor_validation=True,
assigned_to=mentor)
bug.save()
items = ActionItem.objects.filter(content_type=model)
eq_(items.count(), 1)
eq_(items[0].name, 'Waiting mentor validation for ' + bug.summary)
eq_(items[0].user, mentor)
eq_(items[0].priority, ActionItem.BLOCKER)
bug.pending_mentor_validation = False
bug.save()
items = ActionItem.objects.filter(content_type=model, object_id=bug.id)
for item in items:
ok_(item.completed)
ok_(item.resolved)
def test_needinfo(self):
model = ContentType.objects.get_for_model(Bug)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
needinfo = UserFactory.create(groups=['Rep'])
user = UserFactory.create(groups=['Rep'])
bug = BugFactory.build(assigned_to=user)
bug.save()
bug.budget_needinfo.add(needinfo)
bug.save()
items = ActionItem.objects.filter(content_type=model, object_id=bug.id)
ok_(items.count(), 1)
for item in items:
eq_(item.name, 'Need info for ' + bug.summary)
eq_(item.user, needinfo)
ok_(item.priority, ActionItem.MINOR)
def test_remove_needinfo(self):
model = ContentType.objects.get_for_model(Bug)
items = ActionItem.objects.filter(content_type=model)
user = UserFactory.create(groups=['Rep'])
bug = BugFactory.create()
bug.budget_needinfo.add(user)
bug.save()
bug.budget_needinfo.clear()
bug.save()
items = ActionItem.objects.filter(content_type=model, object_id=bug.id)
for item in items:
ok_(item.completed)
ok_(item.resolved)
def test_council_reviewer_assigned(self):
model = ContentType.objects.get_for_model(Bug)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
user = UserFactory.create(groups=['Rep', 'Council'])
bug = BugFactory.build(assigned_to=user, council_member_assigned=True)
bug.save()
items = ActionItem.objects.filter(content_type=model, object_id=bug.id)
eq_(items.count(), 1)
eq_(items[0].name, 'Review budget request ' + bug.summary)
eq_(items[0].user, user)
eq_(items[0].priority, ActionItem.BLOCKER)
def test_council_reviewer_removed(self):
model = ContentType.objects.get_for_model(Bug)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
user = UserFactory.create(groups=['Council'])
bug = BugFactory.build(assigned_to=user,
council_member_assigned=True)
bug.save()
bug.council_member_assigned = False
bug.save()
items = ActionItem.objects.filter(content_type=model, object_id=bug.id)
for item in items:
ok_(item.completed)
ok_(item.resolved)
def test_remove_assignee(self):
model = ContentType.objects.get_for_model(Bug)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
user = UserFactory.create(groups=['Rep'])
bug = BugFactory.build(pending_mentor_validation=True,
assigned_to=user)
bug.save()
items = ActionItem.objects.filter(content_type=model)
eq_(items.count(), 1)
eq_(items[0].name, 'Waiting mentor validation for ' + bug.summary)
eq_(items[0].user, user)
eq_(items[0].priority, ActionItem.BLOCKER)
bug.assigned_to = None
bug.save()
items = ActionItem.objects.filter(content_type=model, object_id=bug.id)
for item in items:
ok_(item.resolved)
ok_(not item.completed)
class VotingActionItems(RemoTestCase):
def test_vote_action_item(self):
model = ContentType.objects.get_for_model(Poll)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
council = Group.objects.get(name='Council')
user = UserFactory.create(groups=['Council'])
start = now() - timedelta(hours=3)
poll = PollFactory.create(valid_groups=council, start=start)
create_poll_action_items()
items = ActionItem.objects.filter(content_type=model,
object_id=poll.id)
eq_(items.count(), 1)
for item in items:
eq_(item.name, 'Cast your vote for ' + poll.name)
eq_(item.user, user)
ok_(item.priority, ActionItem.NORMAL)
ok_(not item.completed)
def test_budget_vote_action_item(self):
model = ContentType.objects.get_for_model(Poll)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
council = Group.objects.get(name='Council')
user = UserFactory.create(groups=['Council'])
bug = BugFactory.create()
start = now() - timedelta(hours=3)
poll = PollFactory.create(valid_groups=council, automated_poll=True,
bug=bug, start=start)
create_poll_action_items()
items = ActionItem.objects.filter(content_type=model,
object_id=poll.id)
eq_(items.count(), 1)
for item in items:
eq_(item.name,
'Cast your vote for budget request ' + poll.bug.summary)
eq_(item.user, user)
ok_(item.priority, ActionItem.NORMAL)
ok_(not item.completed)
def test_future_vote_action_item(self):
model = ContentType.objects.get_for_model(Poll)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
council = Group.objects.get(name='Council')
start = now() + timedelta(hours=3)
PollFactory.create(valid_groups=council, start=start)
create_poll_action_items()
items = ActionItem.objects.filter(content_type=model)
eq_(items.count(), 0)
def test_resolve_vote_action_item(self):
model = ContentType.objects.get_for_model(Poll)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
council = Group.objects.get(name='Council')
user = UserFactory.create(groups=['Council'])
start = now() - timedelta(hours=3)
poll = PollFactory.create(valid_groups=council, start=start)
create_poll_action_items()
VoteFactory.create(poll=poll, user=user)
items = ActionItem.objects.filter(content_type=model,
object_id=poll.id)
eq_(items.count(), 1)
for item in items:
ok_(item.completed)
ok_(item.resolved)
def test_update_vote_due_date(self):
model = ContentType.objects.get_for_model(Poll)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
council = Group.objects.get(name='Council')
UserFactory.create(groups=['Council'])
start = now() - timedelta(hours=3)
poll = PollFactory.create(valid_groups=council, start=start)
create_poll_action_items()
poll.end = poll.end + timedelta(days=4)
poll.save()
items = ActionItem.objects.filter(content_type=model,
object_id=poll.id)
eq_(items.count(), 1)
for item in items:
eq_(item.due_date, poll.end.date())
def test_resolved_past_vote(self):
model = ContentType.objects.get_for_model(Poll)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
council = Group.objects.get(name='Council')
UserFactory.create(groups=['Council'])
start = now() - timedelta(hours=3)
poll = PollFactory.create(valid_groups=council,
end=now() - timedelta(days=1),
start=start)
create_poll_action_items()
items = ActionItem.objects.filter(content_type=model)
eq_(items.count(), 1)
resolve_action_items()
items = ActionItem.objects.filter(content_type=model,
object_id=poll.id)
for item in items:
ok_(item.resolved)
ok_(not item.completed)
def test_update_valid_groups(self):
model = ContentType.objects.get_for_model(Poll)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
council = Group.objects.get(name='Council')
reps = Group.objects.get(name='Rep')
UserFactory.create_batch(3, groups=['Council'])
UserFactory.create_batch(4, groups=['Rep'])
start = now() - timedelta(hours=3)
poll = PollFactory.create(valid_groups=council, start=start)
create_poll_action_items()
poll.valid_groups = reps
poll.save()
items = ActionItem.objects.filter(content_type=model,
object_id=poll.id)
eq_(items.count(), 4)
for user in reps.user_set.all():
ok_(items.filter(user=user).exists())
for user in council.user_set.all():
ok_(not items.filter(user=user).exists())
class EventActionItems(RemoTestCase):
def test_post_event_metrics(self):
model = ContentType.objects.get_for_model(Event)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
start = now() - timedelta(days=4)
end = now() - timedelta(days=1)
user = UserFactory.create(groups=['Rep'])
event = EventFactory.create(owner=user, start=start, end=end)
notify_event_owners_to_input_metrics()
items = ActionItem.objects.filter(content_type=model,
object_id=event.id)
eq_(items.count(), 1)
def test_resolve_post_event_metrics(self):
model = ContentType.objects.get_for_model(Event)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
start = now() - timedelta(days=4)
end = now() - timedelta(days=1)
user = UserFactory.create(groups=['Rep'])
event = EventFactory.create(owner=user, start=start, end=end)
notify_event_owners_to_input_metrics()
items = ActionItem.objects.filter(content_type=model,
object_id=event.id)
eq_(items.count(), 1)
EventMetricOutcomeFactory.create(event=event)
for item in items:
ok_(item.completed)
ok_(item.resolved)
def test_update_event_owner(self):
model = ContentType.objects.get_for_model(Event)
items = ActionItem.objects.filter(content_type=model)
ok_(not items.exists())
start = now() - timedelta(days=4)
end = now() - timedelta(days=1)
user = UserFactory.create(groups=['Rep'])
event = EventFactory.create(owner=user, start=start, end=end)
notify_event_owners_to_input_metrics()
items = ActionItem.objects.filter(content_type=model)
eq_(items.count(), 1)
new_owner = UserFactory.create(groups=['Rep'])
event.owner = new_owner
event.save()
items = ActionItem.objects.filter(content_type=model,
object_id=event.id)
eq_(items.count(), 1)
eq_(items[0].user, new_owner)
| 35.730022 | 79 | 0.628483 | 1,934 | 16,543 | 5.184074 | 0.072906 | 0.081388 | 0.076601 | 0.128466 | 0.824456 | 0.797925 | 0.78466 | 0.778476 | 0.775883 | 0.771394 | 0 | 0.004751 | 0.262044 | 16,543 | 462 | 80 | 35.807359 | 0.816514 | 0.003325 | 0 | 0.707386 | 0 | 0 | 0.032642 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.059659 | false | 0 | 0.045455 | 0 | 0.113636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
18dda7b9a50de9b3d906873e3cd3d81735bd866b | 48 | py | Python | abracadjabra/exceptions.py | gregdetre/abracadjabra | 88c43cd14b8dc25e386a5107e9d7ad0f00f5e56a | [
"MIT"
] | 1 | 2015-02-06T12:36:40.000Z | 2015-02-06T12:36:40.000Z | abracadjabra/exceptions.py | gregdetre/abracadjabra | 88c43cd14b8dc25e386a5107e9d7ad0f00f5e56a | [
"MIT"
] | null | null | null | abracadjabra/exceptions.py | gregdetre/abracadjabra | 88c43cd14b8dc25e386a5107e9d7ad0f00f5e56a | [
"MIT"
] | null | null | null | class SlugAttributeError(AttributeError): pass
| 16 | 46 | 0.854167 | 4 | 48 | 10.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 48 | 2 | 47 | 24 | 0.931818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
18e743c85aa0aaeb553241a1bc16f52ef04ad252 | 2,696 | py | Python | tests/test_basic_math.py | timwuu/deep-learning-from-scratch-3 | 6f18dee8c1d764e16275ed68f90966bc85f0ae66 | [
"MIT"
] | 539 | 2019-11-01T04:09:42.000Z | 2022-03-26T06:25:44.000Z | tests/test_basic_math.py | timwuu/deep-learning-from-scratch-3 | 6f18dee8c1d764e16275ed68f90966bc85f0ae66 | [
"MIT"
] | 32 | 2019-11-21T07:50:16.000Z | 2022-01-26T14:01:55.000Z | tests/test_basic_math.py | timwuu/deep-learning-from-scratch-3 | 6f18dee8c1d764e16275ed68f90966bc85f0ae66 | [
"MIT"
] | 157 | 2019-11-17T22:20:03.000Z | 2022-03-23T02:50:51.000Z | import unittest
import numpy as np
from dezero import Variable
from dezero.utils import gradient_check, array_equal
import dezero.functions as F
class TestAdd(unittest.TestCase):
def test_forward1(self):
x0 = np.array([1, 2, 3])
x1 = Variable(np.array([1, 2, 3]))
y = x0 + x1
res = y.data
expected = np.array([2, 4, 6])
self.assertTrue(array_equal(res, expected))
def test_datatype(self):
"""np.float64ではなく、0次元のndarrayを返すかどうか"""
x = Variable(np.array(2.0))
y = x ** 2
self.assertFalse(np.isscalar(y))
def test_backward1(self):
x = Variable(np.random.randn(3, 3))
y = np.random.randn(3, 3)
f = lambda x: x + y
self.assertTrue(gradient_check(f, x))
def test_backward2(self):
x = Variable(np.random.randn(3, 3))
y = np.random.randn(3, 1)
f = lambda x: x + y
self.assertTrue(gradient_check(f, x))
def test_backward3(self):
x = np.random.randn(3, 3)
y = np.random.randn(3, 1)
self.assertTrue(gradient_check(F.add, x, y))
class TestMul(unittest.TestCase):
def test_forward1(self):
x0 = np.array([1, 2, 3])
x1 = Variable(np.array([1, 2, 3]))
y = x0 * x1
res = y.data
expected = np.array([1, 4, 9])
self.assertTrue(array_equal(res, expected))
def test_backward1(self):
x = np.random.randn(3, 3)
y = np.random.randn(3, 3)
f = lambda x: x * y
self.assertTrue(gradient_check(f, x))
def test_backward2(self):
x = np.random.randn(3, 3)
y = np.random.randn(3, 1)
f = lambda x: x * y
self.assertTrue(gradient_check(f, x))
def test_backward3(self):
x = np.random.randn(3, 3)
y = np.random.randn(3, 1)
f = lambda y: x * y
self.assertTrue(gradient_check(f, x))
class TestDiv(unittest.TestCase):
def test_forward1(self):
x0 = np.array([1, 2, 3])
x1 = Variable(np.array([1, 2, 3]))
y = x0 / x1
res = y.data
expected = np.array([1, 1, 1])
self.assertTrue(array_equal(res, expected))
def test_backward1(self):
x = np.random.randn(3, 3)
y = np.random.randn(3, 3)
f = lambda x: x / y
self.assertTrue(gradient_check(f, x))
def test_backward2(self):
x = np.random.randn(3, 3)
y = np.random.randn(3, 1)
f = lambda x: x / y
self.assertTrue(gradient_check(f, x))
def test_backward3(self):
x = np.random.randn(3, 3)
y = np.random.randn(3, 1)
f = lambda x: x / y
self.assertTrue(gradient_check(f, x)) | 27.793814 | 52 | 0.559718 | 403 | 2,696 | 3.677419 | 0.126551 | 0.097166 | 0.157895 | 0.17004 | 0.824561 | 0.794872 | 0.794872 | 0.794872 | 0.745614 | 0.745614 | 0 | 0.049285 | 0.300074 | 2,696 | 97 | 53 | 27.793814 | 0.736089 | 0.01224 | 0 | 0.74026 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168831 | 1 | 0.168831 | false | 0 | 0.064935 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7a1a33b7fc21be509926a35fc048f04f78b7cf4f | 30,580 | py | Python | src/pyrin/api/router/services.py | wilsonGmn/pyrin | 25dbe3ce17e80a43eee7cfc7140b4c268a6948e0 | [
"BSD-3-Clause"
] | null | null | null | src/pyrin/api/router/services.py | wilsonGmn/pyrin | 25dbe3ce17e80a43eee7cfc7140b4c268a6948e0 | [
"BSD-3-Clause"
] | null | null | null | src/pyrin/api/router/services.py | wilsonGmn/pyrin | 25dbe3ce17e80a43eee7cfc7140b4c268a6948e0 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""
router services module.
"""
from pyrin.api.router import RouterPackage
from pyrin.application.decorators import route_factory
from pyrin.application.services import get_component
@route_factory()
def create_route(rule, **options):
"""
creates the appropriate route based on the input parameters.
this method acts as a route factory and it will be registered
as default route factory on application startup.
:param str rule: unique url rule to register this route for.
routes with duplicated urls and http methods will be
overwritten if `replace=True` option is provided.
otherwise an error will be raised.
:keyword bool authenticated: specifies that this route could not be accessed
if the requester has not been authenticated.
defaults to False if not provided.
:keyword str authenticator: the authenticator name to be used for this route.
if not provided, it will be get from rule based
authenticators if possible. otherwise the
`default_authenticator` config will be used.
if no default is set in `authentication` config
store, it raises an error.
it is only used if this route has `authenticated=True`.
:keyword bool fresh_auth: specifies that this route could not be accessed
if the requester has not a fresh authentication.
fresh authentication means an authentication that
has been done by providing user credentials to server.
defaults to False if not provided.
:keyword PermissionBase | tuple[PermissionBase] permissions: all required permissions
to access this route.
:keyword str | tuple[str] methods: http methods that this route could handle.
if not provided, defaults to `GET`, `HEAD`
and `OPTIONS`.
:keyword function view_function: a function to be called on accessing this route.
:keyword str endpoint: the endpoint for the registered url rule.
:keyword dict defaults: an optional dict with defaults for other rules with the
same endpoint. this is a bit tricky but useful if you
want to have unique urls.
:keyword str subdomain: the subdomain rule string for this rule. If not specified the
rule only matches for the `default_subdomain` of the map. if
the map is not bound to a subdomain this feature is disabled.
:keyword bool strict_slashes: override the `Map` setting for `strict_slashes` only for
this rule. if not specified the `Map` setting is used.
:keyword bool merge_slashes: override `Map.merge_slashes` for this rule.
:keyword bool build_only: set this to True and the rule will never match but will
create a url that can be build. this is useful if you have
resources on a subdomain or folder that are not handled by
the WSGI application (like static data)
:keyword str | callable redirect_to: if given this must be either a string
or callable. in case of a callable it's
called with the url adapter that
triggered the match and the values
of the url as keyword arguments and has
to return the target for the redirect,
otherwise it has to be a string with
placeholders in rule syntax.
:keyword bool alias: if enabled this rule serves as an alias for another rule with
the same endpoint and arguments.
:keyword str host: if provided and the url map has host matching enabled this can be
used to provide a match rule for the whole host. this also means
that the subdomain feature is disabled.
:keyword bool websocket: if set to True, this rule is only matches for
websocket (`ws://`, `wss://`) requests. by default,
rules will only match for http requests.
defaults to False if not provided.
:keyword int max_content_length: max content length that this route could handle,
in bytes. if not provided, it will be set to
`restricted_max_content_length` api config key.
note that this value should be lesser than or equal
to `max_content_length` api config key, otherwise
it will cause an error.
:keyword int status_code: status code to be returned on successful responses.
defaults to corresponding status code of request's
http method if not provided.
:note status_code: it could be a value from `InformationResponseCodeEnum`
or `SuccessfulResponseCodeEnum` or `RedirectionResponseCodeEnum`.
:keyword bool strict_status: specifies that it should only consider
the status code as processed if it is from
`InformationResponseCodeEnum` or
`SuccessfulResponseCodeEnum` or
`RedirectionResponseCodeEnum` values. otherwise
all codes from `INFORMATION_CODE_MIN`
to `INFORMATION_CODE_MAX` or from
`SUCCESS_CODE_MIN` to `SUCCESS_CODE_MAX`
or from `REDIRECTION_CODE_MIN` to
`REDIRECTION_CODE_MAX` will be considered
as processed. defaults to True if not provided.
:keyword ResultSchema | type[ResultSchema] result_schema: result schema to be used
to filter results. it could
be an instance or a type
of `ResultSchema` class.
:keyword bool indexed: specifies that list results must
include an extra field as row index.
the name of the index field and the initial value
of index could be provided by `index_name` and
`start_index` respectively. `indexed` keyword has
only effect if the returning result contains a list
of objects.
:keyword str index_name: name of the extra field to contain
the row index of each result. if not provided
defaults to `row_num` value.
:keyword int start_index: the initial value of row index. if not
provided, starts from 1.
:keyword SECURE_TRUE | SECURE_FALSE readable: specifies that any column or attribute
which has `allow_read=False` or its name
starts with underscore `_`, should not
be included in result dict. defaults to
`SECURE_TRUE` if not provided. it will be
used only for entity conversion. this
value will override the corresponding
value of `result_schema` if provided.
:keyword int depth: a value indicating the depth for conversion.
for example if entity A has a relationship with
entity B and there is a list of B in A, if `depth=0`
is provided, then just columns of A will be available
in result dict, but if `depth=1` is provided, then all
B entities in A will also be included in the result dict.
actually, `depth` specifies that relationships in an
entity should be followed by how much depth.
note that, if `columns` is also provided, it is required to
specify relationship property names in provided columns.
otherwise they won't be included even if `depth` is provided.
defaults to `default_depth` value of database config store.
please be careful on increasing `depth`, it could fail
application if set to higher values. choose it wisely.
normally the maximum acceptable `depth` would be 2 or 3.
there is a hard limit for max valid `depth` which is set
in `ConverterMixin.MAX_DEPTH` class variable. providing higher
`depth` value than this limit, will cause an error.
it will be used only for entity conversion.
this value will override the corresponding value of
`result_schema` if provided.
:keyword bool no_cache: a value indicating that the response returning from this route
must have a `Cache-Control: no-cache` header. this header will
be automatically added. defaults to False if not provided.
:keyword int request_limit: number of allowed requests to this
route before it unregisters itself.
defaults to None if not Provided.
:keyword int lifetime: number of seconds in which this route must remain
responsive after initial registration. after this
period, the route will unregister itself.
defaults to None if not provided.
:note request_limit, lifetime: if both of these values are provided, this
route will unregister itself if any of
these two conditions are met.
:keyword bool paged: specifies that this route should return paginated results.
defaults to False if not provided.
:keyword int page_size: default page size for this route.
defaults to `default_page_size` from
`database` config store if not provided.
:keyword int max_page_size: maximum page size that client is allowed
to request for this route. defaults to
`max_page_size` from `database` configs store
if not provided.
:keyword bool cors_enabled: specifies that cross origin resource sharing is enabled.
if not provided, it will be get from cors config store.
:keyword bool cors_always_send: specifies that cors headers must be included in
response even if the request does not have origin header.
if not provided, it will be get from cors config store.
:keyword list[str] cors_allowed_origins: a list of extra allowed origins to be used
in conjunction with default allowed ones.
:keyword list[str] cors_exposed_headers: extra exposed headers to be combined
with default ones.
:keyword list[str] cors_allowed_headers: extra allowed headers to be combined
with default ones.
:keyword bool cors_allow_credentials: specifies that browsers are allowed to pass
response headers to front-end javascript code
if the route is authenticated.
if not provided, it will be get from cors config
store.
:keyword int cors_max_age: maximum number of seconds to cache results.
if not provided, it will be get from cors config store.
:keyword bool swagger: specifies that this route must be exposed on swagger.
defaults to False if not provided.
:keyword bool ordered: specifies that this route provides ordered results.
this is a flag to be used by swagger package to add
`order_by` keyword into parameters.
defaults to False if not provided.
:raises InvalidCustomRouteTypeError: invalid custom route type error.
:raises RouteAuthenticationMismatchError: route authentication mismatch error.
:raises PageSizeLimitError: page size limit error.
:raises MaxContentLengthLimitMismatchError: max content length limit mismatch error.
:raises InvalidViewFunctionTypeError: invalid view function type error.
:raises InvalidResultSchemaTypeError: invalid result schema type error.
:raises InvalidResponseStatusCodeError: invalid response status code error.
:rtype: RouteBase
"""
return get_component(RouterPackage.COMPONENT_NAME).create_route(rule, **options)
def add_route(url, view_func, provide_automatic_options=None, **options):
"""
connects a url rule. the provided view_func will be registered with the endpoint.
if there is another rule with the same url and http methods and `replace=True`
option is provided, it will be replaced. otherwise an error will be raised.
a note about endpoint:
pyrin will handle endpoint generation on its own.
so there is no endpoint parameter in this method's signature.
this is required to be able to handle uniqueness of endpoints and managing them.
despite flask, pyrin will not require you to define view functions with unique names.
you could define view functions with the same name in different modules. but to
ensure the uniqueness of endpoints, pyrin will use the fully qualified name
of function as endpoint. for example: `pyrin.api.services.create_route`.
so you could figure out endpoint for any view function to use it in places
like `url_for` method.
pyrin handles endpoints this way to achieve these two important features:
1. dismissal of the need for view function name uniqueness.
when application grows, it's nonsense to be forced to have
unique view function names at application level.
2. managing routes with the same url and http methods, and informing
the developer about them to prevent accidental replacements. and also
providing a way to replace a route by another route with the same url
and http methods if that is what developer actually wants.
when application grows, it becomes a point of failure when you have no
idea that you've registered a similar route in multiple places and only
one of them will be get called based on registration order.
:param str url: the url rule as string.
:param function view_func: the function to call when serving a request to the
provided endpoint.
:param bool provide_automatic_options: controls whether the `OPTIONS` method should be
added automatically.
this can also be controlled by setting the
`view_func.provide_automatic_options = False`
before adding the rule.
:keyword str | tuple[str] methods: http methods that this rule should handle.
if not provided, defaults to `GET`.
:keyword PermissionBase | tuple[PermissionBase] permissions: all required permissions
for accessing this route.
:keyword bool authenticated: specifies that this route could not be accessed
if the requester has not been authenticated.
defaults to True if not provided.
:keyword str authenticator: the authenticator name to be used for this route.
if not provided, it will be get from rule based
authenticators if possible. otherwise the
`default_authenticator` config will be used.
if no default is set in `authentication` config
store, it raises an error.
it is only used if this route has `authenticated=True`.
:keyword bool fresh_auth: specifies that this route could not be accessed
if the requester has not a fresh authentication.
fresh authentication means an authentication that
has been done by providing user credentials to
server. defaults to False if not provided.
:keyword bool replace: specifies that this route must replace any existing
route with the same url and http methods or raise
an error if not provided. defaults to False.
:keyword dict defaults: an optional dict with defaults for other rules with the
same endpoint. this is a bit tricky but useful if you
want to have unique urls.
:keyword str subdomain: the subdomain rule string for this rule. If not specified the
rule only matches for the `default_subdomain` of the map. if
the map is not bound to a subdomain this feature is disabled.
:keyword bool strict_slashes: override the `Map` setting for `strict_slashes` only for
this rule. if not specified the `Map` setting is used.
:keyword bool merge_slashes: override `Map.merge_slashes` for this rule.
:keyword bool build_only: set this to True and the rule will never match but will
create a url that can be build. this is useful if you have
resources on a subdomain or folder that are not handled by
the WSGI application (like static data)
:keyword str | callable redirect_to: if given this must be either a string
or callable. in case of a callable it's
called with the url adapter that
triggered the match and the values
of the url as keyword arguments and has
to return the target for the redirect,
otherwise it has to be a string with
placeholders in rule syntax.
:keyword bool alias: if enabled this rule serves as an alias for another rule with
the same endpoint and arguments.
:keyword str host: if provided and the url map has host matching enabled this can be
used to provide a match rule for the whole host. this also means
that the subdomain feature is disabled.
:keyword bool websocket: if set to True, this rule is only matches for
websocket (`ws://`, `wss://`) requests. by default,
rules will only match for http requests.
defaults to False if not provided.
:keyword int max_content_length: max content length that this route could handle,
in bytes. if not provided, it will be set to
`restricted_max_content_length` api config key.
note that this value should be lesser than or equal
to `max_content_length` api config key, otherwise
it will cause an error.
:keyword int status_code: status code to be returned on successful responses.
defaults to corresponding status code of request's
http method if not provided.
:note status_code: it could be a value from `InformationResponseCodeEnum`
or `SuccessfulResponseCodeEnum` or `RedirectionResponseCodeEnum`.
:keyword bool strict_status: specifies that it should only consider
the status code as processed if it is from
`InformationResponseCodeEnum` or
`SuccessfulResponseCodeEnum` or
`RedirectionResponseCodeEnum` values. otherwise
all codes from `INFORMATION_CODE_MIN`
to `INFORMATION_CODE_MAX` or from
`SUCCESS_CODE_MIN` to `SUCCESS_CODE_MAX`
or from `REDIRECTION_CODE_MIN` to
`REDIRECTION_CODE_MAX` will be considered
as processed. defaults to True if not provided.
:keyword str | list[str] environments: a list of all environments that this
route must be exposed on them.
the values could be from all available
environments in environments config store.
for example: `production`, `development`.
if not provided, the route will be exposed
on all environments.
:keyword ResultSchema | type[ResultSchema] result_schema: result schema to be used
to filter results. it could
be an instance or a type
of `ResultSchema` class.
:keyword bool indexed: specifies that list results must
include an extra field as row index.
the name of the index field and the initial value
of index could be provided by `index_name` and
`start_index` respectively. `indexed` keyword has
only effect if the returning result contains a list
of objects.
:keyword str index_name: name of the extra field to contain
the row index of each result. if not provided
defaults to `row_num` value.
:keyword int start_index: the initial value of row index. if not
provided, starts from 1.
:keyword SECURE_TRUE | SECURE_FALSE readable: specifies that any column or attribute
which has `allow_read=False` or its name
starts with underscore `_`, should not
be included in result dict. defaults to
`SECURE_TRUE` if not provided. it will be
used only for entity conversion. this
value will override the corresponding
value of `result_schema` if provided.
:keyword int depth: a value indicating the depth for conversion.
for example if entity A has a relationship with
entity B and there is a list of B in A, if `depth=0`
is provided, then just columns of A will be available
in result dict, but if `depth=1` is provided, then all
B entities in A will also be included in the result dict.
actually, `depth` specifies that relationships in an
entity should be followed by how much depth.
note that, if `columns` is also provided, it is required to
specify relationship property names in provided columns.
otherwise they won't be included even if `depth` is provided.
defaults to `default_depth` value of database config store.
please be careful on increasing `depth`, it could fail
application if set to higher values. choose it wisely.
normally the maximum acceptable `depth` would be 2 or 3.
there is a hard limit for max valid `depth` which is set
in `ConverterMixin.MAX_DEPTH` class variable. providing higher
`depth` value than this limit, will cause an error.
it will be used only for entity conversion.
this value will override the corresponding value of
`result_schema` if provided.
:keyword bool no_cache: a value indicating that the response returning from this route
must have a `Cache-Control: no-cache` header. this header will
be automatically added. defaults to False if not provided.
:keyword int request_limit: number of allowed requests to this
route before it unregisters itself.
defaults to None if not Provided.
:keyword int lifetime: number of seconds in which this route must remain
responsive after initial registration. after this
period, the route will unregister itself.
defaults to None if not provided.
:note request_limit, lifetime: if both of these values are provided, this
route will unregister itself if any of
these two conditions are met.
:keyword bool paged: specifies that this route should return paginated results.
defaults to False if not provided.
:keyword int page_size: default page size for this route.
defaults to `default_page_size` from
`database` config store if not provided.
:keyword int max_page_size: maximum page size that client is allowed
to request for this route. defaults to
`max_page_size` from `database` configs store
if not provided.
:keyword bool cors_enabled: specifies that cross origin resource sharing is enabled.
if not provided, it will be get from cors config store.
:keyword bool cors_always_send: specifies that cors headers must be included in
response even if the request does not have origin header.
if not provided, it will be get from cors config store.
:keyword list[str] cors_allowed_origins: a list of extra allowed origins to be used
in conjunction with default allowed ones.
:keyword list[str] cors_exposed_headers: extra exposed headers to be combined
with default ones.
:keyword list[str] cors_allowed_headers: extra allowed headers to be combined
with default ones.
:keyword bool cors_allow_credentials: specifies that browsers are allowed to pass
response headers to front-end javascript code
if the route is authenticated.
if not provided, it will be get from cors config
store.
:keyword int cors_max_age: maximum number of seconds to cache results.
if not provided, it will be get from cors config store.
:keyword bool swagger: specifies that this route must be exposed on swagger.
defaults to True if not provided.
:keyword bool ordered: specifies that this route provides ordered results.
this is a flag to be used by swagger package to add
`order_by` keyword into parameters.
defaults to False if not provided.
:raises DuplicateRouteURLError: duplicate route url error.
:raises OverwritingEndpointIsNotAllowedError: overwriting endpoint is not allowed error.
:raises PageSizeLimitError: page size limit error.
:raises MaxContentLengthLimitMismatchError: max content length limit mismatch error.
:raises InvalidViewFunctionTypeError: invalid view function type error.
:raises InvalidResultSchemaTypeError: invalid result schema type error.
:raises InvalidResponseStatusCodeError: invalid response status code error.
"""
return get_component(RouterPackage.COMPONENT_NAME).add_route(url, view_func,
provide_automatic_options,
**options)
| 59.378641 | 93 | 0.556769 | 3,388 | 30,580 | 4.976092 | 0.128985 | 0.015422 | 0.037013 | 0.023726 | 0.850228 | 0.841034 | 0.832493 | 0.826502 | 0.804733 | 0.799514 | 0 | 0.000725 | 0.413473 | 30,580 | 514 | 94 | 59.494163 | 0.939228 | 0.927273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.3 | 0 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
e17449cd3b1e8f0ec97e64781f6166fcb2ce11b6 | 922 | py | Python | indicators/tests/serializer_tests/__init__.py | Falliatcom-sa/falliatcom | 39fb926de072c296ed32d50cccfb8003ca870739 | [
"Apache-2.0"
] | null | null | null | indicators/tests/serializer_tests/__init__.py | Falliatcom-sa/falliatcom | 39fb926de072c296ed32d50cccfb8003ca870739 | [
"Apache-2.0"
] | 5 | 2021-02-08T20:42:48.000Z | 2022-03-12T00:19:38.000Z | indicators/tests/serializer_tests/__init__.py | Falliatcom-sa/falliatcom | 39fb926de072c296ed32d50cccfb8003ca870739 | [
"Apache-2.0"
] | null | null | null | from indicators.tests.serializer_tests.indicator_base_serializer import *
from indicators.tests.serializer_tests.indicator_with_measurement_serializer import *
from indicators.tests.serializer_tests.program_page_indicator_serializer import *
from indicators.tests.serializer_tests.program_base_serializer import *
from indicators.tests.serializer_tests.program_level_ordering_program_serializer import *
from indicators.tests.serializer_tests.rf_level_ordering_program_serializer import *
from indicators.tests.serializer_tests.program_page_program_serializer import *
from indicators.tests.serializer_tests.program_page_endpoint import *
from indicators.tests.serializer_tests.iptt_qs_program_serializer import *
from indicators.tests.serializer_tests.iptt_indicator_serializer import *
from indicators.tests.serializer_tests.iptt_program_serializer import *
from indicators.tests.serializer_tests.iptt_endpoints import * | 76.833333 | 89 | 0.896963 | 115 | 922 | 6.817391 | 0.156522 | 0.214286 | 0.290816 | 0.443878 | 0.946429 | 0.946429 | 0.880102 | 0.82398 | 0.512755 | 0.279337 | 0 | 0 | 0.050976 | 922 | 12 | 90 | 76.833333 | 0.896 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
e18146f9bb36f9369c3f1a9e5f656a6e09d4ea88 | 11,322 | py | Python | SMSProject/venv/Lib/site-packages/test/outline/test_outline_html.py | LourencoFernando/SMS-Project | f8e13dafdb41aa01f79337819cc3033a532410e8 | [
"MIT"
] | null | null | null | SMSProject/venv/Lib/site-packages/test/outline/test_outline_html.py | LourencoFernando/SMS-Project | f8e13dafdb41aa01f79337819cc3033a532410e8 | [
"MIT"
] | null | null | null | SMSProject/venv/Lib/site-packages/test/outline/test_outline_html.py | LourencoFernando/SMS-Project | f8e13dafdb41aa01f79337819cc3033a532410e8 | [
"MIT"
] | null | null | null | from pathlib import Path
import fpdf
from test.conftest import assert_pdf_equal
HERE = Path(__file__).resolve().parent
class MyFPDF(fpdf.FPDF, fpdf.HTMLMixin):
pass
def test_html_toc(tmp_path):
pdf = MyFPDF()
pdf.add_page()
pdf.write_html(
"""<h1>Document title</h1>
<br><br><br>
<u>Table of content:</u>
<br>
<toc></toc>
<section><h2>Subtitle 1</h2><br>
<section><h3>Subtitle 1.1</h3>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<section>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<section><h3>Subtitle 1.2</h3><br>
Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat.
<section>
<section>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<section><h2>Subtitle 2</h2><br>
<section><h3>Subtitle 2.1</h3><br>
Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
<section>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<section><h3>Subtitle 2.2</h3><br>
Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.
<section>
<section>"""
)
assert_pdf_equal(pdf, HERE / "test_html_toc.pdf", tmp_path)
def test_html_toc_2_pages(tmp_path):
pdf = MyFPDF()
pdf.add_page()
pdf.write_html(
"""<h1>Document title</h1>
<br><br><br>
<u>Table of content:</u>
<br>
<toc pages="2"></toc>
<h2>Subtitle 0</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 1</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 2</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 3</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 4</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 5</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 6</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 7</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 8</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 9</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 10</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 11</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 12</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 13</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 14</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 15</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 16</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 17</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 18</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 19</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 20</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 21</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 22</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 23</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 24</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 25</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 26</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 27</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 28</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 29</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 30</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 31</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 32</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 33</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 34</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 35</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 36</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 37</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 38</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 39</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 40</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 41</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 42</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 43</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 44</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 45</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 46</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 47</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 48</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 49</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 50</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 51</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 52</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 53</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
<h2>Subtitle 54</h2>
Lorem ipsum dolor sit amet, consectetur adipiscing elit,
sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
"""
)
assert_pdf_equal(pdf, HERE / "test_html_toc_2_pages.pdf", tmp_path)
| 50.32 | 122 | 0.665342 | 1,570 | 11,322 | 4.778981 | 0.099363 | 0.043183 | 0.060776 | 0.075703 | 0.9171 | 0.907904 | 0.907904 | 0.907904 | 0.899374 | 0.899374 | 0 | 0.028705 | 0.264618 | 11,322 | 224 | 123 | 50.544643 | 0.872448 | 0 | 0 | 0.333333 | 0 | 0 | 0.079848 | 0.047529 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.111111 | false | 0.055556 | 0.166667 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
e18ecc60442ff647bcdde88815e072bc882fe8a8 | 60,004 | py | Python | multiple-languages/python/ros-cdk-oss-1.0.3/src/ros_cdk_oss/__init__.py | aliyun/Resource-Orchestration-Service-Cloud-Development-K | 2b81e135002ed81cb72f7d07be7ff497ea39e2e1 | [
"Apache-2.0"
] | 15 | 2020-11-10T02:00:28.000Z | 2022-02-07T19:28:10.000Z | multiple-languages/python/ros-cdk-oss-1.0.3/src/ros_cdk_oss/__init__.py | aliyun/Resource-Orchestration-Service-Cloud-Development-K | 2b81e135002ed81cb72f7d07be7ff497ea39e2e1 | [
"Apache-2.0"
] | 23 | 2021-02-02T04:37:02.000Z | 2022-03-31T06:41:06.000Z | multiple-languages/python/ros-cdk-oss-1.0.3/src/ros_cdk_oss/__init__.py | aliyun/Resource-Orchestration-Service-Cloud-Development-K | 2b81e135002ed81cb72f7d07be7ff497ea39e2e1 | [
"Apache-2.0"
] | 4 | 2021-01-13T05:48:43.000Z | 2022-03-15T11:26:48.000Z | '''
## Aliyun ROS OSS Construct Library
This module is part of the AliCloud ROS Cloud Development Kit (ROS CDK) project.
```python
# Example automatically generated from non-compiling source. May contain errors.
import * as OSS from '@alicloud/ros-cdk-oss';
```
'''
import abc
import builtins
import datetime
import enum
import typing
import jsii
import publication
import typing_extensions
from ._jsii import *
import ros_cdk_core
class Bucket(
ros_cdk_core.Resource,
metaclass=jsii.JSIIMeta,
jsii_type="@alicloud/ros-cdk-oss.Bucket",
):
'''A ROS resource type: ``ALIYUN::OSS::Bucket``.'''
def __init__(
self,
scope: ros_cdk_core.Construct,
id: builtins.str,
props: "BucketProps",
enable_resource_property_constraint: typing.Optional[builtins.bool] = None,
) -> None:
'''Create a new ``ALIYUN::OSS::Bucket``.
Param scope - scope in which this resource is defined
Param id - scoped id of the resource
Param props - resource properties
:param scope: -
:param id: -
:param props: -
:param enable_resource_property_constraint: -
'''
jsii.create(self.__class__, self, [scope, id, props, enable_resource_property_constraint])
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrDomainName")
def attr_domain_name(self) -> ros_cdk_core.IResolvable:
'''Attribute DomainName: The public DNS name of the specified bucket.'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrDomainName"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrInternalDomainName")
def attr_internal_domain_name(self) -> ros_cdk_core.IResolvable:
'''Attribute InternalDomainName: The internal DNS name of the specified bucket.'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrInternalDomainName"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrName")
def attr_name(self) -> ros_cdk_core.IResolvable:
'''Attribute Name: The name of Bucket.'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrName"))
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-oss.BucketProps",
jsii_struct_bases=[],
name_mapping={
"bucket_name": "bucketName",
"access_control": "accessControl",
"cors_configuration": "corsConfiguration",
"deletion_force": "deletionForce",
"lifecycle_configuration": "lifecycleConfiguration",
"logging_configuration": "loggingConfiguration",
"policy": "policy",
"referer_configuration": "refererConfiguration",
"server_side_encryption_configuration": "serverSideEncryptionConfiguration",
"storage_class": "storageClass",
"tags": "tags",
"website_configuration": "websiteConfiguration",
},
)
class BucketProps:
def __init__(
self,
*,
bucket_name: typing.Union[builtins.str, ros_cdk_core.IResolvable],
access_control: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
cors_configuration: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.CORSConfigurationProperty"]] = None,
deletion_force: typing.Optional[typing.Union[builtins.bool, ros_cdk_core.IResolvable]] = None,
lifecycle_configuration: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.LifecycleConfigurationProperty"]] = None,
logging_configuration: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.LoggingConfigurationProperty"]] = None,
policy: typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Mapping[builtins.str, typing.Any]]] = None,
referer_configuration: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.RefererConfigurationProperty"]] = None,
server_side_encryption_configuration: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.ServerSideEncryptionConfigurationProperty"]] = None,
storage_class: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
tags: typing.Optional[typing.Mapping[builtins.str, typing.Any]] = None,
website_configuration: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.WebsiteConfigurationProperty"]] = None,
) -> None:
'''Properties for defining a ``ALIYUN::OSS::Bucket``.
:param bucket_name: Property bucketName: bucket name.
:param access_control: Property accessControl: The access control list.
:param cors_configuration: Property corsConfiguration: Rules that define cross-origin resource sharing of objects in this bucket.
:param deletion_force: Property deletionForce: Whether force delete the relative objects in the bucket. Default value is false.
:param lifecycle_configuration: Property lifecycleConfiguration: Rules that define how oss bucket manages objects during their lifetime.
:param logging_configuration: Property loggingConfiguration: Settings that defines where logs are stored.
:param policy: Property policy: Bucket policy.
:param referer_configuration: Property refererConfiguration: undefined.
:param server_side_encryption_configuration: Property serverSideEncryptionConfiguration: Specifies the bucket used to store the server-side encryption rule.
:param storage_class: Property storageClass: Specifies the storage class of the bucket. Default is "Standard".
:param tags: Property tags: Bucket tags in k-v pairs format.
:param website_configuration: Property websiteConfiguration: The properties of website config.
'''
self._values: typing.Dict[str, typing.Any] = {
"bucket_name": bucket_name,
}
if access_control is not None:
self._values["access_control"] = access_control
if cors_configuration is not None:
self._values["cors_configuration"] = cors_configuration
if deletion_force is not None:
self._values["deletion_force"] = deletion_force
if lifecycle_configuration is not None:
self._values["lifecycle_configuration"] = lifecycle_configuration
if logging_configuration is not None:
self._values["logging_configuration"] = logging_configuration
if policy is not None:
self._values["policy"] = policy
if referer_configuration is not None:
self._values["referer_configuration"] = referer_configuration
if server_side_encryption_configuration is not None:
self._values["server_side_encryption_configuration"] = server_side_encryption_configuration
if storage_class is not None:
self._values["storage_class"] = storage_class
if tags is not None:
self._values["tags"] = tags
if website_configuration is not None:
self._values["website_configuration"] = website_configuration
@builtins.property
def bucket_name(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''Property bucketName: bucket name.'''
result = self._values.get("bucket_name")
assert result is not None, "Required property 'bucket_name' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def access_control(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''Property accessControl: The access control list.'''
result = self._values.get("access_control")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def cors_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.CORSConfigurationProperty"]]:
'''Property corsConfiguration: Rules that define cross-origin resource sharing of objects in this bucket.'''
result = self._values.get("cors_configuration")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.CORSConfigurationProperty"]], result)
@builtins.property
def deletion_force(
self,
) -> typing.Optional[typing.Union[builtins.bool, ros_cdk_core.IResolvable]]:
'''Property deletionForce: Whether force delete the relative objects in the bucket.
Default value is false.
'''
result = self._values.get("deletion_force")
return typing.cast(typing.Optional[typing.Union[builtins.bool, ros_cdk_core.IResolvable]], result)
@builtins.property
def lifecycle_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.LifecycleConfigurationProperty"]]:
'''Property lifecycleConfiguration: Rules that define how oss bucket manages objects during their lifetime.'''
result = self._values.get("lifecycle_configuration")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.LifecycleConfigurationProperty"]], result)
@builtins.property
def logging_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.LoggingConfigurationProperty"]]:
'''Property loggingConfiguration: Settings that defines where logs are stored.'''
result = self._values.get("logging_configuration")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.LoggingConfigurationProperty"]], result)
@builtins.property
def policy(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Mapping[builtins.str, typing.Any]]]:
'''Property policy: Bucket policy.'''
result = self._values.get("policy")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Mapping[builtins.str, typing.Any]]], result)
@builtins.property
def referer_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.RefererConfigurationProperty"]]:
'''Property refererConfiguration: undefined.'''
result = self._values.get("referer_configuration")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.RefererConfigurationProperty"]], result)
@builtins.property
def server_side_encryption_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.ServerSideEncryptionConfigurationProperty"]]:
'''Property serverSideEncryptionConfiguration: Specifies the bucket used to store the server-side encryption rule.'''
result = self._values.get("server_side_encryption_configuration")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.ServerSideEncryptionConfigurationProperty"]], result)
@builtins.property
def storage_class(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''Property storageClass: Specifies the storage class of the bucket.
Default is "Standard".
'''
result = self._values.get("storage_class")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def tags(self) -> typing.Optional[typing.Mapping[builtins.str, typing.Any]]:
'''Property tags: Bucket tags in k-v pairs format.'''
result = self._values.get("tags")
return typing.cast(typing.Optional[typing.Mapping[builtins.str, typing.Any]], result)
@builtins.property
def website_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.WebsiteConfigurationProperty"]]:
'''Property websiteConfiguration: The properties of website config.'''
result = self._values.get("website_configuration")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.WebsiteConfigurationProperty"]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "BucketProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
class RosBucket(
ros_cdk_core.RosResource,
metaclass=jsii.JSIIMeta,
jsii_type="@alicloud/ros-cdk-oss.RosBucket",
):
'''A ROS template type: ``ALIYUN::OSS::Bucket``.'''
def __init__(
self,
scope: ros_cdk_core.Construct,
id: builtins.str,
props: "RosBucketProps",
enable_resource_property_constraint: builtins.bool,
) -> None:
'''Create a new ``ALIYUN::OSS::Bucket``.
:param scope: - scope in which this resource is defined.
:param id: - scoped id of the resource.
:param props: - resource properties.
:param enable_resource_property_constraint: -
'''
jsii.create(self.__class__, self, [scope, id, props, enable_resource_property_constraint])
@jsii.member(jsii_name="renderProperties")
def _render_properties(
self,
props: typing.Mapping[builtins.str, typing.Any],
) -> typing.Mapping[builtins.str, typing.Any]:
'''
:param props: -
'''
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.invoke(self, "renderProperties", [props]))
@jsii.python.classproperty # type: ignore[misc]
@jsii.member(jsii_name="ROS_RESOURCE_TYPE_NAME")
def ROS_RESOURCE_TYPE_NAME(cls) -> builtins.str:
'''The resource type name for this resource class.'''
return typing.cast(builtins.str, jsii.sget(cls, "ROS_RESOURCE_TYPE_NAME"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrDomainName")
def attr_domain_name(self) -> ros_cdk_core.IResolvable:
'''
:Attribute: DomainName: The public DNS name of the specified bucket.
'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrDomainName"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrInternalDomainName")
def attr_internal_domain_name(self) -> ros_cdk_core.IResolvable:
'''
:Attribute: InternalDomainName: The internal DNS name of the specified bucket.
'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrInternalDomainName"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="attrName")
def attr_name(self) -> ros_cdk_core.IResolvable:
'''
:Attribute: Name: The name of Bucket
'''
return typing.cast(ros_cdk_core.IResolvable, jsii.get(self, "attrName"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="rosProperties")
def _ros_properties(self) -> typing.Mapping[builtins.str, typing.Any]:
return typing.cast(typing.Mapping[builtins.str, typing.Any], jsii.get(self, "rosProperties"))
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="bucketName")
def bucket_name(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: bucketName: bucket name.
'''
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], jsii.get(self, "bucketName"))
@bucket_name.setter
def bucket_name(
self,
value: typing.Union[builtins.str, ros_cdk_core.IResolvable],
) -> None:
jsii.set(self, "bucketName", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="enableResourcePropertyConstraint")
def enable_resource_property_constraint(self) -> builtins.bool:
return typing.cast(builtins.bool, jsii.get(self, "enableResourcePropertyConstraint"))
@enable_resource_property_constraint.setter
def enable_resource_property_constraint(self, value: builtins.bool) -> None:
jsii.set(self, "enableResourcePropertyConstraint", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="accessControl")
def access_control(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: accessControl: The access control list.
'''
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], jsii.get(self, "accessControl"))
@access_control.setter
def access_control(
self,
value: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]],
) -> None:
jsii.set(self, "accessControl", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="corsConfiguration")
def cors_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.CORSConfigurationProperty"]]:
'''
:Property: corsConfiguration: Rules that define cross-origin resource sharing of objects in this bucket.
'''
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.CORSConfigurationProperty"]], jsii.get(self, "corsConfiguration"))
@cors_configuration.setter
def cors_configuration(
self,
value: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.CORSConfigurationProperty"]],
) -> None:
jsii.set(self, "corsConfiguration", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="deletionForce")
def deletion_force(
self,
) -> typing.Optional[typing.Union[builtins.bool, ros_cdk_core.IResolvable]]:
'''
:Property: deletionForce: Whether force delete the relative objects in the bucket. Default value is false.
'''
return typing.cast(typing.Optional[typing.Union[builtins.bool, ros_cdk_core.IResolvable]], jsii.get(self, "deletionForce"))
@deletion_force.setter
def deletion_force(
self,
value: typing.Optional[typing.Union[builtins.bool, ros_cdk_core.IResolvable]],
) -> None:
jsii.set(self, "deletionForce", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="lifecycleConfiguration")
def lifecycle_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.LifecycleConfigurationProperty"]]:
'''
:Property: lifecycleConfiguration: Rules that define how oss bucket manages objects during their lifetime.
'''
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.LifecycleConfigurationProperty"]], jsii.get(self, "lifecycleConfiguration"))
@lifecycle_configuration.setter
def lifecycle_configuration(
self,
value: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.LifecycleConfigurationProperty"]],
) -> None:
jsii.set(self, "lifecycleConfiguration", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="loggingConfiguration")
def logging_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.LoggingConfigurationProperty"]]:
'''
:Property: loggingConfiguration: Settings that defines where logs are stored.
'''
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.LoggingConfigurationProperty"]], jsii.get(self, "loggingConfiguration"))
@logging_configuration.setter
def logging_configuration(
self,
value: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.LoggingConfigurationProperty"]],
) -> None:
jsii.set(self, "loggingConfiguration", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="policy")
def policy(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Mapping[builtins.str, typing.Any]]]:
'''
:Property: policy: Bucket policy
'''
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Mapping[builtins.str, typing.Any]]], jsii.get(self, "policy"))
@policy.setter
def policy(
self,
value: typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Mapping[builtins.str, typing.Any]]],
) -> None:
jsii.set(self, "policy", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="refererConfiguration")
def referer_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.RefererConfigurationProperty"]]:
'''
:Property: refererConfiguration: undefined
'''
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.RefererConfigurationProperty"]], jsii.get(self, "refererConfiguration"))
@referer_configuration.setter
def referer_configuration(
self,
value: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.RefererConfigurationProperty"]],
) -> None:
jsii.set(self, "refererConfiguration", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="serverSideEncryptionConfiguration")
def server_side_encryption_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.ServerSideEncryptionConfigurationProperty"]]:
'''
:Property: serverSideEncryptionConfiguration: Specifies the bucket used to store the server-side encryption rule.
'''
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.ServerSideEncryptionConfigurationProperty"]], jsii.get(self, "serverSideEncryptionConfiguration"))
@server_side_encryption_configuration.setter
def server_side_encryption_configuration(
self,
value: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.ServerSideEncryptionConfigurationProperty"]],
) -> None:
jsii.set(self, "serverSideEncryptionConfiguration", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="storageClass")
def storage_class(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: storageClass: Specifies the storage class of the bucket. Default is "Standard".
'''
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], jsii.get(self, "storageClass"))
@storage_class.setter
def storage_class(
self,
value: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]],
) -> None:
jsii.set(self, "storageClass", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="tags")
def tags(self) -> typing.Optional[typing.Mapping[builtins.str, typing.Any]]:
'''
:Property: tags: Bucket tags in k-v pairs format.
'''
return typing.cast(typing.Optional[typing.Mapping[builtins.str, typing.Any]], jsii.get(self, "tags"))
@tags.setter
def tags(
self,
value: typing.Optional[typing.Mapping[builtins.str, typing.Any]],
) -> None:
jsii.set(self, "tags", value)
@builtins.property # type: ignore[misc]
@jsii.member(jsii_name="websiteConfiguration")
def website_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.WebsiteConfigurationProperty"]]:
'''
:Property: websiteConfiguration: The properties of website config.
'''
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.WebsiteConfigurationProperty"]], jsii.get(self, "websiteConfiguration"))
@website_configuration.setter
def website_configuration(
self,
value: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.WebsiteConfigurationProperty"]],
) -> None:
jsii.set(self, "websiteConfiguration", value)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-oss.RosBucket.AbortMultipartUploadProperty",
jsii_struct_bases=[],
name_mapping={"created_before_date": "createdBeforeDate", "days": "days"},
)
class AbortMultipartUploadProperty:
def __init__(
self,
*,
created_before_date: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
days: typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]] = None,
) -> None:
'''
:param created_before_date:
:param days:
'''
self._values: typing.Dict[str, typing.Any] = {}
if created_before_date is not None:
self._values["created_before_date"] = created_before_date
if days is not None:
self._values["days"] = days
@builtins.property
def created_before_date(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: createdBeforeDate: undefined
'''
result = self._values.get("created_before_date")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def days(
self,
) -> typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]]:
'''
:Property: days: undefined
'''
result = self._values.get("days")
return typing.cast(typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "AbortMultipartUploadProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-oss.RosBucket.CORSConfigurationProperty",
jsii_struct_bases=[],
name_mapping={"cors_rule": "corsRule"},
)
class CORSConfigurationProperty:
def __init__(
self,
*,
cors_rule: typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Sequence[typing.Union[ros_cdk_core.IResolvable, "RosBucket.CORSRuleProperty"]]]] = None,
) -> None:
'''
:param cors_rule:
'''
self._values: typing.Dict[str, typing.Any] = {}
if cors_rule is not None:
self._values["cors_rule"] = cors_rule
@builtins.property
def cors_rule(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.List[typing.Union[ros_cdk_core.IResolvable, "RosBucket.CORSRuleProperty"]]]]:
'''
:Property: corsRule: A set of origins and methods that you allow.
'''
result = self._values.get("cors_rule")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.List[typing.Union[ros_cdk_core.IResolvable, "RosBucket.CORSRuleProperty"]]]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CORSConfigurationProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-oss.RosBucket.CORSRuleProperty",
jsii_struct_bases=[],
name_mapping={
"allowed_header": "allowedHeader",
"allowed_method": "allowedMethod",
"allowed_origin": "allowedOrigin",
"expose_header": "exposeHeader",
"max_age_seconds": "maxAgeSeconds",
},
)
class CORSRuleProperty:
def __init__(
self,
*,
allowed_header: typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Sequence[typing.Union[builtins.str, ros_cdk_core.IResolvable]]]] = None,
allowed_method: typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Sequence[typing.Union[builtins.str, ros_cdk_core.IResolvable]]]] = None,
allowed_origin: typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Sequence[typing.Union[builtins.str, ros_cdk_core.IResolvable]]]] = None,
expose_header: typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Sequence[typing.Union[builtins.str, ros_cdk_core.IResolvable]]]] = None,
max_age_seconds: typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]] = None,
) -> None:
'''
:param allowed_header:
:param allowed_method:
:param allowed_origin:
:param expose_header:
:param max_age_seconds:
'''
self._values: typing.Dict[str, typing.Any] = {}
if allowed_header is not None:
self._values["allowed_header"] = allowed_header
if allowed_method is not None:
self._values["allowed_method"] = allowed_method
if allowed_origin is not None:
self._values["allowed_origin"] = allowed_origin
if expose_header is not None:
self._values["expose_header"] = expose_header
if max_age_seconds is not None:
self._values["max_age_seconds"] = max_age_seconds
@builtins.property
def allowed_header(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.List[typing.Union[builtins.str, ros_cdk_core.IResolvable]]]]:
'''
:Property: allowedHeader: undefined
'''
result = self._values.get("allowed_header")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.List[typing.Union[builtins.str, ros_cdk_core.IResolvable]]]], result)
@builtins.property
def allowed_method(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.List[typing.Union[builtins.str, ros_cdk_core.IResolvable]]]]:
'''
:Property: allowedMethod: undefined
'''
result = self._values.get("allowed_method")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.List[typing.Union[builtins.str, ros_cdk_core.IResolvable]]]], result)
@builtins.property
def allowed_origin(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.List[typing.Union[builtins.str, ros_cdk_core.IResolvable]]]]:
'''
:Property: allowedOrigin: undefined
'''
result = self._values.get("allowed_origin")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.List[typing.Union[builtins.str, ros_cdk_core.IResolvable]]]], result)
@builtins.property
def expose_header(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.List[typing.Union[builtins.str, ros_cdk_core.IResolvable]]]]:
'''
:Property: exposeHeader: undefined
'''
result = self._values.get("expose_header")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.List[typing.Union[builtins.str, ros_cdk_core.IResolvable]]]], result)
@builtins.property
def max_age_seconds(
self,
) -> typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]]:
'''
:Property: maxAgeSeconds: undefined
'''
result = self._values.get("max_age_seconds")
return typing.cast(typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "CORSRuleProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-oss.RosBucket.ExpirationProperty",
jsii_struct_bases=[],
name_mapping={"created_before_date": "createdBeforeDate", "days": "days"},
)
class ExpirationProperty:
def __init__(
self,
*,
created_before_date: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
days: typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]] = None,
) -> None:
'''
:param created_before_date:
:param days:
'''
self._values: typing.Dict[str, typing.Any] = {}
if created_before_date is not None:
self._values["created_before_date"] = created_before_date
if days is not None:
self._values["days"] = days
@builtins.property
def created_before_date(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: createdBeforeDate: undefined
'''
result = self._values.get("created_before_date")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def days(
self,
) -> typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]]:
'''
:Property: days: undefined
'''
result = self._values.get("days")
return typing.cast(typing.Optional[typing.Union[jsii.Number, ros_cdk_core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "ExpirationProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-oss.RosBucket.LifecycleConfigurationProperty",
jsii_struct_bases=[],
name_mapping={"rule": "rule"},
)
class LifecycleConfigurationProperty:
def __init__(
self,
*,
rule: typing.Union[ros_cdk_core.IResolvable, typing.Sequence[typing.Union[ros_cdk_core.IResolvable, "RosBucket.RuleProperty"]]],
) -> None:
'''
:param rule:
'''
self._values: typing.Dict[str, typing.Any] = {
"rule": rule,
}
@builtins.property
def rule(
self,
) -> typing.Union[ros_cdk_core.IResolvable, typing.List[typing.Union[ros_cdk_core.IResolvable, "RosBucket.RuleProperty"]]]:
'''
:Property: rule: Describes lifecycle rules for the oss bucket Lifecycle Configuration property.
'''
result = self._values.get("rule")
assert result is not None, "Required property 'rule' is missing"
return typing.cast(typing.Union[ros_cdk_core.IResolvable, typing.List[typing.Union[ros_cdk_core.IResolvable, "RosBucket.RuleProperty"]]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "LifecycleConfigurationProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-oss.RosBucket.LoggingConfigurationProperty",
jsii_struct_bases=[],
name_mapping={
"target_bucket": "targetBucket",
"target_prefix": "targetPrefix",
},
)
class LoggingConfigurationProperty:
def __init__(
self,
*,
target_bucket: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
target_prefix: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
) -> None:
'''
:param target_bucket:
:param target_prefix:
'''
self._values: typing.Dict[str, typing.Any] = {}
if target_bucket is not None:
self._values["target_bucket"] = target_bucket
if target_prefix is not None:
self._values["target_prefix"] = target_prefix
@builtins.property
def target_bucket(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: targetBucket: Specifies the bucket where you want Aliyun OSS to store server access logs.
'''
result = self._values.get("target_bucket")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def target_prefix(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: targetPrefix: This element lets you specify a prefix for the objects that the log files will be stored.
'''
result = self._values.get("target_prefix")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "LoggingConfigurationProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-oss.RosBucket.RefererConfigurationProperty",
jsii_struct_bases=[],
name_mapping={
"allow_empty_referer": "allowEmptyReferer",
"referer_list": "refererList",
},
)
class RefererConfigurationProperty:
def __init__(
self,
*,
allow_empty_referer: typing.Optional[typing.Union[builtins.bool, ros_cdk_core.IResolvable]] = None,
referer_list: typing.Optional[typing.Union[typing.Sequence[typing.Any], ros_cdk_core.IResolvable]] = None,
) -> None:
'''
:param allow_empty_referer:
:param referer_list:
'''
self._values: typing.Dict[str, typing.Any] = {}
if allow_empty_referer is not None:
self._values["allow_empty_referer"] = allow_empty_referer
if referer_list is not None:
self._values["referer_list"] = referer_list
@builtins.property
def allow_empty_referer(
self,
) -> typing.Optional[typing.Union[builtins.bool, ros_cdk_core.IResolvable]]:
'''
:Property: allowEmptyReferer: undefined
'''
result = self._values.get("allow_empty_referer")
return typing.cast(typing.Optional[typing.Union[builtins.bool, ros_cdk_core.IResolvable]], result)
@builtins.property
def referer_list(
self,
) -> typing.Optional[typing.Union[typing.List[typing.Any], ros_cdk_core.IResolvable]]:
'''
:Property: refererList: undefined
'''
result = self._values.get("referer_list")
return typing.cast(typing.Optional[typing.Union[typing.List[typing.Any], ros_cdk_core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "RefererConfigurationProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-oss.RosBucket.RuleProperty",
jsii_struct_bases=[],
name_mapping={
"prefix": "prefix",
"abort_multipart_upload": "abortMultipartUpload",
"expiration": "expiration",
"id": "id",
"status": "status",
},
)
class RuleProperty:
def __init__(
self,
*,
prefix: typing.Union[builtins.str, ros_cdk_core.IResolvable],
abort_multipart_upload: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.AbortMultipartUploadProperty"]] = None,
expiration: typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.ExpirationProperty"]] = None,
id: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
status: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
) -> None:
'''
:param prefix:
:param abort_multipart_upload:
:param expiration:
:param id:
:param status:
'''
self._values: typing.Dict[str, typing.Any] = {
"prefix": prefix,
}
if abort_multipart_upload is not None:
self._values["abort_multipart_upload"] = abort_multipart_upload
if expiration is not None:
self._values["expiration"] = expiration
if id is not None:
self._values["id"] = id
if status is not None:
self._values["status"] = status
@builtins.property
def prefix(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: prefix: undefined
'''
result = self._values.get("prefix")
assert result is not None, "Required property 'prefix' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def abort_multipart_upload(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.AbortMultipartUploadProperty"]]:
'''
:Property: abortMultipartUpload: undefined
'''
result = self._values.get("abort_multipart_upload")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.AbortMultipartUploadProperty"]], result)
@builtins.property
def expiration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.ExpirationProperty"]]:
'''
:Property: expiration: undefined
'''
result = self._values.get("expiration")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, "RosBucket.ExpirationProperty"]], result)
@builtins.property
def id(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: id: undefined
'''
result = self._values.get("id")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def status(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: status: undefined
'''
result = self._values.get("status")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "RuleProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-oss.RosBucket.ServerSideEncryptionConfigurationProperty",
jsii_struct_bases=[],
name_mapping={
"sse_algorithm": "sseAlgorithm",
"kms_master_key_id": "kmsMasterKeyId",
},
)
class ServerSideEncryptionConfigurationProperty:
def __init__(
self,
*,
sse_algorithm: typing.Union[builtins.str, ros_cdk_core.IResolvable],
kms_master_key_id: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
) -> None:
'''
:param sse_algorithm:
:param kms_master_key_id:
'''
self._values: typing.Dict[str, typing.Any] = {
"sse_algorithm": sse_algorithm,
}
if kms_master_key_id is not None:
self._values["kms_master_key_id"] = kms_master_key_id
@builtins.property
def sse_algorithm(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: sseAlgorithm: Specifies the default server-side encryption method.
'''
result = self._values.get("sse_algorithm")
assert result is not None, "Required property 'sse_algorithm' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def kms_master_key_id(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: kmsMasterKeyId: Specifies the CMK ID when the value of SSEAlgorithm is KMS and a specified CMK is used for encryption. If the value of SSEAlgorithm is not KMS, this element must be null.
'''
result = self._values.get("kms_master_key_id")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "ServerSideEncryptionConfigurationProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-oss.RosBucket.WebsiteConfigurationProperty",
jsii_struct_bases=[],
name_mapping={
"error_document": "errorDocument",
"index_document": "indexDocument",
},
)
class WebsiteConfigurationProperty:
def __init__(
self,
*,
error_document: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
index_document: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
) -> None:
'''
:param error_document:
:param index_document:
'''
self._values: typing.Dict[str, typing.Any] = {}
if error_document is not None:
self._values["error_document"] = error_document
if index_document is not None:
self._values["index_document"] = index_document
@builtins.property
def error_document(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: errorDocument: default error page.
'''
result = self._values.get("error_document")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def index_document(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: indexDocument: default home page.
'''
result = self._values.get("index_document")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "WebsiteConfigurationProperty(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
@jsii.data_type(
jsii_type="@alicloud/ros-cdk-oss.RosBucketProps",
jsii_struct_bases=[],
name_mapping={
"bucket_name": "bucketName",
"access_control": "accessControl",
"cors_configuration": "corsConfiguration",
"deletion_force": "deletionForce",
"lifecycle_configuration": "lifecycleConfiguration",
"logging_configuration": "loggingConfiguration",
"policy": "policy",
"referer_configuration": "refererConfiguration",
"server_side_encryption_configuration": "serverSideEncryptionConfiguration",
"storage_class": "storageClass",
"tags": "tags",
"website_configuration": "websiteConfiguration",
},
)
class RosBucketProps:
def __init__(
self,
*,
bucket_name: typing.Union[builtins.str, ros_cdk_core.IResolvable],
access_control: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
cors_configuration: typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.CORSConfigurationProperty]] = None,
deletion_force: typing.Optional[typing.Union[builtins.bool, ros_cdk_core.IResolvable]] = None,
lifecycle_configuration: typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.LifecycleConfigurationProperty]] = None,
logging_configuration: typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.LoggingConfigurationProperty]] = None,
policy: typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Mapping[builtins.str, typing.Any]]] = None,
referer_configuration: typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.RefererConfigurationProperty]] = None,
server_side_encryption_configuration: typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.ServerSideEncryptionConfigurationProperty]] = None,
storage_class: typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]] = None,
tags: typing.Optional[typing.Mapping[builtins.str, typing.Any]] = None,
website_configuration: typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.WebsiteConfigurationProperty]] = None,
) -> None:
'''Properties for defining a ``ALIYUN::OSS::Bucket``.
:param bucket_name:
:param access_control:
:param cors_configuration:
:param deletion_force:
:param lifecycle_configuration:
:param logging_configuration:
:param policy:
:param referer_configuration:
:param server_side_encryption_configuration:
:param storage_class:
:param tags:
:param website_configuration:
'''
self._values: typing.Dict[str, typing.Any] = {
"bucket_name": bucket_name,
}
if access_control is not None:
self._values["access_control"] = access_control
if cors_configuration is not None:
self._values["cors_configuration"] = cors_configuration
if deletion_force is not None:
self._values["deletion_force"] = deletion_force
if lifecycle_configuration is not None:
self._values["lifecycle_configuration"] = lifecycle_configuration
if logging_configuration is not None:
self._values["logging_configuration"] = logging_configuration
if policy is not None:
self._values["policy"] = policy
if referer_configuration is not None:
self._values["referer_configuration"] = referer_configuration
if server_side_encryption_configuration is not None:
self._values["server_side_encryption_configuration"] = server_side_encryption_configuration
if storage_class is not None:
self._values["storage_class"] = storage_class
if tags is not None:
self._values["tags"] = tags
if website_configuration is not None:
self._values["website_configuration"] = website_configuration
@builtins.property
def bucket_name(self) -> typing.Union[builtins.str, ros_cdk_core.IResolvable]:
'''
:Property: bucketName: bucket name.
'''
result = self._values.get("bucket_name")
assert result is not None, "Required property 'bucket_name' is missing"
return typing.cast(typing.Union[builtins.str, ros_cdk_core.IResolvable], result)
@builtins.property
def access_control(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: accessControl: The access control list.
'''
result = self._values.get("access_control")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def cors_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.CORSConfigurationProperty]]:
'''
:Property: corsConfiguration: Rules that define cross-origin resource sharing of objects in this bucket.
'''
result = self._values.get("cors_configuration")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.CORSConfigurationProperty]], result)
@builtins.property
def deletion_force(
self,
) -> typing.Optional[typing.Union[builtins.bool, ros_cdk_core.IResolvable]]:
'''
:Property: deletionForce: Whether force delete the relative objects in the bucket. Default value is false.
'''
result = self._values.get("deletion_force")
return typing.cast(typing.Optional[typing.Union[builtins.bool, ros_cdk_core.IResolvable]], result)
@builtins.property
def lifecycle_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.LifecycleConfigurationProperty]]:
'''
:Property: lifecycleConfiguration: Rules that define how oss bucket manages objects during their lifetime.
'''
result = self._values.get("lifecycle_configuration")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.LifecycleConfigurationProperty]], result)
@builtins.property
def logging_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.LoggingConfigurationProperty]]:
'''
:Property: loggingConfiguration: Settings that defines where logs are stored.
'''
result = self._values.get("logging_configuration")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.LoggingConfigurationProperty]], result)
@builtins.property
def policy(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Mapping[builtins.str, typing.Any]]]:
'''
:Property: policy: Bucket policy
'''
result = self._values.get("policy")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, typing.Mapping[builtins.str, typing.Any]]], result)
@builtins.property
def referer_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.RefererConfigurationProperty]]:
'''
:Property: refererConfiguration: undefined
'''
result = self._values.get("referer_configuration")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.RefererConfigurationProperty]], result)
@builtins.property
def server_side_encryption_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.ServerSideEncryptionConfigurationProperty]]:
'''
:Property: serverSideEncryptionConfiguration: Specifies the bucket used to store the server-side encryption rule.
'''
result = self._values.get("server_side_encryption_configuration")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.ServerSideEncryptionConfigurationProperty]], result)
@builtins.property
def storage_class(
self,
) -> typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]]:
'''
:Property: storageClass: Specifies the storage class of the bucket. Default is "Standard".
'''
result = self._values.get("storage_class")
return typing.cast(typing.Optional[typing.Union[builtins.str, ros_cdk_core.IResolvable]], result)
@builtins.property
def tags(self) -> typing.Optional[typing.Mapping[builtins.str, typing.Any]]:
'''
:Property: tags: Bucket tags in k-v pairs format.
'''
result = self._values.get("tags")
return typing.cast(typing.Optional[typing.Mapping[builtins.str, typing.Any]], result)
@builtins.property
def website_configuration(
self,
) -> typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.WebsiteConfigurationProperty]]:
'''
:Property: websiteConfiguration: The properties of website config.
'''
result = self._values.get("website_configuration")
return typing.cast(typing.Optional[typing.Union[ros_cdk_core.IResolvable, RosBucket.WebsiteConfigurationProperty]], result)
def __eq__(self, rhs: typing.Any) -> builtins.bool:
return isinstance(rhs, self.__class__) and rhs._values == self._values
def __ne__(self, rhs: typing.Any) -> builtins.bool:
return not (rhs == self)
def __repr__(self) -> str:
return "RosBucketProps(%s)" % ", ".join(
k + "=" + repr(v) for k, v in self._values.items()
)
__all__ = [
"Bucket",
"BucketProps",
"RosBucket",
"RosBucketProps",
]
publication.publish()
| 44.153054 | 209 | 0.65304 | 6,310 | 60,004 | 5.993027 | 0.042155 | 0.035223 | 0.054474 | 0.111619 | 0.830997 | 0.805849 | 0.787497 | 0.778216 | 0.768669 | 0.758885 | 0 | 0 | 0.234768 | 60,004 | 1,358 | 210 | 44.185567 | 0.823573 | 0.128375 | 0 | 0.63695 | 1 | 0 | 0.135824 | 0.080946 | 0 | 0 | 0 | 0 | 0.005371 | 1 | 0.142857 | false | 0 | 0.010741 | 0.040816 | 0.282492 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e1a5cbed925c997d70708bdb315b707ff96f96c5 | 119 | py | Python | launch/trail/config/common/vehicle.py | ashish-boss/goby3-course | ce6d3543961cd76cacab68d864b3fdbb056741a7 | [
"MIT"
] | null | null | null | launch/trail/config/common/vehicle.py | ashish-boss/goby3-course | ce6d3543961cd76cacab68d864b3fdbb056741a7 | [
"MIT"
] | null | null | null | launch/trail/config/common/vehicle.py | ashish-boss/goby3-course | ce6d3543961cd76cacab68d864b3fdbb056741a7 | [
"MIT"
] | 5 | 2021-03-07T15:27:15.000Z | 2022-01-17T17:26:12.000Z | def simulator_port(vehicle_id):
return 55000 + vehicle_id
def moos_port(vehicle_id):
return 9000 + vehicle_id
| 19.833333 | 31 | 0.756303 | 18 | 119 | 4.666667 | 0.5 | 0.428571 | 0.309524 | 0.452381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091837 | 0.176471 | 119 | 5 | 32 | 23.8 | 0.765306 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
e1b06e81a2fef8c8125bc646fff3f17d0daaa013 | 31,962 | py | Python | sdk/python/pulumi_azure/analysisservices/server.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 109 | 2018-06-18T00:19:44.000Z | 2022-02-20T05:32:57.000Z | sdk/python/pulumi_azure/analysisservices/server.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 663 | 2018-06-18T21:08:46.000Z | 2022-03-31T20:10:11.000Z | sdk/python/pulumi_azure/analysisservices/server.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 41 | 2018-07-19T22:37:38.000Z | 2022-03-14T10:56:26.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['ServerArgs', 'Server']
@pulumi.input_type
class ServerArgs:
def __init__(__self__, *,
resource_group_name: pulumi.Input[str],
sku: pulumi.Input[str],
admin_users: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
backup_blob_container_uri: Optional[pulumi.Input[str]] = None,
enable_power_bi_service: Optional[pulumi.Input[bool]] = None,
ipv4_firewall_rules: Optional[pulumi.Input[Sequence[pulumi.Input['ServerIpv4FirewallRuleArgs']]]] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
querypool_connection_mode: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a Server resource.
:param pulumi.Input[str] resource_group_name: The name of the Resource Group in which the Analysis Services Server should be exist. Changing this forces a new resource to be created.
:param pulumi.Input[str] sku: SKU for the Analysis Services Server. Possible values are: `D1`, `B1`, `B2`, `S0`, `S1`, `S2`, `S4`, `S8`, `S9`, `S8v2` and `S9v2`.
:param pulumi.Input[Sequence[pulumi.Input[str]]] admin_users: List of email addresses of admin users.
:param pulumi.Input[str] backup_blob_container_uri: URI and SAS token for a blob container to store backups.
:param pulumi.Input[bool] enable_power_bi_service: Indicates if the Power BI service is allowed to access or not.
:param pulumi.Input[Sequence[pulumi.Input['ServerIpv4FirewallRuleArgs']]] ipv4_firewall_rules: One or more `ipv4_firewall_rule` block(s) as defined below.
:param pulumi.Input[str] location: The Azure location where the Analysis Services Server exists. Changing this forces a new resource to be created.
:param pulumi.Input[str] name: Specifies the name of the firewall rule.
:param pulumi.Input[str] querypool_connection_mode: Controls how the read-write server is used in the query pool. If this value is set to `All` then read-write servers are also used for queries. Otherwise with `ReadOnly` these servers do not participate in query operations.
"""
pulumi.set(__self__, "resource_group_name", resource_group_name)
pulumi.set(__self__, "sku", sku)
if admin_users is not None:
pulumi.set(__self__, "admin_users", admin_users)
if backup_blob_container_uri is not None:
pulumi.set(__self__, "backup_blob_container_uri", backup_blob_container_uri)
if enable_power_bi_service is not None:
pulumi.set(__self__, "enable_power_bi_service", enable_power_bi_service)
if ipv4_firewall_rules is not None:
pulumi.set(__self__, "ipv4_firewall_rules", ipv4_firewall_rules)
if location is not None:
pulumi.set(__self__, "location", location)
if name is not None:
pulumi.set(__self__, "name", name)
if querypool_connection_mode is not None:
pulumi.set(__self__, "querypool_connection_mode", querypool_connection_mode)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Input[str]:
"""
The name of the Resource Group in which the Analysis Services Server should be exist. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter
def sku(self) -> pulumi.Input[str]:
"""
SKU for the Analysis Services Server. Possible values are: `D1`, `B1`, `B2`, `S0`, `S1`, `S2`, `S4`, `S8`, `S9`, `S8v2` and `S9v2`.
"""
return pulumi.get(self, "sku")
@sku.setter
def sku(self, value: pulumi.Input[str]):
pulumi.set(self, "sku", value)
@property
@pulumi.getter(name="adminUsers")
def admin_users(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of email addresses of admin users.
"""
return pulumi.get(self, "admin_users")
@admin_users.setter
def admin_users(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "admin_users", value)
@property
@pulumi.getter(name="backupBlobContainerUri")
def backup_blob_container_uri(self) -> Optional[pulumi.Input[str]]:
"""
URI and SAS token for a blob container to store backups.
"""
return pulumi.get(self, "backup_blob_container_uri")
@backup_blob_container_uri.setter
def backup_blob_container_uri(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "backup_blob_container_uri", value)
@property
@pulumi.getter(name="enablePowerBiService")
def enable_power_bi_service(self) -> Optional[pulumi.Input[bool]]:
"""
Indicates if the Power BI service is allowed to access or not.
"""
return pulumi.get(self, "enable_power_bi_service")
@enable_power_bi_service.setter
def enable_power_bi_service(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enable_power_bi_service", value)
@property
@pulumi.getter(name="ipv4FirewallRules")
def ipv4_firewall_rules(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ServerIpv4FirewallRuleArgs']]]]:
"""
One or more `ipv4_firewall_rule` block(s) as defined below.
"""
return pulumi.get(self, "ipv4_firewall_rules")
@ipv4_firewall_rules.setter
def ipv4_firewall_rules(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ServerIpv4FirewallRuleArgs']]]]):
pulumi.set(self, "ipv4_firewall_rules", value)
@property
@pulumi.getter
def location(self) -> Optional[pulumi.Input[str]]:
"""
The Azure location where the Analysis Services Server exists. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "location")
@location.setter
def location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "location", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the name of the firewall rule.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="querypoolConnectionMode")
def querypool_connection_mode(self) -> Optional[pulumi.Input[str]]:
"""
Controls how the read-write server is used in the query pool. If this value is set to `All` then read-write servers are also used for queries. Otherwise with `ReadOnly` these servers do not participate in query operations.
"""
return pulumi.get(self, "querypool_connection_mode")
@querypool_connection_mode.setter
def querypool_connection_mode(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "querypool_connection_mode", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
@pulumi.input_type
class _ServerState:
def __init__(__self__, *,
admin_users: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
backup_blob_container_uri: Optional[pulumi.Input[str]] = None,
enable_power_bi_service: Optional[pulumi.Input[bool]] = None,
ipv4_firewall_rules: Optional[pulumi.Input[Sequence[pulumi.Input['ServerIpv4FirewallRuleArgs']]]] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
querypool_connection_mode: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
server_full_name: Optional[pulumi.Input[str]] = None,
sku: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None):
"""
Input properties used for looking up and filtering Server resources.
:param pulumi.Input[Sequence[pulumi.Input[str]]] admin_users: List of email addresses of admin users.
:param pulumi.Input[str] backup_blob_container_uri: URI and SAS token for a blob container to store backups.
:param pulumi.Input[bool] enable_power_bi_service: Indicates if the Power BI service is allowed to access or not.
:param pulumi.Input[Sequence[pulumi.Input['ServerIpv4FirewallRuleArgs']]] ipv4_firewall_rules: One or more `ipv4_firewall_rule` block(s) as defined below.
:param pulumi.Input[str] location: The Azure location where the Analysis Services Server exists. Changing this forces a new resource to be created.
:param pulumi.Input[str] name: Specifies the name of the firewall rule.
:param pulumi.Input[str] querypool_connection_mode: Controls how the read-write server is used in the query pool. If this value is set to `All` then read-write servers are also used for queries. Otherwise with `ReadOnly` these servers do not participate in query operations.
:param pulumi.Input[str] resource_group_name: The name of the Resource Group in which the Analysis Services Server should be exist. Changing this forces a new resource to be created.
:param pulumi.Input[str] server_full_name: The full name of the Analysis Services Server.
:param pulumi.Input[str] sku: SKU for the Analysis Services Server. Possible values are: `D1`, `B1`, `B2`, `S0`, `S1`, `S2`, `S4`, `S8`, `S9`, `S8v2` and `S9v2`.
"""
if admin_users is not None:
pulumi.set(__self__, "admin_users", admin_users)
if backup_blob_container_uri is not None:
pulumi.set(__self__, "backup_blob_container_uri", backup_blob_container_uri)
if enable_power_bi_service is not None:
pulumi.set(__self__, "enable_power_bi_service", enable_power_bi_service)
if ipv4_firewall_rules is not None:
pulumi.set(__self__, "ipv4_firewall_rules", ipv4_firewall_rules)
if location is not None:
pulumi.set(__self__, "location", location)
if name is not None:
pulumi.set(__self__, "name", name)
if querypool_connection_mode is not None:
pulumi.set(__self__, "querypool_connection_mode", querypool_connection_mode)
if resource_group_name is not None:
pulumi.set(__self__, "resource_group_name", resource_group_name)
if server_full_name is not None:
pulumi.set(__self__, "server_full_name", server_full_name)
if sku is not None:
pulumi.set(__self__, "sku", sku)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="adminUsers")
def admin_users(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
List of email addresses of admin users.
"""
return pulumi.get(self, "admin_users")
@admin_users.setter
def admin_users(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "admin_users", value)
@property
@pulumi.getter(name="backupBlobContainerUri")
def backup_blob_container_uri(self) -> Optional[pulumi.Input[str]]:
"""
URI and SAS token for a blob container to store backups.
"""
return pulumi.get(self, "backup_blob_container_uri")
@backup_blob_container_uri.setter
def backup_blob_container_uri(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "backup_blob_container_uri", value)
@property
@pulumi.getter(name="enablePowerBiService")
def enable_power_bi_service(self) -> Optional[pulumi.Input[bool]]:
"""
Indicates if the Power BI service is allowed to access or not.
"""
return pulumi.get(self, "enable_power_bi_service")
@enable_power_bi_service.setter
def enable_power_bi_service(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "enable_power_bi_service", value)
@property
@pulumi.getter(name="ipv4FirewallRules")
def ipv4_firewall_rules(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['ServerIpv4FirewallRuleArgs']]]]:
"""
One or more `ipv4_firewall_rule` block(s) as defined below.
"""
return pulumi.get(self, "ipv4_firewall_rules")
@ipv4_firewall_rules.setter
def ipv4_firewall_rules(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['ServerIpv4FirewallRuleArgs']]]]):
pulumi.set(self, "ipv4_firewall_rules", value)
@property
@pulumi.getter
def location(self) -> Optional[pulumi.Input[str]]:
"""
The Azure location where the Analysis Services Server exists. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "location")
@location.setter
def location(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "location", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the name of the firewall rule.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="querypoolConnectionMode")
def querypool_connection_mode(self) -> Optional[pulumi.Input[str]]:
"""
Controls how the read-write server is used in the query pool. If this value is set to `All` then read-write servers are also used for queries. Otherwise with `ReadOnly` these servers do not participate in query operations.
"""
return pulumi.get(self, "querypool_connection_mode")
@querypool_connection_mode.setter
def querypool_connection_mode(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "querypool_connection_mode", value)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the Resource Group in which the Analysis Services Server should be exist. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter(name="serverFullName")
def server_full_name(self) -> Optional[pulumi.Input[str]]:
"""
The full name of the Analysis Services Server.
"""
return pulumi.get(self, "server_full_name")
@server_full_name.setter
def server_full_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "server_full_name", value)
@property
@pulumi.getter
def sku(self) -> Optional[pulumi.Input[str]]:
"""
SKU for the Analysis Services Server. Possible values are: `D1`, `B1`, `B2`, `S0`, `S1`, `S2`, `S4`, `S8`, `S9`, `S8v2` and `S9v2`.
"""
return pulumi.get(self, "sku")
@sku.setter
def sku(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "sku", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]:
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]]):
pulumi.set(self, "tags", value)
class Server(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
admin_users: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
backup_blob_container_uri: Optional[pulumi.Input[str]] = None,
enable_power_bi_service: Optional[pulumi.Input[bool]] = None,
ipv4_firewall_rules: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServerIpv4FirewallRuleArgs']]]]] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
querypool_connection_mode: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
sku: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
"""
Manages an Analysis Services Server.
## Example Usage
```python
import pulumi
import pulumi_azure as azure
rg = azure.core.ResourceGroup("rg", location="West Europe")
server = azure.analysisservices.Server("server",
location="northeurope",
resource_group_name=rg.name,
sku="S0",
admin_users=["myuser@domain.tld"],
enable_power_bi_service=True,
ipv4_firewall_rules=[azure.analysisservices.ServerIpv4FirewallRuleArgs(
name="myRule1",
range_start="210.117.252.0",
range_end="210.117.252.255",
)],
tags={
"abc": "123",
})
```
> **NOTE:** The server resource will automatically be started and stopped during an update if it is in `paused` state.
## Import
Analysis Services Server can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:analysisservices/server:Server server /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/resourcegroup1/providers/Microsoft.AnalysisServices/servers/server1
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Sequence[pulumi.Input[str]]] admin_users: List of email addresses of admin users.
:param pulumi.Input[str] backup_blob_container_uri: URI and SAS token for a blob container to store backups.
:param pulumi.Input[bool] enable_power_bi_service: Indicates if the Power BI service is allowed to access or not.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServerIpv4FirewallRuleArgs']]]] ipv4_firewall_rules: One or more `ipv4_firewall_rule` block(s) as defined below.
:param pulumi.Input[str] location: The Azure location where the Analysis Services Server exists. Changing this forces a new resource to be created.
:param pulumi.Input[str] name: Specifies the name of the firewall rule.
:param pulumi.Input[str] querypool_connection_mode: Controls how the read-write server is used in the query pool. If this value is set to `All` then read-write servers are also used for queries. Otherwise with `ReadOnly` these servers do not participate in query operations.
:param pulumi.Input[str] resource_group_name: The name of the Resource Group in which the Analysis Services Server should be exist. Changing this forces a new resource to be created.
:param pulumi.Input[str] sku: SKU for the Analysis Services Server. Possible values are: `D1`, `B1`, `B2`, `S0`, `S1`, `S2`, `S4`, `S8`, `S9`, `S8v2` and `S9v2`.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: ServerArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages an Analysis Services Server.
## Example Usage
```python
import pulumi
import pulumi_azure as azure
rg = azure.core.ResourceGroup("rg", location="West Europe")
server = azure.analysisservices.Server("server",
location="northeurope",
resource_group_name=rg.name,
sku="S0",
admin_users=["myuser@domain.tld"],
enable_power_bi_service=True,
ipv4_firewall_rules=[azure.analysisservices.ServerIpv4FirewallRuleArgs(
name="myRule1",
range_start="210.117.252.0",
range_end="210.117.252.255",
)],
tags={
"abc": "123",
})
```
> **NOTE:** The server resource will automatically be started and stopped during an update if it is in `paused` state.
## Import
Analysis Services Server can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:analysisservices/server:Server server /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/resourcegroup1/providers/Microsoft.AnalysisServices/servers/server1
```
:param str resource_name: The name of the resource.
:param ServerArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ServerArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
admin_users: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
backup_blob_container_uri: Optional[pulumi.Input[str]] = None,
enable_power_bi_service: Optional[pulumi.Input[bool]] = None,
ipv4_firewall_rules: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServerIpv4FirewallRuleArgs']]]]] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
querypool_connection_mode: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
sku: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ServerArgs.__new__(ServerArgs)
__props__.__dict__["admin_users"] = admin_users
__props__.__dict__["backup_blob_container_uri"] = backup_blob_container_uri
__props__.__dict__["enable_power_bi_service"] = enable_power_bi_service
__props__.__dict__["ipv4_firewall_rules"] = ipv4_firewall_rules
__props__.__dict__["location"] = location
__props__.__dict__["name"] = name
__props__.__dict__["querypool_connection_mode"] = querypool_connection_mode
if resource_group_name is None and not opts.urn:
raise TypeError("Missing required property 'resource_group_name'")
__props__.__dict__["resource_group_name"] = resource_group_name
if sku is None and not opts.urn:
raise TypeError("Missing required property 'sku'")
__props__.__dict__["sku"] = sku
__props__.__dict__["tags"] = tags
__props__.__dict__["server_full_name"] = None
super(Server, __self__).__init__(
'azure:analysisservices/server:Server',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
admin_users: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
backup_blob_container_uri: Optional[pulumi.Input[str]] = None,
enable_power_bi_service: Optional[pulumi.Input[bool]] = None,
ipv4_firewall_rules: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServerIpv4FirewallRuleArgs']]]]] = None,
location: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
querypool_connection_mode: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
server_full_name: Optional[pulumi.Input[str]] = None,
sku: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None) -> 'Server':
"""
Get an existing Server resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[Sequence[pulumi.Input[str]]] admin_users: List of email addresses of admin users.
:param pulumi.Input[str] backup_blob_container_uri: URI and SAS token for a blob container to store backups.
:param pulumi.Input[bool] enable_power_bi_service: Indicates if the Power BI service is allowed to access or not.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['ServerIpv4FirewallRuleArgs']]]] ipv4_firewall_rules: One or more `ipv4_firewall_rule` block(s) as defined below.
:param pulumi.Input[str] location: The Azure location where the Analysis Services Server exists. Changing this forces a new resource to be created.
:param pulumi.Input[str] name: Specifies the name of the firewall rule.
:param pulumi.Input[str] querypool_connection_mode: Controls how the read-write server is used in the query pool. If this value is set to `All` then read-write servers are also used for queries. Otherwise with `ReadOnly` these servers do not participate in query operations.
:param pulumi.Input[str] resource_group_name: The name of the Resource Group in which the Analysis Services Server should be exist. Changing this forces a new resource to be created.
:param pulumi.Input[str] server_full_name: The full name of the Analysis Services Server.
:param pulumi.Input[str] sku: SKU for the Analysis Services Server. Possible values are: `D1`, `B1`, `B2`, `S0`, `S1`, `S2`, `S4`, `S8`, `S9`, `S8v2` and `S9v2`.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _ServerState.__new__(_ServerState)
__props__.__dict__["admin_users"] = admin_users
__props__.__dict__["backup_blob_container_uri"] = backup_blob_container_uri
__props__.__dict__["enable_power_bi_service"] = enable_power_bi_service
__props__.__dict__["ipv4_firewall_rules"] = ipv4_firewall_rules
__props__.__dict__["location"] = location
__props__.__dict__["name"] = name
__props__.__dict__["querypool_connection_mode"] = querypool_connection_mode
__props__.__dict__["resource_group_name"] = resource_group_name
__props__.__dict__["server_full_name"] = server_full_name
__props__.__dict__["sku"] = sku
__props__.__dict__["tags"] = tags
return Server(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="adminUsers")
def admin_users(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
List of email addresses of admin users.
"""
return pulumi.get(self, "admin_users")
@property
@pulumi.getter(name="backupBlobContainerUri")
def backup_blob_container_uri(self) -> pulumi.Output[Optional[str]]:
"""
URI and SAS token for a blob container to store backups.
"""
return pulumi.get(self, "backup_blob_container_uri")
@property
@pulumi.getter(name="enablePowerBiService")
def enable_power_bi_service(self) -> pulumi.Output[Optional[bool]]:
"""
Indicates if the Power BI service is allowed to access or not.
"""
return pulumi.get(self, "enable_power_bi_service")
@property
@pulumi.getter(name="ipv4FirewallRules")
def ipv4_firewall_rules(self) -> pulumi.Output[Optional[Sequence['outputs.ServerIpv4FirewallRule']]]:
"""
One or more `ipv4_firewall_rule` block(s) as defined below.
"""
return pulumi.get(self, "ipv4_firewall_rules")
@property
@pulumi.getter
def location(self) -> pulumi.Output[str]:
"""
The Azure location where the Analysis Services Server exists. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "location")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Specifies the name of the firewall rule.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="querypoolConnectionMode")
def querypool_connection_mode(self) -> pulumi.Output[str]:
"""
Controls how the read-write server is used in the query pool. If this value is set to `All` then read-write servers are also used for queries. Otherwise with `ReadOnly` these servers do not participate in query operations.
"""
return pulumi.get(self, "querypool_connection_mode")
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Output[str]:
"""
The name of the Resource Group in which the Analysis Services Server should be exist. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "resource_group_name")
@property
@pulumi.getter(name="serverFullName")
def server_full_name(self) -> pulumi.Output[str]:
"""
The full name of the Analysis Services Server.
"""
return pulumi.get(self, "server_full_name")
@property
@pulumi.getter
def sku(self) -> pulumi.Output[str]:
"""
SKU for the Analysis Services Server. Possible values are: `D1`, `B1`, `B2`, `S0`, `S1`, `S2`, `S4`, `S8`, `S9`, `S8v2` and `S9v2`.
"""
return pulumi.get(self, "sku")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
return pulumi.get(self, "tags")
| 48.722561 | 282 | 0.663131 | 3,935 | 31,962 | 5.162389 | 0.06709 | 0.092596 | 0.074431 | 0.056316 | 0.917298 | 0.901693 | 0.890568 | 0.879689 | 0.867382 | 0.85985 | 0 | 0.01112 | 0.2319 | 31,962 | 655 | 283 | 48.796947 | 0.816334 | 0.348758 | 0 | 0.790761 | 1 | 0 | 0.118366 | 0.057857 | 0 | 0 | 0 | 0 | 0 | 1 | 0.163043 | false | 0.002717 | 0.019022 | 0.008152 | 0.279891 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
beda7fe3aeb3d38f183897892a98b2dd5f74dc0b | 76 | py | Python | compute.py | ZhangMenghao/FloodShield | f9370c31296f5db8058cb52995d3665b5d7d838f | [
"Apache-2.0"
] | 4 | 2018-11-21T09:19:19.000Z | 2021-12-07T03:12:07.000Z | compute.py | ZhangMenghao/FloodShield | f9370c31296f5db8058cb52995d3665b5d7d838f | [
"Apache-2.0"
] | null | null | null | compute.py | ZhangMenghao/FloodShield | f9370c31296f5db8058cb52995d3665b5d7d838f | [
"Apache-2.0"
] | 1 | 2019-11-11T02:35:48.000Z | 2019-11-11T02:35:48.000Z | import sys
print sys.argv[2]
print float(sys.argv[1]) / float(sys.argv[2])
| 15.2 | 45 | 0.697368 | 15 | 76 | 3.533333 | 0.466667 | 0.396226 | 0.301887 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044776 | 0.118421 | 76 | 4 | 46 | 19 | 0.746269 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.333333 | null | null | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
beebc19b0bc31877c14963e04cc63a986e06f605 | 43,140 | py | Python | geofree/models/transformers/geogpt.py | phygitalism/geometry-free-view-synthesis | 00dc639c98dfb9246bee0009649c5be8f8b58e1e | [
"MIT"
] | 241 | 2021-04-16T01:09:06.000Z | 2022-03-28T13:24:21.000Z | geofree/models/transformers/geogpt.py | phygitalism/geometry-free-view-synthesis | 00dc639c98dfb9246bee0009649c5be8f8b58e1e | [
"MIT"
] | 12 | 2021-04-21T17:24:31.000Z | 2021-11-18T08:42:31.000Z | geofree/models/transformers/geogpt.py | phygitalism/geometry-free-view-synthesis | 00dc639c98dfb9246bee0009649c5be8f8b58e1e | [
"MIT"
] | 13 | 2021-04-22T09:59:22.000Z | 2022-01-22T00:31:19.000Z | import torch
import time
import torch.nn.functional as F
import pytorch_lightning as pl
from torch.optim.lr_scheduler import LambdaLR
from einops import rearrange
from geofree.main import instantiate_from_config
from geofree.modules.warp.midas import Midas
def disabled_train(self, mode=True):
"""Overwrite model.train with this function to make sure train/eval mode
does not change anymore."""
return self
class GeoTransformer(pl.LightningModule):
"""This one provides camera, depth and src as conditioning to transformer
and makes sure that camera translation and depth are compatible.
By setting merge_channels!=None, it merges depth and src into a single
embedding to reduce/keep overall codelength.
Four stages:
first stage to encode dst
cond stage to encode src.
Then, scaled depth is inferred from src,
this depth and camera translation t are normalized to make them
consistent,
normalized depth is encoded and normalized camera
parameters are embedded.
"""
def __init__(self,
transformer_config,
first_stage_config,
cond_stage_config,
depth_stage_config,
merge_channels=None,
use_depth=True,
ckpt_path=None,
ignore_keys=[],
first_stage_key="image",
cond_stage_key="depth",
use_scheduler=False,
scheduler_config=None,
monitor="val/loss",
plot_cond_stage=False,
log_det_sample=False,
manipulate_latents=False,
emb_stage_config=None,
emb_stage_key="camera",
emb_stage_trainable=True,
top_k=None
):
super().__init__()
if monitor is not None:
self.monitor = monitor
self.log_det_sample = log_det_sample
self.manipulate_latents = manipulate_latents
self.init_first_stage_from_ckpt(first_stage_config)
self.init_cond_stage_from_ckpt(cond_stage_config)
self.init_depth_stage_from_ckpt(depth_stage_config)
self.transformer = instantiate_from_config(config=transformer_config)
self.merge_channels = merge_channels
if self.merge_channels is not None:
self.merge_conv = torch.nn.Conv2d(self.merge_channels,
self.transformer.config.n_embd,
kernel_size=1,
padding=0,
bias=False)
self.use_depth = use_depth
if not self.use_depth:
assert self.merge_channels is None
self.first_stage_key = first_stage_key
self.cond_stage_key = cond_stage_key
self.use_scheduler = use_scheduler
if use_scheduler:
assert scheduler_config is not None
self.scheduler_config = scheduler_config
self.plot_cond_stage = plot_cond_stage
self.emb_stage_key = emb_stage_key
self.emb_stage_trainable = emb_stage_trainable and emb_stage_config is not None
self.init_emb_stage_from_ckpt(emb_stage_config)
self.top_k = top_k if top_k is not None else 100
if ckpt_path is not None:
self.init_from_ckpt(ckpt_path, ignore_keys=ignore_keys)
self._midas = Midas()
self._midas.eval()
self._midas.train = disabled_train
def init_from_ckpt(self, path, ignore_keys=list()):
sd = torch.load(path, map_location="cpu")["state_dict"]
for k in sd.keys():
for ik in ignore_keys:
if k.startswith(ik):
self.print("Deleting key {} from state_dict.".format(k))
del sd[k]
missing, unexpected = self.load_state_dict(sd, strict=False)
print(f"Restored from {path} with {len(missing)} missing keys and {len(unexpected)} unexpected keys.")
def init_first_stage_from_ckpt(self, config):
model = instantiate_from_config(config)
self.first_stage_model = model.eval()
self.first_stage_model.train = disabled_train
def init_cond_stage_from_ckpt(self, config):
if config == "__is_first_stage__":
print("Using first stage also as cond stage.")
self.cond_stage_model = self.first_stage_model
else:
model = instantiate_from_config(config)
self.cond_stage_model = model.eval()
self.cond_stage_model.train = disabled_train
def init_depth_stage_from_ckpt(self, config):
model = instantiate_from_config(config)
self.depth_stage_model = model.eval()
self.depth_stage_model.train = disabled_train
def init_emb_stage_from_ckpt(self, config):
if config is None:
self.emb_stage_model = None
else:
model = instantiate_from_config(config)
self.emb_stage_model = model
if not self.emb_stage_trainable:
self.emb_stage_model.eval()
self.emb_stage_model.train = disabled_train
@torch.no_grad()
def encode_to_z(self, x):
quant_z, _, info = self.first_stage_model.encode(x)
indices = info[2].view(quant_z.shape[0], -1)
return quant_z, indices
@torch.no_grad()
def encode_to_c(self, c):
quant_c, _, info = self.cond_stage_model.encode(c)
indices = info[2].view(quant_c.shape[0], -1)
return quant_c, indices
@torch.no_grad()
def encode_to_d(self, x):
quant_z, _, info = self.depth_stage_model.encode(x)
indices = info[2].view(quant_z.shape[0], -1)
return quant_z, indices
def encode_to_e(self, **kwargs):
return self.emb_stage_model(**kwargs)
@torch.no_grad()
def get_layers(self, batch, xce=None):
# interface for layer probing
if xce is None:
x, c, e = self.get_xce(batch)
else:
x, c, e = xce
quant_z, z_indices = self.encode_to_z(**x)
_, _, dc_indices, embeddings = self.get_normalized_c(c, e)
cz_indices = torch.cat((dc_indices, z_indices), dim=1)
trafo_layers = self.transformer(cz_indices[:, :-1],
embeddings=embeddings,
return_layers=True)
layers = [quant_z] + trafo_layers
return layers
def get_normalized_d_indices(self, batch, to_device=False):
# interface for layer probing
_, cdict, edict = self.get_xce(batch)
if to_device:
for k in cdict:
cdict[k] = cdict[k].to(device=self.device)
for k in edict:
edict[k] = edict[k].to(device=self.device)
return self.get_normalized_c(cdict, edict, return_depth_only=True)
def get_normalized_c(self, cdict, edict, return_depth_only=False,
fixed_scale=False, scale=None):
with torch.no_grad():
quant_c, c_indices = self.encode_to_c(**cdict)
if not fixed_scale:
scaled_idepth = self._midas.scaled_depth(cdict["c"],
edict.pop("points"),
return_inverse_depth=True)
else:
scale = [0.18577382, 0.93059154] if scale is None else scale
scaled_idepth = self._midas.fixed_scale_depth(cdict["c"],
return_inverse_depth=True)
alpha = scaled_idepth.amax(dim=(1,2))
scaled_idepth = scaled_idepth/alpha[:,None,None]
edict["t"] = edict["t"]*alpha[:,None]
quant_d, d_indices = self.encode_to_d(scaled_idepth[:,None,:,:]*2.0-1.0)
if return_depth_only:
return d_indices, quant_d, scaled_idepth[:,None,:,:]*2.0-1.0
embeddings = self.encode_to_e(**edict)
if self.merge_channels is None:
# concat depth and src indices into 2*h*w conditioning indices
if self.use_depth:
dc_indices = torch.cat((d_indices, c_indices), dim=1)
else:
dc_indices = c_indices
else:
# use empty conditioning indices and compute h*w conditioning
# embeddings
dc_indices = torch.zeros_like(d_indices)[:,[]]
merge = torch.cat((quant_d, quant_c), dim=1)
merge = self.merge_conv(merge)
merge = merge.permute(0,2,3,1) # to b,h,w,c
merge = merge.reshape(merge.shape[0],
merge.shape[1]*merge.shape[2],
merge.shape[3]) # to b,hw,c
embeddings = torch.cat((embeddings,merge), dim=1)
# check that unmasking is correct
total_cond_length = embeddings.shape[1] + dc_indices.shape[1]
assert total_cond_length == self.transformer.config.n_unmasked, (
embeddings.shape[1], dc_indices.shape[1], self.transformer.config.n_unmasked)
return quant_d, quant_c, dc_indices, embeddings
def forward(self, xdict, cdict, edict):
# one step to produce the logits
_, z_indices = self.encode_to_z(**xdict)
_, _, dc_indices, embeddings = self.get_normalized_c(cdict, edict)
cz_indices = torch.cat((dc_indices, z_indices), dim=1)
# target includes all sequence elements (no need to handle first one
# differently because we are conditioning)
target = z_indices
# make the prediction
logits, _ = self.transformer(cz_indices[:, :-1], embeddings=embeddings)
# cut off conditioning outputs - output i corresponds to p(z_i | z_{<i}, c)
logits = logits[:, embeddings.shape[1]+dc_indices.shape[1]-1:]
return logits, target
def top_k_logits(self, logits, k):
v, ix = torch.topk(logits, k)
out = logits.clone()
out[out < v[..., [-1]]] = -float('Inf')
return out
@torch.no_grad()
def sample(self, x, c, steps, temperature=1.0, sample=False, top_k=None,
callback=lambda k: None, embeddings=None, **kwargs):
# in the current variant we always use embeddings for camera
assert embeddings is not None
# check n_unmasked and conditioning length
total_cond_length = embeddings.shape[1] + c.shape[1]
assert total_cond_length == self.transformer.config.n_unmasked, (
embeddings.shape[1], c.shape[1], self.transformer.config.n_unmasked)
x = torch.cat((c,x),dim=1)
block_size = self.transformer.get_block_size()
assert not self.transformer.training
for k in range(steps):
callback(k)
assert x.size(1) <= block_size # make sure model can see conditioning
# do not crop as this messes with n_unmasked
#x_cond = x if x.size(1) <= block_size else x[:, -block_size:] # crop context if needed
x_cond = x
logits, _ = self.transformer(x_cond, embeddings=embeddings)
# pluck the logits at the final step and scale by temperature
logits = logits[:, -1, :] / temperature
# optionally crop probabilities to only the top k options
if top_k is not None:
logits = self.top_k_logits(logits, top_k)
# apply softmax to convert to probabilities
probs = F.softmax(logits, dim=-1)
# sample from the distribution or take the most likely
if sample:
ix = torch.multinomial(probs, num_samples=1)
else:
_, ix = torch.topk(probs, k=1, dim=-1)
# append to the sequence and continue
x = torch.cat((x, ix), dim=1)
# cut off conditioning
x = x[:, c.shape[1]:]
return x
@torch.no_grad()
def decode_to_img(self, index, zshape):
bhwc = (zshape[0],zshape[2],zshape[3],zshape[1])
quant_z = self.first_stage_model.quantize.get_codebook_entry(
index.reshape(-1), shape=bhwc)
x = self.first_stage_model.decode(quant_z)
return x
@torch.no_grad()
def log_images(self,
batch,
temperature=None,
top_k=None,
callback=None,
N=4,
half_sample=True,
sample=True,
det_sample=None,
entropy=False,
**kwargs):
det_sample = det_sample if det_sample is not None else self.log_det_sample
log = dict()
xdict, cdict, edict = self.get_xce(batch, N)
for k in xdict:
xdict[k] = xdict[k].to(device=self.device)
for k in cdict:
cdict[k] = cdict[k].to(device=self.device)
for k in edict:
edict[k] = edict[k].to(device=self.device)
log["inputs"] = xdict["x"]
log["conditioning"] = cdict["c"]
quant_z, z_indices = self.encode_to_z(**xdict)
quant_d, quant_c, dc_indices, embeddings = self.get_normalized_c(cdict,edict)
if half_sample:
# create a "half"" sample
z_start_indices = z_indices[:,:z_indices.shape[1]//2]
index_sample = self.sample(z_start_indices, dc_indices,
steps=z_indices.shape[1]-z_start_indices.shape[1],
temperature=temperature if temperature is not None else 1.0,
sample=True,
top_k=top_k if top_k is not None else self.top_k,
callback=callback if callback is not None else lambda k: None,
embeddings=embeddings)
x_sample = self.decode_to_img(index_sample, quant_z.shape)
log["samples_half"] = x_sample
if sample:
# sample
z_start_indices = z_indices[:, :0]
t1 = time.time()
index_sample = self.sample(z_start_indices, dc_indices,
steps=z_indices.shape[1],
temperature=temperature if temperature is not None else 1.0,
sample=True,
top_k=top_k if top_k is not None else 100,
callback=callback if callback is not None else lambda k: None,
embeddings=embeddings)
if not hasattr(self, "sampling_time"):
self.sampling_time = time.time() - t1
print(f"Full sampling takes about {self.sampling_time:.2f} seconds.")
x_sample_nopix = self.decode_to_img(index_sample, quant_z.shape)
log["samples_nopix"] = x_sample_nopix
if det_sample:
# det sample
z_start_indices = z_indices[:, :0]
index_sample = self.sample(z_start_indices, dc_indices,
steps=z_indices.shape[1],
sample=False,
callback=callback if callback is not None else lambda k: None,
embeddings=embeddings)
x_sample_det = self.decode_to_img(index_sample, quant_z.shape)
log["samples_det"] = x_sample_det
if entropy:
assert sample
H, W = x_sample_nopix.shape[2], x_sample_nopix.shape[3]
# plot entropy and spatial loss, (i) on datapoints and (ii) on sample
# on data first
targets = z_indices
cz_indices = torch.cat((dc_indices, z_indices), dim=1)
logits, _ = self.transformer(cz_indices[:, :-1], embeddings=embeddings)
# cut off conditioning outputs - output i corresponds to p(z_i | z_{<i}, c)
logits = logits[:, embeddings.shape[1] + dc_indices.shape[1] - 1:]
h, w = quant_z.shape[2], quant_z.shape[3]
# to spatial
logits = rearrange(logits, 'b (h w) d -> b d h w', h=h)
targets = rearrange(targets, 'b (h w) -> b h w', h=h)
spatial_loss = F.cross_entropy(logits, targets, reduction="none")#[:, None, ...]
log["spatial_loss_data"] = spatial_loss
probs = F.softmax(logits, dim=1)
entropy = -(probs * torch.log(probs + 1e-10)).sum(1)
entropy_up = F.interpolate(entropy[:,None,...], size=(H, W), mode="bicubic")
log["spatial_entropy_data"] = entropy
log["spatial_entropy_data_upsampled"] = entropy_up[:,0,...]
# now on sample
targets = index_sample
cz_indices = torch.cat((dc_indices, index_sample), dim=1)
logits, _ = self.transformer(cz_indices[:, :-1], embeddings=embeddings)
# cut off conditioning outputs - output i corresponds to p(z_i | z_{<i}, c)
logits = logits[:, embeddings.shape[1] + dc_indices.shape[1] - 1:]
h, w = quant_z.shape[2], quant_z.shape[3]
# to spatial
logits = rearrange(logits, 'b (h w) d -> b d h w', h=h)
targets = rearrange(targets, 'b (h w) -> b h w', h=h)
spatial_loss = F.cross_entropy(logits, targets, reduction="none")#[:, None, ...]
log["spatial_loss_sample"] = spatial_loss
probs = F.softmax(logits, dim=1)
entropy = -(probs * torch.log(probs + 1e-10)).sum(1)
entropy_up = F.interpolate(entropy[:,None,...], size=(H, W), mode="bicubic")
log["spatial_entropy_sample"] = entropy
log["spatial_entropy_sample_upsampled"] = entropy_up[:, 0, ...]
# reconstruction
x_rec = self.decode_to_img(z_indices, quant_z.shape)
log["reconstructions"] = x_rec
if self.plot_cond_stage:
cond_rec = self.cond_stage_model.decode(quant_c)
log["conditioning_rec"] = cond_rec
depth_rec = self.depth_stage_model.decode(quant_d)
log["depth_rec"] = depth_rec
return log
def get_input(self, key, batch, heuristics=True):
x = batch[key]
if heuristics:
if len(x.shape) == 3:
x = x[..., None]
x = x.permute(0, 3, 1, 2).to(memory_format=torch.contiguous_format)
if x.dtype == torch.double:
x = x.float()
return x
def get_xce(self, batch, N=None):
xdict = dict()
for k, v in self.first_stage_key.items():
xdict[k] = self.get_input(v, batch, heuristics=k=="x")[:N]
cdict = dict()
for k, v in self.cond_stage_key.items():
cdict[k] = self.get_input(v, batch, heuristics=k=="c")[:N]
edict = dict()
for k, v in self.emb_stage_key.items():
edict[k] = self.get_input(v, batch, heuristics=False)[:N]
return xdict, cdict, edict
def compute_loss(self, logits, targets, split="train"):
loss = F.cross_entropy(logits.reshape(-1, logits.size(-1)), targets.reshape(-1))
return loss, {f"{split}/loss": loss.detach()}
def shared_step(self, batch, batch_idx):
x, c, e = self.get_xce(batch)
logits, target = self(x, c, e)
return logits, target
def training_step(self, batch, batch_idx):
logits, target = self.shared_step(batch, batch_idx)
loss, log_dict = self.compute_loss(logits, target, split="train")
self.log("train/loss", loss,
prog_bar=True, logger=True, on_step=True, on_epoch=True)
self.log("global_step", self.global_step,
prog_bar=True, logger=True, on_step=True, on_epoch=False)
return loss
def validation_step(self, batch, batch_idx):
logits, target = self.shared_step(batch, batch_idx)
loss, log_dict = self.compute_loss(logits, target, split="val")
self.log("val/loss", loss,
prog_bar=True, logger=True, on_step=False, on_epoch=True)
return log_dict
def configure_optimizers(self):
# separate out all parameters to those that will and won't experience regularizing weight decay
decay = set()
no_decay = set()
whitelist_weight_modules = (torch.nn.Linear, )
blacklist_weight_modules = (torch.nn.LayerNorm, torch.nn.Embedding)
for mn, m in self.transformer.named_modules():
for pn, p in m.named_parameters():
fpn = '%s.%s' % (mn, pn) if mn else pn # full param name
if pn.endswith('bias'):
# all biases will not be decayed
no_decay.add(fpn)
elif pn.endswith('weight') and isinstance(m, whitelist_weight_modules):
# weights of whitelist modules will be weight decayed
decay.add(fpn)
elif pn.endswith('weight') and isinstance(m, blacklist_weight_modules):
# weights of blacklist modules will NOT be weight decayed
no_decay.add(fpn)
# special case the position embedding parameter in the root GPT module as not decayed
no_decay.add('pos_emb')
# validate that we considered every parameter
param_dict = {pn: p for pn, p in self.transformer.named_parameters()}
inter_params = decay & no_decay
union_params = decay | no_decay
assert len(inter_params) == 0, "parameters %s made it into both decay/no_decay sets!" % (str(inter_params), )
assert len(param_dict.keys() - union_params) == 0, "parameters %s were not separated into either decay/no_decay set!" \
% (str(param_dict.keys() - union_params), )
# create the pytorch optimizer object
optim_groups = [
{"params": [param_dict[pn] for pn in sorted(list(decay))], "weight_decay": 0.01},
{"params": [param_dict[pn] for pn in sorted(list(no_decay))], "weight_decay": 0.0},
]
extra_parameters = list()
if self.emb_stage_trainable:
extra_parameters += list(self.emb_stage_model.parameters())
if hasattr(self, "merge_conv"):
extra_parameters += list(self.merge_conv.parameters())
else:
assert self.merge_channels is None
optim_groups.append({"params": extra_parameters, "weight_decay": 0.0})
print(f"Optimizing {len(extra_parameters)} extra parameters.")
optimizer = torch.optim.AdamW(optim_groups, lr=self.learning_rate, betas=(0.9, 0.95))
if self.use_scheduler:
scheduler = instantiate_from_config(self.scheduler_config)
print("Setting up LambdaLR scheduler...")
scheduler = [
{
'scheduler': LambdaLR(optimizer, lr_lambda=scheduler.schedule),
'interval': 'step',
'frequency': 1
}]
return [optimizer], scheduler
return optimizer
class WarpGeoTransformer(pl.LightningModule):
"""GeoTransformer that also uses a warper in the transformer."""
def __init__(self,
transformer_config,
first_stage_config,
cond_stage_config,
depth_stage_config,
merge_channels=None,
ckpt_path=None,
ignore_keys=[],
first_stage_key="image",
cond_stage_key="depth",
use_scheduler=False,
scheduler_config=None,
monitor="val/loss",
plot_cond_stage=False,
log_det_sample=False,
manipulate_latents=False,
emb_stage_config=None,
emb_stage_key="camera",
emb_stage_trainable=True,
top_k=None
):
super().__init__()
if monitor is not None:
self.monitor = monitor
self.log_det_sample = log_det_sample
self.manipulate_latents = manipulate_latents
self.init_first_stage_from_ckpt(first_stage_config)
self.init_cond_stage_from_ckpt(cond_stage_config)
self.init_depth_stage_from_ckpt(depth_stage_config)
self.transformer = instantiate_from_config(config=transformer_config)
self.merge_channels = merge_channels
if self.merge_channels is not None:
self.merge_conv = torch.nn.Conv2d(self.merge_channels,
self.transformer.config.n_embd,
kernel_size=1,
padding=0,
bias=False)
self.first_stage_key = first_stage_key
self.cond_stage_key = cond_stage_key
self.use_scheduler = use_scheduler
if use_scheduler:
assert scheduler_config is not None
self.scheduler_config = scheduler_config
self.plot_cond_stage = plot_cond_stage
self.emb_stage_key = emb_stage_key
self.emb_stage_trainable = emb_stage_trainable and emb_stage_config is not None
if self.emb_stage_trainable:
print("### TRAINING EMB STAGE!!!")
self.init_emb_stage_from_ckpt(emb_stage_config)
self.top_k = top_k if top_k is not None else 100
if ckpt_path is not None:
self.init_from_ckpt(ckpt_path, ignore_keys=ignore_keys)
self._midas = Midas()
self._midas.eval()
self._midas.train = disabled_train
self.warpkwargs_keys = {
"x": "src_img",
"points": "src_points",
"R": "R_rel",
"t": "t_rel",
"K_dst": "K",
"K_src_inv": "K_inv",
}
def init_from_ckpt(self, path, ignore_keys=list()):
sd = torch.load(path, map_location="cpu")["state_dict"]
for k in sd.keys():
for ik in ignore_keys:
if k.startswith(ik):
self.print("Deleting key {} from state_dict.".format(k))
del sd[k]
missing, unexpected = self.load_state_dict(sd, strict=False)
print(f"Restored from {path} with {len(missing)} missing keys and {len(unexpected)} unexpected keys.")
def init_first_stage_from_ckpt(self, config):
model = instantiate_from_config(config)
self.first_stage_model = model.eval()
self.first_stage_model.train = disabled_train
def init_cond_stage_from_ckpt(self, config):
if config == "__is_first_stage__":
print("Using first stage also as cond stage.")
self.cond_stage_model = self.first_stage_model
else:
model = instantiate_from_config(config)
self.cond_stage_model = model.eval()
self.cond_stage_model.train = disabled_train
def init_depth_stage_from_ckpt(self, config):
model = instantiate_from_config(config)
self.depth_stage_model = model.eval()
self.depth_stage_model.train = disabled_train
def init_emb_stage_from_ckpt(self, config):
if config is None:
self.emb_stage_model = None
else:
model = instantiate_from_config(config)
self.emb_stage_model = model
if not self.emb_stage_trainable:
self.emb_stage_model.eval()
self.emb_stage_model.train = disabled_train
@torch.no_grad()
def encode_to_z(self, x):
quant_z, _, info = self.first_stage_model.encode(x)
indices = info[2].view(quant_z.shape[0], -1)
return quant_z, indices
@torch.no_grad()
def encode_to_c(self, c):
quant_c, _, info = self.cond_stage_model.encode(c)
indices = info[2].view(quant_c.shape[0], -1)
return quant_c, indices
@torch.no_grad()
def encode_to_d(self, x):
quant_z, _, info = self.depth_stage_model.encode(x)
indices = info[2].view(quant_z.shape[0], -1)
return quant_z, indices
def encode_to_e(self, **kwargs):
return self.emb_stage_model(**kwargs)
def get_normalized_c(self, cdict, edict):
with torch.no_grad():
quant_c, c_indices = self.encode_to_c(**cdict)
scaled_idepth = self._midas.scaled_depth(cdict["c"],
edict.pop("points"),
return_inverse_depth=True)
alpha = scaled_idepth.amax(dim=(1,2))
scaled_idepth = scaled_idepth/alpha[:,None,None]
edict["t"] = edict["t"]*alpha[:,None]
quant_d, d_indices = self.encode_to_d(scaled_idepth[:,None,:,:]*2.0-1.0)
embeddings = self.encode_to_e(**edict)
if self.merge_channels is None:
# concat depth and src indices into 2*h*w conditioning indices
dc_indices = torch.cat((d_indices, c_indices), dim=1)
else:
# use empty conditioning indices and compute h*w conditioning
# embeddings
dc_indices = torch.zeros_like(d_indices)[:,[]]
merge = torch.cat((quant_d, quant_c), dim=1)
merge = self.merge_conv(merge)
merge = merge.permute(0,2,3,1) # to b,h,w,c
merge = merge.reshape(merge.shape[0],
merge.shape[1]*merge.shape[2],
merge.shape[3]) # to b,hw,c
embeddings = torch.cat((embeddings,merge), dim=1)
# check that unmasking is correct
total_cond_length = embeddings.shape[1] + dc_indices.shape[1]
assert total_cond_length == self.transformer.config.n_unmasked, (
embeddings.shape[1], dc_indices.shape[1], self.transformer.config.n_unmasked)
return quant_d, quant_c, dc_indices, embeddings
def forward(self, xdict, cdict, edict, warpkwargs):
# one step to produce the logits
_, z_indices = self.encode_to_z(**xdict)
_, _, dc_indices, embeddings = self.get_normalized_c(cdict, edict)
cz_indices = torch.cat((dc_indices, z_indices), dim=1)
# target includes all sequence elements (no need to handle first one
# differently because we are conditioning)
target = z_indices
# make the prediction
logits, _ = self.transformer(cz_indices[:, :-1], embeddings=embeddings,
warpkwargs=warpkwargs)
# cut off conditioning outputs - output i corresponds to p(z_i | z_{<i}, c)
logits = logits[:, embeddings.shape[1]+dc_indices.shape[1]-1:]
return logits, target
def top_k_logits(self, logits, k):
v, ix = torch.topk(logits, k)
out = logits.clone()
out[out < v[..., [-1]]] = -float('Inf')
return out
@torch.no_grad()
def sample(self, x, c, steps, temperature=1.0, sample=False, top_k=None,
callback=lambda k: None, embeddings=None, warpkwargs=None, **kwargs):
# in the current variant we always use embeddings for camera
assert embeddings is not None
# check n_unmasked and conditioning length
total_cond_length = embeddings.shape[1] + c.shape[1]
assert total_cond_length == self.transformer.config.n_unmasked, (
embeddings.shape[1], c.shape[1], self.transformer.config.n_unmasked)
x = torch.cat((c,x),dim=1)
block_size = self.transformer.get_block_size()
assert not self.transformer.training
for k in range(steps):
callback(k)
assert x.size(1) <= block_size # make sure model can see conditioning
x_cond = x
logits, _ = self.transformer(x_cond, embeddings=embeddings,
warpkwargs=warpkwargs)
# for the next steps, reuse precomputed embeddings for conditioning
self.transformer.warper.set_cache(True)
# pluck the logits at the final step and scale by temperature
logits = logits[:, -1, :] / temperature
# optionally crop probabilities to only the top k options
if top_k is not None:
logits = self.top_k_logits(logits, top_k)
# apply softmax to convert to probabilities
probs = F.softmax(logits, dim=-1)
# sample from the distribution or take the most likely
if sample:
ix = torch.multinomial(probs, num_samples=1)
else:
_, ix = torch.topk(probs, k=1, dim=-1)
# append to the sequence and continue
x = torch.cat((x, ix), dim=1)
# disable caching again
self.transformer.warper.set_cache(False)
# cut off conditioning
x = x[:, c.shape[1]:]
return x
@torch.no_grad()
def decode_to_img(self, index, zshape):
bhwc = (zshape[0],zshape[2],zshape[3],zshape[1])
quant_z = self.first_stage_model.quantize.get_codebook_entry(
index.reshape(-1), shape=bhwc)
x = self.first_stage_model.decode(quant_z)
return x
@torch.no_grad()
def log_images(self,
batch,
temperature=None,
top_k=None,
callback=None,
N=4,
half_sample=True,
sample=True,
det_sample=None,
**kwargs):
det_sample = det_sample if det_sample is not None else self.log_det_sample
log = dict()
xdict, cdict, edict = self.get_xce(batch, N)
for k in xdict:
xdict[k] = xdict[k].to(device=self.device)
for k in cdict:
cdict[k] = cdict[k].to(device=self.device)
for k in edict:
edict[k] = edict[k].to(device=self.device)
warpkwargs = self.get_warpkwargs(batch, N)
for k in warpkwargs:
warpkwargs[k] = warpkwargs[k].to(device=self.device)
log["inputs"] = xdict["x"]
log["conditioning"] = cdict["c"]
quant_z, z_indices = self.encode_to_z(**xdict)
quant_d, quant_c, dc_indices, embeddings = self.get_normalized_c(cdict,
edict)
if half_sample:
# create a "half"" sample
z_start_indices = z_indices[:,:z_indices.shape[1]//2]
index_sample = self.sample(z_start_indices, dc_indices,
steps=z_indices.shape[1]-z_start_indices.shape[1],
temperature=temperature if temperature is not None else 1.0,
sample=True,
top_k=top_k if top_k is not None else self.top_k,
callback=callback if callback is not None else lambda k: None,
embeddings=embeddings,
warpkwargs=warpkwargs)
x_sample = self.decode_to_img(index_sample, quant_z.shape)
log["samples_half"] = x_sample
if sample:
# sample
z_start_indices = z_indices[:, :0]
t1 = time.time()
index_sample = self.sample(z_start_indices, dc_indices,
steps=z_indices.shape[1],
temperature=temperature if temperature is not None else 1.0,
sample=True,
top_k=top_k if top_k is not None else 100,
callback=callback if callback is not None else lambda k: None,
embeddings=embeddings,
warpkwargs=warpkwargs)
if not hasattr(self, "sampling_time"):
self.sampling_time = time.time() - t1
print(f"Full sampling takes about {self.sampling_time:.2f} seconds.")
x_sample_nopix = self.decode_to_img(index_sample, quant_z.shape)
log["samples_nopix"] = x_sample_nopix
if det_sample:
# det sample
z_start_indices = z_indices[:, :0]
index_sample = self.sample(z_start_indices, dc_indices,
steps=z_indices.shape[1],
sample=False,
callback=callback if callback is not None else lambda k: None,
embeddings=embeddings,
warpkwargs=warpkwargs)
x_sample_det = self.decode_to_img(index_sample, quant_z.shape)
log["samples_det"] = x_sample_det
# reconstruction
x_rec = self.decode_to_img(z_indices, quant_z.shape)
log["reconstructions"] = x_rec
if self.plot_cond_stage:
cond_rec = self.cond_stage_model.decode(quant_c)
log["conditioning_rec"] = cond_rec
depth_rec = self.depth_stage_model.decode(quant_d)
log["depth_rec"] = depth_rec
return log
def get_input(self, key, batch, heuristics=True):
x = batch[key]
if heuristics:
if len(x.shape) == 3:
x = x[..., None]
x = x.permute(0, 3, 1, 2).to(memory_format=torch.contiguous_format)
if x.dtype == torch.double:
x = x.float()
return x
def get_xce(self, batch, N=None):
xdict = dict()
for k, v in self.first_stage_key.items():
xdict[k] = self.get_input(v, batch, heuristics=k=="x")[:N]
cdict = dict()
for k, v in self.cond_stage_key.items():
cdict[k] = self.get_input(v, batch, heuristics=k=="c")[:N]
edict = dict()
for k, v in self.emb_stage_key.items():
edict[k] = self.get_input(v, batch, heuristics=False)[:N]
return xdict, cdict, edict
def get_warpkwargs(self, batch, N=None):
kwargs = dict()
for k, v in self.warpkwargs_keys.items():
kwargs[k] = self.get_input(v, batch, heuristics=k=="x")[:N]
return kwargs
def compute_loss(self, logits, targets, split="train"):
loss = F.cross_entropy(logits.reshape(-1, logits.size(-1)), targets.reshape(-1))
return loss, {f"{split}/loss": loss.detach()}
def shared_step(self, batch, batch_idx):
x, c, e = self.get_xce(batch)
warpkwargs = self.get_warpkwargs(batch)
logits, target = self(x, c, e, warpkwargs)
return logits, target
def training_step(self, batch, batch_idx):
logits, target = self.shared_step(batch, batch_idx)
loss, log_dict = self.compute_loss(logits, target, split="train")
self.log("train/loss", loss,
prog_bar=True, logger=True, on_step=True, on_epoch=True)
self.log("global_step", self.global_step,
prog_bar=True, logger=True, on_step=True, on_epoch=False)
return loss
def validation_step(self, batch, batch_idx):
logits, target = self.shared_step(batch, batch_idx)
loss, log_dict = self.compute_loss(logits, target, split="val")
self.log("val/loss", loss,
prog_bar=True, logger=True, on_step=False, on_epoch=True)
return log_dict
def configure_optimizers(self):
# separate out all parameters to those that will and won't experience regularizing weight decay
decay = set()
no_decay = set()
whitelist_weight_modules = (torch.nn.Linear, torch.nn.Conv2d)
blacklist_weight_modules = (torch.nn.LayerNorm, torch.nn.Embedding)
for mn, m in self.transformer.named_modules():
for pn, p in m.named_parameters():
fpn = '%s.%s' % (mn, pn) if mn else pn # full param name
if fpn.startswith("warper._midas"):
continue
if pn.endswith('bias'):
# all biases will not be decayed
no_decay.add(fpn)
elif pn.endswith('weight') and isinstance(m, whitelist_weight_modules):
# weights of whitelist modules will be weight decayed
decay.add(fpn)
elif pn.endswith('weight') and isinstance(m, blacklist_weight_modules):
# weights of blacklist modules will NOT be weight decayed
no_decay.add(fpn)
# special case the position embedding parameter in the root GPT module as not decayed
no_decay.add('pos_emb')
if hasattr(self.transformer.warper, "pos_emb"):
no_decay.add('warper.pos_emb')
# handle meta positional embeddings
for pn, p in self.transformer.warper.named_parameters():
if pn.endswith("pos_emb"):
no_decay.add(f"warper.{pn}")
# validate that we considered every parameter
param_dict = {pn: p for pn, p in self.transformer.named_parameters() if not pn.startswith("warper._midas")}
inter_params = decay & no_decay
union_params = decay | no_decay
assert len(inter_params) == 0, "parameters %s made it into both decay/no_decay sets!" % (str(inter_params), )
assert len(param_dict.keys() - union_params) == 0, "parameters %s were not separated into either decay/no_decay set!" \
% (str(param_dict.keys() - union_params), )
# create the pytorch optimizer object
optim_groups = [
{"params": [param_dict[pn] for pn in sorted(list(decay))], "weight_decay": 0.01},
{"params": [param_dict[pn] for pn in sorted(list(no_decay))], "weight_decay": 0.0},
]
extra_parameters = list()
if self.emb_stage_trainable:
extra_parameters += list(self.emb_stage_model.parameters())
if hasattr(self, "merge_conv"):
extra_parameters += list(self.merge_conv.parameters())
else:
assert self.merge_channels is None
optim_groups.append({"params": extra_parameters, "weight_decay": 0.0})
print(f"Optimizing {len(extra_parameters)} extra parameters.")
optimizer = torch.optim.AdamW(optim_groups, lr=self.learning_rate, betas=(0.9, 0.95))
if self.use_scheduler:
scheduler = instantiate_from_config(self.scheduler_config)
print("Setting up LambdaLR scheduler...")
scheduler = [
{
'scheduler': LambdaLR(optimizer, lr_lambda=scheduler.schedule),
'interval': 'step',
'frequency': 1
}]
return [optimizer], scheduler
return optimizer
| 43.841463 | 127 | 0.574084 | 5,356 | 43,140 | 4.405153 | 0.07935 | 0.014241 | 0.012206 | 0.009918 | 0.9054 | 0.894592 | 0.888404 | 0.879715 | 0.878571 | 0.878571 | 0 | 0.009116 | 0.32872 | 43,140 | 983 | 128 | 43.886063 | 0.805622 | 0.089244 | 0 | 0.873548 | 0 | 0.002581 | 0.049309 | 0.004548 | 0 | 0 | 0 | 0 | 0.025806 | 1 | 0.064516 | false | 0 | 0.010323 | 0.002581 | 0.130323 | 0.016774 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
831d67740352ceb554fbfe2d27f8e6ddaf3578fa | 4,125 | py | Python | LLVMHelper.py | sunbohong/lldb_tool | 83ba08e47271a70c5ba6aecd0be7914d0d29268c | [
"MIT"
] | 2 | 2021-01-20T08:17:31.000Z | 2021-05-08T06:44:14.000Z | LLVMHelper.py | sunbohong/lldb_tool | 83ba08e47271a70c5ba6aecd0be7914d0d29268c | [
"MIT"
] | null | null | null | LLVMHelper.py | sunbohong/lldb_tool | 83ba08e47271a70c5ba6aecd0be7914d0d29268c | [
"MIT"
] | 1 | 2021-05-08T06:44:30.000Z | 2021-05-08T06:44:30.000Z | import lldb
def llvm_raw_ostream_enable(debugger, command, result, a, internal_dict):
interpreter = lldb.debugger.GetCommandInterpreter()
returnObject = lldb.SBCommandReturnObject()
interpreter.HandleCommand('br enable llvm_raw_ostream_enable' + command, returnObject)
output = returnObject.GetOutput()
groupList = re.match(r'\d*', output, re.M)
if int(groupList.group(0)) > 0:
print(output)
return
interpreter = lldb.debugger.GetCommandInterpreter()
returnObject = lldb.SBCommandReturnObject()
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(llvm::raw_ostream::Colors)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> > const&)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(char)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(llvm::StringRef)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(unsigned char)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(llvm::SmallVectorImpl<char> const&)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(unsigned long)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(long)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(unsigned long long)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(long long)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(llvm::format_object_base const&)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(void const*)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(double)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(llvm::formatv_object_base const&)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(llvm::FormattedString const&)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(llvm::FormattedNumber const&)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(llvm::FormattedBytes const&)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
interpreter.HandleCommand('br set -F "llvm::raw_ostream::operator<<(signed char)" -C "v " -N "llvm_raw_ostream_enable"' + command, returnObject)
output = returnObject.GetOutput()
print("18 breakpoints enabled.")
def llvm_raw_ostream_disable(debugger, command, result, a, internal_dict):
interpreter = lldb.debugger.GetCommandInterpreter()
returnObject = lldb.SBCommandReturnObject()
interpreter.HandleCommand('br disable llvm_raw_ostream_enable' + command, returnObject)
output = returnObject.GetOutput()
print(output)
def __lldb_init_module(debugger, internal_dict):
debugger.HandleCommand(
'command script add llvm_raw_ostream_enable -f LLVMHelper.llvm_raw_ostream_enable')
debugger.HandleCommand(
'command script add llvm_raw_ostream_disable -f LLVMHelper.llvm_raw_ostream_disable')
| 85.9375 | 229 | 0.736727 | 518 | 4,125 | 5.642857 | 0.137066 | 0.107766 | 0.215532 | 0.157373 | 0.876497 | 0.850496 | 0.850496 | 0.850496 | 0.8156 | 0.793021 | 0 | 0.001928 | 0.119758 | 4,125 | 47 | 230 | 87.765957 | 0.803085 | 0 | 0 | 0.302326 | 0 | 0.418605 | 0.508121 | 0.361455 | 0 | 0 | 0 | 0 | 0 | 1 | 0.069767 | false | 0 | 0.023256 | 0 | 0.116279 | 0.069767 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
833649324854bb3e7cd0e13d538851c6f469dcdc | 32,604 | py | Python | libs/src/test/python/dlpx/virtualization/test_libs.py | vr-delphix/virtualization-sdk | ebf934c9eb1337b35f7eb34e807af63ea259da6a | [
"Apache-2.0"
] | null | null | null | libs/src/test/python/dlpx/virtualization/test_libs.py | vr-delphix/virtualization-sdk | ebf934c9eb1337b35f7eb34e807af63ea259da6a | [
"Apache-2.0"
] | 95 | 2020-03-06T00:25:42.000Z | 2020-06-09T00:22:25.000Z | libs/src/test/python/dlpx/virtualization/test_libs.py | vr-delphix/virtualization-sdk | ebf934c9eb1337b35f7eb34e807af63ea259da6a | [
"Apache-2.0"
] | 3 | 2019-10-14T18:33:30.000Z | 2019-10-23T17:08:08.000Z | #
# Copyright (c) 2019, 2020 by Delphix. All rights reserved.
#
import mock
import pytest
from dlpx.virtualization.api import libs_pb2
from dlpx.virtualization import libs
from dlpx.virtualization.libs.exceptions import (
IncorrectArgumentTypeError, LibraryError, PluginScriptError)
class TestLibsRunBash:
@staticmethod
def test_run_bash(remote_connection):
expected_run_bash_response = libs_pb2.RunBashResponse()
expected_run_bash_response.return_value.exit_code = 0
expected_run_bash_response.return_value.stdout = 'stdout'
expected_run_bash_response.return_value.stderr = 'stderr'
expected_command = 'command'
expected_variables = None
expected_use_login_shell = False
def mock_run_bash(actual_run_bash_request):
assert actual_run_bash_request.command == expected_command
assert (actual_run_bash_request.use_login_shell ==
expected_use_login_shell)
actual_environment = (
actual_run_bash_request.remote_connection.environment)
assert (actual_environment.name ==
remote_connection.environment.name)
assert (actual_environment.reference ==
remote_connection.environment.reference)
return expected_run_bash_response
with mock.patch('dlpx.virtualization._engine.libs.run_bash',
side_effect=mock_run_bash, create=True):
actual_run_bash_result = libs.run_bash(
remote_connection,
expected_command,
expected_variables,
expected_use_login_shell)
expected = expected_run_bash_response.return_value
assert actual_run_bash_result.exit_code == expected.exit_code
assert actual_run_bash_result.stdout == expected.stdout
assert actual_run_bash_result.stderr == expected.stderr
@staticmethod
def test_run_bash_check_true_success_exitcode(remote_connection):
expected_run_bash_response = libs_pb2.RunBashResponse()
expected_run_bash_response.return_value.exit_code = 0
expected_run_bash_response.return_value.stdout = "stdout"
expected_run_bash_response.return_value.stderr = "stderr"
expected_command = "command"
expected_variables = None
expected_use_login_shell = False
def mock_run_bash(actual_run_bash_request):
assert actual_run_bash_request.command == expected_command
assert actual_run_bash_request.use_login_shell == expected_use_login_shell
assert (
actual_run_bash_request.remote_connection.environment.name
== remote_connection.environment.name
)
assert (
actual_run_bash_request.remote_connection.environment.reference
== remote_connection.environment.reference
)
return expected_run_bash_response
with mock.patch("dlpx.virtualization._engine.libs.run_bash",
side_effect=mock_run_bash, create=True):
actual_run_bash_result = libs.run_bash(remote_connection,
expected_command,
expected_variables,
expected_use_login_shell,
check=True)
assert actual_run_bash_result.exit_code == expected_run_bash_response.return_value.exit_code
assert actual_run_bash_result.stdout == expected_run_bash_response.return_value.stdout
assert actual_run_bash_result.stderr == expected_run_bash_response.return_value.stderr
@staticmethod
def test_run_bash_with_check_true_failed_exitcode(remote_connection):
expected_message = (
'The script failed with exit code 1.'
' stdout : stdout and stderr : stderr'
)
response = libs_pb2.RunBashResponse()
response.return_value.exit_code = 1
response.return_value.stdout = "stdout"
response.return_value.stderr = "stderr"
with mock.patch("dlpx.virtualization._engine.libs.run_bash",
return_value=response, create=True):
with pytest.raises(PluginScriptError) as info:
response = libs.run_bash(remote_connection, "test_command",
check=True)
assert info.value.message == expected_message
@staticmethod
def test_run_bash_with_actionable_error(remote_connection):
expected_id = 15
expected_message = 'Some message'
response = libs_pb2.RunBashResponse()
response.error.actionable_error.id = expected_id
response.error.actionable_error.message = expected_message
with mock.patch('dlpx.virtualization._engine.libs.run_bash',
return_value=response, create=True):
with pytest.raises(LibraryError) as err_info:
libs.run_bash(remote_connection, 'command')
assert err_info.value._id == expected_id
assert err_info.value.message == expected_message
@staticmethod
def test_run_bash_with_nonactionable_error(remote_connection):
response = libs_pb2.RunBashResponse()
na_error = libs_pb2.NonActionableLibraryError()
response.error.non_actionable_error.CopyFrom(na_error)
with mock.patch('dlpx.virtualization._engine.libs.run_bash',
return_value=response, create=True):
with pytest.raises(SystemExit):
libs.run_bash(remote_connection, 'command')
@staticmethod
def test_run_bash_bad_remote_connection():
# Set the connection be a string instead of a RemoteConnection.
connection = 'BadRemoteConnection'
command = 'command'
variables = None
use_login_shell = False
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_bash(connection, command, variables, use_login_shell)
assert err_info.value.message == (
"The function run_bash's argument 'remote_connection' was"
" type 'str' but should be of"
" class 'dlpx.virtualization.common._common_classes.RemoteConnection'.")
@staticmethod
def test_run_bash_bad_command(remote_connection):
# Set the command be an int instead of a string.
command = 10
variables = None
use_login_shell = False
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_bash(remote_connection, command, variables, use_login_shell)
assert err_info.value.message == (
"The function run_bash's argument 'command' was"
" type 'int' but should be of type 'basestring'.")
@staticmethod
def test_run_bash_variables_not_dict(remote_connection):
command = 'command'
# Set the variables be a string instead of a dict.
variables = 'not a dict'
use_login_shell = False
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_bash(remote_connection, command, variables, use_login_shell)
assert err_info.value.message == (
"The function run_bash's argument 'variables' was"
" type 'str' but should be of"
" type 'dict of basestring:basestring' if defined.")
@staticmethod
def test_run_bash_bad_variables(remote_connection):
command = 'command'
#
# Set the value inside the varibles dict to be an int instead of a
# string.
#
variables = {'test0': 'yes', 'test1': 10}
use_login_shell = False
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_bash(remote_connection, command, variables, use_login_shell)
message = ("The function run_bash's argument 'variables' was"
" a dict of {{type 'str':type '{}', type 'str':type '{}'}}"
" but should be of"
" type 'dict of basestring:basestring' if defined.")
assert (err_info.value.message == message.format('int', 'str') or
err_info.value.message == message.format('str', 'int'))
@staticmethod
def test_run_bash_bad_use_login_shell(remote_connection):
command = 'command'
variables = None
# Set the variables be a string instead of a bool.
use_login_shell = 'False'
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_bash(remote_connection, command, variables, use_login_shell)
assert err_info.value.message == (
"The function run_bash's argument 'use_login_shell' was"
" type 'str' but should be of type 'bool' if defined.")
class TestLibsRunSync:
@staticmethod
def test_run_sync(remote_connection):
expected_run_sync_response = libs_pb2.RunSyncResponse()
expected_source_directory = 'sourceDirectory'
expected_rsync_user = 'rsyncUser'
expected_exclude_paths = ['/path1', '/path2']
expected_sym_links_to_follow = ['/path3', '/path4']
def mock_run_sync(actual_run_sync_request):
assert (actual_run_sync_request.source_directory
== expected_source_directory)
actual_environment = (
actual_run_sync_request.remote_connection.environment)
assert (actual_environment.name ==
remote_connection.environment.name)
assert (actual_environment.reference ==
remote_connection.environment.reference)
assert actual_run_sync_request.rsync_user == expected_rsync_user
assert (actual_run_sync_request.exclude_paths ==
expected_exclude_paths)
assert (actual_run_sync_request.sym_links_to_follow ==
expected_sym_links_to_follow)
return expected_run_sync_response
with mock.patch('dlpx.virtualization._engine.libs.run_sync',
side_effect=mock_run_sync, create=True):
actual_runsync_response = libs.run_sync(
remote_connection,
expected_source_directory,
expected_rsync_user,
expected_exclude_paths,
expected_sym_links_to_follow)
assert actual_runsync_response is None
@staticmethod
def test_run_sync_with_actionable_error(remote_connection):
expected_id = 15
expected_message = 'Some message'
response = libs_pb2.RunSyncResponse()
response.error.actionable_error.id = expected_id
response.error.actionable_error.message = expected_message
with mock.patch('dlpx.virtualization._engine.libs.run_sync',
return_value=response, create=True):
with pytest.raises(LibraryError) as err_info:
libs.run_sync(remote_connection, 'dir')
assert err_info.value._id == expected_id
assert err_info.value.message == expected_message
@staticmethod
def test_run_sync_with_nonactionable_error(remote_connection):
response = libs_pb2.RunSyncResponse()
na_error = libs_pb2.NonActionableLibraryError()
response.error.non_actionable_error.CopyFrom(na_error)
with mock.patch('dlpx.virtualization._engine.libs.run_sync',
return_value=response, create=True):
with pytest.raises(SystemExit):
libs.run_sync(remote_connection, "dir")
@staticmethod
def test_run_sync_bad_remote_connection():
# Set the connection be a string instead of a RemoteConnection.
connection = 'BadRemoteConnection'
source_directory = 'sourceDirectory'
rsync_user = 'rsyncUser'
exclude_paths = ['/path1', '/path2']
sym_links_to_follow = ['/path3', '/path4']
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_sync(
connection,
source_directory,
rsync_user,
exclude_paths,
sym_links_to_follow)
assert err_info.value.message == (
"The function run_sync's argument 'remote_connection' was"
" type 'str' but should be of"
" class 'dlpx.virtualization.common._common_classes.RemoteConnection'.")
@staticmethod
def test_run_sync_bad_source_directory(remote_connection):
# Set the source_directory be an int instead of a string.
source_directory = 10
rsync_user = 'rsyncUser'
exclude_paths = ['/path1', '/path2']
sym_links_to_follow = ['/path3', '/path4']
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_sync(
remote_connection,
source_directory,
rsync_user,
exclude_paths,
sym_links_to_follow)
assert err_info.value.message == (
"The function run_sync's argument 'source_directory' was"
" type 'int' but should be of type 'basestring'.")
@staticmethod
def test_run_sync_bad_rsync_user(remote_connection):
source_directory = 'sourceDirectory'
# Set the rsync_user be an int instead of a string.
rsync_user = 10
exclude_paths = ['/path1', '/path2']
sym_links_to_follow = ['/path3', '/path4']
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_sync(
remote_connection,
source_directory,
rsync_user,
exclude_paths,
sym_links_to_follow)
assert err_info.value.message == (
"The function run_sync's argument 'rsync_user' was"
" type 'int' but should be of type 'basestring' if defined.")
@staticmethod
def test_run_sync_exclude_paths_not_list(remote_connection):
source_directory = 'sourceDirectory'
rsync_user = 'rsyncUser'
# Set the exclude_paths be an string instead of a list.
exclude_paths = '/path'
sym_links_to_follow = ['/path3', '/path4']
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_sync(
remote_connection,
source_directory,
rsync_user,
exclude_paths,
sym_links_to_follow)
assert err_info.value.message == (
"The function run_sync's argument 'exclude_paths' was"
" type 'str' but should be of"
" type 'list of basestring' if defined.")
@staticmethod
def test_run_sync_bad_exclude_paths(remote_connection):
source_directory = 'sourceDirectory'
rsync_user = 'rsyncUser'
# Set the exclude_paths list to be int instead of a string.
exclude_paths = ['/path1', 10]
sym_links_to_follow = ['/path3', '/path4']
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_sync(
remote_connection,
source_directory,
rsync_user,
exclude_paths,
sym_links_to_follow)
assert err_info.value.message == (
"The function run_sync's argument 'exclude_paths' was a list of"
" [type 'str', type 'int'] but should be of"
" type 'list of basestring' if defined.")
@staticmethod
def test_run_sync_sym_links_to_follow_not_list(remote_connection):
source_directory = 'sourceDirectory'
rsync_user = 'rsyncUser'
exclude_paths = ['/path1', '/path2']
# Set the sym_links_to_follow be an string instead of a list.
sym_links_to_follow = '/path'
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_sync(
remote_connection,
source_directory,
rsync_user,
exclude_paths,
sym_links_to_follow)
assert err_info.value.message == (
"The function run_sync's argument 'sym_links_to_follow' was"
" type 'str' but should be of"
" type 'list of basestring' if defined.")
@staticmethod
def test_run_sync_bad_sym_links_to_follow(remote_connection):
source_directory = 'sourceDirectory'
rsync_user = 'rsyncUser'
exclude_paths = ['/path1', '/path2']
# Set the sym_links_to_follow list to be int instead of a string.
sym_links_to_follow = ['/path3', 10]
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_sync(
remote_connection,
source_directory,
rsync_user,
exclude_paths,
sym_links_to_follow)
assert err_info.value.message == (
"The function run_sync's argument 'sym_links_to_follow' was"
" a list of [type 'str', type 'int'] but should be of"
" type 'list of basestring' if defined.")
class TestLibsRunPowershell:
@staticmethod
def test_run_powershell(remote_connection):
expected_run_powershell_response = libs_pb2.RunPowerShellResponse()
expected_run_powershell_response.return_value.exit_code = 0
expected_run_powershell_response.return_value.stdout = 'stdout'
expected_run_powershell_response.return_value.stderr = 'stderr'
expected_command = 'command'
expected_variables = None
def mock_run_powershell(actual_run_powershell_request):
assert actual_run_powershell_request.command == expected_command
actual_environment = (
actual_run_powershell_request.remote_connection.environment)
assert (actual_environment.name ==
remote_connection.environment.name)
assert (actual_environment.reference ==
remote_connection.environment.reference)
return expected_run_powershell_response
with mock.patch('dlpx.virtualization._engine.libs.run_powershell',
side_effect=mock_run_powershell, create=True):
actual_run_powershell_result = libs.run_powershell(
remote_connection,
expected_command,
expected_variables)
expected = expected_run_powershell_response.return_value
assert actual_run_powershell_result.exit_code == expected.exit_code
assert actual_run_powershell_result.stdout == expected.stdout
assert actual_run_powershell_result.stderr == expected.stderr
@staticmethod
def test_run_powershell_check_true_exitcode_success(remote_connection):
expected_run_powershell_response = libs_pb2.RunPowerShellResponse()
expected_run_powershell_response.return_value.exit_code = 0
expected_run_powershell_response.return_value.stdout = "stdout"
expected_run_powershell_response.return_value.stderr = "stderr"
expected_command = "command"
expected_variables = None
def mock_run_powershell(actual_run_powershell_request):
assert actual_run_powershell_request.command == expected_command
assert (
actual_run_powershell_request.remote_connection.environment.name
== remote_connection.environment.name
)
assert (
actual_run_powershell_request.remote_connection.environment.reference
== remote_connection.environment.reference
)
return expected_run_powershell_response
with mock.patch("dlpx.virtualization._engine.libs.run_powershell",
side_effect=mock_run_powershell, create=True):
actual_run_powershell_result = libs.run_powershell(
remote_connection,
expected_command, expected_variables, check=True)
assert actual_run_powershell_result.exit_code == expected_run_powershell_response.return_value.exit_code
assert actual_run_powershell_result.stdout == expected_run_powershell_response.return_value.stdout
assert actual_run_powershell_result.stderr == expected_run_powershell_response.return_value.stderr
@staticmethod
def test_run_powershell_check_true_exitcode_failed(remote_connection):
expected_message = (
'The script failed with exit code 1.'
' stdout : stdout and stderr : stderr'
)
response = libs_pb2.RunPowerShellResponse()
response.return_value.exit_code = 1
response.return_value.stdout = "stdout"
response.return_value.stderr = "stderr"
with mock.patch("dlpx.virtualization._engine.libs.run_powershell",
return_value=response, create=True):
with pytest.raises(PluginScriptError) as info:
response = libs.run_powershell(remote_connection, "test_command",
check=True)
assert info.value.message == expected_message
@staticmethod
def test_run_powershell_with_actionable_error(remote_connection):
expected_id = 15
expected_message = 'Some message'
response = libs_pb2.RunPowerShellResponse()
response.error.actionable_error.id = expected_id
response.error.actionable_error.message = expected_message
with mock.patch('dlpx.virtualization._engine.libs.run_powershell',
return_value=response, create=True):
with pytest.raises(LibraryError) as err_info:
libs.run_powershell(remote_connection, 'command')
assert err_info.value._id == expected_id
assert err_info.value.message == expected_message
@staticmethod
def test_run_powershell_with_nonactionable_error(remote_connection):
response = libs_pb2.RunPowerShellResponse()
na_error = libs_pb2.NonActionableLibraryError()
response.error.non_actionable_error.CopyFrom(na_error)
with mock.patch('dlpx.virtualization._engine.libs.run_powershell',
return_value=response, create=True):
with pytest.raises(SystemExit):
libs.run_powershell(remote_connection, 'command')
@staticmethod
def test_run_powershell_bad_remote_connection():
# Set the connection be a string instead of a RemoteConnection.
connection = 'BadRemoteConnection'
command = 'command'
variables = None
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_powershell(connection, command, variables)
assert err_info.value.message == (
"The function run_powershell's argument 'remote_connection' was"
" type 'str' but should be of"
" class 'dlpx.virtualization.common._common_classes.RemoteConnection'.")
@staticmethod
def test_run_powershell_bad_command(remote_connection):
# Set the command be an int instead of a string.
command = 10
variables = None
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_powershell(remote_connection, command, variables)
assert err_info.value.message == (
"The function run_powershell's argument 'command' was"
" type 'int' but should be of type 'basestring'.")
@staticmethod
def test_run_powershell_variables_not_dict(remote_connection):
command = 'command'
# Set the variables be a string instead of a dict.
variables = 'not a dict'
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_powershell(remote_connection, command, variables)
assert err_info.value.message == (
"The function run_powershell's argument 'variables' was"
" type 'str' but should be of"
" type 'dict of basestring:basestring' if defined.")
@staticmethod
def test_run_powershell_bad_variables(remote_connection):
command = 'command'
#
# Set the value inside the varibles dict to be an int instead of a
# string.
#
variables = {'test0': 'yes', 'test1': 10}
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_powershell(remote_connection, command, variables)
message = ("The function run_powershell's argument 'variables' was"
" a dict of {{type 'str':type '{}', type 'str':type '{}'}}"
" but should be of"
" type 'dict of basestring:basestring' if defined.")
assert (err_info.value.message == message.format('int', 'str') or
err_info.value.message == message.format('str', 'int'))
class TestLibsRunExpect:
@staticmethod
def test_run_expect(remote_connection):
expected_run_expect_response = libs_pb2.RunExpectResponse()
expected_run_expect_response.return_value.exit_code = 0
expected_run_expect_response.return_value.stdout = 'stdout'
expected_run_expect_response.return_value.stderr = 'stderr'
expected_command = 'command'
expected_variables = None
def mock_run_expect(actual_run_expect_request):
assert actual_run_expect_request.command == expected_command
actual_environment = (
actual_run_expect_request.remote_connection.environment)
assert (actual_environment.name ==
remote_connection.environment.name)
assert (actual_environment.reference ==
remote_connection.environment.reference)
return expected_run_expect_response
with mock.patch('dlpx.virtualization._engine.libs.run_expect',
side_effect=mock_run_expect, create=True):
actual_run_expect_result = libs.run_expect(
remote_connection,
expected_command,
expected_variables)
expected = expected_run_expect_response.return_value
assert actual_run_expect_result.exit_code == expected.exit_code
assert actual_run_expect_result.stdout == expected.stdout
assert actual_run_expect_result.stderr == expected.stderr
@staticmethod
def test_run_expect_check_true_exitcode_success(remote_connection):
expected_run_expect_response = libs_pb2.RunPowerShellResponse()
expected_run_expect_response.return_value.exit_code = 0
expected_run_expect_response.return_value.stdout = "stdout"
expected_run_expect_response.return_value.stderr = "stderr"
expected_command = "command"
expected_variables = None
def mock_run_expect(actual_run_expect_request):
assert actual_run_expect_request.command == expected_command
assert (
actual_run_expect_request.remote_connection.environment.name
== remote_connection.environment.name
)
assert (
actual_run_expect_request.remote_connection.environment.reference
== remote_connection.environment.reference
)
return expected_run_expect_response
with mock.patch("dlpx.virtualization._engine.libs.run_expect",
side_effect=mock_run_expect, create=True):
actual_run_expect_result = libs.run_expect(
remote_connection,
expected_command, expected_variables, check=True)
assert actual_run_expect_result.exit_code == expected_run_expect_response.return_value.exit_code
assert actual_run_expect_result.stdout == expected_run_expect_response.return_value.stdout
assert actual_run_expect_result.stderr == expected_run_expect_response.return_value.stderr
@staticmethod
def test_run_expect_check_true_exitcode_failed(remote_connection):
expected_message = (
'The script failed with exit code 1.'
' stdout : stdout and stderr : stderr'
)
response = libs_pb2.RunExpectResponse()
response.return_value.exit_code = 1
response.return_value.stdout = "stdout"
response.return_value.stderr = "stderr"
with mock.patch("dlpx.virtualization._engine.libs.run_expect",
return_value=response, create=True):
with pytest.raises(PluginScriptError) as info:
response = libs.run_expect(remote_connection, "test_command",
check=True)
assert info.value.message == expected_message
@staticmethod
def test_run_expect_with_actionable_error(remote_connection):
expected_id = 15
expected_message = 'Some message'
response = libs_pb2.RunExpectResponse()
response.error.actionable_error.id = expected_id
response.error.actionable_error.message = expected_message
with mock.patch('dlpx.virtualization._engine.libs.run_expect',
return_value=response, create=True):
with pytest.raises(LibraryError) as err_info:
libs.run_expect(remote_connection, 'command')
assert err_info.value._id == expected_id
assert err_info.value.message == expected_message
@staticmethod
def test_run_expect_with_nonactionable_error(remote_connection):
response = libs_pb2.RunExpectResponse()
na_error = libs_pb2.NonActionableLibraryError()
response.error.non_actionable_error.CopyFrom(na_error)
with mock.patch('dlpx.virtualization._engine.libs.run_expect',
return_value=response, create=True):
with pytest.raises(SystemExit):
libs.run_expect(remote_connection, "command")
@staticmethod
def test_run_expect_bad_remote_connection():
# Set the connection be a string instead of a RemoteConnection.
connection = 'BadRemoteConnection'
command = 'command'
variables = None
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_expect(connection, command, variables)
assert err_info.value.message == (
"The function run_expect's argument 'remote_connection' was"
" type 'str' but should be of"
" class 'dlpx.virtualization.common._common_classes.RemoteConnection'.")
@staticmethod
def test_run_expect_bad_command(remote_connection):
# Set the command be an int instead of a string.
command = 10
variables = None
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_expect(remote_connection, command, variables)
assert err_info.value.message == (
"The function run_expect's argument 'command' was"
" type 'int' but should be of type 'basestring'.")
@staticmethod
def test_run_expect_variables_not_dict(remote_connection):
command = 'command'
# Set the variables be a string instead of a dict.
variables = 'not a dict'
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_expect(remote_connection, command, variables)
assert err_info.value.message == (
"The function run_expect's argument 'variables' was"
" type 'str' but should be of"
" type 'dict of basestring:basestring' if defined.")
@staticmethod
def test_run_expect_bad_variables(remote_connection):
command = 'command'
#
# Set the value inside the varibles dict to be an int instead of a
# string.
#
variables = {'test0': 'yes', 'test1': 10}
with pytest.raises(IncorrectArgumentTypeError) as err_info:
libs.run_expect(remote_connection, command, variables)
message = ("The function run_expect's argument 'variables' was"
" a dict of {{type 'str':type '{}', type 'str':type '{}'}}"
" but should be of"
" type 'dict of basestring:basestring' if defined.")
assert (err_info.value.message == message.format('int', 'str') or
err_info.value.message == message.format('str', 'int'))
| 42.015464 | 116 | 0.652067 | 3,510 | 32,604 | 5.729345 | 0.04302 | 0.079562 | 0.036847 | 0.041571 | 0.936499 | 0.908702 | 0.891198 | 0.87265 | 0.811288 | 0.771009 | 0 | 0.004365 | 0.276285 | 32,604 | 775 | 117 | 42.069677 | 0.847898 | 0.036897 | 0 | 0.72562 | 0 | 0 | 0.144223 | 0.038256 | 0 | 0 | 0 | 0 | 0.12562 | 1 | 0.07438 | false | 0 | 0.008264 | 0 | 0.100826 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
833867e57eb1fdc5577302cab0c62f53c1a6d6e4 | 6,074 | py | Python | MSVNPackages/Algo.py | Magdk01/MSVNPackages | 1fabeb40b2e52afdde4b6e51bd8b24e9526f10a1 | [
"MIT"
] | null | null | null | MSVNPackages/Algo.py | Magdk01/MSVNPackages | 1fabeb40b2e52afdde4b6e51bd8b24e9526f10a1 | [
"MIT"
] | null | null | null | MSVNPackages/Algo.py | Magdk01/MSVNPackages | 1fabeb40b2e52afdde4b6e51bd8b24e9526f10a1 | [
"MIT"
] | null | null | null | """Queue and Stack data structure built with linked lists """
class Node:
def __init__(self, value):
self.value = value
self.next = None
'''' LnkedList class represents the linked list '''
class LinkedList_Queue():
def __init__(self):
self.head = None
self.tail = None
''' Check if the list (queue) is empty so returns True (self.head is None)
or not so returns False (self.head is not None) '''
def is_empty(self):
return not self.head
'''Add a new node to the end of the linked list (queue)'''
def enqueue(self, key):
new_node = Node(key)
if self.is_empty():
self.head = new_node
self.tail = new_node
else:
self.tail.next = new_node
self.tail = new_node
''' Remove the first node of the linked list (queue).
If the linked list (queue) is empty returns a warning message '''
def dequeue(self):
if self.is_empty():
return "Warning: The queue is empty"
deleted_element = self.head.value
self.head = self.head.next
return deleted_element
''' Return the first node of the linked list.
If the linked list (queue) is empty returns a warning message'''
def peek(self):
if self.is_empty():
return "Warning: The queue is empty"
return self.head.value
class LinkedList_Stack():
def __init__(self):
self.head = None
self.tail = None
''' Check if the list (queue) is empty so returns True (self.head is None)
or not so returns False (self.head is not None) '''
def is_empty(self):
return not self.head
'''Add a new node to the end of the linked list (queue)'''
def push(self, key):
new_node = Node(key)
if self.is_empty():
self.head = new_node
self.tail = new_node
else:
new_node.next = self.head
self.head = new_node
''' Remove the first node of the linked list (queue).
If the linked list (queue) is empty returns a warning message '''
def pop(self):
if self.is_empty():
return "Warning: The stack is empty"
deleted_element = self.head.value
self.head = self.head.next
return deleted_element
''' Return the first node of the linked list.
If the linked list (queue) is empty returns a warning message'''
def peek(self):
if self.is_empty():
return "Warning: The stack is empty"
return self.head.value
class max_Heap():
def __init__(self):
self.heap = [None]
self.len = 0
def is_empty(self):
return self.len == 0
def insert(self, key):
self.len += 1
self.heap.append(key)
self.bubble_up(self.len)
def swap(self,i,j):
temp = self.heap[i]
self.heap[i] = self.heap[j]
self.heap[j] = temp
def bubble_up(self,i):
if i < 2:
return
parent = i // 2
if self.heap[i] > self.heap[parent]:
self.swap(i,parent)
self.bubble_up(parent)
def bubble_down(self,i):
if i * 2 > self.len: return
child = 2 * i + 1 if 2 * i + 1 <= self.len and self.heap[2 * i + 1] > self.heap[2 * i] else 2 * i
if self.heap[child] > self.heap[i]:
self.swap(child, i)
self.bubble_down(child)
# child_left = i * 2
# child_right = i * 2 + 1
# if self.heap[i] < self.heap[child_left]:
# self.swap(i,child_left)
# self.bubble_up(child_left)
# elif self.heap[i] < self.heap[child_right]:
# self.swap(i, child_right)
# self.bubble_down(child_right)
def extract_max(self):
if self.is_empty(): return
maxi = self.heap[1]
self.heap[1] = self.heap[self.len]
self.len -= 1
self.bubble_down(1)
del(self.heap[-1])
return maxi
def __str__(self):
return ' '.join([str(elem) for elem in self.heap[1:self.len+1]])
class min_Heap():
def __init__(self):
self.heap = [None]
self.len = 0
def is_empty(self):
return self.len == 0
def insert(self, key):
self.len += 1
self.heap.append(key)
self.bubble_up(self.len)
def swap(self,i,j):
temp = self.heap[i]
self.heap[i] = self.heap[j]
self.heap[j] = temp
def bubble_up(self,i):
if i < 2:
return
parent = i // 2
if self.heap[i] < self.heap[parent]:
self.swap(i,parent)
self.bubble_up(parent)
def bubble_down(self,i):
if i * 2 > self.len: return
child = 2 * i + 1 if 2 * i + 1 <= self.len and self.heap[2 * i + 1] < self.heap[2 * i] else 2 * i
if self.heap[child] < self.heap[i]:
self.swap(child, i)
self.bubble_down(child)
# child_left = i * 2
# child_right = i * 2 + 1
# if self.heap[i] < self.heap[child_left]:
# self.swap(i,child_left)
# self.bubble_up(child_left)
# elif self.heap[i] < self.heap[child_right]:
# self.swap(i, child_right)
# self.bubble_down(child_right)
def extract_min(self):
if self.is_empty(): return
mini= self.heap[1]
self.heap[1] = self.heap[self.len]
self.len -= 1
self.bubble_down(1)
del(self.heap[-1])
return mini
def __str__(self):
return ' '.join([str(elem) for elem in self.heap[1:self.len+1]])
class Unionfind():
def __init__(self, length):
self.id_list = [x for x in range(length)]
self.size = [1]*length
def find(self, i):
while i != self.id_list[i]:
i = self.id_list[i]
return i
def union(self,c1,c2):
r1 = self.find(c1)
r2 = self.find(c2)
if self.size[r1] < self.size[r2]:
r1, r2 = r2, r1
self.id_list[r2] = r1
self.size[r1] += self.size[r2] | 27.609091 | 105 | 0.550708 | 884 | 6,074 | 3.670814 | 0.109729 | 0.103544 | 0.033282 | 0.048074 | 0.872419 | 0.865023 | 0.833898 | 0.819106 | 0.819106 | 0.819106 | 0 | 0.015974 | 0.330095 | 6,074 | 220 | 106 | 27.609091 | 0.781519 | 0.092032 | 0 | 0.70922 | 0 | 0 | 0.023965 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.212766 | false | 0 | 0 | 0.042553 | 0.390071 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
55d8658aae0d09e0978caa8c326d760dafe719b6 | 4,651 | py | Python | pmacs/modules/std.py | bartlbrown/pmacs | b8844ee80b39f166ff5d3572911b537fec8d7005 | [
"MIT"
] | 1 | 2019-08-24T01:11:06.000Z | 2019-08-24T01:11:06.000Z | pmacs/modules/std.py | bartlbrown/pmacs | b8844ee80b39f166ff5d3572911b537fec8d7005 | [
"MIT"
] | null | null | null | pmacs/modules/std.py | bartlbrown/pmacs | b8844ee80b39f166ff5d3572911b537fec8d7005 | [
"MIT"
] | null | null | null | from PyQt4 import (QtGui, QtCore)
def exit(self):
self.closed.emit()
self.close()
# Mark Mode
def toggle_mark_mode(self):
self.current_widget.mark_mode = not self.current_widget.mark_mode
if not self.current_widget.mark_mode:
self.current_widget.unset_mark_mode()
def backspace(self):
self.current_widget.backspace()
def enter(self):
self.current_widget.enter()
# Cursor Movement
def move_cursor_char_right(self):
if self.current_widget.mark_mode:
self.current_widget.moveCursor(QtGui.QTextCursor.Right, QtGui.QTextCursor.KeepAnchor)
else:
self.current_widget.moveCursor(QtGui.QTextCursor.Right)
def move_cursor_char_left(self):
if self.current_widget.mark_mode:
self.current_widget.moveCursor(QtGui.QTextCursor.Left, QtGui.QTextCursor.KeepAnchor)
else:
self.current_widget.moveCursor(QtGui.QTextCursor.Left)
def move_cursor_line_up(self):
if self.current_widget.mark_mode:
self.current_widget.moveCursor(QtGui.QTextCursor.Up, QtGui.QTextCursor.KeepAnchor)
else:
self.current_widget.moveCursor(QtGui.QTextCursor.Up)
def move_cursor_line_down(self):
if self.current_widget.mark_mode:
self.current_widget.moveCursor(QtGui.QTextCursor.Down, QtGui.QTextCursor.KeepAnchor)
else:
self.current_widget.moveCursor(QtGui.QTextCursor.Down)
def move_cursor_word_right(self):
if self.current_widget.mark_mode:
self.current_widget.moveCursor(QtGui.QTextCursor.WordRight, QtGui.QTextCursor.KeepAnchor)
else:
self.current_widget.moveCursor(QtGui.QTextCursor.WordRight)
def move_cursor_word_left(self):
if self.current_widget.mark_mode:
self.current_widget.moveCursor(QtGui.QTextCursor.WordLeft, QtGui.QTextCursor.KeepAnchor)
else:
self.current_widget.moveCursor(QtGui.QTextCursor.WordLeft)
def move_cursor_block_up(self):
if self.current_widget.mark_mode:
self.current_widget.moveCursor(QtGui.QTextCursor.PreviousBlock, QtGui.QTextCursor.KeepAnchor)
self.current_widget.moveCursor(QtGui.QTextCursor.StartOfBlock, QtGui.QTextCursor.KeepAnchor)
else:
self.current_widget.moveCursor(QtGui.QTextCursor.PreviousBlock)
self.current_widget.moveCursor(QtGui.QTextCursor.StartOfBlock)
def move_cursor_block_down(self):
if self.current_widget.mark_mode:
self.current_widget.moveCursor(QtGui.QTextCursor.NextBlock, QtGui.QTextCursor.KeepAnchor)
self.current_widget.moveCursor(QtGui.QTextCursor.EndOfBlock, QtGui.QTextCursor.KeepAnchor)
else:
self.current_widget.moveCursor(QtGui.QTextCursor.NextBlock)
self.current_widget.moveCursor(QtGui.QTextCursor.EndOfBlock)
# Mark mode
def move_cursor_char_right_mark(self):
self.current_widget.moveCursor(QtGui.QTextCursor.Right, QtGui.QTextCursor.KeepAnchor)
def move_cursor_char_left_mark(self):
self.current_widget.moveCursor(QtGui.QTextCursor.Left, QtGui.QTextCursor.KeepAnchor)
def move_cursor_line_up_mark(self):
self.current_widget.moveCursor(QtGui.QTextCursor.Up, QtGui.QTextCursor.KeepAnchor)
def move_cursor_line_down_mark(self):
self.current_widget.moveCursor(QtGui.QTextCursor.Down, QtGui.QTextCursor.KeepAnchor)
def move_cursor_word_right_mark(self):
self.current_widget.moveCursor(QtGui.QTextCursor.WordRight, QtGui.QTextCursor.KeepAnchor)
def move_cursor_word_left_mark(self):
self.current_widget.moveCursor(QtGui.QTextCursor.WordLeft, QtGui.QTextCursor.KeepAnchor)
def move_cursor_block_up_mark(self):
self.current_widget.moveCursor(QtGui.QTextCursor.PreviousBlock, QtGui.QTextCursor.KeepAnchor)
def move_cursor_block_down_mark(self):
self.current_widget.moveCursor(QtGui.QTextCursor.NextBlock, QtGui.QTextCursor.KeepAnchor)
self.current_widget.moveCursor(QtGui.QTextCursor.EndOfBlock, QtGui.QTextCursor.KeepAnchor)
def move_cursor_line_end(self):
if self.current_widget.mark_mode:
self.current_widget.moveCursor(QtGui.QTextCursor.EndOfLine, QtGui.QTextCursor.KeepAnchor)
else:
self.current_widget.moveCursor(QtGui.QTextCursor.EndOfLine)
def move_cursor_line_start(self):
if self.current_widget.mark_mode:
self.current_widget.moveCursor(QtGui.QTextCursor.StartOfLine, QtGui.QTextCursor.KeepAnchor)
else:
self.current_widget.moveCursor(QtGui.QTextCursor.StartOfLine)
def move_cursor_line_end_mark(self):
self.current_widget.moveCursor(QtGui.QTextCursor.EndOfLine, QtGui.QTextCursor.KeepAnchor)
def move_cursor_line_start_mark(self):
self.current_widget.moveCursor(QtGui.QTextCursor.StartOfLine, QtGui.QTextCursor.KeepAnchor)
| 40.443478 | 101 | 0.788433 | 576 | 4,651 | 6.126736 | 0.079861 | 0.262964 | 0.245679 | 0.267781 | 0.94276 | 0.876452 | 0.863984 | 0.769906 | 0.753471 | 0.71465 | 0 | 0.000244 | 0.120404 | 4,651 | 114 | 102 | 40.798246 | 0.862381 | 0.007525 | 0 | 0.482759 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.275862 | false | 0 | 0.011494 | 0 | 0.287356 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
361c74ea2ba961db23922a1ecd8ab208b7fd659f | 88 | py | Python | lib/interactiveBrokers/__init__.py | cmorgan/trading-with-python | c7d91a418030316cc9d06f318ade134fde016dee | [
"BSD-3-Clause"
] | 24 | 2015-01-24T11:27:34.000Z | 2019-09-03T22:28:00.000Z | lib/interactiveBrokers/__init__.py | cmorgan/trading-with-python | c7d91a418030316cc9d06f318ade134fde016dee | [
"BSD-3-Clause"
] | null | null | null | lib/interactiveBrokers/__init__.py | cmorgan/trading-with-python | c7d91a418030316cc9d06f318ade134fde016dee | [
"BSD-3-Clause"
] | 19 | 2015-11-03T19:52:50.000Z | 2021-03-12T08:16:15.000Z | from extra import createContract
from tickLogger import logTicks
from extra import * | 22 | 33 | 0.818182 | 11 | 88 | 6.545455 | 0.545455 | 0.25 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170455 | 88 | 4 | 34 | 22 | 0.986301 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
361d6552bdce7bde27813def131010fcbad38502 | 68 | py | Python | services/getForums.py | Luclujan7198/Ac8_aplic_Distribuidas | a8589b85415b2e535c3c7682b3bb631411492547 | [
"Unlicense"
] | null | null | null | services/getForums.py | Luclujan7198/Ac8_aplic_Distribuidas | a8589b85415b2e535c3c7682b3bb631411492547 | [
"Unlicense"
] | null | null | null | services/getForums.py | Luclujan7198/Ac8_aplic_Distribuidas | a8589b85415b2e535c3c7682b3bb631411492547 | [
"Unlicense"
] | null | null | null | from models.Forum import forum
def listarForuns():
return forum | 17 | 30 | 0.764706 | 9 | 68 | 5.777778 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 68 | 4 | 31 | 17 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
3642c718aa497e3c7b256928d83fff6f9d554b23 | 47,511 | py | Python | huaweicloud-sdk-bcs/huaweicloudsdkbcs/v2/bcs_async_client.py | huaweicloud/huaweicloud-sdk-python-v3 | 7a6270390fcbf192b3882bf763e7016e6026ef78 | [
"Apache-2.0"
] | 64 | 2020-06-12T07:05:07.000Z | 2022-03-30T03:32:50.000Z | huaweicloud-sdk-bcs/huaweicloudsdkbcs/v2/bcs_async_client.py | huaweicloud/huaweicloud-sdk-python-v3 | 7a6270390fcbf192b3882bf763e7016e6026ef78 | [
"Apache-2.0"
] | 11 | 2020-07-06T07:56:54.000Z | 2022-01-11T11:14:40.000Z | huaweicloud-sdk-bcs/huaweicloudsdkbcs/v2/bcs_async_client.py | huaweicloud/huaweicloud-sdk-python-v3 | 7a6270390fcbf192b3882bf763e7016e6026ef78 | [
"Apache-2.0"
] | 24 | 2020-06-08T11:42:13.000Z | 2022-03-04T06:44:08.000Z | # coding: utf-8
from __future__ import absolute_import
import datetime
import re
import importlib
import six
from huaweicloudsdkcore.client import Client, ClientBuilder
from huaweicloudsdkcore.exceptions import exceptions
from huaweicloudsdkcore.utils import http_utils
from huaweicloudsdkcore.sdk_stream_request import SdkStreamRequest
class BcsAsyncClient(Client):
"""
:param configuration: .Configuration object for this client
:param pool_threads: The number of threads to use for async requests
to the API. More threads means more concurrent API requests.
"""
PRIMITIVE_TYPES = (float, bool, bytes, six.text_type) + six.integer_types
NATIVE_TYPES_MAPPING = {
'int': int,
'long': int if six.PY3 else long,
'float': float,
'str': str,
'bool': bool,
'date': datetime.date,
'datetime': datetime.datetime,
'object': object,
}
def __init__(self):
super(BcsAsyncClient, self).__init__()
self.model_package = importlib.import_module("huaweicloudsdkbcs.v2.model")
self.preset_headers = {'User-Agent': 'HuaweiCloud-SDK-Python'}
@classmethod
def new_builder(cls, clazz=None):
if clazz is None:
return ClientBuilder(cls)
if clazz.__name__ != "BcsClient":
raise TypeError("client type error, support client type is BcsClient")
return ClientBuilder(clazz)
def batch_add_peers_to_channel_async(self, request):
"""peer节点加入通道
peer节点加入通道,目前仅支持往一个通道中加入peer
:param BatchAddPeersToChannelRequest request
:return: BatchAddPeersToChannelResponse
"""
return self.batch_add_peers_to_channel_with_http_info(request)
def batch_add_peers_to_channel_with_http_info(self, request):
"""peer节点加入通道
peer节点加入通道,目前仅支持往一个通道中加入peer
:param BatchAddPeersToChannelRequest request
:return: BatchAddPeersToChannelResponse
"""
all_params = ['blockchain_id', 'batch_add_peers_to_channel_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json;charset=UTF-8'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}/channels/peers',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='BatchAddPeersToChannelResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def batch_create_channels_async(self, request):
"""创建通道
创建通道
:param BatchCreateChannelsRequest request
:return: BatchCreateChannelsResponse
"""
return self.batch_create_channels_with_http_info(request)
def batch_create_channels_with_http_info(self, request):
"""创建通道
创建通道
:param BatchCreateChannelsRequest request
:return: BatchCreateChannelsResponse
"""
all_params = ['blockchain_id', 'batch_create_channels_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json;charset=UTF-8'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}/channels',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='BatchCreateChannelsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def batch_invite_members_to_channel_async(self, request):
"""邀请联盟成员
批量邀请联盟成员加入通道,此操作会向被邀请方发出邀请通知
:param BatchInviteMembersToChannelRequest request
:return: BatchInviteMembersToChannelResponse
"""
return self.batch_invite_members_to_channel_with_http_info(request)
def batch_invite_members_to_channel_with_http_info(self, request):
"""邀请联盟成员
批量邀请联盟成员加入通道,此操作会向被邀请方发出邀请通知
:param BatchInviteMembersToChannelRequest request
:return: BatchInviteMembersToChannelResponse
"""
all_params = ['batch_invite_members_to_channel_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json;charset=UTF-8'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/members/invitations',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='BatchInviteMembersToChannelResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def batch_remove_orgs_from_channel_async(self, request):
"""BCS组织退出某通道
该接口用于BCS组织退出某通道。
:param BatchRemoveOrgsFromChannelRequest request
:return: BatchRemoveOrgsFromChannelResponse
"""
return self.batch_remove_orgs_from_channel_with_http_info(request)
def batch_remove_orgs_from_channel_with_http_info(self, request):
"""BCS组织退出某通道
该接口用于BCS组织退出某通道。
:param BatchRemoveOrgsFromChannelRequest request
:return: BatchRemoveOrgsFromChannelResponse
"""
all_params = ['blockchain_id', 'channel_id', 'batch_remove_orgs_from_channel_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
if 'channel_id' in local_var_params:
path_params['channel_id'] = local_var_params['channel_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json;charset=UTF-8'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}/{channel_id}/orgs/quit',
method='PUT',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='BatchRemoveOrgsFromChannelResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def create_new_blockchain_async(self, request):
"""创建服务实例
创建BCS服务实例,只支持按需创建
:param CreateNewBlockchainRequest request
:return: CreateNewBlockchainResponse
"""
return self.create_new_blockchain_with_http_info(request)
def create_new_blockchain_with_http_info(self, request):
"""创建服务实例
创建BCS服务实例,只支持按需创建
:param CreateNewBlockchainRequest request
:return: CreateNewBlockchainResponse
"""
all_params = ['create_new_blockchain_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json;charset=UTF-8'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='CreateNewBlockchainResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def delete_blockchain_async(self, request):
"""删除服务实例
删除bcs实例。包周期实例不支持
:param DeleteBlockchainRequest request
:return: DeleteBlockchainResponse
"""
return self.delete_blockchain_with_http_info(request)
def delete_blockchain_with_http_info(self, request):
"""删除服务实例
删除bcs实例。包周期实例不支持
:param DeleteBlockchainRequest request
:return: DeleteBlockchainResponse
"""
all_params = ['blockchain_id', 'is_delete_storage', 'is_delete_obs', 'is_delete_resource']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
query_params = []
if 'is_delete_storage' in local_var_params:
query_params.append(('is_delete_storage', local_var_params['is_delete_storage']))
if 'is_delete_obs' in local_var_params:
query_params.append(('is_delete_obs', local_var_params['is_delete_obs']))
if 'is_delete_resource' in local_var_params:
query_params.append(('is_delete_resource', local_var_params['is_delete_resource']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}',
method='DELETE',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='DeleteBlockchainResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def download_blockchain_cert_async(self, request):
"""下载证书
下载指定服务实例相关证书
:param DownloadBlockchainCertRequest request
:return: DownloadBlockchainCertResponse
"""
return self.download_blockchain_cert_with_http_info(request)
def download_blockchain_cert_with_http_info(self, request):
"""下载证书
下载指定服务实例相关证书
:param DownloadBlockchainCertRequest request
:return: DownloadBlockchainCertResponse
"""
all_params = ['blockchain_id', 'org_name', 'cert_type']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
query_params = []
if 'org_name' in local_var_params:
query_params.append(('org_name', local_var_params['org_name']))
if 'cert_type' in local_var_params:
query_params.append(('cert_type', local_var_params['cert_type']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}/cert',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='DownloadBlockchainCertResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def download_blockchain_sdk_config_async(self, request):
"""下载SDK配置
下载指定服务实例SDK配置文件
:param DownloadBlockchainSdkConfigRequest request
:return: DownloadBlockchainSdkConfigResponse
"""
return self.download_blockchain_sdk_config_with_http_info(request)
def download_blockchain_sdk_config_with_http_info(self, request):
"""下载SDK配置
下载指定服务实例SDK配置文件
:param DownloadBlockchainSdkConfigRequest request
:return: DownloadBlockchainSdkConfigResponse
"""
all_params = ['blockchain_id', 'download_blockchain_sdk_config_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json;charset=UTF-8'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}/sdk-cfg',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='DownloadBlockchainSdkConfigResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def handle_notification_async(self, request):
"""处理联盟邀请
处理联盟邀请
:param HandleNotificationRequest request
:return: HandleNotificationResponse
"""
return self.handle_notification_with_http_info(request)
def handle_notification_with_http_info(self, request):
"""处理联盟邀请
处理联盟邀请
:param HandleNotificationRequest request
:return: HandleNotificationResponse
"""
all_params = ['handle_notification_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json;charset=UTF-8'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/notification/handle',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='HandleNotificationResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_bcs_metric_async(self, request):
"""查询BCS服务实例监控数据
该接口用于查询BCS服务的监控数据,可以指定相应的指标名称。[目前不支持IEF节点](tag:hasief)
:param ListBcsMetricRequest request
:return: ListBcsMetricResponse
"""
return self.list_bcs_metric_with_http_info(request)
def list_bcs_metric_with_http_info(self, request):
"""查询BCS服务实例监控数据
该接口用于查询BCS服务的监控数据,可以指定相应的指标名称。[目前不支持IEF节点](tag:hasief)
:param ListBcsMetricRequest request
:return: ListBcsMetricResponse
"""
all_params = ['blockchain_id', 'list_bcs_metric_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json;charset=UTF-8'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}/metric/list',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListBcsMetricResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_blockchain_channels_async(self, request):
"""查询通道信息
查询指定服务实例通道信息
:param ListBlockchainChannelsRequest request
:return: ListBlockchainChannelsResponse
"""
return self.list_blockchain_channels_with_http_info(request)
def list_blockchain_channels_with_http_info(self, request):
"""查询通道信息
查询指定服务实例通道信息
:param ListBlockchainChannelsRequest request
:return: ListBlockchainChannelsResponse
"""
all_params = ['blockchain_id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}/channels',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListBlockchainChannelsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_blockchains_async(self, request):
"""查询服务实例列表
查询当前项目下所有服务实例的简要信息
:param ListBlockchainsRequest request
:return: ListBlockchainsResponse
"""
return self.list_blockchains_with_http_info(request)
def list_blockchains_with_http_info(self, request):
"""查询服务实例列表
查询当前项目下所有服务实例的简要信息
:param ListBlockchainsRequest request
:return: ListBlockchainsResponse
"""
all_params = []
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListBlockchainsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_entity_metric_async(self, request):
"""查询BCS组织监控数据列表
该接口用于查询BCS组织的监控数据列表。
:param ListEntityMetricRequest request
:return: ListEntityMetricResponse
"""
return self.list_entity_metric_with_http_info(request)
def list_entity_metric_with_http_info(self, request):
"""查询BCS组织监控数据列表
该接口用于查询BCS组织的监控数据列表。
:param ListEntityMetricRequest request
:return: ListEntityMetricResponse
"""
all_params = ['blockchain_id', 'list_entity_metric_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json;charset=UTF-8'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}/entity/metric/list',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListEntityMetricResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_instance_metric_async(self, request):
"""查询BCS组织实例监控数据详情
该接口用于BCS组织实例监控数据详情。
:param ListInstanceMetricRequest request
:return: ListInstanceMetricResponse
"""
return self.list_instance_metric_with_http_info(request)
def list_instance_metric_with_http_info(self, request):
"""查询BCS组织实例监控数据详情
该接口用于BCS组织实例监控数据详情。
:param ListInstanceMetricRequest request
:return: ListInstanceMetricResponse
"""
all_params = ['blockchain_id', 'list_instance_metric_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json;charset=UTF-8'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}/entity/instance/metric/list',
method='POST',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListInstanceMetricResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_members_async(self, request):
"""获取联盟成员列表
获取联盟成员列表
:param ListMembersRequest request
:return: ListMembersResponse
"""
return self.list_members_with_http_info(request)
def list_members_with_http_info(self, request):
"""获取联盟成员列表
获取联盟成员列表
:param ListMembersRequest request
:return: ListMembersResponse
"""
all_params = []
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/members',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListMembersResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_notifications_async(self, request):
"""获取全部通知
获取全部通知
:param ListNotificationsRequest request
:return: ListNotificationsResponse
"""
return self.list_notifications_with_http_info(request)
def list_notifications_with_http_info(self, request):
"""获取全部通知
获取全部通知
:param ListNotificationsRequest request
:return: ListNotificationsResponse
"""
all_params = []
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/notifications',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListNotificationsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_op_record_async(self, request):
"""查询异步操作结果
查询异步操作结果
:param ListOpRecordRequest request
:return: ListOpRecordResponse
"""
return self.list_op_record_with_http_info(request)
def list_op_record_with_http_info(self, request):
"""查询异步操作结果
查询异步操作结果
:param ListOpRecordRequest request
:return: ListOpRecordResponse
"""
all_params = ['blockchain_id', 'operation_status', 'resource_type', 'operation_type', 'operation_id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
if 'blockchain_id' in local_var_params:
query_params.append(('blockchain_id', local_var_params['blockchain_id']))
if 'operation_status' in local_var_params:
query_params.append(('operation_status', local_var_params['operation_status']))
if 'resource_type' in local_var_params:
query_params.append(('resource_type', local_var_params['resource_type']))
if 'operation_type' in local_var_params:
query_params.append(('operation_type', local_var_params['operation_type']))
if 'operation_id' in local_var_params:
query_params.append(('operation_id', local_var_params['operation_id']))
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/operation/record',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListOpRecordResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def list_quotas_async(self, request):
"""查询配额
查询当前项目下BCS服务所有资源的配额信息
:param ListQuotasRequest request
:return: ListQuotasResponse
"""
return self.list_quotas_with_http_info(request)
def list_quotas_with_http_info(self, request):
"""查询配额
查询当前项目下BCS服务所有资源的配额信息
:param ListQuotasRequest request
:return: ListQuotasResponse
"""
all_params = []
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/quotas',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ListQuotasResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def show_blockchain_detail_async(self, request):
"""查询实例信息
查询指定服务实例详细信息
:param ShowBlockchainDetailRequest request
:return: ShowBlockchainDetailResponse
"""
return self.show_blockchain_detail_with_http_info(request)
def show_blockchain_detail_with_http_info(self, request):
"""查询实例信息
查询指定服务实例详细信息
:param ShowBlockchainDetailRequest request
:return: ShowBlockchainDetailResponse
"""
all_params = ['blockchain_id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ShowBlockchainDetailResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def show_blockchain_flavors_async(self, request):
"""查询规格
查询服务联盟链规格信息
:param ShowBlockchainFlavorsRequest request
:return: ShowBlockchainFlavorsResponse
"""
return self.show_blockchain_flavors_with_http_info(request)
def show_blockchain_flavors_with_http_info(self, request):
"""查询规格
查询服务联盟链规格信息
:param ShowBlockchainFlavorsRequest request
:return: ShowBlockchainFlavorsResponse
"""
all_params = []
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/flavors',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ShowBlockchainFlavorsResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def show_blockchain_nodes_async(self, request):
"""查询节点信息
查询指定服务实例节点信息
:param ShowBlockchainNodesRequest request
:return: ShowBlockchainNodesResponse
"""
return self.show_blockchain_nodes_with_http_info(request)
def show_blockchain_nodes_with_http_info(self, request):
"""查询节点信息
查询指定服务实例节点信息
:param ShowBlockchainNodesRequest request
:return: ShowBlockchainNodesResponse
"""
all_params = ['blockchain_id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}/nodes',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ShowBlockchainNodesResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def show_blockchain_status_async(self, request):
"""查询创建状态
查询指定服务实例创建状态
:param ShowBlockchainStatusRequest request
:return: ShowBlockchainStatusResponse
"""
return self.show_blockchain_status_with_http_info(request)
def show_blockchain_status_with_http_info(self, request):
"""查询创建状态
查询指定服务实例创建状态
:param ShowBlockchainStatusRequest request
:return: ShowBlockchainStatusResponse
"""
all_params = ['blockchain_id']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}/status',
method='GET',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='ShowBlockchainStatusResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def update_instance_async(self, request):
"""修改服务实例
修改实例的节点、组织,目前仅支持添加、删除节点(IEF模式不支持添加、删除节点),添加、删除组织,共4种类型,每次操作只可以操作一种类型。此接口不支持包周期模式; 注意注册IEF节点时,IEF节点名称长度应该为4-24位的字符
:param UpdateInstanceRequest request
:return: UpdateInstanceResponse
"""
return self.update_instance_with_http_info(request)
def update_instance_with_http_info(self, request):
"""修改服务实例
修改实例的节点、组织,目前仅支持添加、删除节点(IEF模式不支持添加、删除节点),添加、删除组织,共4种类型,每次操作只可以操作一种类型。此接口不支持包周期模式; 注意注册IEF节点时,IEF节点名称长度应该为4-24位的字符
:param UpdateInstanceRequest request
:return: UpdateInstanceResponse
"""
all_params = ['blockchain_id', 'update_instance_request_body']
local_var_params = {}
for attr in request.attribute_map:
if hasattr(request, attr):
local_var_params[attr] = getattr(request, attr)
collection_formats = {}
path_params = {}
if 'blockchain_id' in local_var_params:
path_params['blockchain_id'] = local_var_params['blockchain_id']
query_params = []
header_params = {}
form_params = {}
body_params = None
if 'body' in local_var_params:
body_params = local_var_params['body']
if isinstance(request, SdkStreamRequest):
body_params = request.get_file_stream()
response_headers = []
header_params['Content-Type'] = http_utils.select_header_content_type(
['application/json;charset=UTF-8'])
auth_settings = []
return self.call_api(
resource_path='/v2/{project_id}/blockchains/{blockchain_id}',
method='PUT',
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body_params,
post_params=form_params,
response_type='UpdateInstanceResponse',
response_headers=response_headers,
auth_settings=auth_settings,
collection_formats=collection_formats,
request_type=request.__class__.__name__)
def call_api(self, resource_path, method, path_params=None, query_params=None, header_params=None, body=None,
post_params=None, response_type=None, response_headers=None, auth_settings=None,
collection_formats=None, request_type=None):
"""Makes the HTTP request and returns deserialized data.
:param resource_path: Path to method endpoint.
:param method: Method to call.
:param path_params: Path parameters in the url.
:param query_params: Query parameters in the url.
:param header_params: Header parameters to be
placed in the request header.
:param body: Request body.
:param post_params dict: Request post form parameters,
for `application/x-www-form-urlencoded`, `multipart/form-data`.
:param auth_settings list: Auth Settings names for the request.
:param response_type: Response data type.
:param response_headers: Header should be added to response data.
:param collection_formats: dict of collection formats for path, query,
header, and post parameters.
:param request_type: Request data type.
:return:
Return the response directly.
"""
return self.do_http_request(
method=method,
resource_path=resource_path,
path_params=path_params,
query_params=query_params,
header_params=header_params,
body=body,
post_params=post_params,
response_type=response_type,
response_headers=response_headers,
collection_formats=collection_formats,
request_type=request_type,
async_request=True)
| 30.416773 | 121 | 0.632948 | 4,634 | 47,511 | 6.084376 | 0.063013 | 0.033481 | 0.058592 | 0.035893 | 0.864196 | 0.841993 | 0.812414 | 0.794999 | 0.775953 | 0.650718 | 0 | 0.001325 | 0.285155 | 47,511 | 1,561 | 122 | 30.436259 | 0.828843 | 0.127676 | 0 | 0.787258 | 0 | 0 | 0.112593 | 0.059434 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055745 | false | 0 | 0.011377 | 0 | 0.12628 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
36e6de2a1443ef4731ce0cfb3871b3332eda9213 | 41,885 | py | Python | interp_e2e_driving/networks/sequential_latent_network.py | liuyuqi123/interp-e2e-driving | 497bb1bd2eeefe6267e53673813f63cafcd59bb9 | [
"MIT"
] | null | null | null | interp_e2e_driving/networks/sequential_latent_network.py | liuyuqi123/interp-e2e-driving | 497bb1bd2eeefe6267e53673813f63cafcd59bb9 | [
"MIT"
] | null | null | null | interp_e2e_driving/networks/sequential_latent_network.py | liuyuqi123/interp-e2e-driving | 497bb1bd2eeefe6267e53673813f63cafcd59bb9 | [
"MIT"
] | null | null | null | # Copyright (c) 2020: Jianyu Chen (jianyuchen@berkeley.edu).
#
# This file is modified from <https://github.com/alexlee-gk/slac>:
# Copyright (c) 2019 alexlee-gk
#
# This work is licensed under the terms of the MIT license.
# For a copy, see <https://opensource.org/licenses/MIT>.
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import functools
import gin
import numpy as np
import tensorflow as tf
import tensorflow_probability as tfp
from tf_agents.trajectories import time_step as ts
from interp_e2e_driving.utils import nest_utils
tfd = tfp.distributions
class MultivariateNormalDiag(tf.Module):
def __init__(self, base_depth, latent_size, name=None):
super(MultivariateNormalDiag, self).__init__(name=name)
self.latent_size = latent_size
self.dense1 = tf.keras.layers.Dense(base_depth, activation=tf.nn.leaky_relu)
self.dense2 = tf.keras.layers.Dense(base_depth, activation=tf.nn.leaky_relu)
self.output_layer = tf.keras.layers.Dense(2 * latent_size)
def __call__(self, *inputs):
if len(inputs) > 1:
inputs = tf.concat(inputs, axis=-1)
else:
inputs, = inputs
out = self.dense1(inputs)
out = self.dense2(out)
out = self.output_layer(out)
loc = out[..., :self.latent_size]
scale_diag = tf.nn.softplus(out[..., self.latent_size:]) + 1e-5
return tfd.MultivariateNormalDiag(loc=loc, scale_diag=scale_diag)
class ConstantMultivariateNormalDiag(tf.Module):
def __init__(self, latent_size, name=None):
super(ConstantMultivariateNormalDiag, self).__init__(name=name)
self.latent_size = latent_size
def __call__(self, *inputs):
# first input should not have any dimensions after the batch_shape, step_type
batch_shape = tf.shape(inputs[0]) # input is only used to infer batch_shape
shape = tf.concat([batch_shape, [self.latent_size]], axis=0)
loc = tf.zeros(shape)
scale_diag = tf.ones(shape)
return tfd.MultivariateNormalDiag(loc=loc, scale_diag=scale_diag)
class Decoder64(tf.Module):
"""Probabilistic decoder for `p(x_t | z_t)`, with input image size of 64*64.
"""
def __init__(self, base_depth, channels=3, scale=1.0, name=None):
super(Decoder64, self).__init__(name=name)
self.scale = scale
conv_transpose = functools.partial(
tf.keras.layers.Conv2DTranspose, padding="SAME", activation=tf.nn.leaky_relu)
self.conv_transpose1 = conv_transpose(8 * base_depth, 4, padding="VALID")
self.conv_transpose2 = conv_transpose(4 * base_depth, 3, 2)
self.conv_transpose3 = conv_transpose(2 * base_depth, 3, 2)
self.conv_transpose4 = conv_transpose(base_depth, 3, 2)
self.conv_transpose5 = conv_transpose(channels, 5, 2)
def __call__(self, *inputs):
if len(inputs) > 1:
latent = tf.concat(inputs, axis=-1)
else:
latent, = inputs
# (sample, N, T, latent)
collapsed_shape = tf.stack([-1, 1, 1, tf.shape(latent)[-1]], axis=0)
out = tf.reshape(latent, collapsed_shape)
out = self.conv_transpose1(out)
out = self.conv_transpose2(out)
out = self.conv_transpose3(out)
out = self.conv_transpose4(out)
out = self.conv_transpose5(out) # (sample*N*T, h, w, c)
expanded_shape = tf.concat(
[tf.shape(latent)[:-1], tf.shape(out)[1:]], axis=0)
out = tf.reshape(out, expanded_shape) # (sample, N, T, h, w, c)
return tfd.Independent(
distribution=tfd.Normal(loc=out, scale=self.scale),
reinterpreted_batch_ndims=3) # wrap (h, w, c)
class Decoder128(tf.Module):
"""Probabilistic decoder for `p(x_t | z_t)`, with input image size of 128*128.
"""
def __init__(self, base_depth, channels=3, scale=1.0, name=None):
super(Decoder128, self).__init__(name=name)
self.scale = scale
conv_transpose = functools.partial(
tf.keras.layers.Conv2DTranspose, padding="SAME", activation=tf.nn.leaky_relu)
self.conv_transpose1 = conv_transpose(8 * base_depth, 4, padding="VALID")
self.conv_transpose2 = conv_transpose(8 * base_depth, 3, 2)
self.conv_transpose3 = conv_transpose(4 * base_depth, 3, 2)
self.conv_transpose4 = conv_transpose(2 * base_depth, 3, 2)
self.conv_transpose5 = conv_transpose(base_depth, 3, 2)
self.conv_transpose6 = conv_transpose(channels, 5, 2)
def __call__(self, *inputs):
if len(inputs) > 1:
latent = tf.concat(inputs, axis=-1)
else:
latent, = inputs
# (sample, N, T, latent)
collapsed_shape = tf.stack([-1, 1, 1, tf.shape(latent)[-1]], axis=0)
out = tf.reshape(latent, collapsed_shape)
out = self.conv_transpose1(out)
out = self.conv_transpose2(out)
out = self.conv_transpose3(out)
out = self.conv_transpose4(out)
out = self.conv_transpose5(out)
out = self.conv_transpose6(out) # (sample*N*T, h, w, c)
expanded_shape = tf.concat(
[tf.shape(latent)[:-1], tf.shape(out)[1:]], axis=0)
out = tf.reshape(out, expanded_shape) # (sample, N, T, h, w, c)
return tfd.Independent(
distribution=tfd.Normal(loc=out, scale=self.scale),
reinterpreted_batch_ndims=3) # wrap (h, w, c)
class Decoder256(tf.Module):
"""Probabilistic decoder for `p(x_t | z_t)`, with input image size of 256*256.
"""
def __init__(self, base_depth, channels=3, scale=1.0, name=None):
super(Decoder256, self).__init__(name=name)
self.scale = scale
conv_transpose = functools.partial(
tf.keras.layers.Conv2DTranspose, padding="SAME", activation=tf.nn.leaky_relu)
self.conv_transpose1 = conv_transpose(8 * base_depth, 4, padding="VALID")
self.conv_transpose2 = conv_transpose(8 * base_depth, 3, 2)
self.conv_transpose3 = conv_transpose(8 * base_depth, 3, 2)
self.conv_transpose4 = conv_transpose(4 * base_depth, 3, 2)
self.conv_transpose5 = conv_transpose(2 * base_depth, 3, 2)
self.conv_transpose6 = conv_transpose(base_depth, 3, 2)
self.conv_transpose7 = conv_transpose(channels, 5, 2)
def __call__(self, *inputs):
if len(inputs) > 1:
latent = tf.concat(inputs, axis=-1)
else:
latent, = inputs
# (sample, N, T, latent)
collapsed_shape = tf.stack([-1, 1, 1, tf.shape(latent)[-1]], axis=0)
out = tf.reshape(latent, collapsed_shape)
out = self.conv_transpose1(out)
out = self.conv_transpose2(out)
out = self.conv_transpose3(out)
out = self.conv_transpose4(out)
out = self.conv_transpose5(out)
out = self.conv_transpose6(out)
out = self.conv_transpose7(out) # (sample*N*T, h, w, c)
expanded_shape = tf.concat(
[tf.shape(latent)[:-1], tf.shape(out)[1:]], axis=0)
out = tf.reshape(out, expanded_shape) # (sample, N, T, h, w, c)
return tfd.Independent(
distribution=tfd.Normal(loc=out, scale=self.scale),
reinterpreted_batch_ndims=3) # wrap (h, w, c)
class Encoder64(tf.Module):
"""Feature extractor for input image size 64*64.
"""
def __init__(self, base_depth, feature_size, name=None):
super(Encoder64, self).__init__(name=name)
self.feature_size = feature_size
conv = functools.partial(
tf.keras.layers.Conv2D, padding="SAME", activation=tf.nn.leaky_relu)
self.conv1 = conv(base_depth, 5, 2)
self.conv2 = conv(2 * base_depth, 3, 2)
self.conv3 = conv(4 * base_depth, 3, 2)
self.conv4 = conv(8 * base_depth, 3, 2)
self.conv5 = conv(8 * base_depth, 4, padding="VALID")
def __call__(self, image):
image_shape = tf.shape(image)[-3:]
collapsed_shape = tf.concat(([-1], image_shape), axis=0)
out = tf.reshape(image, collapsed_shape) # (sample*N*T, h, w, c)
out = self.conv1(out)
out = self.conv2(out)
out = self.conv3(out)
out = self.conv4(out)
out = self.conv5(out)
expanded_shape = tf.concat((tf.shape(image)[:-3], [self.feature_size]), axis=0)
return tf.reshape(out, expanded_shape) # (sample, N, T, feature)
class Encoder128(tf.Module):
"""Feature extractor for input image size 128*128.
"""
def __init__(self, base_depth, feature_size, name=None):
super(Encoder128, self).__init__(name=name)
self.feature_size = feature_size
conv = functools.partial(
tf.keras.layers.Conv2D, padding="SAME", activation=tf.nn.leaky_relu)
self.conv1 = conv(base_depth, 5, 2)
self.conv2 = conv(2 * base_depth, 3, 2)
self.conv3 = conv(4 * base_depth, 3, 2)
self.conv4 = conv(8 * base_depth, 3, 2)
self.conv5 = conv(8 * base_depth, 3, 2)
self.conv6 = conv(8 * base_depth, 4, padding="VALID")
def __call__(self, image):
image_shape = tf.shape(image)[-3:]
collapsed_shape = tf.concat(([-1], image_shape), axis=0)
out = tf.reshape(image, collapsed_shape) # (sample*N*T, h, w, c)
out = self.conv1(out)
out = self.conv2(out)
out = self.conv3(out)
out = self.conv4(out)
out = self.conv5(out)
out = self.conv6(out)
expanded_shape = tf.concat((tf.shape(image)[:-3], [self.feature_size]), axis=0)
return tf.reshape(out, expanded_shape) # (sample, N, T, feature)
class Encoder256(tf.Module):
"""Feature extractor for input image size 256*256.
"""
def __init__(self, base_depth, feature_size, name=None):
super(Encoder256, self).__init__(name=name)
self.feature_size = feature_size
conv = functools.partial(
tf.keras.layers.Conv2D, padding="SAME", activation=tf.nn.leaky_relu)
self.conv1 = conv(base_depth, 5, 2)
self.conv2 = conv(2 * base_depth, 3, 2)
self.conv3 = conv(4 * base_depth, 3, 2)
self.conv4 = conv(8 * base_depth, 3, 2)
self.conv5 = conv(8 * base_depth, 3, 2)
self.conv6 = conv(8 * base_depth, 3, 2)
self.conv7 = conv(8 * base_depth, 4, padding="VALID")
def __call__(self, image):
image_shape = tf.shape(image)[-3:]
collapsed_shape = tf.concat(([-1], image_shape), axis=0)
out = tf.reshape(image, collapsed_shape) # (sample*N*T, h, w, c)
out = self.conv1(out)
out = self.conv2(out)
out = self.conv3(out)
out = self.conv4(out)
out = self.conv5(out)
out = self.conv6(out)
out = self.conv7(out)
expanded_shape = tf.concat((tf.shape(image)[:-3], [self.feature_size]), axis=0)
return tf.reshape(out, expanded_shape) # (sample, N, T, feature)
@gin.configurable
class SequentialLatentModelHierarchical(tf.Module):
"""The hierarchical sequential latent model. See https://arxiv.org/abs/1907.00953.
"""
def __init__(self,
input_names,
reconstruct_names,
obs_size=64,
base_depth=32,
latent1_size=32,
latent2_size=256,
kl_analytic=True,
decoder_stddev=np.sqrt(0.1, dtype=np.float32),
name=None):
"""Creates an instance of `SequentialLatentModelHierarchical`.
Args:
input_names: the names of the observation inputs (e.g, 'camera', 'lidar').
reconstruct_names: names of the outputs to reconstruct (e.g, 'mask').
obs_size: the pixel size of the observation inputs. Here we assume
the image inputs have same width and height.
base_depth: base depth of the convolutional layers.
latent1_size: size of the first latent of the hierarchical latent model.
latent2_size: size of the second latent of the hierarchical latent model.
kl_analytic: whether to use analytical KL divergence.
decoder_stddev: standard deviation of the decoder.
name: A string representing name of the network.
"""
super(SequentialLatentModelHierarchical, self).__init__(name=name)
self.input_names = input_names
self.reconstruct_names = reconstruct_names
self.latent1_size = latent1_size
self.latent2_size = latent2_size
self.kl_analytic = kl_analytic
self.obs_size = obs_size
latent1_first_prior_distribution_ctor = ConstantMultivariateNormalDiag
latent1_distribution_ctor = MultivariateNormalDiag
latent2_distribution_ctor = MultivariateNormalDiag
# p(z_1^1)
self.latent1_first_prior = latent1_first_prior_distribution_ctor(latent1_size)
# p(z_1^2 | z_1^1)
self.latent2_first_prior = latent2_distribution_ctor(8 * base_depth, latent2_size)
# p(z_{t+1}^1 | z_t^2, a_t)
self.latent1_prior = latent1_distribution_ctor(8 * base_depth, latent1_size)
# p(z_{t+1}^2 | z_{t+1}^1, z_t^2, a_t)
self.latent2_prior = latent2_distribution_ctor(8 * base_depth, latent2_size)
# q(z_1^1 | f_1)
self.latent1_first_posterior = latent1_distribution_ctor(8 * base_depth, latent1_size)
# q(z_1^2 | z_1^1) = p(z_1^2 | z_1^1)
self.latent2_first_posterior = self.latent2_first_prior
# q(z_{t+1}^1 | f_{t+1}, z_t^2, a_t)
self.latent1_posterior = latent1_distribution_ctor(8 * base_depth, latent1_size)
# q(z_{t+1}^2 | z_{t+1}^1, z_t^2, a_t) = p(z_{t+1}^2 | z_{t+1}^1, z_t^2, a_t)
self.latent2_posterior = self.latent2_prior
# Create encoders q(f_t|x_t)
self.encoders = {}
for name in input_names:
if obs_size == 64:
self.encoders[name] = Encoder64(base_depth, 8 * base_depth)
elif obs_size == 128:
self.encoders[name] = Encoder128(base_depth, 8 * base_depth)
elif obs_size == 256:
self.encoders[name] = Encoder256(base_depth, 8 * base_depth)
else:
raise NotImplementedError
# Create decoders q(x_t|z_t)
self.decoders = {}
for name in self.reconstruct_names:
if obs_size == 64:
self.decoders[name] = Decoder64(base_depth, scale=decoder_stddev)
elif obs_size == 128:
self.decoders[name] = Decoder128(base_depth, scale=decoder_stddev)
elif obs_size == 256:
self.decoders[name] = Decoder256(base_depth, scale=decoder_stddev)
else:
raise NotImplementedError
@property
def latent_size(self):
"""Size of the latent state."""
return self.latent1_size + self.latent2_size
def sample_prior(self, batch_size):
"""Sample the prior latent state."""
latent1 = self.latent1_first_prior(tf.zeros(batch_size)).sample()
latent2 = self.latent2_first_prior(latent1).sample()
latent = tf.concat([latent1, latent2], axis=-1)
return latent
def filter(self, image, last_latent, last_action):
"""Apply recursive filter to obtain posterior estimation of latent
q(z_{t+1}|z_t,a_t,x_{t+1}).
"""
feature = self.get_features(image)
last_latent2 = tf.gather(last_latent, tf.range(self.latent1_size, self.latent_size), axis=-1)
latent1 = self.latent1_posterior(feature, last_latent2, last_action).sample()
latent2 = self.latent2_posterior(latent1, last_latent2, last_action).sample()
latent = tf.concat([latent1, latent2], axis=-1)
return latent
def first_filter(self, image):
"""Obtain the posterior of the latent at the first timestep q(z_1|x_1)."""
feature = self.get_features(image)
latent1 = self.latent1_first_posterior(feature).sample()
latent2 = self.latent2_first_posterior(latent1).sample()
latent = tf.concat([latent1, latent2], axis=-1)
return latent
def get_features(self, images):
"""Get low dimensional features from images q(f_t|x_t)"""
features = {}
for name in self.input_names:
images_tmp = tf.image.convert_image_dtype(images[name], tf.float32)
features[name] = self.encoders[name](images_tmp)
features = sum(features.values())
return features
def reconstruct(self, latent):
"""Reconstruct the images in reconstruct_names given the latent state."""
latent1 = tf.gather(latent, tf.range(0, self.latent1_size), axis=-1)
latent2 = tf.gather(latent, tf.range(self.latent1_size, self.latent_size), axis=-1)
posterior_images = {}
for name in self.reconstruct_names:
posterior_images[name] = self.decoders[name](latent1, latent2).mean()
posterior_images = tf.concat(list(posterior_images.values()), axis=-2)
return posterior_images
def compute_latents(self, images, actions, step_types, latent_posterior_samples_and_dists=None, num_first_image=5):
"""Compute the latent states of the sequential latent model."""
sequence_length = step_types.shape[1] - 1
# Get posterior and prior samples of latents
if latent_posterior_samples_and_dists is None:
latent_posterior_samples_and_dists = self.sample_posterior(images, actions, step_types)
(latent1_posterior_samples, latent2_posterior_samples), (latent1_posterior_dists, latent2_posterior_dists) = (
latent_posterior_samples_and_dists)
(latent1_prior_samples, latent2_prior_samples), _ = self.sample_prior_or_posterior(actions, step_types) # for visualization
# Get prior samples of latents conditioned on intial inputs
first_image = {}
for k,v in images.items():
first_image[k] = v[:, :num_first_image]
(latent1_conditional_prior_samples, latent2_conditional_prior_samples), _ = self.sample_prior_or_posterior(
actions, step_types, images=first_image) # for visualization. condition on first image only
# Reset the initial steps of an episode to first prior latents
def where_and_concat(reset_masks, first_prior_tensors, after_first_prior_tensors):
after_first_prior_tensors = tf.where(reset_masks[:, 1:], first_prior_tensors[:, 1:], after_first_prior_tensors)
prior_tensors = tf.concat([first_prior_tensors[:, 0:1], after_first_prior_tensors], axis=1)
return prior_tensors
reset_masks = tf.concat([tf.ones_like(step_types[:, 0:1], dtype=tf.bool),
tf.equal(step_types[:, 1:], ts.StepType.FIRST)], axis=1)
latent1_reset_masks = tf.tile(reset_masks[:, :, None], [1, 1, self.latent1_size])
latent1_first_prior_dists = self.latent1_first_prior(step_types)
latent1_after_first_prior_dists = self.latent1_prior(
latent2_posterior_samples[:, :sequence_length], actions[:, :sequence_length])
latent1_prior_dists = nest_utils.map_distribution_structure(
functools.partial(where_and_concat, latent1_reset_masks),
latent1_first_prior_dists,
latent1_after_first_prior_dists)
latent2_reset_masks = tf.tile(reset_masks[:, :, None], [1, 1, self.latent2_size])
latent2_first_prior_dists = self.latent2_first_prior(latent1_posterior_samples)
latent2_after_first_prior_dists = self.latent2_prior(
latent1_posterior_samples[:, 1:sequence_length+1],
latent2_posterior_samples[:, :sequence_length],
actions[:, :sequence_length])
latent2_prior_dists = nest_utils.map_distribution_structure(
functools.partial(where_and_concat, latent2_reset_masks),
latent2_first_prior_dists,
latent2_after_first_prior_dists)
return (latent1_posterior_dists, latent1_prior_dists), (latent2_posterior_dists,
latent2_prior_dists), (latent1_posterior_samples, latent1_prior_samples,
latent1_conditional_prior_samples), (latent2_posterior_samples, latent2_prior_samples,
latent2_conditional_prior_samples)
def compute_loss(self, images, actions, step_types, latent_posterior_samples_and_dists=None, num_first_image=5):
# Compuate the latents
latent1_dists, latent2_dists, latent1_samples, latent2_samples = \
self.compute_latents(images, actions, step_types, latent_posterior_samples_and_dists, num_first_image)
latent1_posterior_dists, latent1_prior_dists = latent1_dists
latent2_posterior_dists, latent2_prior_dists = latent2_dists
latent1_posterior_samples, latent1_prior_samples, \
latent1_conditional_prior_samples = latent1_samples
latent2_posterior_samples, latent2_prior_samples, \
latent2_conditional_prior_samples = latent2_samples
# Compute the KL divergence part of the ELBO
outputs = {}
if self.kl_analytic:
latent1_kl_divergences = tfd.kl_divergence(latent1_posterior_dists, latent1_prior_dists)
else:
latent1_kl_divergences = (latent1_posterior_dists.log_prob(latent1_posterior_samples)
- latent1_prior_dists.log_prob(latent1_posterior_samples))
latent1_kl_divergences = tf.reduce_sum(latent1_kl_divergences, axis=1)
outputs.update({
'latent1_kl_divergence': tf.reduce_mean(latent1_kl_divergences),
})
if self.kl_analytic:
latent2_kl_divergences = tfd.kl_divergence(latent2_posterior_dists, latent2_prior_dists)
else:
latent2_kl_divergences = (latent2_posterior_dists.log_prob(latent2_posterior_samples)
- latent2_prior_dists.log_prob(latent2_posterior_samples))
latent2_kl_divergences = tf.reduce_sum(latent2_kl_divergences, axis=1)
outputs.update({
'latent2_kl_divergence': tf.reduce_mean(latent2_kl_divergences),
})
outputs.update({
'kl_divergence': tf.reduce_mean(latent1_kl_divergences + latent2_kl_divergences),
})
elbo = - latent1_kl_divergences - latent2_kl_divergences
# Compute the reconstruction part of the ELBO
likelihood_dists = {}
likelihood_log_probs = {}
reconstruction_error = {}
for name in self.reconstruct_names:
likelihood_dists[name] = self.decoders[name](latent1_posterior_samples, latent2_posterior_samples)
images_tmp = tf.image.convert_image_dtype(images[name], tf.float32)
likelihood_log_probs[name] = likelihood_dists[name].log_prob(images_tmp)
likelihood_log_probs[name] = tf.reduce_sum(likelihood_log_probs[name], axis=1)
reconstruction_error[name] = tf.reduce_sum(tf.square(images_tmp - likelihood_dists[name].distribution.loc),
axis=list(range(-len(likelihood_dists[name].event_shape), 0)))
reconstruction_error[name] = tf.reduce_sum(reconstruction_error[name], axis=1)
outputs.update({
'log_likelihood_'+name: tf.reduce_mean(likelihood_log_probs[name]),
'reconstruction_error_'+name: tf.reduce_mean(reconstruction_error[name]),
})
elbo += likelihood_log_probs[name]
# Compute the loss
loss = -tf.reduce_mean(elbo)
# Generate the images
posterior_images = {}
prior_images = {}
conditional_prior_images = {}
for name in self.reconstruct_names:
posterior_images[name] = likelihood_dists[name].mean()
prior_images[name] = self.decoders[name](latent1_prior_samples, latent2_prior_samples).mean()
conditional_prior_images[name] = self.decoders[name](latent1_conditional_prior_samples, latent2_conditional_prior_samples).mean()
images = tf.concat([tf.image.convert_image_dtype(images[k], tf.float32)
for k in list(set(self.input_names+self.reconstruct_names))], axis=-2)
posterior_images = tf.concat(list(posterior_images.values()), axis=-2)
prior_images = tf.concat(list(prior_images.values()), axis=-2)
conditional_prior_images = tf.concat(list(conditional_prior_images.values()), axis=-2)
outputs.update({
'elbo': tf.reduce_mean(elbo),
'images': images,
'posterior_images': posterior_images,
'prior_images': prior_images,
'conditional_prior_images': conditional_prior_images,
})
return loss, outputs
def sample_prior_or_posterior(self, actions, step_types=None, images=None):
"""Samples from the prior latents, or latents conditioned on input images."""
if step_types is None:
batch_size = tf.shape(actions)[0]
sequence_length = actions.shape[1] # should be statically defined
step_types = tf.fill(
[batch_size, sequence_length + 1], ts.StepType.MID)
else:
sequence_length = step_types.shape[1] - 1
actions = actions[:, :sequence_length]
if images is not None:
features = self.get_features(images)
# Swap batch and time axes
actions = tf.transpose(actions, [1, 0, 2])
step_types = tf.transpose(step_types, [1, 0])
if images is not None:
features = tf.transpose(features, [1, 0, 2])
# Get latent distributions and samples
latent1_dists = []
latent1_samples = []
latent2_dists = []
latent2_samples = []
for t in range(sequence_length + 1):
is_conditional = images is not None and (t < list(images.values())[0].shape[1])
if t == 0:
if is_conditional:
latent1_dist = self.latent1_first_posterior(features[t])
else:
latent1_dist = self.latent1_first_prior(step_types[t]) # step_types is only used to infer batch_size
latent1_sample = latent1_dist.sample()
if is_conditional:
latent2_dist = self.latent2_first_posterior(latent1_sample)
else:
latent2_dist = self.latent2_first_prior(latent1_sample)
latent2_sample = latent2_dist.sample()
else:
reset_mask = tf.equal(step_types[t], ts.StepType.FIRST)
reset_mask = tf.expand_dims(reset_mask, 1)
if is_conditional:
latent1_first_dist = self.latent1_first_posterior(features[t])
latent1_dist = self.latent1_posterior(features[t], latent2_samples[t-1], actions[t-1])
else:
latent1_first_dist = self.latent1_first_prior(step_types[t])
latent1_dist = self.latent1_prior(latent2_samples[t-1], actions[t-1])
latent1_dist = nest_utils.map_distribution_structure(
functools.partial(tf.where, reset_mask), latent1_first_dist, latent1_dist)
latent1_sample = latent1_dist.sample()
if is_conditional:
latent2_first_dist = self.latent2_first_posterior(latent1_sample)
latent2_dist = self.latent2_posterior(latent1_sample, latent2_samples[t-1], actions[t-1])
else:
latent2_first_dist = self.latent2_first_prior(latent1_sample)
latent2_dist = self.latent2_prior(latent1_sample, latent2_samples[t-1], actions[t-1])
latent2_dist = nest_utils.map_distribution_structure(
functools.partial(tf.where, reset_mask), latent2_first_dist, latent2_dist)
latent2_sample = latent2_dist.sample()
latent1_dists.append(latent1_dist)
latent1_samples.append(latent1_sample)
latent2_dists.append(latent2_dist)
latent2_samples.append(latent2_sample)
latent1_dists = nest_utils.map_distribution_structure(lambda *x: tf.stack(x, axis=1), *latent1_dists)
latent1_samples = tf.stack(latent1_samples, axis=1)
latent2_dists = nest_utils.map_distribution_structure(lambda *x: tf.stack(x, axis=1), *latent2_dists)
latent2_samples = tf.stack(latent2_samples, axis=1)
return (latent1_samples, latent2_samples), (latent1_dists, latent2_dists)
def sample_posterior(self, images, actions, step_types, features=None):
"""Sample posterior latents conditioned on input images."""
sequence_length = step_types.shape[1] - 1
actions = actions[:, :sequence_length]
if features is None:
features = self.get_features(images)
# Swap batch and time axes
features = tf.transpose(features, [1, 0, 2])
actions = tf.transpose(actions, [1, 0, 2])
step_types = tf.transpose(step_types, [1, 0])
# Get latent distributions and samples
latent1_dists = []
latent1_samples = []
latent2_dists = []
latent2_samples = []
for t in range(sequence_length + 1):
if t == 0:
latent1_dist = self.latent1_first_posterior(features[t])
latent1_sample = latent1_dist.sample()
latent2_dist = self.latent2_first_posterior(latent1_sample)
latent2_sample = latent2_dist.sample()
else:
reset_mask = tf.equal(step_types[t], ts.StepType.FIRST)
reset_mask = tf.expand_dims(reset_mask, 1)
latent1_first_dist = self.latent1_first_posterior(features[t])
latent1_dist = self.latent1_posterior(features[t], latent2_samples[t-1], actions[t-1])
latent1_dist = nest_utils.map_distribution_structure(
functools.partial(tf.where, reset_mask), latent1_first_dist, latent1_dist)
latent1_sample = latent1_dist.sample()
latent2_first_dist = self.latent2_first_posterior(latent1_sample)
latent2_dist = self.latent2_posterior(latent1_sample, latent2_samples[t-1], actions[t-1])
latent2_dist = nest_utils.map_distribution_structure(
functools.partial(tf.where, reset_mask), latent2_first_dist, latent2_dist)
latent2_sample = latent2_dist.sample()
latent1_dists.append(latent1_dist)
latent1_samples.append(latent1_sample)
latent2_dists.append(latent2_dist)
latent2_samples.append(latent2_sample)
latent1_dists = nest_utils.map_distribution_structure(lambda *x: tf.stack(x, axis=1), *latent1_dists)
latent1_samples = tf.stack(latent1_samples, axis=1)
latent2_dists = nest_utils.map_distribution_structure(lambda *x: tf.stack(x, axis=1), *latent2_dists)
latent2_samples = tf.stack(latent2_samples, axis=1)
return (latent1_samples, latent2_samples), (latent1_dists, latent2_dists)
@gin.configurable
class SequentialLatentModelNonHierarchical(tf.Module):
def __init__(self,
input_names,
reconstruct_names,
obs_size=64,
base_depth=32,
latent_size=64,
kl_analytic=True,
decoder_stddev=np.sqrt(0.1, dtype=np.float32),
name=None):
"""
Read codes:
- sss
Creates an instance of `SequentialLatentModelNonHierarchical`.
Args:
input_names: the names of the observation inputs (e.g, 'camera', 'lidar').
reconstruct_names: names of the outputs to reconstruct (e.g, 'mask').
obs_size: the pixel size of the observation inputs. Here we assume
the image inputs have same width and height.
base_depth: base depth of the convolutional layers.
latent_size: size of the latent state.
kl_analytic: whether to use analytical KL divergence.
decoder_stddev: standard deviation of the decoder.
name: A string representing name of the network.
"""
super(SequentialLatentModelNonHierarchical, self).__init__(name=name)
self.input_names = input_names
self.reconstruct_names = reconstruct_names
self.latent_size = latent_size
self.kl_analytic = kl_analytic
self.obs_size = obs_size
latent_first_prior_distribution_ctor = ConstantMultivariateNormalDiag
latent_distribution_ctor = MultivariateNormalDiag
# p(z_1)
self.latent_first_prior = latent_first_prior_distribution_ctor(latent_size)
# p(z_{t+1} | z_t, a_t)
self.latent_prior = latent_distribution_ctor(8 * base_depth, latent_size)
# q(z_1 | f_1)
self.latent_first_posterior = latent_distribution_ctor(8 * base_depth, latent_size)
# q(z_{t+1} | f_{t+1}, z_t, a_t)
self.latent_posterior = latent_distribution_ctor(8 * base_depth, latent_size)
# Create encoders q(f_t|x_t)
self.encoders = {}
for name in input_names:
if obs_size == 64:
self.encoders[name] = Encoder64(base_depth, 8 * base_depth)
elif obs_size == 128:
self.encoders[name] = Encoder128(base_depth, 8 * base_depth)
elif obs_size == 256:
self.encoders[name] = Encoder256(base_depth, 8 * base_depth)
else:
raise NotImplementedError
# Create decoders q(x_t|z_t)
self.decoders = {}
for name in self.reconstruct_names:
if obs_size == 64:
self.decoders[name] = Decoder64(base_depth, scale=decoder_stddev)
elif obs_size == 128:
self.decoders[name] = Decoder128(base_depth, scale=decoder_stddev)
elif obs_size == 256:
self.decoders[name] = Decoder256(base_depth, scale=decoder_stddev)
else:
raise NotImplementedError
def sample_prior(self, batch_size):
"""Sample the prior latent state."""
latent = self.latent_first_prior(tf.zeros(batch_size)).sample()
return latent
def filter(self, image, last_latent, last_action):
"""Apply recursive filter to obtain posterior estimation of latent
q(z_{t+1}|z_t,a_t,x_{t+1}).
"""
feature = self.get_features(image)
latent = self.latent_posterior(feature, last_latent, last_action).sample()
return latent
def first_filter(self, image):
"""Obtain the posterior of the latent at the first timestep q(z_1|x_1)."""
feature = self.get_features(image)
latent = self.latent_first_posterior(feature).sample()
return latent
def get_features(self, images):
"""Get low dimensional features from images q(f_t|x_t)"""
features = {}
for name in self.input_names:
images_tmp = tf.image.convert_image_dtype(images[name], tf.float32)
features[name] = self.encoders[name](images_tmp)
features = sum(features.values())
return features
def reconstruct(self, latent):
"""Reconstruct the images in reconstruct_names given the latent state."""
posterior_images = {}
for name in self.reconstruct_names:
posterior_images[name] = self.decoders[name](latent).mean()
posterior_images = tf.concat(list(posterior_images.values()), axis=-2)
return posterior_images
def compute_latents(self, images, actions, step_types, latent_posterior_samples_and_dists=None):
"""Compute the latent states of the sequential latent model."""
sequence_length = step_types.shape[1] - 1
# Get posterior and prior samples of latents
if latent_posterior_samples_and_dists is None:
latent_posterior_samples_and_dists = self.sample_posterior(images, actions, step_types)
latent_posterior_samples, latent_posterior_dists = latent_posterior_samples_and_dists
latent_prior_samples, _ = self.sample_prior_or_posterior(actions, step_types) # for visualization
# Get prior samples of latents conditioned on intial inputs
first_image = {}
num_first_image = 3
for k,v in images.items():
first_image[k] = v[:, :num_first_image]
latent_conditional_prior_samples, _ = self.sample_prior_or_posterior(
actions, step_types, images=first_image) # for visualization. condition on first image only
# Reset the initial steps of an episode to first prior latents
def where_and_concat(reset_masks, first_prior_tensors, after_first_prior_tensors):
after_first_prior_tensors = tf.where(reset_masks[:, 1:], first_prior_tensors[:, 1:], after_first_prior_tensors)
prior_tensors = tf.concat([first_prior_tensors[:, 0:1], after_first_prior_tensors], axis=1)
return prior_tensors
reset_masks = tf.concat([tf.ones_like(step_types[:, 0:1], dtype=tf.bool),
tf.equal(step_types[:, 1:], ts.StepType.FIRST)], axis=1)
latent_reset_masks = tf.tile(reset_masks[:, :, None], [1, 1, self.latent_size])
latent_first_prior_dists = self.latent_first_prior(step_types)
# these distributions start at t=1 and the inputs are from t-1
latent_after_first_prior_dists = self.latent_prior(
latent_posterior_samples[:, :sequence_length], actions[:, :sequence_length])
latent_prior_dists = nest_utils.map_distribution_structure(
functools.partial(where_and_concat, latent_reset_masks),
latent_first_prior_dists,
latent_after_first_prior_dists)
return (latent_posterior_dists, latent_prior_dists), (latent_posterior_samples,
latent_prior_samples, latent_conditional_prior_samples)
def compute_loss(self, images, actions, step_types, latent_posterior_samples_and_dists=None):
# Compuate the latents
latent_dists, latent_samples = self.compute_latents(images, actions, step_types, latent_posterior_samples_and_dists)
latent_posterior_dists, latent_prior_dists = latent_dists
latent_posterior_samples, latent_prior_samples, latent_conditional_prior_samples = latent_samples
# Compute the KL divergence part of the ELBO
outputs = {}
if self.kl_analytic:
latent_kl_divergences = tfd.kl_divergence(latent_posterior_dists, latent_prior_dists)
else:
latent_kl_divergences = (latent_posterior_dists.log_prob(latent_posterior_samples)
- latent_prior_dists.log_prob(latent_posterior_samples))
latent_kl_divergences = tf.reduce_sum(latent_kl_divergences, axis=1)
outputs.update({
'latent_kl_divergence': tf.reduce_mean(latent_kl_divergences),
})
elbo = - latent_kl_divergences
# Compute the reconstruction part of the ELBO
likelihood_dists = {}
likelihood_log_probs = {}
reconstruction_error = {}
for name in self.reconstruct_names:
likelihood_dists[name] = self.decoders[name](latent_posterior_samples)
images_tmp = tf.image.convert_image_dtype(images[name], tf.float32)
likelihood_log_probs[name] = likelihood_dists[name].log_prob(images_tmp)
likelihood_log_probs[name] = tf.reduce_sum(likelihood_log_probs[name], axis=1)
reconstruction_error[name] = tf.reduce_sum(tf.square(images_tmp - likelihood_dists[name].distribution.loc),
axis=list(range(-len(likelihood_dists[name].event_shape), 0)))
reconstruction_error[name] = tf.reduce_sum(reconstruction_error[name], axis=1)
outputs.update({
'log_likelihood_'+name: tf.reduce_mean(likelihood_log_probs[name]),
'reconstruction_error_'+name: tf.reduce_mean(reconstruction_error[name]),
})
elbo += likelihood_log_probs[name]
# average over the batch dimension
loss = -tf.reduce_mean(elbo)
# Generate the images
posterior_images = {}
prior_images = {}
conditional_prior_images = {}
for name in self.reconstruct_names:
posterior_images[name] = likelihood_dists[name].mean()
prior_images[name] = self.decoders[name](latent_prior_samples).mean()
conditional_prior_images[name] = self.decoders[name](latent_conditional_prior_samples).mean()
images = tf.concat([tf.image.convert_image_dtype(images[k], tf.float32)
for k in list(set(self.input_names+self.reconstruct_names))], axis=-2)
posterior_images = tf.concat(list(posterior_images.values()), axis=-2)
prior_images = tf.concat(list(prior_images.values()), axis=-2)
conditional_prior_images = tf.concat(list(conditional_prior_images.values()), axis=-2)
outputs.update({
'elbo': tf.reduce_mean(elbo),
'images': images,
'posterior_images': posterior_images,
'prior_images': prior_images,
'conditional_prior_images': conditional_prior_images,
})
return loss, outputs
def sample_prior_or_posterior(self, actions, step_types=None, images=None):
"""Samples from the prior latents, or latents conditioned on input images."""
if step_types is None:
batch_size = tf.shape(actions)[0]
sequence_length = actions.shape[1] # should be statically defined
step_types = tf.fill(
[batch_size, sequence_length + 1], ts.StepType.MID)
else:
sequence_length = step_types.shape[1] - 1
actions = actions[:, :sequence_length]
if images is not None:
features = self.get_features(images)
# Swap batch and time axes
actions = tf.transpose(actions, [1, 0, 2])
step_types = tf.transpose(step_types, [1, 0])
if images is not None:
features = tf.transpose(features, [1, 0, 2])
# Get latent distributions and samples
latent_dists = []
latent_samples = []
for t in range(sequence_length + 1):
is_conditional = images is not None and (t < list(images.values())[0].shape[1])
if t == 0:
if is_conditional:
latent_dist = self.latent_first_posterior(features[t])
else:
latent_dist = self.latent_first_prior(step_types[t]) # step_types is only used to infer batch_size
latent_sample = latent_dist.sample()
else:
reset_mask = tf.equal(step_types[t], ts.StepType.FIRST)
reset_mask = tf.expand_dims(reset_mask, 1)
if is_conditional:
latent_first_dist = self.latent_first_posterior(features[t])
latent_dist = self.latent_posterior(features[t], latent_samples[t-1], actions[t-1])
else:
latent_first_dist = self.latent_first_prior(step_types[t])
latent_dist = self.latent_prior(latent_samples[t-1], actions[t-1])
latent_dist = nest_utils.map_distribution_structure(
functools.partial(tf.where, reset_mask), latent_first_dist, latent_dist)
latent_sample = latent_dist.sample()
latent_dists.append(latent_dist)
latent_samples.append(latent_sample)
latent_dists = nest_utils.map_distribution_structure(lambda *x: tf.stack(x, axis=1), *latent_dists)
latent_samples = tf.stack(latent_samples, axis=1)
return latent_samples, latent_dists
def sample_posterior(self, images, actions, step_types, features=None):
"""Sample posterior latents conditioned on input images."""
sequence_length = step_types.shape[1] - 1
actions = actions[:, :sequence_length]
if features is None:
features = self.get_features(images)
# Swap batch and time axes
features = tf.transpose(features, [1, 0, 2])
actions = tf.transpose(actions, [1, 0, 2])
step_types = tf.transpose(step_types, [1, 0])
# Get latent distributions and samples
latent_dists = []
latent_samples = []
for t in range(sequence_length + 1):
if t == 0:
latent_dist = self.latent_first_posterior(features[t])
latent_sample = latent_dist.sample()
else:
reset_mask = tf.equal(step_types[t], ts.StepType.FIRST)
reset_mask = tf.expand_dims(reset_mask, 1)
latent_first_dist = self.latent_first_posterior(features[t])
latent_dist = self.latent_posterior(features[t], latent_samples[t-1], actions[t-1])
latent_dist = nest_utils.map_distribution_structure(
functools.partial(tf.where, reset_mask), latent_first_dist, latent_dist)
latent_sample = latent_dist.sample()
latent_dists.append(latent_dist)
latent_samples.append(latent_sample)
latent_dists = nest_utils.map_distribution_structure(lambda *x: tf.stack(x, axis=1), *latent_dists)
latent_samples = tf.stack(latent_samples, axis=1)
return latent_samples, latent_dists | 43.76698 | 135 | 0.703784 | 5,694 | 41,885 | 4.894275 | 0.060063 | 0.023898 | 0.011124 | 0.009473 | 0.893247 | 0.855246 | 0.823202 | 0.798479 | 0.763097 | 0.738123 | 0 | 0.027268 | 0.188349 | 41,885 | 957 | 136 | 43.76698 | 0.792476 | 0.12477 | 0 | 0.730823 | 0 | 0 | 0.008947 | 0.003634 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054393 | false | 0 | 0.013947 | 0 | 0.122734 | 0.001395 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3d00ca9e8b06c0fae8292c5742ea569d581cb7ae | 21,418 | py | Python | Content.py | 914525753/swOpt | 3024a390e859eb0efcfc876995ad352f89d13f20 | [
"Apache-2.0"
] | 3 | 2020-11-25T15:45:23.000Z | 2021-06-16T13:36:14.000Z | Content.py | 914525753/swOpt | 3024a390e859eb0efcfc876995ad352f89d13f20 | [
"Apache-2.0"
] | null | null | null | Content.py | 914525753/swOpt | 3024a390e859eb0efcfc876995ad352f89d13f20 | [
"Apache-2.0"
] | null | null | null | #coding:utf-8
#! /usr/bin/env python
#Author:Sh4d0w_小白
from tkinter import messagebox
from os import makedirs
from time import sleep
from subprocess import run
flag = 0
def system(command):
run(command, shell = True, encoding="utf-8")
checkButtonText = [
'关闭网络共享', '禁用telnet', '禁用guest用户', 'U盘保护', '禁止自动播放', '远程服务禁用', '关闭bing', '自动关闭停止响应的应用', '伪优化清除', '改善超级预读', '优化搜索索引', '关闭多余视觉效果', '关闭系统休眠', '系统服务项优化', 'office安全优化','禁用全屏优化'
]
treasureBoxText = [
'清理arp缓存表', '重置winsock', '重新加载dll', 'IE安全警报关闭', '修复windows installer', '组策略启动修复', '修复dns并重新分配ip', '解除CMD禁用', '解除注册表禁用', '解除防火墙禁用(win10)', '路由表重置', '重置系统应用(win10)', '清理桌面图标缓存(win10)', 'ESENT EDB.log错误修复','SRU修复(1909)'
]
def IsOk():
messagebox.showinfo(message = "优化成功")
def UserCancel():
global flag
flag += 1
messagebox.showerror(message = '用户取消了操作')
def NoShare():
if(messagebox.askyesno(message = '注意,关闭网络共享会无法共享文件') == True):
system('net share C$ /del & net share D$ /del & net share ADMIN$ /del & net share E$ /del & net share F$ /del & net share G$ /del')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Control\\Lsa" /v restrictanonymous /t REG_DWORD /d 1 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\LanmanServer\\Parameters" /v AutoShareServer /t REG_DWORD /d 0 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\services\\LanmanServer\\Parameters" /v AutoShareWks /t REG_DWORD /d 0 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\LanmanServer" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\LanmanWorkstation" /v Start /t REG_DWORD /d 4 /f')
system('net use * /del /y')
system('netsh wlan set allowexplicitcreds no')
else:
UserCancel()
def NoTelnet():
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\tlntsvr" /v Start /t REG_DWORD /d 4 /f')
def NoGuest():
system('net user Guest /active:no')
def ProtectUSB():
if(messagebox.askyesno(message = '注意,U盘保护会导致文件无法写入U盘') == True):
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Control\\StorageDevicePolicies" /v WriteProtect /t REG_DWORD /d 1 /f')
else:
UserCancel()
def NoAutoRun():
system('reg add "HKEY_CURRENT_USER\\SOFTWARE\\Microsoft\\Windows\\CurrentVersion\\Policies\\Explorer" /v NoDriveTypeAutoRun /t REG_DWORD /d 255 /f')
system('reg add "HKEY_CURRENT_USER\\SOFTWARE\\Policies\\Microsoft\\Windows\\Explorer /v NoAutoplayfornonVolume" /t REG_DWORD /d 1 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Microsoft\\Windows\\CurrentVersion\\Policies\\Explorer" /v NoDriveTypeAutoRun /t REG_DWORD /d 255 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\ControlSet001\\Services\\cdrom" /v Autorun /t REG_DWORD /d 0 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\cdrom" /v Autorun /t REG_DWORD /d 0 /f')
def NoRemote():
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\RemoteRegistry" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\RemoteRegAccess" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\UmRdpService" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\TermService" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\SessionEnv" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\RasMan" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\RasAuto" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\Secondary Logon" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\SharedAccess" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\HomeGroupListener" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\HomeGroupProvider" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\WPCSvc" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\WerSvc" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\WMPNetworkSvc" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\WinRM" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\wscsvc" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\NetTcpPortSharing" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\QWAVE" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\RpcLocator" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\lmhosts" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\NetBT" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\NetBIOS" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\RDSessMgr" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\DiagTrack" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\SSDPSRV" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\upnphost" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\lfsvc" /v Start /t REG_DWORD /d 4 /f')
if(messagebox.askyesno(message = '是否要禁用windows10自带更新') == True):
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\WindowsUpdate" /v ExcludeWUDriversInQualityUpdate /t REG_DWORD /d 1 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\WindowsUpdate" /v TargetReleaseVersion /t REG_SZ /d 1909 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\WindowsUpdate" /v TargetReleaseVersion /t REG_DWORD /d 1 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\WindowsUpdate\\AU" /v AutoInstallMinorUpdates /t REG_DWORD /d 0 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\WindowsUpdate\\AU" /v EnableFeaturedSoftware /t REG_DWORD /d 0 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\WindowsUpdate\\AU" /v NoAutoUpdate /t REG_DWORD /d 0 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\wuauserv" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\UsoSvc" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\WaaSMedicSvc" /v ExcludeWUDriversInQualityUpdate /t REG_DWORD /d 4 /f')
system('Schtasks /Change /DISABLE /TN "\\Microsoft\\Windows\\WindowsUpdate\\Scheduled Start"')
else:
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\wuauserv" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\UsoSvc" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\WaaSMedicSvc" /v ExcludeWUDriversInQualityUpdate /t REG_DWORD /d 3 /f')
system('Schtasks /Change /ENABLE /TN "\\Microsoft\\Windows\\WindowsUpdate\\Scheduled Start"')
system('reg delete "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\WindowsUpdate" /v TargetReleaseVersion /f')
if(messagebox.askyesno(message = '是否要禁用windows10热点') == True):
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\icssvc" /v Start /t REG_DWORD /d 4 /f')
else:
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\icssvc" /v Start /t REG_DWORD /d 3 /f')
if(messagebox.askyesno(message = '是否要禁用ADSL拨号') == True):
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\TapiSrv" /v Start /t REG_DWORD /d 4 /f')
else:
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\TapiSrv" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\p2pimsvc" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\p2psvc" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\PNRPsvc" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\SOFTWARE\\Policies\\Microsoft\\Windows\\sppsvc" /v Start /t REG_DWORD /d 3 /f')
def NoBing():
system('reg add "HKEY_CURRENT_USER\\SOFTWARE\\Policies\\Microsoft\\Windows\\Explorer" /v DisableSearchBoxSuggestions /t REG_DWORD /d 1 /f')
def AutoEndTask():
system('reg add "HKEY_CURRENT_USER\\Control Panel\\Desktop" /v AutoEndTasks /t REG_SZ /d 1 /f')
def ErrorOptimizationClean():
system('reg add "HKEY_CURRENT_USER\\Control Panel\\Desktop" /v MenuShowDelay /t REG_SZ /d 400 /f')
system('reg add "HKEY_CURRENT_USER\\Control Panel\\Desktop" /v WaitToKillAppTimeout /t REG_SZ /d 5000 /f')
system('reg add "HKEY_CURRENT_USER\\Control Panel\\Desktop" /v HungAppTimeout /t REG_SZ /d 5000 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Control" /v WaitToKillServiceTimeout /t REG_SZ /d 5000 /f')
system('reg delete "HKEY_CURRENT_USER\\SOFTWARE\\Microsoft\\Windows\\CurrentVersion\\Explorer\\Advanced" /v ThumbnailLivePreviewHoverTime /f')
def OptimizeSuperfetch():
if(messagebox.askyesno(message = '是否增强超级预读,“是”代表增强,“否”使用系统默认') == True):
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Control\\Session Manager\\Memory Management\\PrefetchParameters" /v EnablePrefetcher /t REG_DWORD /d 4 /f')
else:
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Control\\Session Manager\\Memory Management\\PrefetchParameters" /v EnablePrefetcher /t REG_DWORD /d 3 /f')
def OptimizeSearch():
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\WSearch" /v Start /t REG_DWORD /d 2 /f')
def NoVFX():
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\UxSms" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_CURRENT_USER\\SOFTWARE\\Microsoft\\Windows\\CurrentVersion\\Themes\\Personalize" /v EnableBlurBehind /t REG_DWORD /d 0 /f')
system('reg add "HKEY_CURRENT_USER\\Microsoft\\Windows\\DWM" /v EnableAeroPeek /t REG_DWORD /d 0 /f')
system('reg add "HKEY_CURRENT_USER\\Microsoft\\Windows\\DWM" /v ColorizationGlassAttribute /t REG_DWORD /d 0 /f')
def NoDormancy():
system('powercfg -h off')
def OptimizeService():
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\WpnService" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\OneSyncSvc" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\PushToInstall" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\vmicrdv" /v Start /t REG_DWORD /d 4 /f')
if(messagebox.askyesno(message = '是否禁用windows应用商店') == True):
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\InstallService" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\LicenseManager" /v Start /t REG_DWORD /d 4 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\NcbService" /v Start /t REG_DWORD /d 4 /f')
else:
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\InstallService" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\LicenseManager" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\NcbService" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\stisvc" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\wmiApSrvr" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\XLServicePlatform" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\TabletInputService" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\MapsBroker" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\DsmSvc" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\WbioSrvc" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\DPS" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\Spooler" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\DusmSvc" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\iphlpsvc" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\CDPSvc" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\DPSCryptSvc" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\DPS" /v Start /t REG_DWORD /d 3 /f')
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\SysMain" /v DelayedAutostart /t REG_DWORD /d 1 /f')
def OfficeSecurity():
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Word\\Security" /v RequireAddinSig /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Word\\Security" /v NoTBPromptUnsignedAddin /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Word\\Security\\ProtectedView" /v DisableAttachmentsInPV /t REG_DWORD /d 0 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Word\\Security\\ProtectedView" /v DisableInternetFilesInPV /t REG_DWORD /d 0 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Word\\Security\\ProtectedView" /v DisableUnsafeLocationsInPV /t REG_DWORD /d 0 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Word\\Security\\Trusted Documents" /v DisableTrustedDocuments /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Word\\Security\\Trusted Documents" /v DisableNetworkTrustedDocuments /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Word\\Security\\Trusted Locations" /v AllLocationsDisabled /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Excel\\Security" /v RequireAddinSig /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Excel\\Security" /v NoTBPromptUnsignedAddin /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Excel\\Security\\ProtectedView" /v DisableAttachmentsInPV /t REG_DWORD /d 0 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Excel\\Security\\ProtectedView" /v DisableInternetFilesInPV /t REG_DWORD /d 0 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Excel\\Security\\ProtectedView" /v DisableUnsafeLocationsInPV /t REG_DWORD /d 0 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Excel\\Security\\Trusted Documents" /v DisableTrustedDocuments /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\Excel\\Security\\Trusted Locations" /v DisableTrustedDocuments /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\PowerPoint\\Security" /v RequireAddinSig /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\PowerPoint\\Security" /v NoTBPromptUnsignedAddin /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\PowerPoint\\Security\\Trusted Documents" /v DisableTrustedDocuments /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\16.0\\PowerPoint\\Security\\Trusted Locations" /v AllLocationsDisabled /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Office\\Common\\Security" /v UFIControlsbled /t REG_DWORD /d 4 /f')
def ARPClean():
system('arp -d')
IsOk()
def WinsockReset():
system('netsh winsock reset')
IsOk()
def DllReload():
system('for %1 in (%windir%\\system32\\*.dll) do regsvr32.exe /s %1')
IsOk()
def NoWarn():
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Windows\\CurrentVersion\\Internet Settings" /v WarnOnHTTPSToHTTPRedirect /t REG_DWORD /d 0 /f')
IsOk()
def FixMsiServer():
system('reg add "HKEY_LOCAL_MACHINE\\system\\CurrentControlSet\\Services\\msiserver" /v Start /t REG_DWORD /d 3 /f')
IsOk()
def FixGpedit():
system('net start gpsvc & start gpedit.msc')
IsOk()
def DnsFlush():
system('ipconfig /flushdns & ipconfig /registerdns & ipconfig /release WLAN & ipconfig /renew WLAN')
IsOk()
def EnableCMD():
system('reg add "HKEY_CURRENT_USER\\Software\\Policies\\Microsoft\\Windows\\system" /v DisableCMD /t REG_DWORD /d 0 /f')
IsOk()
def EnableREG():
system('reg add "HKEY_CURRENT_USER\\Software\\Microsoft\\Windows\\CurrentVersion\\Policies\\system" /v DisableRegistrytools /t REG_DWORD /d 0 /f')
system('reg add "HKEY_LOCAL_MACHINE\\Software\\CLASSES\\.reg" /v "" /t REG_SZ /d regfile /f')
system('reg add "HKEY_LOCAL_MACHINE\\Software\\CLASSES\\.inf" /v "" /t REG_SZ /d inffile /f')
IsOk()
def EnableFireWall():
system('netsh advfirewall set allprofiles state on')
IsOk()
def RouteClean():
system('route -f')
IsOk()
def Win10APPReset():
system('powershell -e RwBlAHQALQBBAHAAcABYAFAAYQBjAGsAYQBnAGUAIAAtAEEAbABsAFUAcwBlAHIAcwAgAHwAIABGAG8AcgBlAGEAYwBoACAAewBBAGQAZAAtAEEAcABwAHgAUABhAGMAawBhAGcAZQAgAC0ARABpAHMAYQBiAGwAZQBEAGUAdgBlAGwAbwBwAG0AZQBuAHQATQBvAGQAZQAgAC0AUgBlAGcAaQBzAHQAZQByACAAIgAkACgAJABfAC4ASQBuAHMAdABhAGwAbABMAG8AYwBhAHQAaQBvAG4AKQBcAEEAcABwAFgATQBhAG4AaQBmAGUAcwB0AC4AeABtAGwAIgB9ACAA')
IsOk()
def DesktopCacheClean():
system('ie4uinit -show')
IsOk()
def FixEDB():
makedirs("C:\\WINDOWS\\system32\\config\\systemprofile\\AppData\\Local\\TileDataLayer\\Database")
IsOk()
def FixSRU():
system('del /F /Q "C:\\Windows\\system32\\sru"')
system('icacls "C:\\Windows\\system32\\sru" /grant[:r] Everyone:(OI)(CI)(R,W,D,WDAC,DC)')
IsOk()
def NoFullOpt():
system('reg add "HKEY_CURRENT_USER\system\GameConfigStore" /v GameDVR_FSEBehaviorMode /t REG_DWORD /d 2 /f')
system('reg add "HKEY_CURRENT_USER\system\GameConfigStore" /v GameDVR_HonorUserFSEBehaviorMode /t REG_DWORD /d 1 /f')
system('reg add "HKEY_CURRENT_USER\system\GameConfigStore" /v GameDVR_HonorUserFSEBehaviorMode /t REG_DWORD /d 2 /f')
system('reg add "HKEY_CURRENT_USER\system\GameConfigStore" /v GameDVR_DXGIHonorFSEWindowsCompatible /t REG_DWORD /d 1 /f')
checkButtonFunction = [
NoShare, NoTelnet, NoGuest, ProtectUSB, NoAutoRun, NoRemote, NoBing, AutoEndTask, ErrorOptimizationClean, OptimizeSuperfetch, OptimizeSearch, NoVFX, NoDormancy, OptimizeService, OfficeSecurity,NoFullOpt
]
treasureBoxFunction = [
ARPClean, WinsockReset, DllReload, NoWarn, FixMsiServer, FixGpedit, DnsFlush, EnableCMD, EnableREG, EnableFireWall, RouteClean, Win10APPReset, DesktopCacheClean, FixEDB,FixSRU
] | 73.855172 | 373 | 0.718788 | 2,855 | 21,418 | 5.255692 | 0.127145 | 0.077374 | 0.101566 | 0.135422 | 0.752816 | 0.744219 | 0.733356 | 0.723292 | 0.714429 | 0.679174 | 0 | 0.013663 | 0.135447 | 21,418 | 290 | 374 | 73.855172 | 0.796673 | 0.002288 | 0 | 0.110638 | 0 | 0.53617 | 0.783993 | 0.491081 | 0 | 0 | 0 | 0 | 0 | 1 | 0.144681 | false | 0 | 0.017021 | 0 | 0.161702 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3d387f5149273503ae4616244601059b9c81ffee | 16,425 | py | Python | EvalData/migrations/0024_auto_20171006_1311.py | amalinovskiy/Appraise | 03446dacebd91c556b29420fe917e2b0547047bd | [
"BSD-3-Clause"
] | 11 | 2021-02-08T08:40:23.000Z | 2022-03-30T09:56:40.000Z | EvalData/migrations/0024_auto_20171006_1311.py | amalinovskiy/Appraise | 03446dacebd91c556b29420fe917e2b0547047bd | [
"BSD-3-Clause"
] | 29 | 2021-01-23T16:50:47.000Z | 2022-03-25T13:46:01.000Z | EvalData/migrations/0024_auto_20171006_1311.py | amalinovskiy/Appraise | 03446dacebd91c556b29420fe917e2b0547047bd | [
"BSD-3-Clause"
] | 5 | 2021-05-22T14:34:47.000Z | 2021-08-23T15:50:05.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11 on 2017-10-06 20:11
from __future__ import unicode_literals
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('EvalData', '0023_auto_20171006_1307'),
]
operations = [
migrations.AlterField(
model_name='directassessmentresult',
name='activatedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_directassessmentresult_activated_by', to=settings.AUTH_USER_MODEL, verbose_name='Activated by'),
),
migrations.AlterField(
model_name='directassessmentresult',
name='completedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_directassessmentresult_completed_by', to=settings.AUTH_USER_MODEL, verbose_name='Completed by'),
),
migrations.AlterField(
model_name='directassessmentresult',
name='createdBy',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_directassessmentresult_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created by'),
),
migrations.AlterField(
model_name='directassessmentresult',
name='item',
field=models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_directassessmentresult_items', to='EvalData.TextPair', verbose_name='Item'),
),
migrations.AlterField(
model_name='directassessmentresult',
name='modifiedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_directassessmentresult_modified_by', to=settings.AUTH_USER_MODEL, verbose_name='Modified by'),
),
migrations.AlterField(
model_name='directassessmentresult',
name='retiredBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_directassessmentresult_retired_by', to=settings.AUTH_USER_MODEL, verbose_name='Retired by'),
),
migrations.AlterField(
model_name='directassessmentresult',
name='task',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_directassessmentresult_tasks', to='EvalData.DirectAssessmentTask', verbose_name='Task'),
),
migrations.AlterField(
model_name='directassessmenttask',
name='activatedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_directassessmenttask_activated_by', to=settings.AUTH_USER_MODEL, verbose_name='Activated by'),
),
migrations.AlterField(
model_name='directassessmenttask',
name='completedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_directassessmenttask_completed_by', to=settings.AUTH_USER_MODEL, verbose_name='Completed by'),
),
migrations.AlterField(
model_name='directassessmenttask',
name='createdBy',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_directassessmenttask_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created by'),
),
migrations.AlterField(
model_name='directassessmenttask',
name='modifiedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_directassessmenttask_modified_by', to=settings.AUTH_USER_MODEL, verbose_name='Modified by'),
),
migrations.AlterField(
model_name='directassessmenttask',
name='retiredBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_directassessmenttask_retired_by', to=settings.AUTH_USER_MODEL, verbose_name='Retired by'),
),
migrations.AlterField(
model_name='market',
name='activatedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_market_activated_by', to=settings.AUTH_USER_MODEL, verbose_name='Activated by'),
),
migrations.AlterField(
model_name='market',
name='completedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_market_completed_by', to=settings.AUTH_USER_MODEL, verbose_name='Completed by'),
),
migrations.AlterField(
model_name='market',
name='createdBy',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_market_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created by'),
),
migrations.AlterField(
model_name='market',
name='modifiedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_market_modified_by', to=settings.AUTH_USER_MODEL, verbose_name='Modified by'),
),
migrations.AlterField(
model_name='market',
name='retiredBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_market_retired_by', to=settings.AUTH_USER_MODEL, verbose_name='Retired by'),
),
migrations.AlterField(
model_name='metadata',
name='activatedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_metadata_activated_by', to=settings.AUTH_USER_MODEL, verbose_name='Activated by'),
),
migrations.AlterField(
model_name='metadata',
name='completedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_metadata_completed_by', to=settings.AUTH_USER_MODEL, verbose_name='Completed by'),
),
migrations.AlterField(
model_name='metadata',
name='createdBy',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_metadata_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created by'),
),
migrations.AlterField(
model_name='metadata',
name='modifiedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_metadata_modified_by', to=settings.AUTH_USER_MODEL, verbose_name='Modified by'),
),
migrations.AlterField(
model_name='metadata',
name='retiredBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_metadata_retired_by', to=settings.AUTH_USER_MODEL, verbose_name='Retired by'),
),
migrations.AlterField(
model_name='multimodalassessmentresult',
name='activatedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmentresult_activated_by', to=settings.AUTH_USER_MODEL, verbose_name='Activated by'),
),
migrations.AlterField(
model_name='multimodalassessmentresult',
name='completedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmentresult_completed_by', to=settings.AUTH_USER_MODEL, verbose_name='Completed by'),
),
migrations.AlterField(
model_name='multimodalassessmentresult',
name='createdBy',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmentresult_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created by'),
),
migrations.AlterField(
model_name='multimodalassessmentresult',
name='modifiedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmentresult_modified_by', to=settings.AUTH_USER_MODEL, verbose_name='Modified by'),
),
migrations.AlterField(
model_name='multimodalassessmentresult',
name='retiredBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmentresult_retired_by', to=settings.AUTH_USER_MODEL, verbose_name='Retired by'),
),
migrations.AlterField(
model_name='multimodalassessmenttask',
name='activatedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmenttask_activated_by', to=settings.AUTH_USER_MODEL, verbose_name='Activated by'),
),
migrations.AlterField(
model_name='multimodalassessmenttask',
name='completedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmenttask_completed_by', to=settings.AUTH_USER_MODEL, verbose_name='Completed by'),
),
migrations.AlterField(
model_name='multimodalassessmenttask',
name='createdBy',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmenttask_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created by'),
),
migrations.AlterField(
model_name='multimodalassessmenttask',
name='modifiedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmenttask_modified_by', to=settings.AUTH_USER_MODEL, verbose_name='Modified by'),
),
migrations.AlterField(
model_name='multimodalassessmenttask',
name='retiredBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_multimodalassessmenttask_retired_by', to=settings.AUTH_USER_MODEL, verbose_name='Retired by'),
),
migrations.AlterField(
model_name='textpair',
name='activatedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpair_activated_by', to=settings.AUTH_USER_MODEL, verbose_name='Activated by'),
),
migrations.AlterField(
model_name='textpair',
name='completedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpair_completed_by', to=settings.AUTH_USER_MODEL, verbose_name='Completed by'),
),
migrations.AlterField(
model_name='textpair',
name='createdBy',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpair_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created by'),
),
migrations.AlterField(
model_name='textpair',
name='modifiedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpair_modified_by', to=settings.AUTH_USER_MODEL, verbose_name='Modified by'),
),
migrations.AlterField(
model_name='textpair',
name='retiredBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpair_retired_by', to=settings.AUTH_USER_MODEL, verbose_name='Retired by'),
),
migrations.AlterField(
model_name='textpairwithimage',
name='activatedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpairwithimage_activated_by', to=settings.AUTH_USER_MODEL, verbose_name='Activated by'),
),
migrations.AlterField(
model_name='textpairwithimage',
name='completedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpairwithimage_completed_by', to=settings.AUTH_USER_MODEL, verbose_name='Completed by'),
),
migrations.AlterField(
model_name='textpairwithimage',
name='createdBy',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpairwithimage_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created by'),
),
migrations.AlterField(
model_name='textpairwithimage',
name='modifiedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpairwithimage_modified_by', to=settings.AUTH_USER_MODEL, verbose_name='Modified by'),
),
migrations.AlterField(
model_name='textpairwithimage',
name='retiredBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textpairwithimage_retired_by', to=settings.AUTH_USER_MODEL, verbose_name='Retired by'),
),
migrations.AlterField(
model_name='textsegment',
name='activatedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textsegment_activated_by', to=settings.AUTH_USER_MODEL, verbose_name='Activated by'),
),
migrations.AlterField(
model_name='textsegment',
name='completedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textsegment_completed_by', to=settings.AUTH_USER_MODEL, verbose_name='Completed by'),
),
migrations.AlterField(
model_name='textsegment',
name='createdBy',
field=models.ForeignKey(editable=False, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textsegment_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created by'),
),
migrations.AlterField(
model_name='textsegment',
name='modifiedBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textsegment_modified_by', to=settings.AUTH_USER_MODEL, verbose_name='Modified by'),
),
migrations.AlterField(
model_name='textsegment',
name='retiredBy',
field=models.ForeignKey(blank=True, editable=False, null=True, on_delete=django.db.models.deletion.PROTECT, related_name='evaldata_textsegment_retired_by', to=settings.AUTH_USER_MODEL, verbose_name='Retired by'),
),
]
| 64.920949 | 243 | 0.702648 | 1,780 | 16,425 | 6.244944 | 0.041011 | 0.035264 | 0.060453 | 0.094998 | 0.967164 | 0.964466 | 0.950882 | 0.888089 | 0.888089 | 0.888089 | 0 | 0.002393 | 0.185875 | 16,425 | 252 | 244 | 65.178571 | 0.828896 | 0.004018 | 0 | 0.759184 | 1 | 0 | 0.215273 | 0.13359 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.016327 | 0 | 0.028571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3d446b36cb9778e38cf93376e5f3715177e18a45 | 19,544 | py | Python | sys/controller/AuthorHelper.py | frankdede/CMPUT410W14T07 | 18b88ae8d12459eeb0073c48e0708fb2b175b9ba | [
"Apache-2.0"
] | 1 | 2017-01-04T11:55:21.000Z | 2017-01-04T11:55:21.000Z | sys/controller/AuthorHelper.py | frankdede/CMPUT410W14T07 | 18b88ae8d12459eeb0073c48e0708fb2b175b9ba | [
"Apache-2.0"
] | null | null | null | sys/controller/AuthorHelper.py | frankdede/CMPUT410W14T07 | 18b88ae8d12459eeb0073c48e0708fb2b175b9ba | [
"Apache-2.0"
] | null | null | null | from mysql.connector.errors import Error
from DatabaseAdapter import *
import sys
sys.path.append("sys/model")
from author import *
import Utility
import json
class AuthorHelper:
"""
If the username and password are correct, it will return True otherwise false
"""
def __init__(self,dbAdapter):
self.localSid = 'cs410.cs.ualberta.ca:41070'
self.dbAdapter = dbAdapter
def authorAuthenticate(self,authorName,password):
cur = self.dbAdapter.getcursor()
#Refactored: Author_name is changed to name
query = "SELECT aid,valid FROM author WHERE name='%s' AND pwd='%s' AND sid='%s'"%(authorName,password,self.localSid)
print query
try:
cur.execute(query)
row = cur.fetchone()
if row is None:
cur.close()
return False
else:
valid_bit = str(row[1])
if valid_bit == '0':
return "NO_CONFIRMED"
re_aid =""
re_aid = row[0]
cur.close()
return json.dumps({"aid":re_aid})
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from authorAuthenticate():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return False
def getAuthorObjectByAid(self,aid):
# DO NOT DELETE THE COMMENT
# TODO:
# [Success] return an jason author object
# [Exception] return null
# [Failed] return null
cur = self.dbAdapter.getcursor()
query = "SELECT aid,name,nick_name,sid,email,gender,city,birthday,img_path FROM author WHERE aid='%s' AND valid=1"%(aid)
print query
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from getAuthorObjectByAid():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Query:",query)
print("****************************************")
return None
row = cur.fetchone()
if row is None:
return None
else:
friend = Author(row[0],row[1],row[2],row[3],row[4],row[5],row[6],row[7],row[8])
print(friend)
return friend
def getAllAuthorObjectsForLocalServer(self):
"""
DO NOT DELETE THE COMMENT
TODO:
[SUCCESS] return an array of author objects for local server
e.g. {{'aid':xxxxx,'name':xxxxxxx ...},{'aid':xxxxx,'name':xxxxxxx..}}
[Exception] return null
[Failed] return null
"""
result = []
cur = self.dbAdapter.getcursor()
query = ("SELECT aid,name,nick_name,sid,email,gender,city,birthday,img_path from author WHERE sid = '%s' AND valid=1")%(self.localSid)
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from getFriendOfFriend():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Query:",query)
print("****************************************")
return None
result = []
for row in cur:
friend = Author(row[0],row[1],row[2],row[3],row[4],row[5],row[6],row[7],row[8])
result.append(friend)
return result
def getAllAuthorObjectsForRemoteServer(self):
# DO NOT DELETE THE COMMENT
# TODO:
# [SUCCESS] return an json array of author objects for remote server
# e.g. {{'aid':xxxxx,'name':xxxxxxx ...},{'aid':xxxxx,'name':xxxxxxx..}}
# [Exception] return null
# [Failed] return null
result = []
cur = self.dbAdapter.getcursor()
query = ("SELECT aid,name,nick_name,sid,email,gender,city,birthday,img_path from author WHERE sid != '%s'")%(self.localSid)
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from getFriendOfFriend():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Query:",query)
print("****************************************")
return None
for row in cur:
author = Author(row[0],row[1],row[2],row[3],row[4],row[5],row[6],row[7],row[8])
result.append(author)
return result
def getAllTmpAuthorObjects(self):
# DO NOT DELETE THE COMMENT
# TODO:
# [SUCCESS] return an json array of author objects for remote server
# e.g. {{'aid':xxxxx,'name':xxxxxxx ...},{'aid':xxxxx,'name':xxxxxxx..}}
# [Exception] return null
# [Failed] return null
result = []
cur = self.dbAdapter.getcursor()
query = ("SELECT aid,name,nick_name,sid,email,gender,city,birthday,img_path from author WHERE valid = 0")
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from getFriendOfFriend():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Query:",query)
print("****************************************")
return None
for row in cur:
author = Author(row[0],row[1],row[2],row[3],row[4],row[5],row[6],row[7],row[8])
result.append(author)
return result
def addLocalAuthor(self,authorName,nickName,password):
# DO NOT DELETE THE COMMENT
# TODO:
# [Success] return {'aid':xxxxx } (jason type)
# [Exception Caught] return false
# [Failed] return false
localSid = 'cs410.cs.ualberta.ca:41070'
cur = self.dbAdapter.getcursor()
aid = Utility.getid()
query = ("INSERT INTO author values('%s','%s','%s','%s','%s','','','','','',1)"%(aid,authorName,nickName,password,self.localSid))
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from getFriendOfFriend():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Query:",query)
print("******************************")
return None
return json.dumps({'aid':aid})
def addLocalTmpAuthor(self,authorName,password,nickName):
"""
# TODO:
# [Success] return {'aid':xxxxx } (jason type)
# [Exception Caught] return false
# [Failed] return false
"""
cur = self.dbAdapter.getcursor()
aid = Utility.getid()
query = ("INSERT INTO author values('%s','%s','%s','%s','%s','','','','','',0)"%(aid,authorName,nickName,password,self.localSid))
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from getFriendOfFriend():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Query:",query)
print("******************************")
return False
return json.dumps({'aid':aid})
def confirmAuthor(self,aid):
"""
confirm the author and set the valid bit to true
return booleam
"""
cur = self.dbAdapter.getcursor()
query = "UPDATE author SET valid=1 WHERE aid = '%s'"%(aid)
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("updateAuthorInfo():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return False
return cur.rowcount>0
def updateAuthorInfo(self,aid,email,gender,city,birthday,img_path):
# DO NOT DELETE THE COMMENT
# TODO:
# [Success] return true
# [Exception] return false
# [Failed] return false
cur = self.dbAdapter.getcursor()
query = ""
if img_path=="" and gender=="":
query = "UPDATE author SET email = '%s',city = '%s',birthday = '%s' WHERE aid = '%s'"%(email,city,birthday,aid)
elif img_path=="":
query = "UPDATE author SET email = '%s', gender = '%s',city = '%s',birthday = '%s' WHERE aid = '%s'"%(email,gender,city,birthday,aid)
elif gender=="":
query = "UPDATE author SET email = '%s',city = '%s',birthday = '%s', img_path = '%s' WHERE aid = '%s'"%(email,city,birthday,img_path,aid)
else:
query = "UPDATE author SET email = '%s', gender = '%s',city = '%s',birthday = '%s', img_path = '%s' WHERE aid = '%s'"%(email,gender,city,birthday,img_path,aid)
try:
cur.execute(query)
return True
except mysql.connector.Error as err:
print("****************************************")
print("updateAuthorInfo():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return False
def updateNickNameByAid(self,aid,newNickName):
cur = self.dbAdapter.getcursor()
query = "UPDATE author SET nick_name = '%s' WHERE aid = '%s'"%(newNickName,aid)
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from updateNickNameByAid():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return False
return cur.rowcount>0
def updatePasswordByAid(self,aid,newPassword):
cur = self.dbAdapter.getcursor()
query = "UPDATE author SET pwd = '%s' WHERE aid='%s'"%(newPassword,aid)
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from updatePasswordByUserId():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return False
return cur.rowcount>0
def setTmpAuthorToOfficialAuthor(self,aid):
"""
To set author to official author by its aid
"""
cur = self.dbAdapter.getcursor()
query = "UPDATE author SET valid=1 WHERE aid='%s'"%(aid)
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from updatePasswordByUserId():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return False
return cur.rowcount>0
def setAllTmpAuthorToOfficialAuthor(self,aid):
"""
To set author to official author by its aid
"""
cur = self.dbAdapter.getcursor()
query = "UPDATE author SET valid=1 WHERE aid='%s'"%(aid)
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from updatePasswordByUserId():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return False
return cur.rowcount>0
# to add an author to database the server_id is defualtly cs410.cs.ualberta.ca:41070 if server_id is not given
def deleteAuthor(self,aid):
cur = self.dbAdapter.getcursor()
query = ("DELETE FROM author "
"WHERE aid = '%s'") %(aid)
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from deleteAuthor():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return False
return cur.rowcount > 0
def addAuthor(self,name,password,nickName,sid='cs410.cs.ualberta.ca:41070'):
cur = self.dbAdapter.getcursor()
aid =Utility.getid()
query = ("INSERT INTO author(aid,name,nick_name,pwd,sid) "
"VALUES('%s','%s','%s','%s','%s')")%(aid,name,nickName,password,sid)
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from addAuthor():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return False
return json.dumps({"aid":aid})
def addRemoteAuthor(self,aid,displayName,sid):
cur = self.dbAdapter.getcursor()
query = ("INSERT INTO author(aid,name,nick_name,pwd,sid,valid) "
"VALUES('%s','%s','%s','%s','%s',1)")%(aid,aid,aid,"",sid)
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from addRemoteAuthor():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return False
return True
def getRecommendedAuthorList(self,aid):
cur = self.dbAdapter.getcursor()
query = ("SELECT c2.aid2,a.name ,count(*)as num FROM circle c1,circle c2,author a WHERE c1.aid2 = c2.aid1 and c1.aid1 !=c2.aid2 and c1.aid1 ='%s' and a.aid=c2.aid2 group by c1.aid1,c2.aid2 order by num desc;")%(aid)
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from addAuthor():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return False
re = []
for row in cur:
re.append({"aid":row[0],"name":row[1]})
return re
def searchAuthor(self,single_key):
result=[]
cur = self.dbAdapter.getcursor()
query = "SELECT aid,name,nick_name,sid,email,gender,city,birthday,img_path FROM author WHERE valid=1 AND name like '%"+single_key+"%' or nick_name like '%"+single_key+"%';"
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from searchAuthor():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return False
for row in cur:
author = Author(row[0],row[1],row[2],row[3],row[4],row[5],row[6],row[7],row[8])
result.append(author)
return result
def isRemoteAuthor(self,aid):
cur = self.dbAdapter.getcursor()
query = ("SELECT * FROM author WHERE aid = '%s' AND sid <> '%s';")%(aid,self.localSid)
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from searchAuthor():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return None
return len(cur.fetchall()) > 0
def getGlobalAuthors(self):
cur = self.dbAdapter.getcursor()
query =("SELECT A.aid,A.nick_name FROM author A,servers S WHERE A.valid = 1 AND S.sid = A.sid AND S.local = 1;")
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from getGlobalAuthors():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return None
return cur.fetchall()
def doesAuthorExists(self,aid):
cur = self.dbAdapter.getcursor()
query = ("SELECT * FROM author A WHERE A.aid = '%s' ")%(aid)
try:
cur.execute(query)
except mysql.connector.Error as err:
print("****************************************")
print("SQLException from doesAuthorExists():")
print("Error code:", err.errno)
print("SQLSTATE value:", err.sqlstate)
print("Error message:", err.msg)
print("Might be query issue:",query)
print("****************************************")
return None
return len(cur.fetchall()) > 0
| 38.777778 | 223 | 0.498056 | 2,002 | 19,544 | 4.843157 | 0.093906 | 0.043317 | 0.034653 | 0.054146 | 0.798061 | 0.786097 | 0.761551 | 0.750103 | 0.724938 | 0.712459 | 0 | 0.008824 | 0.292571 | 19,544 | 503 | 224 | 38.854871 | 0.692463 | 0.047585 | 0 | 0.784211 | 0 | 0.031579 | 0.316289 | 0.135248 | 0 | 0 | 0 | 0.00994 | 0 | 0 | null | null | 0.034211 | 0.015789 | null | null | 0.394737 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3d44a2c016a8b382c06b4b8bf77aac99c60d1591 | 266 | py | Python | src/dbml/core/runconfig/_run_configuration.py | simondale/databricks-ml-sdk | dd66e1c3fbb042f21e951bb09c25baefe6d54c4e | [
"MIT"
] | null | null | null | src/dbml/core/runconfig/_run_configuration.py | simondale/databricks-ml-sdk | dd66e1c3fbb042f21e951bb09c25baefe6d54c4e | [
"MIT"
] | null | null | null | src/dbml/core/runconfig/_run_configuration.py | simondale/databricks-ml-sdk | dd66e1c3fbb042f21e951bb09c25baefe6d54c4e | [
"MIT"
] | null | null | null | from dbml.core import Environment
class RunConfiguration:
@property
def environment(self) -> Environment:
return self._environment
@environment.setter
def environment(self, environment: Environment):
self._environment = environment | 24.181818 | 52 | 0.725564 | 25 | 266 | 7.64 | 0.48 | 0.314136 | 0.408377 | 0.303665 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206767 | 266 | 11 | 53 | 24.181818 | 0.905213 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0.125 | 0.625 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
3d53e21ce9422067093604b2d876d161ae0fd04f | 5,179 | py | Python | SerialController/Commands/UnitCommand.py | Moi-poke/Poke-Controller-temp | b632f55eb6e5adc0f85f2ba6ef59c1230a5d5606 | [
"MIT"
] | 3 | 2021-04-23T06:30:36.000Z | 2022-01-04T09:10:25.000Z | SerialController/Commands/UnitCommand.py | Moi-poke/Poke-Controller-temp | b632f55eb6e5adc0f85f2ba6ef59c1230a5d5606 | [
"MIT"
] | 1 | 2022-01-04T06:33:11.000Z | 2022-01-04T06:33:11.000Z | SerialController/Commands/UnitCommand.py | Moi-poke/Poke-Controller-temp | b632f55eb6e5adc0f85f2ba6ef59c1230a5d5606 | [
"MIT"
] | 6 | 2021-10-03T05:42:50.000Z | 2022-03-15T00:29:09.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
from time import sleep
from . import CommandBase
from .Keys import Button, Hat, KeyPress
# Sigle button command
class UnitCommand(CommandBase.Command):
def __init__(self):
super().__init__()
def start(self, ser, postProcess=None):
self.isRunning = True
self.key = KeyPress(ser)
def end(self, ser):
pass
# do nothing at wait time(s)
def wait(self, wait):
sleep(wait)
def press(self, btn):
self.key.input([btn])
self.wait(0.1)
self.key.inputEnd([btn])
self.isRunning = False
self.key = None
class A(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.A)
class B(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.B)
class X(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.X)
class Y(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.Y)
class L(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.L)
class R(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.R)
class ZL(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.ZL)
class ZR(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.ZR)
class MINUS(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.MINUS)
class PLUS(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.PLUS)
class LCLICK(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.LCLICK)
class RCLICK(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.RCLICK)
class HOME(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.HOME)
class CAPTURE(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.press(Button.CAPTURE)
class UP(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.key.input(Hat.TOP)
self.wait(0.1)
self.key.input(Hat.CENTER)
self.key = None
class UP_RIGHT(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.key.input(Hat.TOP_RIGHT)
self.wait(0.1)
self.key.input(Hat.CENTER)
self.key = None
class RIGHT(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.key.input(Hat.RIGHT)
self.wait(0.1)
self.key.input(Hat.CENTER)
self.key = None
class DOWN_RIGHT(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.key.input(Hat.BTM_RIGHT)
self.wait(0.1)
self.key.input(Hat.CENTER)
self.key = None
class DOWN(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.key.input(Hat.BTM)
self.wait(0.1)
self.key.input(Hat.CENTER)
self.key = None
class DOWN_LEFT(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.key.input(Hat.BTM_LEFT)
self.wait(0.1)
self.key.input(Hat.CENTER)
self.key = None
class LEFT(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.key.input(Hat.LEFT)
self.wait(0.1)
self.key.input(Hat.CENTER)
self.key = None
class UP_LEFT(UnitCommand):
def __init__(self):
super().__init__()
def start(self, ser):
super().start(ser)
self.key.input(Hat.TOP_LEFT)
self.wait(0.1)
self.key.input(Hat.CENTER)
self.key = None
| 20.389764 | 44 | 0.544893 | 614 | 5,179 | 4.283388 | 0.107492 | 0.074525 | 0.096198 | 0.139924 | 0.813308 | 0.813308 | 0.806844 | 0.806844 | 0.806844 | 0.793536 | 0 | 0.005672 | 0.319174 | 5,179 | 253 | 45 | 20.470356 | 0.740216 | 0.017571 | 0 | 0.662857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.28 | false | 0.005714 | 0.017143 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
18388379a3bce3f4ae6e28cc59cead51004c50ee | 101 | py | Python | tests/test_version.py | viagostini/parallel-utilities | 3fcd6e69550284206fc2da451daa0e258512e1db | [
"MIT"
] | null | null | null | tests/test_version.py | viagostini/parallel-utilities | 3fcd6e69550284206fc2da451daa0e258512e1db | [
"MIT"
] | null | null | null | tests/test_version.py | viagostini/parallel-utilities | 3fcd6e69550284206fc2da451daa0e258512e1db | [
"MIT"
] | null | null | null | import parallel_utilities
def test_version():
assert parallel_utilities.__version__ == "0.1.0"
| 16.833333 | 52 | 0.762376 | 13 | 101 | 5.384615 | 0.692308 | 0.485714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034483 | 0.138614 | 101 | 5 | 53 | 20.2 | 0.770115 | 0 | 0 | 0 | 0 | 0 | 0.049505 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
1862f570098bf9777d078a89b69ddd8cec2d7b28 | 866 | py | Python | vulture-whitelist.py | lschmelzeisen/wallcrop | e6ab84154d2135f27609f9d29cec67c265b2744a | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | vulture-whitelist.py | lschmelzeisen/wallcrop | e6ab84154d2135f27609f9d29cec67c265b2744a | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | vulture-whitelist.py | lschmelzeisen/wallcrop | e6ab84154d2135f27609f9d29cec67c265b2744a | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | _.error_on_external_run # unused attribute (/home/lschmelzeisen/Repositories/wallcrop/noxfile.py:20)
_.reuse_existing_virtualenvs # unused attribute (/home/lschmelzeisen/Repositories/wallcrop/noxfile.py:21)
_.stop_on_first_error # unused attribute (/home/lschmelzeisen/Repositories/wallcrop/noxfile.py:22)
test # unused function (/home/lschmelzeisen/Repositories/wallcrop/noxfile.py:25)
Config # unused class (/home/lschmelzeisen/Repositories/wallcrop/src/wallcrop/_cli.py:34)
search_path # unused variable (/home/lschmelzeisen/Repositories/wallcrop/src/wallcrop/_cli.py:35)
Config # unused class (/home/lschmelzeisen/Repositories/wallcrop/src/wallcrop/_cli.py:39)
version # unused variable (/home/lschmelzeisen/Repositories/wallcrop/src/wallcrop/_cli.py:41)
description # unused variable (/home/lschmelzeisen/Repositories/wallcrop/src/wallcrop/_cli.py:42)
| 86.6 | 106 | 0.818707 | 109 | 866 | 6.348624 | 0.33945 | 0.221098 | 0.377168 | 0.481214 | 0.823699 | 0.823699 | 0.757225 | 0.757225 | 0.492775 | 0.492775 | 0 | 0.022167 | 0.062356 | 866 | 9 | 107 | 96.222222 | 0.830049 | 0.822171 | 0 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
a16e1e53fc1f658e42d775e2c1107de21c460eb4 | 1,877 | py | Python | Estructura de control selectivas/Ejercicio15.py | Argenta47/talleres_de_algoritmos | cfca1984c69702d48fb165070eb0c60b16977e51 | [
"MIT"
] | null | null | null | Estructura de control selectivas/Ejercicio15.py | Argenta47/talleres_de_algoritmos | cfca1984c69702d48fb165070eb0c60b16977e51 | [
"MIT"
] | null | null | null | Estructura de control selectivas/Ejercicio15.py | Argenta47/talleres_de_algoritmos | cfca1984c69702d48fb165070eb0c60b16977e51 | [
"MIT"
] | null | null | null | """
Datos de entrada
nivel de hemoglobina-->NH-->float
edad-->E-->int
sexo-->S-->str
Datos de salida
resultado de anemia-->R-->str
"""
NH=float(input("Ingrese el nivel de hemoglobina: "))
TE=float(input("Edad en días digite 1, edad en meses digite 2, edad en años digite 3: "))
E=int(input("Ingrese la edad del paciente: "))
S=str(input("Ingrese el sexo en minusculas: "))
if(TE==1 and E>0 and E<=60 and NH<=13):
print("Resultado para anemia: Positivo")
else:
(TE==1 and E>0 and E<=60 and NH>13 and NH<26)
print("Resultado para anemia: Negativo")
if(TE==2 and E>1 and E<=6 and NH<=10):
print("Resultado para anemia: Positivo")
else:
(TE==2 and E>1 and E<=6 and NH>10 and NH<18)
print("Resultado para anemia: Negativo")
if(TE==2 and E>6 and E<=12 and NH<=11):
print("Resultado para anemia: Positivo")
else:
(TE==2 and E>6 and E<=12 and NH>11 and NH<15)
print("Resultado para anemia:Negativo")
if(TE==3 and E>1 and E<=5 and NH<=11.5):
print("Resultado para anemia: Positivo")
else:
(TE==3 and E>1 and E<=5 and NH>11.5 and NH<15)
print("Resultado para anemia: Negativo")
if(TE==3 and E>5 and E<=10 and NH<=12.6):
print("Resultado para anemia: Positivo")
else:
(TE==3 and E>5 and E<=10 and NH>12.6 and NH<15.5)
print("Resultado para anemia: Negativo")
if(TE==3 and E>10 and E<=15 and NH<=13):
print("Resultado para anemia: Positivo")
else:
(TE==3 and E>10 and E<=15 and NH>13 and NH<15.5)
print("Resultado para anemia: Negativo")
if(TE==3 and S=="femenino" and E>15 and NH<=12):
print("Resultado para anemia: Positivo")
else:
(TE==3 and S=="femenino" and E>15 and NH>12 and NH<16)
print("Resultado para anemia: Negativo")
if(TE==3 and S=="masculino" and E>15 and NH<=14):
print("Resultado para anemia: Positivo")
else:
(TE==3 and S=="masculino" and E>15 and NH>14 and NH<18)
print("Resultado para anemia: Negativo")
| 28.014925 | 89 | 0.663292 | 366 | 1,877 | 3.401639 | 0.150273 | 0.08996 | 0.231325 | 0.308434 | 0.783133 | 0.779116 | 0.779116 | 0.779116 | 0.742169 | 0.742169 | 0 | 0.073576 | 0.167288 | 1,877 | 67 | 90 | 28.014925 | 0.722969 | 0.067128 | 0 | 0.545455 | 0 | 0 | 0.397362 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.363636 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a16ed925b789f2a7a075cba851ef074456f9da24 | 10,993 | py | Python | define/UserAgent.py | 1131041715/12306lmqc | e2049fd5393bacb08401737fa314ef5e7ab962e6 | [
"MIT"
] | 1 | 2020-10-22T03:36:46.000Z | 2020-10-22T03:36:46.000Z | define/UserAgent.py | Any1131041715/12306lmqc | e2049fd5393bacb08401737fa314ef5e7ab962e6 | [
"MIT"
] | 5 | 2021-03-31T19:41:44.000Z | 2021-12-13T20:39:28.000Z | define/UserAgent.py | 1131041715/12306lmqc | e2049fd5393bacb08401737fa314ef5e7ab962e6 | [
"MIT"
] | null | null | null | FIREFOX_USER_AGENT = r'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.14; rv:75.0) Gecko/20100101 Firefox/75.0'
USER_AGENT = [
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:65.0) Gecko/20100101 Firefox/65.0",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.119 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_3) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.0.3 Safari/605.1.15",
"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36 Edge/17.17134",
"Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:65.0) Gecko/20100101 Firefox/65.0",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.75 Safari/537.36",
"Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:65.0) Gecko/20100101 Firefox/65.0",
"Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/60.0",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36 Edge/18.17763",
"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.119 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.109 Safari/537.36",
"Mozilla/5.0 (Windows NT 6.1; rv:60.0) Gecko/20100101 Firefox/60.0",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; rv:11.0) like Gecko",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.14; rv:65.0) Gecko/20100101 Firefox/65.0",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.0.3 Safari/605.1.15",
"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.119 Safari/537.36",
"Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (X11; Linux x86_64; rv:65.0) Gecko/20100101 Firefox/65.0",
"Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:66.0) Gecko/20100101 Firefox/66.0",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.75 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36 OPR/58.0.3135.79",
"Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36 OPR/58.0.3135.107",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.119 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.75 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:65.0) Gecko/20100101 Firefox/65.0",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.14; rv:66.0) Gecko/20100101 Firefox/66.0",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_2) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.0.2 Safari/605.1.15",
"Mozilla/5.0 (Windows NT 6.1; WOW64; rv:65.0) Gecko/20100101 Firefox/65.0",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_4) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.1 Safari/605.1.15",
"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.109 YaBrowser/19.3.0.2485 Yowser/2.5 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/11.1.2 Safari/605.1.15",
"Mozilla/5.0 (iPad; CPU OS 12_1_4 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.0 Mobile/15E148 Safari/604.1",
"Mozilla/5.0 (X11; Linux x86_64; rv:66.0) Gecko/20100101 Firefox/66.0",
"Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.0 Safari/605.1.15",
"Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:60.0) Gecko/20100101 Firefox/60.0",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.109 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.119 Safari/537.36",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/72.0.3626.121 Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.0.3 Safari/605.1.15",
"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 YaBrowser/19.3.1.828 Yowser/2.5 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.109 YaBrowser/19.3.0.3022 Yowser/2.5 Safari/537.36",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.119 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.106 Safari/537.36",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.11; rv:65.0) Gecko/20100101 Firefox/65.0",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; Touch; rv:11.0) like Gecko",
"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.75 Safari/537.36",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/73.0.3683.75 Chrome/73.0.3683.75 Safari/537.36",
"Mozilla/5.0 (Windows NT 6.3; Win64; x64; rv:65.0) Gecko/20100101 Firefox/65.0",
"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36",
"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.109 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.12; rv:65.0) Gecko/20100101 Firefox/65.0",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:67.0) Gecko/20100101 Firefox/67.0",
"Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36 OPR/58.0.3135.118",
"Mozilla/5.0 (Windows NT 10.0; WOW64; rv:65.0) Gecko/20100101 Firefox/65.0",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:66.0) Gecko/20100101 Firefox/66.0",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:60.0) Gecko/20100101 Firefox/60.0",
"Mozilla/5.0 (Windows NT 6.1; Trident/7.0; rv:11.0) like Gecko",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_1) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.0.1 Safari/605.1.15",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.109 YaBrowser/19.3.0.2485 Yowser/2.5 Safari/537.36",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.109 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.109 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:64.0) Gecko/20100101 Firefox/64.0",
"Mozilla/5.0 (Windows NT 10.0; WOW64; rv:56.0) Gecko/20100101 Firefox/56.0",
"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.96 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.119 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36",
"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36 OPR/58.0.3135.127",
"Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.122 Safari/537.36",
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.14; rv:75.0) Gecko/20100101 Firefox/75.0",
]
| 109.93 | 150 | 0.699354 | 2,145 | 10,993 | 3.54965 | 0.047552 | 0.074862 | 0.114657 | 0.15721 | 0.965196 | 0.965196 | 0.965196 | 0.965196 | 0.950749 | 0.934069 | 0 | 0.261267 | 0.134085 | 10,993 | 99 | 151 | 111.040404 | 0.538607 | 0 | 0 | 0 | 0 | 0.979798 | 0.926317 | 0.009643 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
a1dab426db4aa7699bfb154c476add14361cbe6c | 174 | py | Python | twarc/version.py | mariosanz92/twarc | 469c23abc7b661f9841eeed6fd7d735b81b1c240 | [
"MIT"
] | null | null | null | twarc/version.py | mariosanz92/twarc | 469c23abc7b661f9841eeed6fd7d735b81b1c240 | [
"MIT"
] | null | null | null | twarc/version.py | mariosanz92/twarc | 469c23abc7b661f9841eeed6fd7d735b81b1c240 | [
"MIT"
] | null | null | null | import platform
version = "2.9.5"
user_agent = f"twarc/{version} ({platform.system()} {platform.machine()}) {platform.python_implementation()}/{platform.python_version()}"
| 29 | 137 | 0.729885 | 21 | 174 | 5.904762 | 0.666667 | 0.225806 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018634 | 0.074713 | 174 | 5 | 138 | 34.8 | 0.751553 | 0 | 0 | 0 | 0 | 0.333333 | 0.724138 | 0.477011 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
6299e43cbbcf507dd111a48fa8d104779033bb06 | 9,707 | py | Python | mars/tests/test_kvstore.py | tomzhang/mars-1 | 6f1d85e37eb1b383251314cb0ba13e06288af03d | [
"Apache-2.0"
] | 2 | 2019-03-29T04:11:10.000Z | 2020-07-08T10:19:54.000Z | mars/tests/test_kvstore.py | tomzhang/mars-1 | 6f1d85e37eb1b383251314cb0ba13e06288af03d | [
"Apache-2.0"
] | null | null | null | mars/tests/test_kvstore.py | tomzhang/mars-1 | 6f1d85e37eb1b383251314cb0ba13e06288af03d | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# Copyright 1999-2020 Alibaba Group Holding Ltd.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import sys
import unittest
import gevent
from mars.tests.core import EtcdProcessHelper
from mars.kvstore import get, PathResult
class Test(unittest.TestCase):
def testLocalPathStore(self):
kvstore = get(':inproc:')
kvstore.write('/node/subnode/v1', 'value1')
kvstore.write('/node/v2', 'value2')
res = kvstore.read('/node', sort=True)
expected = PathResult(key='/node', dir=True, children=[
PathResult(key='/node/subnode', dir=True),
PathResult(key='/node/v2', value='value2'),
])
self.assertEqual(repr(res), repr(expected))
res = kvstore.read('/node', recursive=True, sort=True)
expected = PathResult(key='/node', dir=True, children=[
PathResult(key='/node/subnode/v1', value='value1'),
PathResult(key='/node/v2', value='value2'),
])
self.assertEqual(repr(res), repr(expected))
kvstore.write('/node/v3', 'value3')
with self.assertRaises(KeyError):
kvstore.write('/node/v2/invalid_value', value='invalid')
res = kvstore.read('/', recursive=False, sort=True)
expected = PathResult(key='/', dir=True, children=[
PathResult(key='/node', dir=True),
])
self.assertEqual(repr(res), repr(expected))
res = kvstore.read('/', recursive=True, sort=True)
expected = PathResult(key='/', dir=True, children=[
PathResult(key='/node/subnode/v1', value='value1'),
PathResult(key='/node/v2', value='value2'),
PathResult(key='/node/v3', value='value3'),
])
self.assertEqual(repr(res), repr(expected))
kvstore.write('/node/subnode2/v4', 'value4')
with self.assertRaises(KeyError):
kvstore.delete('/node/subnode', dir=True)
kvstore.delete('/node/subnode/v1')
res = kvstore.read('/', recursive=True, sort=True)
expected = PathResult(key='/', dir=True, children=[
PathResult(key='/node/subnode', dir=True),
PathResult(key='/node/subnode2/v4', value='value4'),
PathResult(key='/node/v2', value='value2'),
PathResult(key='/node/v3', value='value3'),
])
self.assertEqual(repr(res), repr(expected))
kvstore.delete('/node/subnode2', dir=True, recursive=True)
res = kvstore.read('/', recursive=True, sort=True)
expected = PathResult(key='/', dir=True, children=[
PathResult(key='/node/subnode', dir=True),
PathResult(key='/node/v2', value='value2'),
PathResult(key='/node/v3', value='value3')
])
self.assertEqual(repr(res), repr(expected))
@unittest.skipIf(sys.platform == 'win32', 'does not run in windows')
@unittest.skipIf('CI' not in os.environ and not EtcdProcessHelper().is_installed(),
'does not run without etcd')
def testEtcdPathStore(self):
with EtcdProcessHelper(port_range_start=51342).run():
kvstore = get(u'etcd://localhost:51342')
kvstore.write(u'/node/subnode/v1', u'value1')
kvstore.write(u'/node/v2', u'value2')
res = kvstore.read(u'/node', sort=True)
expected = PathResult(key=u'/node', dir=True, children=[
PathResult(key=u'/node/subnode', dir=True),
PathResult(key=u'/node/v2', value=u'value2'),
])
self.assertEqual(repr(res), repr(expected))
res = kvstore.read(u'/node', recursive=True, sort=True)
expected = PathResult(key=u'/node', dir=True, children=[
PathResult(key=u'/node/subnode/v1', value=u'value1'),
PathResult(key=u'/node/v2', value=u'value2'),
])
self.assertEqual(repr(res), repr(expected))
kvstore.write(u'/node/v3', u'value3')
with self.assertRaises(KeyError):
kvstore.write(u'/node/v2/invalid_value', value=u'invalid')
res = kvstore.read('/', recursive=False, sort=True)
expected = PathResult(key='/', dir=True, children=[
PathResult(key=u'/node', dir=True),
])
self.assertEqual(repr(res), repr(expected))
res = kvstore.read('/', recursive=True, sort=True)
expected = PathResult(key='/', dir=True, children=[
PathResult(key=u'/node/subnode/v1', value=u'value1'),
PathResult(key=u'/node/v2', value=u'value2'),
PathResult(key=u'/node/v3', value=u'value3'),
])
self.assertEqual(repr(res), repr(expected))
kvstore.write(u'/node/subnode2/v4', u'value4')
with self.assertRaises(KeyError):
kvstore.delete(u'/node/subnode', dir=True)
kvstore.delete(u'/node/subnode/v1')
res = kvstore.read('/', recursive=True, sort=True)
expected = PathResult(key='/', dir=True, children=[
PathResult(key=u'/node/subnode', dir=True),
PathResult(key=u'/node/subnode2/v4', value=u'value4'),
PathResult(key=u'/node/v2', value=u'value2'),
PathResult(key=u'/node/v3', value=u'value3'),
])
self.assertEqual(repr(res), repr(expected))
kvstore.delete(u'/node', recursive=True, dir=True)
def testLocalWatch(self):
kvstore = get(':inproc:')
kvstore.write('/node/subnode/v1', 'value1')
kvstore.write('/node/v2', 'value2')
def watcher():
return kvstore.watch('/node/v2', timeout=10)
def writer():
gevent.sleep(0.5)
kvstore.write('/node/v2', 'value2\'')
g1 = gevent.spawn(writer)
g2 = gevent.spawn(watcher)
gevent.joinall([g1, g2])
self.assertEqual(g2.value.value, 'value2\'')
kvstore.delete('/node/v2')
def watcher():
return kvstore.watch('/node/subnode', timeout=10, recursive=True)
def writer():
gevent.sleep(0.5)
kvstore.write('/node/subnode/v1', 'value1\'')
g1 = gevent.spawn(writer)
g2 = gevent.spawn(watcher)
gevent.joinall([g1, g2])
self.assertEqual(g2.value.children[0].value, 'value1\'')
kvstore.write('/node/subnode/v3', '-1')
def watcher():
results = []
for idx, result in enumerate(kvstore.eternal_watch('/node/subnode/v3')):
results.append(int(result.value))
if idx == 4:
break
return results
def writer():
gevent.sleep(0.1)
for v in range(5):
kvstore.write('/node/subnode/v3', str(v))
gevent.sleep(0.1)
g1 = gevent.spawn(writer)
g2 = gevent.spawn(watcher)
gevent.joinall([g1, g2])
self.assertEqual(g2.value, list(range(5)))
kvstore.delete('/node', dir=True, recursive=True)
@unittest.skipIf(sys.platform == 'win32', 'does not run in windows')
@unittest.skipIf('CI' not in os.environ and not EtcdProcessHelper().is_installed(),
'does not run without etcd')
def testEtcdWatch(self):
with EtcdProcessHelper(port_range_start=51342).run():
kvstore = get('etcd://localhost:51342')
kvstore.write('/node/subnode/v1', 'value1')
kvstore.write('/node/v2', 'value2')
def watcher():
return kvstore.watch('/node/v2', timeout=10)
def writer():
gevent.sleep(1)
kvstore.write('/node/v2', 'value2\'')
g1 = gevent.spawn(writer)
g2 = gevent.spawn(watcher)
gevent.joinall([g1, g2])
self.assertEqual(g2.value.value, 'value2\'')
kvstore.delete('/node/v2')
def watcher():
return kvstore.watch('/node/subnode', timeout=10, recursive=True)
def writer():
gevent.sleep(1)
kvstore.write('/node/subnode/v1', 'value1\'')
g1 = gevent.spawn(writer)
g2 = gevent.spawn(watcher)
gevent.joinall([g1, g2])
self.assertEqual(g2.value.children[0].value, 'value1\'')
kvstore.write('/node/subnode/v3', '-1')
def watcher():
results = []
for idx, result in enumerate(kvstore.eternal_watch('/node/subnode/v3')):
results.append(int(result.value))
if idx == 4:
break
return results
def writer():
gevent.sleep(0.1)
for v in range(5):
kvstore.write('/node/subnode/v3', str(v))
gevent.sleep(0.1)
g1 = gevent.spawn(writer)
g2 = gevent.spawn(watcher)
gevent.joinall([g1, g2])
self.assertEqual(g2.value, list(range(5)))
kvstore.delete('/node', dir=True, recursive=True)
| 37.624031 | 88 | 0.56176 | 1,099 | 9,707 | 4.952684 | 0.151956 | 0.090759 | 0.049972 | 0.046298 | 0.841999 | 0.817748 | 0.804887 | 0.770715 | 0.764101 | 0.756568 | 0 | 0.02793 | 0.284434 | 9,707 | 257 | 89 | 37.770428 | 0.755687 | 0.063047 | 0 | 0.8125 | 0 | 0 | 0.138106 | 0.009692 | 0 | 0 | 0 | 0 | 0.109375 | 1 | 0.083333 | false | 0 | 0.03125 | 0.020833 | 0.151042 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c519d3208f7da5e18d9337f70e9aa6906f90584e | 163 | py | Python | mantra/util/data/__init__.py | durandtibo/mantra-python | a35dfd93f92f7f510a212ee5356ae4d776a27849 | [
"MIT"
] | 1 | 2019-02-22T09:48:04.000Z | 2019-02-22T09:48:04.000Z | mantra/util/data/__init__.py | durandtibo/mantra-python | a35dfd93f92f7f510a212ee5356ae4d776a27849 | [
"MIT"
] | null | null | null | mantra/util/data/__init__.py | durandtibo/mantra-python | a35dfd93f92f7f510a212ee5356ae4d776a27849 | [
"MIT"
] | null | null | null | from mantra.util.data.labeled_object import LabeledObject
from mantra.util.data.preprocessing import Preprocessing
from mantra.util.data.bag import Bag, BagReader
| 40.75 | 57 | 0.858896 | 23 | 163 | 6.043478 | 0.478261 | 0.215827 | 0.302158 | 0.388489 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079755 | 163 | 3 | 58 | 54.333333 | 0.926667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
c54a3efbba18d1a70e820fc01627674734ea989c | 117,718 | py | Python | mp/settings_lookup.py | Ecotrust/COMPASS | 42ee113e4d66767300cfab0d6ce1f35847f447ed | [
"Apache-2.0"
] | 1 | 2017-09-06T14:05:48.000Z | 2017-09-06T14:05:48.000Z | mp/settings_lookup.py | Ecotrust/COMPASS | 42ee113e4d66767300cfab0d6ce1f35847f447ed | [
"Apache-2.0"
] | 118 | 2015-01-05T19:52:11.000Z | 2021-11-30T18:33:35.000Z | mp/settings_lookup.py | Ecotrust/COMPASS | 42ee113e4d66767300cfab0d6ce1f35847f447ed | [
"Apache-2.0"
] | null | null | null | ECOREGION_LOOKUP = {
'blue mountains': "<a href='https://oregonconservationstrategy.org/ecoregion/blue-mountains/' class='ocs-species-link' target='_blank'>Blue Mountains</a>",
'coast range': "<a href='https://oregonconservationstrategy.org/ecoregion/coast-range/' class='ocs-species-link' target='_blank'>Coast Range</a>",
'columbia plateau': "<a href='https://oregonconservationstrategy.org/ecoregion/columbia-plateau/' class='ocs-species-link' target='_blank'>Columbia Plateau</a>",
'east cascades': "<a href='https://oregonconservationstrategy.org/ecoregion/east-cascades/' class='ocs-species-link' target='_blank'>East Cascades</a>",
'klamath mountains': "<a href='https://oregonconservationstrategy.org/ecoregion/klamath-mountains/' class='ocs-species-link' target='_blank'>Klamath Mountains</a>",
'northern basin and range': "<a href='https://oregonconservationstrategy.org/ecoregion/northern-basin-and-range/' class='ocs-species-link' target='_blank'>Northern Basin and Range</a>",
'west cascades': "<a href='https://oregonconservationstrategy.org/ecoregion/west-cascades/' class='ocs-species-link' target='_blank'>West Cascades</a>",
'willamette valley': "<a href='https://oregonconservationstrategy.org/ecoregion/willamette-valley/' class='ocs-species-link' target='_blank'>Willamette Valley</a>",
'nearshore': "<a href='https://oregonconservationstrategy.org/ecoregion/nearshore/' class='ocs-species-link' target='_blank'>Nearshore</a>",
}
SPECIES_LOOKUP = {
"1":u"<a href='https://oregonconservationstrategy.org/strategy-species/cascade-torrent-salamander/' class='ocs-species-link' target='_blank'>Cascade Torrent Salamander</a>",
"2":u"<a href='https://oregonconservationstrategy.org/strategy-species/cascade-torrent-salamander/' class='ocs-species-link' target='_blank'>Cascade Torrent Salamander</a>",
"3":u"<a href='https://oregonconservationstrategy.org/strategy-species/cascades-frog/' class='ocs-species-link' target='_blank'>Cascades Frog</a>",
"4":u"<a href='https://oregonconservationstrategy.org/strategy-species/cascades-frog/' class='ocs-species-link' target='_blank'>Cascades Frog</a>",
"5":u"<a href='https://oregonconservationstrategy.org/strategy-species/clouded-salamander/' class='ocs-species-link' target='_blank'>Clouded Salamander</a>",
"6":u"<a href='https://oregonconservationstrategy.org/strategy-species/clouded-salamander/' class='ocs-species-link' target='_blank'>Clouded Salamander</a>",
"7":u"<a href='https://oregonconservationstrategy.org/strategy-species/coastal-tailed-frog/' class='ocs-species-link' target='_blank'>Coastal Tailed Frog</a>",
"8":u"<a href='https://oregonconservationstrategy.org/strategy-species/coastal-tailed-frog/' class='ocs-species-link' target='_blank'>Coastal Tailed Frog</a>",
"9":u"<a href='https://oregonconservationstrategy.org/strategy-species/columbia-spotted-frog/' class='ocs-species-link' target='_blank'>Columbia Spotted Frog</a>",
"10":u"<a href='https://oregonconservationstrategy.org/strategy-species/columbia-spotted-frog/' class='ocs-species-link' target='_blank'>Columbia Spotted Frog</a>",
"11":u"<a href='https://oregonconservationstrategy.org/strategy-species/columbia-torrent-salamander/' class='ocs-species-link' target='_blank'>Columbia Torrent Salamander</a>",
"12":u"<a href='https://oregonconservationstrategy.org/strategy-species/columbia-torrent-salamander/' class='ocs-species-link' target='_blank'>Columbia Torrent Salamander</a>",
"13":u"<a href='https://oregonconservationstrategy.org/strategy-species/copes-giant-salamander/' class='ocs-species-link' target='_blank'>Cope's Giant Salamander</a>",
"14":u"<a href='https://oregonconservationstrategy.org/strategy-species/copes-giant-salamander/' class='ocs-species-link' target='_blank'>Cope's Giant Salamander</a>",
"15":u"<a href='https://oregonconservationstrategy.org/strategy-species/del-norte-salamander/' class='ocs-species-link' target='_blank'>Del Norte Salamander</a>",
"16":u"<a href='https://oregonconservationstrategy.org/strategy-species/del-norte-salamander/' class='ocs-species-link' target='_blank'>Del Norte Salamander</a>",
"17":u"<a href='https://oregonconservationstrategy.org/strategy-species/foothill-yellow-legged-frog/' class='ocs-species-link' target='_blank'>Foothill Yellow-legged Frog</a>",
"18":u"<a href='https://oregonconservationstrategy.org/strategy-species/foothill-yellow-legged-frog/' class='ocs-species-link' target='_blank'>Foothill Yellow-legged Frog</a>",
"19":u"<a href='https://oregonconservationstrategy.org/strategy-species/larch-mountain-salamander/' class='ocs-species-link' target='_blank'>Larch Mountain Salamander</a>",
"20":u"<a href='https://oregonconservationstrategy.org/strategy-species/larch-mountain-salamander/' class='ocs-species-link' target='_blank'>Larch Mountain Salamander</a>",
"21":u"<a href='https://oregonconservationstrategy.org/strategy-species/northern-red-legged-frog/' class='ocs-species-link' target='_blank'>Northern Red-legged Frog</a>",
"22":u"<a href='https://oregonconservationstrategy.org/strategy-species/northern-red-legged-frog/' class='ocs-species-link' target='_blank'>Northern Red-legged Frog</a>",
"23":u"<a href='https://oregonconservationstrategy.org/strategy-species/oregon-slender-salamander/' class='ocs-species-link' target='_blank'>Oregon Slender Salamander</a>",
"24":u"<a href='https://oregonconservationstrategy.org/strategy-species/oregon-slender-salamander/' class='ocs-species-link' target='_blank'>Oregon Slender Salamander</a>",
"25":u"<a href='https://oregonconservationstrategy.org/strategy-species/oregon-spotted-frog/' class='ocs-species-link' target='_blank'>Oregon Spotted Frog</a>",
"26":u"<a href='https://oregonconservationstrategy.org/strategy-species/oregon-spotted-frog/' class='ocs-species-link' target='_blank'>Oregon Spotted Frog</a>",
"27":u"<a href='https://oregonconservationstrategy.org/strategy-species/rocky-mountain-tailed-frog/' class='ocs-species-link' target='_blank'>Rocky Mountain Tailed Frog</a>",
"28":u"<a href='https://oregonconservationstrategy.org/strategy-species/rocky-mountain-tailed-frog/' class='ocs-species-link' target='_blank'>Rocky Mountain Tailed Frog</a>",
"29":u"<a href='https://oregonconservationstrategy.org/strategy-species/siskiyou-mountain-salamander/' class='ocs-species-link' target='_blank'>Siskiyou Mountain Salamander</a>",
"30":u"<a href='https://oregonconservationstrategy.org/strategy-species/siskiyou-mountain-salamander/' class='ocs-species-link' target='_blank'>Siskiyou Mountain Salamander</a>",
"31":u"<a href='https://oregonconservationstrategy.org/strategy-species/southern-torrent-salamander/' class='ocs-species-link' target='_blank'>Southern Torrent Salamander</a>",
"32":u"<a href='https://oregonconservationstrategy.org/strategy-species/southern-torrent-salamander/' class='ocs-species-link' target='_blank'>Southern Torrent Salamander</a>",
"33":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-toad/' class='ocs-species-link' target='_blank'>Western Toad</a>",
"34":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-toad/' class='ocs-species-link' target='_blank'>Western Toad</a>",
"35":u"<a href='https://oregonconservationstrategy.org/strategy-species/acorn-woodpecker/' class='ocs-species-link' target='_blank'>Acorn Woodpecker</a>",
"36":u"<a href='https://oregonconservationstrategy.org/strategy-species/acorn-woodpecker/' class='ocs-species-link' target='_blank'>Acorn Woodpecker</a>",
"37":u"<a href='https://oregonconservationstrategy.org/strategy-species/american-peregrine-falcon/' class='ocs-species-link' target='_blank'>Peregrine Falcon (American)</a>",
"38":u"<a href='https://oregonconservationstrategy.org/strategy-species/american-peregrine-falcon/' class='ocs-species-link' target='_blank'>Peregrine Falcon (American)</a>",
"39":u"<a href='https://oregonconservationstrategy.org/strategy-species/american-three-toed-woodpecker/' class='ocs-species-link' target='_blank'>American Three-toed Woodpecker</a>",
"40":u"<a href='https://oregonconservationstrategy.org/strategy-species/american-three-toed-woodpecker/' class='ocs-species-link' target='_blank'>American Three-toed Woodpecker</a>",
"41":u"<a href='https://oregonconservationstrategy.org/strategy-species/american-white-pelican/' class='ocs-species-link' target='_blank'>American White Pelican</a>",
"42":u"<a href='https://oregonconservationstrategy.org/strategy-species/american-white-pelican/' class='ocs-species-link' target='_blank'>American White Pelican</a>",
"43":u"<a href='https://oregonconservationstrategy.org/strategy-species/black-backed-woodpecker/' class='ocs-species-link' target='_blank'>Black-backed Woodpecker</a>",
"44":u"<a href='https://oregonconservationstrategy.org/strategy-species/black-backed-woodpecker/' class='ocs-species-link' target='_blank'>Black-backed Woodpecker</a>",
"45":u"<a href='https://oregonconservationstrategy.org/strategy-species/black-necked-stilt/' class='ocs-species-link' target='_blank'>Black-necked Stilt</a>",
"46":u"<a href='https://oregonconservationstrategy.org/strategy-species/black-necked-stilt/' class='ocs-species-link' target='_blank'>Black-necked Stilt</a>",
"47":u"<a href='https://oregonconservationstrategy.org/strategy-species/black-brant/' class='ocs-species-link' target='_blank'>Black Brant</a>",
"48":u"<a href='https://oregonconservationstrategy.org/strategy-species/black-brant/' class='ocs-species-link' target='_blank'>Black Brant</a>",
"49":u"<a href='https://oregonconservationstrategy.org/strategy-species/black-oystercatcher/' class='ocs-species-link' target='_blank'>Black Oystercatcher</a>",
"50":u"<a href='https://oregonconservationstrategy.org/strategy-species/black-oystercatcher/' class='ocs-species-link' target='_blank'>Black Oystercatcher</a>",
"51":u"<a href='https://oregonconservationstrategy.org/strategy-species/black-swift/' class='ocs-species-link' target='_blank'>Black Swift</a>",
"52":u"<a href='https://oregonconservationstrategy.org/strategy-species/black-swift/' class='ocs-species-link' target='_blank'>Black Swift</a>",
"53":u"<a href='https://oregonconservationstrategy.org/strategy-species/bobolink/' class='ocs-species-link' target='_blank'>Bobolink</a>",
"54":u"<a href='https://oregonconservationstrategy.org/strategy-species/bobolink/' class='ocs-species-link' target='_blank'>Bobolink</a>",
"55":u"<a href='https://oregonconservationstrategy.org/strategy-species/brewers-sparrow/' class='ocs-species-link' target='_blank'>Brewer's Sparrow</a>",
"56":u"<a href='https://oregonconservationstrategy.org/strategy-species/brewers-sparrow/' class='ocs-species-link' target='_blank'>Brewer's Sparrow</a>",
"57":u"<a href='https://oregonconservationstrategy.org/strategy-species/california-brown-pelican/' class='ocs-species-link' target='_blank'>Brown Pelican (California)</a>",
"58":u"<a href='https://oregonconservationstrategy.org/strategy-species/california-brown-pelican/' class='ocs-species-link' target='_blank'>Brown Pelican (California)</a>",
"59":u"<a href='https://oregonconservationstrategy.org/strategy-species/caspian-tern/' class='ocs-species-link' target='_blank'>Caspian Tern</a>",
"60":u"<a href='https://oregonconservationstrategy.org/strategy-species/caspian-tern/' class='ocs-species-link' target='_blank'>Caspian Tern</a>",
"61":u"<a href='https://oregonconservationstrategy.org/strategy-species/chipping-sparrow/' class='ocs-species-link' target='_blank'>Chipping Sparrow</a>",
"62":u"<a href='https://oregonconservationstrategy.org/strategy-species/chipping-sparrow/' class='ocs-species-link' target='_blank'>Chipping Sparrow</a>",
"63":u"<a href='https://oregonconservationstrategy.org/strategy-species/columbian-sharp-tailed-grouse/' class='ocs-species-link' target='_blank'>Columbian Sharp-Tailed Grouse</a>",
"64":u"<a href='https://oregonconservationstrategy.org/strategy-species/columbian-sharp-tailed-grouse/' class='ocs-species-link' target='_blank'>Columbian Sharp-Tailed Grouse</a>",
"65":u"<a href='https://oregonconservationstrategy.org/strategy-species/common-nighthawk/' class='ocs-species-link' target='_blank'>Common Nighthawk</a>",
"66":u"<a href='https://oregonconservationstrategy.org/strategy-species/common-nighthawk/' class='ocs-species-link' target='_blank'>Common Nighthawk</a>",
"67":u"<a href='https://oregonconservationstrategy.org/strategy-species/dusky-canada-goose/' class='ocs-species-link' target='_blank'>Dusky Canada Goose</a>",
"68":u"<a href='https://oregonconservationstrategy.org/strategy-species/dusky-canada-goose/' class='ocs-species-link' target='_blank'>Dusky Canada Goose</a>",
"69":u"<a href='https://oregonconservationstrategy.org/strategy-species/ferruginous-hawk/' class='ocs-species-link' target='_blank'>Ferruginous Hawk</a>",
"70":u"<a href='https://oregonconservationstrategy.org/strategy-species/ferruginous-hawk/' class='ocs-species-link' target='_blank'>Ferruginous Hawk</a>",
"71":u"<a href='https://oregonconservationstrategy.org/strategy-species/flammulated-owl/' class='ocs-species-link' target='_blank'>Flammulated Owl</a>",
"72":u"<a href='https://oregonconservationstrategy.org/strategy-species/flammulated-owl/' class='ocs-species-link' target='_blank'>Flammulated Owl</a>",
"73":u"<a href='https://oregonconservationstrategy.org/strategy-species/fork-tailed-storm-petrel/' class='ocs-species-link' target='_blank'>Fork-tailed Storm-Petrel</a>",
"74":u"<a href='https://oregonconservationstrategy.org/strategy-species/fork-tailed-storm-petrel/' class='ocs-species-link' target='_blank'>Fork-tailed Storm-Petrel</a>",
"75":u"<a href='https://oregonconservationstrategy.org/strategy-species/franklins-gull/' class='ocs-species-link' target='_blank'>Franklin's Gull</a>",
"76":u"<a href='https://oregonconservationstrategy.org/strategy-species/franklins-gull/' class='ocs-species-link' target='_blank'>Franklin's Gull</a>",
"77":u"<a href='https://oregonconservationstrategy.org/strategy-species/grasshopper-sparrow/' class='ocs-species-link' target='_blank'>Grasshopper Sparrow</a>",
"78":u"<a href='https://oregonconservationstrategy.org/strategy-species/grasshopper-sparrow/' class='ocs-species-link' target='_blank'>Grasshopper Sparrow</a>",
"79":u"<a href='https://oregonconservationstrategy.org/strategy-species/great-gray-owl/' class='ocs-species-link' target='_blank'>Great Gray Owl</a>",
"80":u"<a href='https://oregonconservationstrategy.org/strategy-species/great-gray-owl/' class='ocs-species-link' target='_blank'>Great Gray Owl</a>",
"81":u"<a href='https://oregonconservationstrategy.org/strategy-species/greater-sage-grouse/' class='ocs-species-link' target='_blank'>Greater Sage-Grouse</a>",
"82":u"<a href='https://oregonconservationstrategy.org/strategy-species/greater-sage-grouse/' class='ocs-species-link' target='_blank'>Greater Sage-Grouse</a>",
"83":u"<a href='https://oregonconservationstrategy.org/strategy-species/greater-sandhill-crane/' class='ocs-species-link' target='_blank'>Greater Sandhill Crane</a>",
"84":u"<a href='https://oregonconservationstrategy.org/strategy-species/greater-sandhill-crane/' class='ocs-species-link' target='_blank'>Greater Sandhill Crane</a>",
"85":u"<a href='https://oregonconservationstrategy.org/strategy-species/harlequin-duck/' class='ocs-species-link' target='_blank'>Harlequin Duck</a>",
"86":u"<a href='https://oregonconservationstrategy.org/strategy-species/harlequin-duck/' class='ocs-species-link' target='_blank'>Harlequin Duck</a>",
"87":u"<a href='https://oregonconservationstrategy.org/strategy-species/juniper-titmouse/' class='ocs-species-link' target='_blank'>Juniper Titmouse</a>",
"88":u"<a href='https://oregonconservationstrategy.org/strategy-species/juniper-titmouse/' class='ocs-species-link' target='_blank'>Juniper Titmouse</a>",
"89":u"<a href='https://oregonconservationstrategy.org/strategy-species/leachs-storm-petrel/' class='ocs-species-link' target='_blank'>Leach's Storm-Petrel</a>",
"90":u"<a href='https://oregonconservationstrategy.org/strategy-species/leachs-storm-petrel/' class='ocs-species-link' target='_blank'>Leach's Storm-Petrel</a>",
"91":u"<a href='https://oregonconservationstrategy.org/strategy-species/lewiss-woodpecker/' class='ocs-species-link' target='_blank'>Lewis's Woodpecker</a>",
"92":u"<a href='https://oregonconservationstrategy.org/strategy-species/lewiss-woodpecker/' class='ocs-species-link' target='_blank'>Lewis's Woodpecker</a>",
"93":u"<a href='https://oregonconservationstrategy.org/strategy-species/loggerhead-shrike/' class='ocs-species-link' target='_blank'>Loggerhead Shrike</a>",
"94":u"<a href='https://oregonconservationstrategy.org/strategy-species/loggerhead-shrike/' class='ocs-species-link' target='_blank'>Loggerhead Shrike</a>",
"95":u"<a href='https://oregonconservationstrategy.org/strategy-species/long-billed-curlew/' class='ocs-species-link' target='_blank'>Long-billed Curlew</a>",
"96":u"<a href='https://oregonconservationstrategy.org/strategy-species/long-billed-curlew/' class='ocs-species-link' target='_blank'>Long-billed Curlew</a>",
"97":u"<a href='https://oregonconservationstrategy.org/strategy-species/marbled-murrelet/' class='ocs-species-link' target='_blank'>Marbled Murrelet</a>",
"98":u"<a href='https://oregonconservationstrategy.org/strategy-species/marbled-murrelet/' class='ocs-species-link' target='_blank'>Marbled Murrelet</a>",
"99":u"<a href='https://oregonconservationstrategy.org/strategy-species/mountain-quail/' class='ocs-species-link' target='_blank'>Mountain Quail</a>",
"100":u"<a href='https://oregonconservationstrategy.org/strategy-species/mountain-quail/' class='ocs-species-link' target='_blank'>Mountain Quail</a>",
"101":u"<a href='https://oregonconservationstrategy.org/strategy-species/northern-goshawk/' class='ocs-species-link' target='_blank'>Northern Goshawk</a>",
"102":u"<a href='https://oregonconservationstrategy.org/strategy-species/northern-goshawk/' class='ocs-species-link' target='_blank'>Northern Goshawk</a>",
"103":u"<a href='https://oregonconservationstrategy.org/strategy-species/northern-spotted-owl/' class='ocs-species-link' target='_blank'>Northern Spotted Owl</a>",
"104":u"<a href='https://oregonconservationstrategy.org/strategy-species/northern-spotted-owl/' class='ocs-species-link' target='_blank'>Northern Spotted Owl</a>",
"105":u"<a href='https://oregonconservationstrategy.org/strategy-species/olive-sided-flycatcher/' class='ocs-species-link' target='_blank'>Olive-sided Flycatcher</a>",
"106":u"<a href='https://oregonconservationstrategy.org/strategy-species/olive-sided-flycatcher/' class='ocs-species-link' target='_blank'>Olive-sided Flycatcher</a>",
"107":u"<a href='https://oregonconservationstrategy.org/strategy-species/oregon-vesper-sparrow/' class='ocs-species-link' target='_blank'>Oregon Vesper Sparrow</a>",
"108":u"<a href='https://oregonconservationstrategy.org/strategy-species/oregon-vesper-sparrow/' class='ocs-species-link' target='_blank'>Oregon Vesper Sparrow</a>",
"109":u"<a href='https://oregonconservationstrategy.org/strategy-species/pileated-woodpecker/' class='ocs-species-link' target='_blank'>Pileated Woodpecker</a>",
"110":u"<a href='https://oregonconservationstrategy.org/strategy-species/pileated-woodpecker/' class='ocs-species-link' target='_blank'>Pileated Woodpecker</a>",
"111":u"<a href='https://oregonconservationstrategy.org/strategy-species/red-necked-grebe/' class='ocs-species-link' target='_blank'>Red-necked Grebe</a>",
"112":u"<a href='https://oregonconservationstrategy.org/strategy-species/red-necked-grebe/' class='ocs-species-link' target='_blank'>Red-necked Grebe</a>",
"113":u"<a href='https://oregonconservationstrategy.org/strategy-species/rock-sandpiper/' class='ocs-species-link' target='_blank'>Rock Sandpiper</a>",
"114":u"<a href='https://oregonconservationstrategy.org/strategy-species/rock-sandpiper/' class='ocs-species-link' target='_blank'>Rock Sandpiper</a>",
"115":u"<a href='https://oregonconservationstrategy.org/strategy-species/short-eared-owl/' class='ocs-species-link' target='_blank'>Short-Eared Owl</a>",
"116":u"<a href='https://oregonconservationstrategy.org/strategy-species/short-eared-owl/' class='ocs-species-link' target='_blank'>Short-Eared Owl</a>",
"117":u"<a href='https://oregonconservationstrategy.org/strategy-species/slender-billed-white-breasted-nuthatch/' class='ocs-species-link' target='_blank'>White-breasted Nuthatch (Slender-billed)</a>",
"118":u"<a href='https://oregonconservationstrategy.org/strategy-species/slender-billed-white-breasted-nuthatch/' class='ocs-species-link' target='_blank'>White-breasted Nuthatch (Slender-billed)</a>",
"119":u"<a href='https://oregonconservationstrategy.org/strategy-species/snowy-egret/' class='ocs-species-link' target='_blank'>Snowy Egret</a>",
"120":u"<a href='https://oregonconservationstrategy.org/strategy-species/snowy-egret/' class='ocs-species-link' target='_blank'>Snowy Egret</a>",
"121":u"<a href='https://oregonconservationstrategy.org/strategy-species/streaked-horned-lark/' class='ocs-species-link' target='_blank'>Streaked Horned Lark</a>",
"122":u"<a href='https://oregonconservationstrategy.org/strategy-species/streaked-horned-lark/' class='ocs-species-link' target='_blank'>Streaked Horned Lark</a>",
"123":u"<a href='https://oregonconservationstrategy.org/strategy-species/swainsons-hawk/' class='ocs-species-link' target='_blank'>Swainson's Hawk</a>",
"124":u"<a href='https://oregonconservationstrategy.org/strategy-species/swainsons-hawk/' class='ocs-species-link' target='_blank'>Swainson's Hawk</a>",
"125":u"<a href='https://oregonconservationstrategy.org/strategy-species/tufted-puffin/' class='ocs-species-link' target='_blank'>Tufted Puffin</a>",
"126":u"<a href='https://oregonconservationstrategy.org/strategy-species/tufted-puffin/' class='ocs-species-link' target='_blank'>Tufted Puffin</a>",
"127":u"<a href='https://oregonconservationstrategy.org/strategy-species/upland-sandpiper/' class='ocs-species-link' target='_blank'>Upland Sandpiper</a>",
"128":u"<a href='https://oregonconservationstrategy.org/strategy-species/upland-sandpiper/' class='ocs-species-link' target='_blank'>Upland Sandpiper</a>",
"129":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-bluebird/' class='ocs-species-link' target='_blank'>Western Bluebird</a>",
"130":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-bluebird/' class='ocs-species-link' target='_blank'>Western Bluebird</a>",
"131":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-burrowing-owl/' class='ocs-species-link' target='_blank'>Burrowing Owl (Western)</a>",
"132":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-burrowing-owl/' class='ocs-species-link' target='_blank'>Burrowing Owl (Western)</a>",
"133":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-meadowlark/' class='ocs-species-link' target='_blank'>Western Meadowlark</a>",
"134":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-meadowlark/' class='ocs-species-link' target='_blank'>Western Meadowlark</a>",
"135":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-purple-martin/' class='ocs-species-link' target='_blank'>Purple Martin</a>",
"136":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-purple-martin/' class='ocs-species-link' target='_blank'>Purple Martin</a>",
"137":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-snowy-plover/' class='ocs-species-link' target='_blank'>Western Snowy Plover</a>",
"138":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-snowy-plover/' class='ocs-species-link' target='_blank'>Western Snowy Plover</a>",
"139":u"<a href='https://oregonconservationstrategy.org/strategy-species/white-headed-woodpecker/' class='ocs-species-link' target='_blank'>White-headed Woodpecker</a>",
"140":u"<a href='https://oregonconservationstrategy.org/strategy-species/white-headed-woodpecker/' class='ocs-species-link' target='_blank'>White-headed Woodpecker</a>",
"141":u"<a href='https://oregonconservationstrategy.org/strategy-species/willow-flycatcher/' class='ocs-species-link' target='_blank'>Willow Flycatcher</a>",
"142":u"<a href='https://oregonconservationstrategy.org/strategy-species/willow-flycatcher/' class='ocs-species-link' target='_blank'>Willow Flycatcher</a>",
"143":u"<a href='https://oregonconservationstrategy.org/strategy-species/yellow-breasted-chat/' class='ocs-species-link' target='_blank'>Yellow-Breasted Chat</a>",
"144":u"<a href='https://oregonconservationstrategy.org/strategy-species/yellow-breasted-chat/' class='ocs-species-link' target='_blank'>Yellow-Breasted Chat</a>",
"145":u"<a href='https://oregonconservationstrategy.org/strategy-species/yellow-rail/' class='ocs-species-link' target='_blank'>Yellow Rail</a>",
"146":u"<a href='https://oregonconservationstrategy.org/strategy-species/yellow-rail/' class='ocs-species-link' target='_blank'>Yellow Rail</a>",
"147":u"<a href='https://oregonconservationstrategy.org/strategy-species/american-marten/' class='ocs-species-link' target='_blank'>American Marten</a>",
"148":u"<a href='https://oregonconservationstrategy.org/strategy-species/american-marten/' class='ocs-species-link' target='_blank'>American Marten</a>",
"149":u"<a href='https://oregonconservationstrategy.org/strategy-species/california-myotis/' class='ocs-species-link' target='_blank'>California Myotis</a>",
"150":u"<a href='https://oregonconservationstrategy.org/strategy-species/california-myotis/' class='ocs-species-link' target='_blank'>California Myotis</a>",
"151":u"<a href='https://oregonconservationstrategy.org/strategy-species/columbia-white-tailed-deer/' class='ocs-species-link' target='_blank'>Columbian White-tailed Deer</a>",
"152":u"<a href='https://oregonconservationstrategy.org/strategy-species/columbia-white-tailed-deer/' class='ocs-species-link' target='_blank'>Columbian White-tailed Deer</a>",
"153":u"<a href='https://oregonconservationstrategy.org/strategy-species/fisher/' class='ocs-species-link' target='_blank'>Fisher</a>",
"154":u"<a href='https://oregonconservationstrategy.org/strategy-species/fisher/' class='ocs-species-link' target='_blank'>Fisher</a>",
"155":u"<a href='https://oregonconservationstrategy.org/strategy-species/fringed-myotis/' class='ocs-species-link' target='_blank'>Fringed Myotis</a>",
"156":u"<a href='https://oregonconservationstrategy.org/strategy-species/fringed-myotis/' class='ocs-species-link' target='_blank'>Fringed Myotis</a>",
"157":u"<a href='https://oregonconservationstrategy.org/strategy-species/hoary-bat/' class='ocs-species-link' target='_blank'>Hoary Bat</a>",
"158":u"<a href='https://oregonconservationstrategy.org/strategy-species/hoary-bat/' class='ocs-species-link' target='_blank'>Hoary Bat</a>",
"159":u"<a href='https://oregonconservationstrategy.org/strategy-species/kit-fox/' class='ocs-species-link' target='_blank'>Kit Fox</a>",
"160":u"<a href='https://oregonconservationstrategy.org/strategy-species/kit-fox/' class='ocs-species-link' target='_blank'>Kit Fox</a>",
"161":u"<a href='https://oregonconservationstrategy.org/strategy-species/long-legged-myotis/' class='ocs-species-link' target='_blank'>Long-legged Myotis</a>",
"162":u"<a href='https://oregonconservationstrategy.org/strategy-species/long-legged-myotis/' class='ocs-species-link' target='_blank'>Long-legged Myotis</a>",
"163":u"<a href='https://oregonconservationstrategy.org/strategy-species/pallid-bat/' class='ocs-species-link' target='_blank'>Pallid Bat</a>",
"164":u"<a href='https://oregonconservationstrategy.org/strategy-species/pallid-bat/' class='ocs-species-link' target='_blank'>Pallid Bat</a>",
"165":u"<a href='https://oregonconservationstrategy.org/strategy-species/american-pika/' class='ocs-species-link' target='_blank'>American Pika</a>",
"166":u"<a href='https://oregonconservationstrategy.org/strategy-species/american-pika/' class='ocs-species-link' target='_blank'>American Pika</a>",
"167":u"<a href='https://oregonconservationstrategy.org/strategy-species/pygmy-rabbit/' class='ocs-species-link' target='_blank'>Pygmy Rabbit</a>",
"168":u"<a href='https://oregonconservationstrategy.org/strategy-species/pygmy-rabbit/' class='ocs-species-link' target='_blank'>Pygmy Rabbit</a>",
"169":u"<a href='https://oregonconservationstrategy.org/strategy-species/red-tree-vole/' class='ocs-species-link' target='_blank'>Red Tree Vole</a>",
"170":u"<a href='https://oregonconservationstrategy.org/strategy-species/red-tree-vole/' class='ocs-species-link' target='_blank'>Red Tree Vole</a>",
"171":u"<a href='https://oregonconservationstrategy.org/strategy-species/ringtail/' class='ocs-species-link' target='_blank'>Ringtail</a>",
"172":u"<a href='https://oregonconservationstrategy.org/strategy-species/ringtail/' class='ocs-species-link' target='_blank'>Ringtail</a>",
"173":u"<a href='https://oregonconservationstrategy.org/strategy-species/silver-haired-bat/' class='ocs-species-link' target='_blank'>Silver-haired Bat</a>",
"174":u"<a href='https://oregonconservationstrategy.org/strategy-species/silver-haired-bat/' class='ocs-species-link' target='_blank'>Silver-haired Bat</a>",
"175":u"<a href='https://oregonconservationstrategy.org/strategy-species/spotted-bat/' class='ocs-species-link' target='_blank'>Spotted Bat</a>",
"176":u"<a href='https://oregonconservationstrategy.org/strategy-species/spotted-bat/' class='ocs-species-link' target='_blank'>Spotted Bat</a>",
"177":u"<a href='https://oregonconservationstrategy.org/strategy-species/townsends-big-eared-bat/' class='ocs-species-link' target='_blank'>Townsend's Big-eared Bat</a>",
"178":u"<a href='https://oregonconservationstrategy.org/strategy-species/townsends-big-eared-bat/' class='ocs-species-link' target='_blank'>Townsend's Big-eared Bat</a>",
"179":u"<a href='https://oregonconservationstrategy.org/strategy-species/washington-ground-squirrel/' class='ocs-species-link' target='_blank'>Washington Ground Squirrel</a>",
"180":u"<a href='https://oregonconservationstrategy.org/strategy-species/'washington-ground-squirrel/ class='ocs-species-link' target='_blank'>Washington Ground Squirrel</a>",
"181":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-gray-squirrel/' class='ocs-species-link' target='_blank'>Western Gray Squirrel</a>",
"182":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-gray-squirrel/' class='ocs-species-link' target='_blank'>Western Gray Squirrel</a>",
"183":u"<a href='https://oregonconservationstrategy.org/strategy-species/wolverine/' class='ocs-species-link' target='_blank'>Wolverine</a>",
"184":u"<a href='https://oregonconservationstrategy.org/strategy-species/wolverine/' class='ocs-species-link' target='_blank'>Wolverine</a>",
"185":u"<a href='https://oregonconservationstrategy.org/strategy-species/california-mountain-kingsnake/' class='ocs-species-link' target='_blank'>California Mountain Kingsnake</a>",
"186":u"<a href='https://oregonconservationstrategy.org/strategy-species/california-mountain-kingsnake/' class='ocs-species-link' target='_blank'>California Mountain Kingsnake</a>",
"187":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-pond-turtle/' class='ocs-species-link' target='_blank'>Western Pond Turtle</a>",
"188":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-pond-turtle/' class='ocs-species-link' target='_blank'>Western Pond Turtle</a>",
"189":u"<a href='https://oregonconservationstrategy.org/strategy-species/northern-sagebrush-lizard/' class='ocs-species-link' target='_blank'>Northern Sagebrush Lizard</a>",
"190":u"<a href='https://oregonconservationstrategy.org/strategy-species/northern-sagebrush-lizard/' class='ocs-species-link' target='_blank'>Northern Sagebrush Lizard</a>",
"191":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-painted-turtle/' class='ocs-species-link' target='_blank'>Western Painted Turtle</a>",
"192":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-painted-turtle/' class='ocs-species-link' target='_blank'>Western Painted Turtle</a>",
"193":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-rattlesnake/' class='ocs-species-link' target='_blank'>Western Rattlesnake</a>",
"194":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-rattlesnake/' class='ocs-species-link' target='_blank'>Western Rattlesnake</a>",
"195":u"<a href='https://oregonconservationstrategy.org/strategy-species/sage-sparrow/' class='ocs-species-link' target='_blank'>Sagebrush Sparrow</a>",
"196":u"<a href='https://oregonconservationstrategy.org/strategy-species/sage-sparrow/' class='ocs-species-link' target='_blank'>Sagebrush Sparrow</a>",
"197":u"<a href='https://oregonconservationstrategy.org/strategy-species/trumpeter-swan/' class='ocs-species-link' target='_blank'>Trumpeter Swan</a>",
"198":u"<a href='https://oregonconservationstrategy.org/strategy-species/trumpeter-swan/' class='ocs-species-link' target='_blank'>Trumpeter Swan</a>",
"199":u"<a href='https://oregonconservationstrategy.org/strategy-species/gray-whale/' class='ocs-species-link' target='_blank'>Gray Whale</a>",
"200":u"<a href='https://oregonconservationstrategy.org/strategy-species/gray-whale/' class='ocs-species-link' target='_blank'>Gray Whale</a>",
"201":u"<a href='https://oregonconservationstrategy.org/strategy-species/gray-wolf/' class='ocs-species-link' target='_blank'>Gray Wolf</a>",
"202":u"<a href='https://oregonconservationstrategy.org/strategy-species/gray-wolf/' class='ocs-species-link' target='_blank'>Gray Wolf</a>",
"203":u"<a href='https://oregonconservationstrategy.org/strategy-species/harbor-porpoise/' class='ocs-species-link' target='_blank'>Harbor Porpoise</a>",
"204":u"<a href='https://oregonconservationstrategy.org/strategy-species/harbor-porpoise/' class='ocs-species-link' target='_blank'>Harbor Porpoise</a>",
"205":u"<a href='https://oregonconservationstrategy.org/strategy-species/southern-resident-killer-whale-dps/' class='ocs-species-link' target='_blank'>Killer Whale</a>",
"206":u"<a href='https://oregonconservationstrategy.org/strategy-species/southern-resident-killer-whale-dps/' class='ocs-species-link' target='_blank'>Killer Whale</a>",
"207":u"<a href='https://oregonconservationstrategy.org/strategy-species/northern-elephant-seal/' class='ocs-species-link' target='_blank'>Northern Elephant Seal</a>",
"208":u"<a href='https://oregonconservationstrategy.org/strategy-species/northern-elephant-seal/' class='ocs-species-link' target='_blank'>Northern Elephant Seal</a>",
"209":u"<a href='https://oregonconservationstrategy.org/strategy-species/pacific-harbor-seal/' class='ocs-species-link' target='_blank'>Pacific Harbor Seal</a>",
"210":u"<a href='https://oregonconservationstrategy.org/strategy-species/pacific-harbor-seal/' class='ocs-species-link' target='_blank'>Pacific Harbor Seal</a>",
"211":u"<a href='https://oregonconservationstrategy.org/strategy-species/rocky-mountain-bighorn-sheep/' class='ocs-species-link' target='_blank'>Rocky Mountain Bighorn Sheep</a>",
"212":u"<a href='https://oregonconservationstrategy.org/strategy-species/rocky-mountain-bighorn-sheep/' class='ocs-species-link' target='_blank'>Rocky Mountain Bighorn Sheep</a>",
"213":u"<a href='https://oregonconservationstrategy.org/strategy-species/sierra-nevada-red-fox/' class='ocs-species-link' target='_blank'>Sierra Nevada Red Fox</a>",
"214":u"<a href='https://oregonconservationstrategy.org/strategy-species/sierra-nevada-red-fox/' class='ocs-species-link' target='_blank'>Sierra Nevada Red Fox</a>",
"215":u"<a href='https://oregonconservationstrategy.org/strategy-species/steller-sea-lion/' class='ocs-species-link' target='_blank'>Steller Sea Lion</a>",
"216":u"<a href='https://oregonconservationstrategy.org/strategy-species/steller-sea-lion/' class='ocs-species-link' target='_blank'>Steller Sea Lion</a>",
"217":u"<a href='https://oregonconservationstrategy.org/strategy-species/white-tailed-jackrabbit/' class='ocs-species-link' target='_blank'>White-tailed Jackrabbit</a>",
"218":u"<a href='https://oregonconservationstrategy.org/strategy-species/white-tailed-jackrabbit/' class='ocs-species-link' target='_blank'>White-tailed Jackrabbit</a>",
"219":u"<a href='https://oregonconservationstrategy.org/strategy-species/archimedes-springsnail/' class='ocs-species-link' target='_blank'>Archimedes Springsnail</a>",
"220":u"<a href='https://oregonconservationstrategy.org/strategy-species/bellers-ground-beetle/' class='ocs-species-link' target='_blank'>Beller's Ground Beetle</a>",
"221":u"<a href='https://oregonconservationstrategy.org/strategy-species/black-petaltail/' class='ocs-species-link' target='_blank'>Black Petaltail</a>",
"222":u"<a href='https://oregonconservationstrategy.org/strategy-species/blue-mud-shrimp/' class='ocs-species-link' target='_blank'>Blue Mud Shrimp</a>",
"223":u"<a href='https://oregonconservationstrategy.org/strategy-species/borax-lake-ramshorn/' class='ocs-species-link' target='_blank'>Borax Lake Ramshorn</a>",
"224":u"<a href='https://oregonconservationstrategy.org/strategy-species/bulb-juga/' class='ocs-species-link' target='_blank'>Bulb Juga</a>",
"225":u"<a href='https://oregonconservationstrategy.org/strategy-species/california-floater-freshwater-mussel/' class='ocs-species-link' target='_blank'>California Floater Freshwater Mussel</a>",
"226":u"<a href='https://oregonconservationstrategy.org/strategy-species/california-mussel/' class='ocs-species-link' target='_blank'>California Mussel</a>",
"227":u"<a href='https://oregonconservationstrategy.org/strategy-species/columbia-clubtail/' class='ocs-species-link' target='_blank'>Columbia Clubtail</a>",
"228":u"<a href='https://oregonconservationstrategy.org/strategy-species/columbia-gorge-caddisfly/' class='ocs-species-link' target='_blank'>Columbia Gorge Caddisfly</a>",
"229":u"<a href='https://oregonconservationstrategy.org/strategy-species/columbia-gorge-hesperian/' class='ocs-species-link' target='_blank'>Columbia Gorge Hesperian</a>",
"230":u"<a href='https://oregonconservationstrategy.org/strategy-species/crater-lake-tightcoil/' class='ocs-species-link' target='_blank'>Crater Lake Tightcoil</a>",
"231":u"<a href='https://oregonconservationstrategy.org/strategy-species/dalls-ramshorn/' class='ocs-species-link' target='_blank'>Dall's Ramshorn</a>",
"232":u"<a href='https://oregonconservationstrategy.org/strategy-species/dalles-mountainsnail/' class='ocs-species-link' target='_blank'>Dalles Mountainsnail</a>",
"233":u"<a href='https://oregonconservationstrategy.org/strategy-species/dungeness-crab/' class='ocs-species-link' target='_blank'>Dungeness Crab</a>",
"234":u"<a href='https://oregonconservationstrategy.org/strategy-species/fenders-blue-butterfly/' class='ocs-species-link' target='_blank'>Fender's Blue Butterfly</a>",
"235":u"<a href='https://oregonconservationstrategy.org/strategy-species/flat-abalone/' class='ocs-species-link' target='_blank'>Flat Abalone</a>",
"236":u"<a href='https://oregonconservationstrategy.org/strategy-species/franklins-bumble-bee/' class='ocs-species-link' target='_blank'>Franklin's Bumble Bee</a>",
"237":u"<a href='https://oregonconservationstrategy.org/strategy-species/great-basin-ramshorn/' class='ocs-species-link' target='_blank'>Great Basin Ramshorn</a>",
"238":u"<a href='https://oregonconservationstrategy.org/strategy-species/great-spangled-fritillary/' class='ocs-species-link' target='_blank'>Great Spangled Fritillary</a>",
"239":u"<a href='https://oregonconservationstrategy.org/strategy-species/highcap-lanx/' class='ocs-species-link' target='_blank'>Highcap Lanx</a>",
"240":u"<a href='https://oregonconservationstrategy.org/strategy-species/hoary-elfin-butterfly/' class='ocs-species-link' target='_blank'>Hoary Elfin Butterfly</a>",
"241":u"<a href='https://oregonconservationstrategy.org/strategy-species/insular-blue-butterfly/' class='ocs-species-link' target='_blank'>Insular Blue Butterfly</a>",
"242":u"<a href='https://oregonconservationstrategy.org/strategy-species/klamath-ramshorn/' class='ocs-species-link' target='_blank'>Klamath Ramshorn</a>",
"243":u"<a href='https://oregonconservationstrategy.org/strategy-species/leonas-little-blue-butterfly/' class='ocs-species-link' target='_blank'>Leona's Little Blue Butterfly</a>",
"244":u"<a href='https://oregonconservationstrategy.org/strategy-species/lined-ramshorn/' class='ocs-species-link' target='_blank'>Lined Ramshorn</a>",
"245":u"<a href='https://oregonconservationstrategy.org/strategy-species/malheur-cave-amphipod/' class='ocs-species-link' target='_blank'>Malheur Cave Amphipod</a>",
"246":u"<a href='https://oregonconservationstrategy.org/strategy-species/malheur-cave-flatworm/' class='ocs-species-link' target='_blank'>Malheur Cave Flatworm </a>",
"247":u"<a href='https://oregonconservationstrategy.org/strategy-species/malheur-cave-springtail/' class='ocs-species-link' target='_blank'>Malheur Cave Springtail</a>",
"248":u"<a href='https://oregonconservationstrategy.org/strategy-species/malheur-isopod/' class='ocs-species-link' target='_blank'>Malheur Isopod</a>",
"249":u"<a href='https://oregonconservationstrategy.org/strategy-species/malheur-pseudoscorpian/' class='ocs-species-link' target='_blank'>Malheur Pseudoscorpian</a>",
"250":u"<a href='https://oregonconservationstrategy.org/strategy-species/mardon-skipper-butterfly/' class='ocs-species-link' target='_blank'>Mardon Skipper Butterfly</a>",
"251":u"<a href='https://oregonconservationstrategy.org/strategy-species/monarch-butterfly/' class='ocs-species-link' target='_blank'>Monarch Butterfly</a>",
"252":u"<a href='https://oregonconservationstrategy.org/strategy-species/native-littleneck-clam/' class='ocs-species-link' target='_blank'>Native Littleneck Clam</a>",
"253":u"<a href='https://oregonconservationstrategy.org/strategy-species/ochre-sea-star/' class='ocs-species-link' target='_blank'>Ochre Sea Star</a>",
"254":u"<a href='https://oregonconservationstrategy.org/strategy-species/olympia-oyster/' class='ocs-species-link' target='_blank'>Olympia Oyster</a>",
"255":u"<a href='https://oregonconservationstrategy.org/strategy-species/oregon-shoulderband/' class='ocs-species-link' target='_blank'>Oregon Shoulderband</a>",
"256":u"<a href='https://oregonconservationstrategy.org/strategy-species/oregon-silverspot-butterfly/' class='ocs-species-link' target='_blank'>Oregon Silverspot Butterfly</a>",
"257":u"<a href='https://oregonconservationstrategy.org/strategy-species/pacific-giant-octopus/' class='ocs-species-link' target='_blank'>Pacific Giant Octopus</a>",
"258":u"<a href='https://oregonconservationstrategy.org/strategy-species/pacific-walker/' class='ocs-species-link' target='_blank'>Pacific Walker</a>",
"259":u"<a href='https://oregonconservationstrategy.org/strategy-species/purple-lipped-juga/' class='ocs-species-link' target='_blank'>Purple-lipped Juga</a>",
"260":u"<a href='https://oregonconservationstrategy.org/strategy-species/purple-sea-urchin/' class='ocs-species-link' target='_blank'>Purple Sea Urchin</a>",
"261":u"<a href='https://oregonconservationstrategy.org/strategy-species/razor-clam/' class='ocs-species-link' target='_blank'>Razor Clam</a>",
"262":u"<a href='https://oregonconservationstrategy.org/strategy-species/red-abalone/' class='ocs-species-link' target='_blank'>Red Abalone</a>",
"263":u"<a href='https://oregonconservationstrategy.org/strategy-species/red-sea-urchin/' class='ocs-species-link' target='_blank'>Red Sea Urchin</a>",
"264":u"<a href='https://oregonconservationstrategy.org/strategy-species/robust-walker/' class='ocs-species-link' target='_blank'>Robust Walker</a>",
"265":u"<a href='https://oregonconservationstrategy.org/strategy-species/rock-scallop/' class='ocs-species-link' target='_blank'>Rock Scallop</a>",
"266":u"<a href='https://oregonconservationstrategy.org/strategy-species/rotund-lanx/' class='ocs-species-link' target='_blank'>Rotund Lanx</a>",
"267":u"<a href='https://oregonconservationstrategy.org/strategy-species/scale-lanx/' class='ocs-species-link' target='_blank'>Scale Lanx</a>",
"268":u"<a href='https://oregonconservationstrategy.org/strategy-species/scalloped-juga/' class='ocs-species-link' target='_blank'>Scalloped Juga</a>",
"269":u"<a href='https://oregonconservationstrategy.org/strategy-species/shortface-lanx/' class='ocs-species-link' target='_blank'>Shortface Lanx</a>",
"270":u"<a href='https://oregonconservationstrategy.org/strategy-species/sinitsin-ramshorn/' class='ocs-species-link' target='_blank'>Sinitsin Ramshorn</a>",
"271":u"<a href='https://oregonconservationstrategy.org/strategy-species/siskiyou-hesperian/' class='ocs-species-link' target='_blank'>Siskiyou Hesperian</a>",
"272":u"<a href='https://oregonconservationstrategy.org/strategy-species/sisters-hesperian/' class='ocs-species-link' target='_blank'>Sister's Hesperian</a>",
"273":u"<a href='https://oregonconservationstrategy.org/strategy-species/stonefly/' class='ocs-species-link' target='_blank'>Stonefly</a>",
"274":u"<a href='https://oregonconservationstrategy.org/strategy-species/sunflower-star/' class='ocs-species-link' target='_blank'>Sunflower Star</a>",
"275":u"<a href='https://oregonconservationstrategy.org/strategy-species/taylors-checkerspot-butterfly/' class='ocs-species-link' target='_blank'>Taylor's Checkerspot Butterfly</a>",
"276":u"<a href='https://oregonconservationstrategy.org/strategy-species/turban-pebblesnail/' class='ocs-species-link' target='_blank'>Turban Pebblesnail</a>",
"277":u"<a href='https://oregonconservationstrategy.org/strategy-species/vernal-pool-fairy-shrimp/' class='ocs-species-link' target='_blank'>Vernal Pool Fairy Shrimp</a>",
"278":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-bumble-bee/' class='ocs-species-link' target='_blank'>Western Bumble Bee</a>",
"279":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-ridged-mussel/' class='ocs-species-link' target='_blank'>Western Ridged Mussel</a>",
"280":u"<a href='https://oregonconservationstrategy.org/strategy-species/winged-floater-freshwater-mussel/' class='ocs-species-link' target='_blank'>Winged Floater Freshwater Mussel</a>",
"281":u"<a href='https://oregonconservationstrategy.org/strategy-species/applegates-milkvetch/' class='ocs-species-link' target='_blank'>Applegate's Milkvetch</a>",
"282":u"<a href='https://oregonconservationstrategy.org/strategy-species/arrow-leaf-thelypody/' class='ocs-species-link' target='_blank'>Arrow-leaf Thelypody</a>",
"283":u"<a href='https://oregonconservationstrategy.org/strategy-species/big-flowered-wooly-meadowfoam/' class='ocs-species-link' target='_blank'>Big-flowered Wooly Meadowfoam</a>",
"284":u"<a href='https://oregonconservationstrategy.org/strategy-species/boggs-lake-hedge-hyssop/' class='ocs-species-link' target='_blank'>Boggs Lake Hedge Hyssop</a>",
"285":u"<a href='https://oregonconservationstrategy.org/strategy-species/bradshaws-desert-parsley/' class='ocs-species-link' target='_blank'>Bradshaw's Desert Parsley</a>",
"286":u"<a href='https://oregonconservationstrategy.org/strategy-species/bull-kelp/' class='ocs-species-link' target='_blank'>Bull Kelp</a>",
"287":u"<a href='https://oregonconservationstrategy.org/strategy-species/cascade-head-catchfly/' class='ocs-species-link' target='_blank'>Cascade Head Catchfly</a>",
"288":u"<a href='https://oregonconservationstrategy.org/strategy-species/coast-range-fawn-lily/' class='ocs-species-link' target='_blank'>Coast Range Fawn Lily</a>",
"289":u"<a href='https://oregonconservationstrategy.org/strategy-species/cooks-desert-parsley/' class='ocs-species-link' target='_blank'>Cook's Desert Parsley</a>",
"290":u"<a href='https://oregonconservationstrategy.org/strategy-species/crinite-mariposa-lily/' class='ocs-species-link' target='_blank'>Crinite Mariposa Lily</a>",
"291":u"<a href='https://oregonconservationstrategy.org/strategy-species/cronquists-stickseed/' class='ocs-species-link' target='_blank'>Cronquist's Stickseed</a>",
"292":u"<a href='https://oregonconservationstrategy.org/strategy-species/crosbys-buckwheat/' class='ocs-species-link' target='_blank'>Crosby's Buckwheat</a>",
"293":u"<a href='https://oregonconservationstrategy.org/strategy-species/cusicks-lupine/' class='ocs-species-link' target='_blank'>Cusick's Lupine</a>",
"294":u"<a href='https://oregonconservationstrategy.org/strategy-species/davis-peppergrass/' class='ocs-species-link' target='_blank'>Davis' Peppergrass</a>",
"295":u"<a href='https://oregonconservationstrategy.org/strategy-species/dwarf-meadowfoam/' class='ocs-species-link' target='_blank'>Dwarf Meadowfoam</a>",
"296":u"<a href='https://oregonconservationstrategy.org/strategy-species/gentners-fritillary/' class='ocs-species-link' target='_blank'>Gentner's Fritillary</a>",
"297":u"<a href='https://oregonconservationstrategy.org/strategy-species/golden-buckwheat/' class='ocs-species-link' target='_blank'>Golden Buckwheat</a>",
"298":u"<a href='https://oregonconservationstrategy.org/strategy-species/golden-paintbrush/' class='ocs-species-link' target='_blank'>Golden Paintbrush</a>",
"299":u"<a href='https://oregonconservationstrategy.org/strategy-species/greenmans-desert-parsley/' class='ocs-species-link' target='_blank'>Greenman's Desert Parsley</a>",
"300":u"<a href='https://oregonconservationstrategy.org/strategy-species/grimy-ivesia/' class='ocs-species-link' target='_blank'>Grimy Ivesia</a>",
"301":u"<a href='https://oregonconservationstrategy.org/strategy-species/howells-mariposa-lily/' class='ocs-species-link' target='_blank'>Howell's Mariposa Lily</a>",
"302":u"<a href='https://oregonconservationstrategy.org/strategy-species/howells-microseris/' class='ocs-species-link' target='_blank'>Howell's Microseris</a>",
"303":u"<a href='https://oregonconservationstrategy.org/strategy-species/howells-spectacular-thelypody/' class='ocs-species-link' target='_blank'>Howell's Spectacular Thelypody</a>",
"304":u"<a href='https://oregonconservationstrategy.org/strategy-species/howellia/' class='ocs-species-link' target='_blank'>Howellia</a>",
"305":u"<a href='https://oregonconservationstrategy.org/strategy-species/kincaids-lupine/' class='ocs-species-link' target='_blank'>Kincaid's Lupine</a>",
"306":u"<a href='https://oregonconservationstrategy.org/strategy-species/large-flowered-rush-lily/' class='ocs-species-link' target='_blank'>Large-flowered Rush Lily</a>",
"307":u"<a href='https://oregonconservationstrategy.org/strategy-species/lawrences-milkvetch/' class='ocs-species-link' target='_blank'>Lawrence's Milkvetch</a>",
"308":u"<a href='https://oregonconservationstrategy.org/strategy-species/macfarlanes-four-oclock/' class='ocs-species-link' target='_blank'>Macfarlane's Four o'Clock</a>",
"309":u"<a href='https://oregonconservationstrategy.org/strategy-species/malheur-valley-fiddleneck/' class='ocs-species-link' target='_blank'>Malheur Valley Fiddleneck</a>",
"310":u"<a href='https://oregonconservationstrategy.org/strategy-species/malheur-wire-lettuce/' class='ocs-species-link' target='_blank'>Malheur Wire-lettuce</a>",
"311":u"<a href='https://oregonconservationstrategy.org/strategy-species/mcdonalds-rockcress/' class='ocs-species-link' target='_blank'>McDonald's Rockcress</a>",
"312":u"<a href='https://oregonconservationstrategy.org/strategy-species/mulfords-milkvetch/' class='ocs-species-link' target='_blank'>Mulford's Milkvetch</a>",
"313":u"<a href='https://oregonconservationstrategy.org/strategy-species/native-eelgrass/' class='ocs-species-link' target='_blank'>Native Eelgrass</a>",
"314":u"<a href='https://oregonconservationstrategy.org/strategy-species/nelsons-checkermallow/' class='ocs-species-link' target='_blank'>Nelson's Checkermallow</a>",
"315":u"<a href='https://oregonconservationstrategy.org/strategy-species/northern-wormwood/' class='ocs-species-link' target='_blank'>Northern Wormwood</a>",
"316":u"<a href='https://oregonconservationstrategy.org/strategy-species/oregon-semaphore-grass/' class='ocs-species-link' target='_blank'>Oregon Semaphore Grass</a>",
"317":u"<a href='https://oregonconservationstrategy.org/strategy-species/owyhee-clover/' class='ocs-species-link' target='_blank'>Owyhee Clover</a>",
"318":u"<a href='https://oregonconservationstrategy.org/strategy-species/packards-mentzelia/' class='ocs-species-link' target='_blank'>Packard's Mentzelia</a>",
"319":u"<a href='https://oregonconservationstrategy.org/strategy-species/peacock-larkspur/' class='ocs-species-link' target='_blank'>Peacock Larkspur</a>",
"320":u"<a href='https://oregonconservationstrategy.org/strategy-species/pecks-milkvetch/' class='ocs-species-link' target='_blank'>Peck's Milkvetch</a>",
"321":u"<a href='https://oregonconservationstrategy.org/strategy-species/pink-sandverbena/' class='ocs-species-link' target='_blank'>Pink Sandverbena</a>",
"322":u"<a href='https://oregonconservationstrategy.org/strategy-species/point-reyes-birds-beak/' class='ocs-species-link' target='_blank'>Point Reyes Bird's-beak</a>",
"323":u"<a href='https://oregonconservationstrategy.org/strategy-species/pumice-grape-fern/' class='ocs-species-link' target='_blank'>Pumice Grape-fern</a>",
"324":u"<a href='https://oregonconservationstrategy.org/strategy-species/red-fruited-lomatium/' class='ocs-species-link' target='_blank'>Red-fruited Lomatium</a>",
"325":u"<a href='https://oregonconservationstrategy.org/strategy-species/rough-popcornflower/' class='ocs-species-link' target='_blank'>Rough Popcornflower</a>",
"326":u"<a href='https://oregonconservationstrategy.org/strategy-species/sea-palm/' class='ocs-species-link' target='_blank'>Sea Palm</a>",
"327":u"<a href='https://oregonconservationstrategy.org/strategy-species/sexton-mountain-mariposa-lily/' class='ocs-species-link' target='_blank'>Sexton Mountain Mariposa Lily</a>",
"328":u"<a href='https://oregonconservationstrategy.org/strategy-species/shiny-fruited-allocarya/' class='ocs-species-link' target='_blank'>Shiny-fruited Allocarya</a>",
"329":u"<a href='https://oregonconservationstrategy.org/strategy-species/silvery-phacelia/' class='ocs-species-link' target='_blank'>Silvery Phacelia</a>",
"330":u"<a href='https://oregonconservationstrategy.org/strategy-species/smooth-mentzelia/' class='ocs-species-link' target='_blank'>Smooth Mentzelia</a>",
"331":u"<a href='https://oregonconservationstrategy.org/strategy-species/snake-river-goldenweed/' class='ocs-species-link' target='_blank'>Snake River Goldenweed</a>",
"332":u"<a href='https://oregonconservationstrategy.org/strategy-species/south-fork-john-day-milkvetch/' class='ocs-species-link' target='_blank'>South Fork John Day Milkvetch</a>",
"333":u"<a href='https://oregonconservationstrategy.org/strategy-species/spaldings-campion/' class='ocs-species-link' target='_blank'>Spalding's Campion</a>",
"334":u"<a href='https://oregonconservationstrategy.org/strategy-species/sterile-milkvetch/' class='ocs-species-link' target='_blank'>Sterile Milkvetch</a>",
"335":u"<a href='https://oregonconservationstrategy.org/strategy-species/surf-grass/' class='ocs-species-link' target='_blank'>Surf Grass</a>",
"336":u"<a href='https://oregonconservationstrategy.org/strategy-species/tygh-valley-milkvetch/' class='ocs-species-link' target='_blank'>Tygh Valley Milkvetch</a>",
"337":u"<a href='https://oregonconservationstrategy.org/strategy-species/umpqua-mariposa-lily/' class='ocs-species-link' target='_blank'>Umpqua Mariposa Lily</a>",
"338":u"<a href='https://oregonconservationstrategy.org/strategy-species/wayside-aster/' class='ocs-species-link' target='_blank'>Wayside Aster</a>",
"339":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-lily/' class='ocs-species-link' target='_blank'>Western Lily</a>",
"340":u"<a href='https://oregonconservationstrategy.org/strategy-species/white-topped-aster/' class='ocs-species-link' target='_blank'>White-topped Aster</a>",
"341":u"<a href='https://oregonconservationstrategy.org/strategy-species/white-rock-larkspur/' class='ocs-species-link' target='_blank'>White Rock Larkspur</a>",
"342":u"<a href='https://oregonconservationstrategy.org/strategy-species/willamette-daisy/' class='ocs-species-link' target='_blank'>Willamette Daisy</a>",
"343":u"<a href='https://oregonconservationstrategy.org/strategy-species/wolfs-evening-primrose/' class='ocs-species-link' target='_blank'>Wolf's Evening Primrose</a>",
"344":u"<a href='https://oregonconservationstrategy.org/strategy-species/alvord-lake-chub/' class='ocs-species-link' target='_blank'>Alvord Lake Chub</a>",
"345":u"<a href='https://oregonconservationstrategy.org/strategy-species/borax-lake-chub/' class='ocs-species-link' target='_blank'>Borax Lake Chub</a>",
"346":u"<a href='https://oregonconservationstrategy.org/strategy-species/bull-trout/' class='ocs-species-link' target='_blank'>Bull Trout</a>",
"347":u"<a href='https://oregonconservationstrategy.org/strategy-species/chinook-salmon/' class='ocs-species-link' target='_blank'>Chinook Salmon - Fall Run</a>",
"348":u"<a href='https://oregonconservationstrategy.org/strategy-species/chinook-salmon/' class='ocs-species-link' target='_blank'>Chinook Salmon - Spring Run</a>",
"349":u"<a href='https://oregonconservationstrategy.org/strategy-species/chum-salmon/' class='ocs-species-link' target='_blank'>Chum Salmon</a>",
"350":u"<a href='https://oregonconservationstrategy.org/strategy-species/coastal-cutthroat-trout/' class='ocs-species-link' target='_blank'>Coastal Cutthroat Trout</a>",
"351":u"<a href='https://oregonconservationstrategy.org/strategy-species/coho-salmon/' class='ocs-species-link' target='_blank'>Coho Salmon</a>",
"352":u"<a href='https://oregonconservationstrategy.org/strategy-species/foskett-spring-speckled-dace/' class='ocs-species-link' target='_blank'>Foskett Spring Speckled Dace</a>",
"353":u"<a href='https://oregonconservationstrategy.org/strategy-species/goose-lake-sucker/' class='ocs-species-link' target='_blank'>Goose Lake Sucker</a>",
"354":u"<a href='https://oregonconservationstrategy.org/strategy-species/hutton-spring-tui-chub-group/' class='ocs-species-link' target='_blank'>Hutton Springs Tui Chub</a>",
"355":u"<a href='https://oregonconservationstrategy.org/strategy-species/lahontan-cutthroat-trout/' class='ocs-species-link' target='_blank'>Lahontan Cutthroat Trout</a>",
"356":u"<a href='https://oregonconservationstrategy.org/strategy-species/lost-river-sucker/' class='ocs-species-link' target='_blank'>Lost River Sucker</a>",
"357":u"<a href='https://oregonconservationstrategy.org/strategy-species/miller-lake-lamprey/' class='ocs-species-link' target='_blank'>Miller Lake Lamprey</a>",
"358":u"<a href='https://oregonconservationstrategy.org/strategy-species/modoc-sucker/' class='ocs-species-link' target='_blank'>Modoc Sucker</a>",
"359":u"<a href='https://oregonconservationstrategy.org/strategy-species/oregon-chub/' class='ocs-species-link' target='_blank'>Oregon Chub</a>",
"360":u"<a href='https://oregonconservationstrategy.org/strategy-species/pit-sculpin/' class='ocs-species-link' target='_blank'>Pit Sculpin</a>",
"361":u"<a href='https://oregonconservationstrategy.org/strategy-species/shortnose-sucker/' class='ocs-species-link' target='_blank'>Shortnose Sucker</a>",
"362":u"<a href='https://oregonconservationstrategy.org/strategy-species/steelhead-rainbow-redband-trout/' class='ocs-species-link' target='_blank'>Summer Steelhead / Coastal Rainbow Trout / Columbia Basin Redband Trout</a>",
"363":u"<a href='https://oregonconservationstrategy.org/strategy-species/' class='ocs-species-link' target='_blank'>Oregon Basin Redband Trout</a>", # great-basin-redband-trout/ ?
"364":u"<a href='https://oregonconservationstrategy.org/strategy-species/steelhead-rainbow-redband-trout/' class='ocs-species-link' target='_blank'>Winter Steelhead / Coastal Rainbow Trout</a>",
"365":u"<a href='https://oregonconservationstrategy.org/strategy-species/warner-sucker/' class='ocs-species-link' target='_blank'>Warner Sucker</a>",
"366":u"<a href='https://oregonconservationstrategy.org/strategy-species/westslope-cutthroat-trout/' class='ocs-species-link' target='_blank'>Westslope Cutthroat Trout</a>",
"406":u"<a href='https://oregonconservationstrategy.org/strategy-habitat/aspen-woodlands/' class='ocs-species-link' target='_blank'>Aspen Woodlands</a>",
"407":u"<a href='https://oregonconservationstrategy.org/strategy-habitat/coastal-dunes/' class='ocs-species-link' target='_blank'>Coastal Dunes</a>",
"408":u"<a href='https://oregonconservationstrategy.org/strategy-habitat/estuaries/' class='ocs-species-link' target='_blank'>Estuaries</a>",
"409":u"<a href='https://oregonconservationstrategy.org/strategy-habitat/riparian-habitats-and-flowing-water/' class='ocs-species-link' target='_blank'>Flowing Water and Riparian Habitats</a>",
"410":u"<a href='https://oregonconservationstrategy.org/strategy-habitat/grasslands/' class='ocs-species-link' target='_blank'>Grasslands</a>",
"411":u"<a href='https://oregonconservationstrategy.org/strategy-habitat/late-successional-mixed-conifer-forests/' class='ocs-species-link' target='_blank'>Late Successional Mixed Conifer Forests</a>",
"412":u"<a href='https://oregonconservationstrategy.org/strategy-habitat/natural-lakes/' class='ocs-species-link' target='_blank'>Natural Lakes</a>",
"413":u"<a href='https://oregonconservationstrategy.org/strategy-habitat/oak-woodlands/' class='ocs-species-link' target='_blank'>Oak Woodlands</a>",
"414":u"<a href='https://oregonconservationstrategy.org/strategy-habitat/ponderosa-pine-woodlands/' class='ocs-species-link' target='_blank'>Ponderosa Pine Woodlands</a>",
"415":u"<a href='https://oregonconservationstrategy.org/strategy-habitat/sagebrush-habitats/' class='ocs-species-link' target='_blank'>Sagebrush Habitats</a>",
"416":u"<a href='https://oregonconservationstrategy.org/strategy-habitat/wetlands/' class='ocs-species-link' target='_blank'>Wetlands</a>",
"367":u"<a href='https://oregonconservationstrategy.org/strategy-species/big-skate/' class='ocs-species-link' target='_blank'>Big Skate</a>",
"368":u"<a href='https://oregonconservationstrategy.org/strategy-species/black-rockfish/' class='ocs-species-link' target='_blank'>Black Rockfish</a>",
"369":u"<a href='https://oregonconservationstrategy.org/strategy-species/blue-rockfish/' class='ocs-species-link' target='_blank'>Blue Rockfish</a>",
"370":u"<a href='https://oregonconservationstrategy.org/strategy-species/brown-rockfish/' class='ocs-species-link' target='_blank'>Brown Rockfish</a>",
"371":u"<a href='https://oregonconservationstrategy.org/strategy-species/cabezon/' class='ocs-species-link' target='_blank'>Cabezon</a>",
"372":u"<a href='https://oregonconservationstrategy.org/strategy-species/canary-rockfish/' class='ocs-species-link' target='_blank'>Canary Rockfish</a>",
"373":u"<a href='https://oregonconservationstrategy.org/strategy-species/china-rockfish/' class='ocs-species-link' target='_blank'>China Rockfish</a>",
"374":u"<a href='https://oregonconservationstrategy.org/strategy-species/copper-rockfish/' class='ocs-species-link' target='_blank'>Copper Rockfish</a>",
"375":u"<a href='https://oregonconservationstrategy.org/strategy-species/deacon-rockfish/' class='ocs-species-link' target='_blank'>Deacon Rockfish</a>",
"376":u"<a href='https://oregonconservationstrategy.org/strategy-species/eulachon-southern-dps/' class='ocs-species-link' target='_blank'>Eulachon</a>",
"377":u"<a href='https://oregonconservationstrategy.org/strategy-species/grass-rockfish/' class='ocs-species-link' target='_blank'>Grass Rockfish</a>",
"378":u"<a href='https://oregonconservationstrategy.org/strategy-species/green-sturgeon/' class='ocs-species-link' target='_blank'>Green Sturgeon</a>",
"379":u"<a href='https://oregonconservationstrategy.org/strategy-species/kelp-greenling-2/' class='ocs-species-link' target='_blank'>Kelp Greenling</a>",
"380":u"<a href='https://oregonconservationstrategy.org/strategy-species/lingcod/' class='ocs-species-link' target='_blank'>Lingcod</a>",
"381":u"<a href='https://oregonconservationstrategy.org/strategy-species/longfin-smelt/' class='ocs-species-link' target='_blank'>Longfin Smelt</a>",
"382":u"<a href='https://oregonconservationstrategy.org/strategy-species/millicoma-dace/' class='ocs-species-link' target='_blank'>Millicoma Dace</a>",
"383":u"<a href='https://oregonconservationstrategy.org/strategy-species/northern-anchovy/' class='ocs-species-link' target='_blank'>Northern Anchovy</a>",
"384":u"<a href='https://oregonconservationstrategy.org/strategy-species/pacific-herring/' class='ocs-species-link' target='_blank'>Pacific Herring</a>",
"385":u"<a href='https://oregonconservationstrategy.org/strategy-species/pacific-lamprey/' class='ocs-species-link' target='_blank'>Pacific Lamprey</a>",
"386":u"<a href='https://oregonconservationstrategy.org/strategy-species/pacific-sand-lance/' class='ocs-species-link' target='_blank'>Pacific Sand Lance</a>",
"387":u"<a href='https://oregonconservationstrategy.org/strategy-species/pile-perch/' class='ocs-species-link' target='_blank'>Pile Perch</a>",
"388":u"<a href='https://oregonconservationstrategy.org/strategy-species/quillback-rockfish/' class='ocs-species-link' target='_blank'>Quillback Rockfish</a>",
"389":u"<a href='https://oregonconservationstrategy.org/strategy-species/redtail-surfperch/' class='ocs-species-link' target='_blank'>Redtail Surfperch</a>",
"390":u"<a href='https://oregonconservationstrategy.org/strategy-species/rock-greenling/' class='ocs-species-link' target='_blank'>Rock Greenling</a>",
"391":u"<a href='https://oregonconservationstrategy.org/strategy-species/shiner-perch/' class='ocs-species-link' target='_blank'>Shiner Perch</a>",
"392":u"<a href='https://oregonconservationstrategy.org/strategy-species/spiny-dogfish/' class='ocs-species-link' target='_blank'>Spiny Dogfish</a>",
"393":u"<a href='https://oregonconservationstrategy.org/strategy-species/starry-flounder/' class='ocs-species-link' target='_blank'>Starry Flounder</a>",
"394":u"<a href='https://oregonconservationstrategy.org/strategy-species/striped-perch/' class='ocs-species-link' target='_blank'>Striped Perch</a>",
"395":u"<a href='https://oregonconservationstrategy.org/strategy-species/surf-smelt/' class='ocs-species-link' target='_blank'>Surf Smelt</a>",
"396":u"<a href='https://oregonconservationstrategy.org/strategy-species/tiger-rockfish/' class='ocs-species-link' target='_blank'>Tiger Rockfish</a>",
"397":u"<a href='https://oregonconservationstrategy.org/strategy-species/topsmelt/' class='ocs-species-link' target='_blank'>Topsmelt</a>",
"398":u"<a href='https://oregonconservationstrategy.org/strategy-species/umpqua-chub/' class='ocs-species-link' target='_blank'>Umpqua Chub</a>",
"399":u"<a href='https://oregonconservationstrategy.org/strategy-species/vermilion-rockfish/' class='ocs-species-link' target='_blank'>Vermilion Rockfish</a>",
"400":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-brook-lamprey/' class='ocs-species-link' target='_blank'>Western Brook Lamprey</a>",
"401":u"<a href='https://oregonconservationstrategy.org/strategy-species/western-river-lamprey/' class='ocs-species-link' target='_blank'>Western River Lamprey</a>",
"402":u"<a href='https://oregonconservationstrategy.org/strategy-species/white-sturgeon/' class='ocs-species-link' target='_blank'>White Sturgeon</a>",
"403":u"<a href='https://oregonconservationstrategy.org/strategy-species/wolf-eel/' class='ocs-species-link' target='_blank'>Wolf-eel</a>",
"404":u"<a href='https://oregonconservationstrategy.org/strategy-species/yelloweye-rockfish/' class='ocs-species-link' target='_blank'>Yelloweye Rockfish</a>",
"405":u"<a href='https://oregonconservationstrategy.org/strategy-species/yellowtail-rockfish/' class='ocs-species-link' target='_blank'>Yellowtail Rockfish</a>",
}
COA_LOOKUP = {
"Alkali Lake, COA 190": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/alkali-lake/' target='_blank' class='ocs-coa-link'>Alkali Lake, COA 190</a>",
"Alsea Estuary-Alsea River, COA 029": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/alsea-estuary-alsea-river/' target='_blank' class='ocs-coa-link'>Alsea Estuary-Alsea River, COA 029</a>",
"Alvord Lake Basin, COA 194": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/alvord-lake-basin/' target='_blank' class='ocs-coa-link'>Alvord Lake Basin, COA 194</a>",
"Anderson Butte, COA 103": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/anderson-butte/' target='_blank' class='ocs-coa-link'>Anderson Butte, COA 103</a>",
"Antelope Creek-Paynes Cliffs, COA 099": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/antelope-creek-paynes-cliffs/' target='_blank' class='ocs-coa-link'>Antelope Creek-Paynes Cliffs, COA 099</a>",
"Bakeoven Creek-Buckhollow Creek, COA 151": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/bakeoven-creek-buckhollow-creek/' target='_blank' class='ocs-coa-link'>Bakeoven Creek-Buckhollow Creek, COA 151</a>",
"Baker Valley Wetlands, COA 165": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/baker-valley-wetlands/' target='_blank' class='ocs-coa-link'>Baker Valley Wetlands, COA 165</a>",
"Banks Swamp, COA 062": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/banks-swamp/' target='_blank' class='ocs-coa-link'>Banks Swamp, COA 062</a>",
# Current COA data has "Baskett Butte" as "Basket Butte". This line hides that error, the next will handle data when fixed. (RDH 20190626)
"Basket Butte, COA 072": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/baskett-butte/' target='_blank' class='ocs-coa-link'>Baskett Butte, COA 072</a>",
"Baskett Butte, COA 072": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/baskett-butte/' target='_blank' class='ocs-coa-link'>Baskett Butte, COA 072</a>",
"Basque Hills Area Plains, COA 202": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/basque-hills-area-plains/' target='_blank' class='ocs-coa-link'>Basque Hills Area Plains, COA 202</a>",
"Bear Valley, COA 177": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/bear-valley/' target='_blank' class='ocs-coa-link'>Bear Valley, COA 177</a>",
"Beaver Creek, COA 027": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/beaver-creek/' target='_blank' class='ocs-coa-link'>Beaver Creek, COA 027</a>",
"Big Butte Area, COA 122": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/big-butte-area/' target='_blank' class='ocs-coa-link'>Big Butte Area, COA 122</a>",
"Big Marsh Creek, COA 133": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/big-marsh-creek/' target='_blank' class='ocs-coa-link'>Big Marsh Creek, COA 133</a>",
"Boardman Area, COA 154": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/boardman-area/' target='_blank' class='ocs-coa-link'>Boardman Area, COA 154</a>",
"Breitenbush River, COA 110": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/breitenbush-river/' target='_blank' class='ocs-coa-link'>Breitenbush River, COA 110</a>",
"Brothers-North Wagontire, COA 184": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/brothers-north-wagontire/' target='_blank' class='ocs-coa-link'>Brothers-North Wagontire, COA 184</a>",
"Bull of the Woods, North, COA 108": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/bull-of-the-woods-north/' target='_blank' class='ocs-coa-link'>Bull of the Woods, North, COA 108</a>",
"Bull Run-Sandy Rivers, COA 105": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/bull-run-sandy-rivers/' target='_blank' class='ocs-coa-link'>Bull Run-Sandy Rivers, COA 105</a>",
"Bully Creek Area, COA 183": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/bully-creek-area/' target='_blank' class='ocs-coa-link'>Bully Creek Area, COA 183</a>",
"Burnt River, COA 166": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/burnt-river/' target='_blank' class='ocs-coa-link'>Burnt River, COA 166</a>",
"Calapooia River, COA 082": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/calapooia-river/' target='_blank' class='ocs-coa-link'>Calapooia River, COA 082</a>",
"Cape Ferrelo, COA 051": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/cape-ferrelo/' target='_blank' class='ocs-coa-link'>Cape Ferrelo, COA 051</a>",
"Central Cascades Crest, Southeast, COA 116": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/central-cascades-crest-southeast/' target='_blank' class='ocs-coa-link'>Central Cascades Crest, Southeast, COA 116</a>",
# Current COA data has period after "Crest". This line hides that error, the next will handle data when fixed. (RDH 20190626)
"Central Cascades Crest. West, COA 113": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/central-cascades-crest-west/' target='_blank' class='ocs-coa-link'>Central Cascades Crest, West, COA 113</a>",
"Central Cascades Crest, West, COA 113": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/central-cascades-crest-west/' target='_blank' class='ocs-coa-link'>Central Cascades Crest, West, COA 113</a>",
# Current COA data has typo "Cetco". This line hides that error, the next will handle data when fixed. (RDH 20190626)
"Cetco River-Winhchuck River Estuaries, COA 052": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/chetco-river-winhchuck-river-estuaries/' target='_blank' class='ocs-coa-link'>Chetco River-Winhchuck River Estuaries, COA 052</a>",
"Chetco River-Winhchuck River Estuaries, COA 052": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/chetco-river-winhchuck-river-estuaries/' target='_blank' class='ocs-coa-link'>Chetco River-Winhchuck River Estuaries, COA 052</a>",
"Chewaucan River, COA 141": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/chewaucan-river/' target='_blank' class='ocs-coa-link'>Chewaucan River, COA 141</a>",
"Clackamas River and Tributaries, COA 065": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/clackamas-river-and-tributaries/' target='_blank' class='ocs-coa-link'>Clackamas River and Tributaries, COA 065</a>",
"Clatskanie River, COA 008": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/clatskanie-river/' target='_blank' class='ocs-coa-link'>Clatskanie River, COA 008</a>",
"Clatsop Plains, COA 001": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/clatsop-plains/' target='_blank' class='ocs-coa-link'>Clatsop Plains, COA 001</a>",
"Clatsop State Forest-Jewel Meadows Area, COA 007": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/clatsop-state-forest-jewel-meadows-area/' target='_blank' class='ocs-coa-link'>Clatsop State Forest-Jewel Meadows Area, COA 007</a>",
"Coburg Ridge, COA 087": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/coburg-ridge/' target='_blank' class='ocs-coa-link'>Coburg Ridge, COA 087</a>",
"Cold Springs National Wildlife Refuge Area, COA 156": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/cold-springs-national-wildlife-refuge-area/' target='_blank' class='ocs-coa-link'>Cold Springs National Wildlife Refuge Area, COA 156</a>",
"Columbia River-Blind Slough Swamp, COA 006": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/columbia-river-blind-slough-swamp/' target='_blank' class='ocs-coa-link'>Columbia River-Blind Slough Swamp, COA 006</a>",
"Coos Bay, COA 043": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/coos-bay/' target='_blank' class='ocs-coa-link'>Coos Bay, COA 043</a>",
"Coos Mountain-Middle Creek, COA 044": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/coos-mountain-middle-creek/' target='_blank' class='ocs-coa-link'>Coos Mountain-Middle Creek, COA 044</a>",
"Corvallis Area Forests and Balds, COA 081": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/corvallis-area-forests-and-balds/' target='_blank' class='ocs-coa-link'>Corvallis Area Forests and Balds, COA 081</a>",
"Crater Lake, COA 121": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/crater-lake/' target='_blank' class='ocs-coa-link'>Crater Lake, COA 121</a>",
"Crawfordsville Oak-Washburn Butte, COA 085": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/crawfordsville-oak-washburn-butte/' target='_blank' class='ocs-coa-link'>Crawfordsville Oak-Washburn Butte, COA 085</a>",
"Crowley, COA 185": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/crowley/' target='_blank' class='ocs-coa-link'>Crowley, COA 185</a>",
"Deer Island, COA 053": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/deer-island/' target='_blank' class='ocs-coa-link'>Deer Island, COA 053</a>",
# Current COA data has typo "Depot". This line hides that error, the next will handle data when fixed. (RDH 20190626)
"Depot Bay Area, COA 023": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/depoe-bay-area/' target='_blank' class='ocs-coa-link'>Depoe Bay Area, COA 023</a>",
"Depoe Bay Area, COA 023": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/depoe-bay-area/' target='_blank' class='ocs-coa-link'>Depoe Bay Area, COA 023</a>",
"Deschutes River, COA 149": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/deschutes-river/' target='_blank' class='ocs-coa-link'>Deschutes River, COA 149</a>",
"Devil's Lake, COA 020": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/devils-lake/' target='_blank' class='ocs-coa-link'>Devil's Lake, COA 020</a>",
"Dry Valley, COA 144": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/dry-valley/' target='_blank' class='ocs-coa-link'>Dry Valley, COA 144</a>",
"Dundee Oaks, COA 066": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/dundee-oaks/' target='_blank' class='ocs-coa-link'>Dundee Oaks, COA 066</a>",
"East Fork Illinois River, COA 101": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/east-fork-illinois-river/' target='_blank' class='ocs-coa-link'>East Fork Illinois River, COA 101</a>",
"East Madras-Trout Creek Sagebrush and Grassland Area, COA 172": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/east-madras-trout-creek-sagebrush-and-grassland-area/' target='_blank' class='ocs-coa-link'>East Madras-Trout Creek Sagebrush and Grassland Area, COA 172</a>",
"Elliot State Forest, COA 041": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/elliot-state-forest/' target='_blank' class='ocs-coa-link'>Elliot State Forest, COA 041</a>",
"Eola Hills, COA 073": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/eola-hills/' target='_blank' class='ocs-coa-link'>Eola Hills, COA 073</a>",
"Fields Peak, COA 176": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/fields-peak/' target='_blank' class='ocs-coa-link'>Fields Peak, COA 176</a>",
"Fifteenmile Creek, COA 147": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/fifteenmile-creek/' target='_blank' class='ocs-coa-link'>Fifteenmile Creek, COA 147</a>",
"Finley-Muddy Creek Area, COA 084": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/finley-muddy-creek-area/' target='_blank' class='ocs-coa-link'>Finley-Muddy Creek Area, COA 084</a>",
"Forest Park, COA 058": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/forest-park/' target='_blank' class='ocs-coa-link'>Forest Park, COA 058</a>",
"Foster Flat-Black Rim Sagebrush Area, COA 191": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/foster-flat-black-rim-sagebrush-area/' target='_blank' class='ocs-coa-link'>Foster Flat-Black Rim Sagebrush Area, COA 191</a>",
"Gales Creek, COA 013": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/gales-creek/' target='_blank' class='ocs-coa-link'>Gales Creek, COA 013</a>",
"Gearhart Mountain-North Fork Sprague, COA 140": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/gearhart-mountain-north-fork-sprague/' target='_blank' class='ocs-coa-link'>Gearhart Mountain-North Fork Sprague, COA 140</a>",
"Grande Ronde Valley, COA 159": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/grande-ronde-valley/' target='_blank' class='ocs-coa-link'>Grande Ronde Valley, COA 159</a>",
"Grave Creek, COA 094": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/grave-creek/' target='_blank' class='ocs-coa-link'>Grave Creek, COA 094</a>",
"Habeck Oaks, COA 074": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/habeck-oaks/' target='_blank' class='ocs-coa-link'>Habeck Oaks, COA 074</a>",
"Harney-Malheur Area, COA 187": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/harney-malheur-area/' target='_blank' class='ocs-coa-link'>Harney-Malheur Area, COA 187</a>",
"Hart Mountain Area, COA 200": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/hart-mountain-area/' target='_blank' class='ocs-coa-link'>Hart Mountain Area, COA 200</a>",
"Heceta Head, COA 031": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/heceta-head/' target='_blank' class='ocs-coa-link'>Heceta Head, COA 031</a>",
"Illinois River-Silver Creek, COA 096": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/illinois-river-silver-creek/' target='_blank' class='ocs-coa-link'>Illinois River-Silver Creek, COA 096</a>",
"Imnaha, COA 161": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/imnaha/' target='_blank' class='ocs-coa-link'>Imnaha, COA 161</a>",
"Jordan Creek Wetlands, COA 195": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/jordan-creek-wetlands/' target='_blank' class='ocs-coa-link'>Jordan Creek Wetlands, COA 195</a>",
"Kalmiopsis Area, COA 100": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/kalmiopsis-area/' target='_blank' class='ocs-coa-link'>Kalmiopsis Area, COA 100</a>",
"King Mountain Area, COA 095": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/king-mountain-area/' target='_blank' class='ocs-coa-link'>King Mountain Area, COA 095</a>",
"Kings Valley-Woods Creek Oak Woodlands, COA 080": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/kings-valley-woods-creek-oak-woodlands/' target='_blank' class='ocs-coa-link'>Kings Valley-Woods Creek Oak Woodlands, COA 080</a>",
"Kingston Prairie-Scio Oak Pine Savanna, COA 079": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/kingston-prairie-scio-oak-pine-savanna/' target='_blank' class='ocs-coa-link'>Kingston Prairie-Scio Oak Pine Savanna, COA 079</a>",
"Klamath Marsh-Williamson River, COA 134": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/klamath-marsh-williamson-river/' target='_blank' class='ocs-coa-link'>Klamath Marsh-Williamson River, COA 134</a>",
"Klamath River Canyon, COA 142": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/klamath-river-canyon/' target='_blank' class='ocs-coa-link'>Klamath River Canyon, COA 142</a>",
"Lake Abert, COA 197": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/lake-abert/' target='_blank' class='ocs-coa-link'>Lake Abert, COA 197</a>",
"Lawrence Grasslands, COA 152": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/lawrence-grasslands/' target='_blank' class='ocs-coa-link'>Lawrence Grasslands, COA 152</a>",
# Current COA data has typo "Deschuttes". This line hides that error, the next will handle data when fixed. (RDH 20190626)
"Little Deschuttes River, COA 132": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/little-deschutes-river/' target='_blank' class='ocs-coa-link'>Little Deschutes River, COA 132</a>",
"Little Deschutes River, COA 132": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/little-deschutes-river/' target='_blank' class='ocs-coa-link'>Little Deschutes River, COA 132</a>",
"Little Louse Canyon, COA 205": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/little-louse-canyon/' target='_blank' class='ocs-coa-link'>Little Louse Canyon, COA 205</a>",
"Little North Santiam River Area, COA 109": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/little-north-santiam-river-area/' target='_blank' class='ocs-coa-link'>Little North Santiam River Area, COA 109</a>",
"Logan Valley-John Day River Headwaters, COA 179": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/logan-valley-john-day-river-headwaters/' target='_blank' class='ocs-coa-link'>Logan Valley-John Day River Headwaters, COA 179</a>",
"Long Creek-Coyote Creek-Silver Creek, COA 135": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/long-creek-coyote-creek-silver-creek/' target='_blank' class='ocs-coa-link'>Long Creek-Coyote Creek-Silver Creek, COA 135</a>",
"Lost River Area, COA 143": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/lost-river-area/' target='_blank' class='ocs-coa-link'>Lost River Area, COA 143</a>",
"Lower Coquille River, COA 045": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/lower-coquille-river/' target='_blank' class='ocs-coa-link'>Lower Coquille River, COA 045</a>",
# Current COA data has typo "Deschuttes". This line hides that error, the next will handle data when fixed. (RDH 20190626)
"Lower Deschuttes River, COA 148": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/lower-deschutes-river/' target='_blank' class='ocs-coa-link'>Lower Deschutes River, COA 148</a>",
"Lower Deschutes River, COA 148": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/lower-deschutes-river/' target='_blank' class='ocs-coa-link'>Lower Deschutes River, COA 148</a>",
"Lower Grande Ronde, COA 158": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/lower-grande-ronde/' target='_blank' class='ocs-coa-link'>Lower Grande Ronde, COA 158</a>",
"Lower John Day River, COA 153": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/lower-john-day-river/' target='_blank' class='ocs-coa-link'>Lower John Day River, COA 153</a>",
"Lower Rogue River and Estuary, COA 049": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/lower-rogue-river-and-estuary/' target='_blank' class='ocs-coa-link'>Lower Rogue River and Estuary, COA 049</a>",
"Luckiamute River and Tributaries, COA 075": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/luckiamute-river-and-tributaries/' target='_blank' class='ocs-coa-link'>Luckiamute River and Tributaries, COA 075</a>",
"Malheur River Headwaters, COA 180": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/malheur-river-headwaters/' target='_blank' class='ocs-coa-link'>Malheur River Headwaters, COA 180</a>",
"Mary's Peak, COA 028": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/marys-peak/' target='_blank' class='ocs-coa-link'>Mary's Peak, COA 028</a>",
"McKenzie River Area, COA 114": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/mckenzie-river-area/' target='_blank' class='ocs-coa-link'>McKenzie River Area, COA 114</a>",
"McTimmons Valley - Airlie Savanna, COA 076": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/mctimmons-valley-airlie-savanna/' target='_blank' class='ocs-coa-link'>McTimmons Valley - Airlie Savanna, COA 076</a>",
"Metolius Bench-Mutton Mountains Wildlife Movement Corridor, COA 150": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/metolius-bench-mutton-mountains-wildlife-movement-corridor/' target='_blank' class='ocs-coa-link'>Metolius Bench-Mutton Mountains Wildlife Movement Corridor, COA 150</a>",
"Metolius River Area, COA 127": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/metolius-river-area/' target='_blank' class='ocs-coa-link'>Metolius River Area, COA 127</a>",
"Middle Fork John Day River, COA 170": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/middle-fork-john-day-river/' target='_blank' class='ocs-coa-link'>Middle Fork John Day River, COA 170</a>",
"Middle Fork Willamette River, COA 115": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/middle-fork-willamette-river/' target='_blank' class='ocs-coa-link'>Middle Fork Willamette River, COA 115</a>",
"Middle Owyhee River Area, COA 186": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/middle-owyhee-river-area/' target='_blank' class='ocs-coa-link'>Middle Owyhee River Area, COA 186</a>",
"Middlefork Willamette River Headwaters, COA 118": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/middlefork-willamette-river-headwaters/' target='_blank' class='ocs-coa-link'>Middlefork Willamette River Headwaters, COA 118</a>",
"Mill Creek, COA 024": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/mill-creek/' target='_blank' class='ocs-coa-link'>Mill Creek, COA 024</a>",
"Missouri Ridge, COA 070": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/missouri-ridge/' target='_blank' class='ocs-coa-link'>Missouri Ridge, COA 070</a>",
"Mohawk River, COA 088": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/mohawk-river/' target='_blank' class='ocs-coa-link'>Mohawk River, COA 088</a>",
"Molalla River, COA 069": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/molalla-river/' target='_blank' class='ocs-coa-link'>Molalla River, COA 069</a>",
"Mt Hood Area, COA 107": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/mt-hood-area/' target='_blank' class='ocs-coa-link'>Mt Hood Area, COA 107</a>",
"Mt Jefferson Wilderness, North, COA 111": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/mt-jefferson-wilderness-north/' target='_blank' class='ocs-coa-link'>Mt Jefferson Wilderness, North, COA 111</a>",
"Necanicum Estuary, COA 002": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/necanicum-estuary/' target='_blank' class='ocs-coa-link'>Necanicum Estuary, COA 002</a>",
"Necanicum River, COA 004": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/necanicum-river/' target='_blank' class='ocs-coa-link'>Necanicum River, COA 004</a>",
"Nehalem and Salmonberry River Headwaters, COA 012": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/nehalem-and-salmonberry-river-headwaters/' target='_blank' class='ocs-coa-link'>Nehalem and Salmonberry River Headwaters, COA 012</a>",
"Nehalem River Estuary, COA 009": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/nehalem-river-estuary/' target='_blank' class='ocs-coa-link'>Nehalem River Estuary, COA 009</a>",
"Nestucca Bay, COA 016": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/nestucca-bay/' target='_blank' class='ocs-coa-link'>Nestucca Bay, COA 016</a>",
"Nestucca River Watershed, COA 017": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/nestucca-river-watershed/' target='_blank' class='ocs-coa-link'>Nestucca River Watershed, COA 017</a>",
"Netarts Bay, COA 014": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/netarts-bay/' target='_blank' class='ocs-coa-link'>Netarts Bay, COA 014</a>",
"New River Area, COA 047": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/new-river-area/' target='_blank' class='ocs-coa-link'>New River Area, COA 047</a>",
"Newberry Crater, COA 130": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/newberry-crater/' target='_blank' class='ocs-coa-link'>Newberry Crater, COA 130</a>",
"North Fork John Day River 1, COA 168": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/north-fork-john-day-river-1/' target='_blank' class='ocs-coa-link'>North Fork John Day River 1, COA 168</a>",
"North Fork John Day River 2, COA 169": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/north-fork-john-day-river-2/' target='_blank' class='ocs-coa-link'>North Fork John Day River 2, COA 169</a>",
"North Fork Malheur-Monument Rock Area, COA 182": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/north-fork-malheur-monument-rock-area/' target='_blank' class='ocs-coa-link'>North Fork Malheur-Monument Rock Area, COA 182</a>",
# Current COA data has typo "Reiver". This line hides that error, the next will handle data when fixed. (RDH 20190626)
"North Fork Nehalem Reiver, COA 010": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/north-fork-nehalem-river/' target='_blank' class='ocs-coa-link'>North Fork Nehalem River, COA 010</a>",
"North Fork Nehalem River, COA 010": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/north-fork-nehalem-river/' target='_blank' class='ocs-coa-link'>North Fork Nehalem River, COA 010</a>",
"North Fork Siuslaw River, COA 032": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/north-fork-siuslaw-river/' target='_blank' class='ocs-coa-link'>North Fork Siuslaw River, COA 032</a>",
"North Fork Smith River, COA 037": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/north-fork-smith-river/' target='_blank' class='ocs-coa-link'>North Fork Smith River, COA 037</a>",
"North Medford Area, COA 097": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/north-medford-area/' target='_blank' class='ocs-coa-link'>North Medford Area, COA 097</a>",
"North Umpqua River Area, COA 090": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/north-umpqua-river-area/' target='_blank' class='ocs-coa-link'>North Umpqua River Area, COA 090</a>",
"Ochoco Mountains, COA 173": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/ochoco-mountains/' target='_blank' class='ocs-coa-link'>Ochoco Mountains, COA 173</a>",
"Odell Lake-Davis Lake, COA 117": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/odell-lake-davis-lake/' target='_blank' class='ocs-coa-link'>Odell Lake-Davis Lake, COA 117</a>",
"One Horse Slough-Beaver Creek, COA 083": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/one-horse-slough-beaver-creek/' target='_blank' class='ocs-coa-link'>One Horse Slough-Beaver Creek, COA 083</a>",
"Oregon Caves-Applegate Area, COA 102": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/oregon-caves-applegate-area/' target='_blank' class='ocs-coa-link'>Oregon Caves-Applegate Area, COA 102</a>",
"Pelican Butte-Sky Lakes Area, COA 123": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/pelican-butte-sky-lakes-area/' target='_blank' class='ocs-coa-link'>Pelican Butte-Sky Lakes Area, COA 123</a>",
"Pistol River Estuary, COA 050": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/pistol-river-estuary/' target='_blank' class='ocs-coa-link'>Pistol River Estuary, COA 050</a>",
"Powder River Sage-Grouse Core Area, COA 164": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/powder-river-sage-grouse-core-area/' target='_blank' class='ocs-coa-link'>Powder River Sage-Grouse Core Area, COA 164</a>",
"Pudding River, COA 068": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/pudding-river/' target='_blank' class='ocs-coa-link'>Pudding River, COA 068</a>",
"Pueblo Mountain, COA 203": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/pueblo-mountain/' target='_blank' class='ocs-coa-link'>Pueblo Mountain, COA 203</a>",
"Quartz Mountain, COA 131": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/quartz-mountain/' target='_blank' class='ocs-coa-link'>Quartz Mountain, COA 131</a>",
"Quartzville Creek Area, COA 112": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/quartzville-creek-area/' target='_blank' class='ocs-coa-link'>Quartzville Creek Area, COA 112</a>",
"Rattlesnake Creek-Calamity Creek Area, COA 181": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/rattlesnake-creek-calamity-creek-area/' target='_blank' class='ocs-coa-link'>Rattlesnake Creek-Calamity Creek Area, COA 181</a>",
"Red Prairie-Mill Creek-Willamina Oaks South, COA 071": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/red-prairie-mill-creek-willamina-oaks-south/' target='_blank' class='ocs-coa-link'>Red Prairie-Mill Creek-Willamina Oaks South, COA 071</a>",
"Rickreall Creek and Little Luckiamute River Headwaters, COA 025": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/rickreall-creek-and-little-luckiamute-river-headwaters/' target='_blank' class='ocs-coa-link'>Rickreall Creek and Little Luckiamute River Headwaters, COA 025</a>",
"Rock Creek, COA 120": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/rock-creek/' target='_blank' class='ocs-coa-link'>Rock Creek, COA 120</a>",
"Rock Creek-Butter Creek Grasslands, COA 155": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/rock-creek-butter-creek-grasslands/' target='_blank' class='ocs-coa-link'>Rock Creek-Butter Creek Grasslands, COA 155</a>",
"Rogue River, COA 093": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/rogue-river/' target='_blank' class='ocs-coa-link'>Rogue River, COA 093</a>",
"Saddle Mountain, COA 005": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/saddle-mountain/' target='_blank' class='ocs-coa-link'>Saddle Mountain, COA 005</a>",
"Sage Hen Creek, COA 201": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/sage-hen-creek/' target='_blank' class='ocs-coa-link'>Sage Hen Creek, COA 201</a>",
"Salem Hills-Ankeny NWR, COA 077": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/salem-hills-ankeny-nwr/' target='_blank' class='ocs-coa-link'>Salem Hills-Ankeny NWR, COA 077</a>",
"Salmon River Estuary-Cascade Head, COA 019": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/salmon-river-estuary-cascade-head/' target='_blank' class='ocs-coa-link'>Salmon River Estuary-Cascade Head, COA 019</a>",
"Sand Lake Area, COA 015": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/sand-lake-area/' target='_blank' class='ocs-coa-link'>Sand Lake Area, COA 015</a>",
"Santiam Confluences, COA 078": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/santiam-confluences/' target='_blank' class='ocs-coa-link'>Santiam Confluences, COA 078</a>",
"Sauvie Island-Scappoose, COA 054": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/sauvie-island-scappoose/' target='_blank' class='ocs-coa-link'>Sauvie Island-Scappoose, COA 054</a>",
"Scoggins Valley-Mount Richmond, COA 063": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/scoggins-valley-mount-richmond/' target='_blank' class='ocs-coa-link'>Scoggins Valley-Mount Richmond, COA 063</a>",
"Shady Cove Foothills, COA 098": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/shady-cove-foothills/' target='_blank' class='ocs-coa-link'>Shady Cove Foothills, COA 098</a>",
"Siletz Bay, COA 021": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/siletz-bay/' target='_blank' class='ocs-coa-link'>Siletz Bay, COA 021</a>",
"Siletz River, COA 022": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/siletz-river/' target='_blank' class='ocs-coa-link'>Siletz River, COA 022</a>",
"Silver Creek Area, COA 175": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/silver-creek-area/' target='_blank' class='ocs-coa-link'>Silver Creek Area, COA 175</a>",
# Current COA data has typo "Siskoyou". This line hides that error, the next will handle data when fixed. (RDH 20190626)
"Siskoyou Crest Area, COA 104": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/siskiyou-crest-area/' target='_blank' class='ocs-coa-link'>Siskiyou Crest Area, COA 104</a>",
"Siskiyou Crest Area, COA 104": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/siskiyou-crest-area/' target='_blank' class='ocs-coa-link'>Siskiyou Crest Area, COA 104</a>",
"Siuslaw River, COA 035": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/siuslaw-river/' target='_blank' class='ocs-coa-link'>Siuslaw River, COA 035</a>",
"Siuslaw River Estuary, COA 034": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/siuslaw-river-estuary/' target='_blank' class='ocs-coa-link'>Siuslaw River Estuary, COA 034</a>",
# Current COA data has "Elk Creek", not "Elk River". This line hides that error, the next will handle data when fixed. (RDH 20190626)
"Sixes River-Elk Creek, COA 048": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/sixes-river-elk-river/' target='_blank' class='ocs-coa-link'>Sixes River-Elk River, COA 048</a>",
"Sixes River-Elk River, COA 048": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/sixes-river-elk-river/' target='_blank' class='ocs-coa-link'>Sixes River-Elk River, COA 048</a>",
"Soda Mountain Area, COA 124": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/soda-mountain-area/' target='_blank' class='ocs-coa-link'>Soda Mountain Area, COA 124</a>",
"Soldier Creek-Upper Owyhee River, COA 196": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/soldier-creek-upper-owyhee-river/' target='_blank' class='ocs-coa-link'>Soldier Creek-Upper Owyhee River, COA 196</a>",
"South Fork Coquille, COA 046": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/south-fork-coquille/' target='_blank' class='ocs-coa-link'>South Fork Coquille, COA 046</a>",
"South Fork Crooked River Area, COA 174": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/south-fork-crooked-river-area/' target='_blank' class='ocs-coa-link'>South Fork Crooked River Area, COA 174</a>",
"South Fork Umpqua River and Tributaries, COA 091": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/south-fork-umpqua-river-and-tributaries/' target='_blank' class='ocs-coa-link'>South Fork Umpqua River and Tributaries, COA 091</a>",
"South John Day River, COA 171": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/south-john-day-river/' target='_blank' class='ocs-coa-link'>South John Day River, COA 171</a>",
"Sprague River, COA 139": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/sprague-river/' target='_blank' class='ocs-coa-link'>Sprague River, COA 139</a>",
"Steens Mountain, COA 192": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/steens-mountain/' target='_blank' class='ocs-coa-link'>Steens Mountain, COA 192</a>",
"Summer Lake Area, COA 189": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/summer-lake-area/' target='_blank' class='ocs-coa-link'>Summer Lake Area, COA 189</a>",
"Sutton Lake Area, COA 033": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/sutton-lake-area/' target='_blank' class='ocs-coa-link'>Sutton Lake Area, COA 033</a>",
"Sycan Marsh, COA 136": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/sycan-marsh/' target='_blank' class='ocs-coa-link'>Sycan Marsh, COA 136</a>",
"Sycan River, COA 137": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/sycan-river/' target='_blank' class='ocs-coa-link'>Sycan River, COA 137</a>",
"Tahkenitch-Siltcoos Lakes, COA 036": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/tahkenitch-siltcoos-lakes/' target='_blank' class='ocs-coa-link'>Tahkenitch-Siltcoos Lakes, COA 036</a>",
"Ten Cent Lake-Juniper Lake Area, COA 193": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/ten-cent-lake-juniper-lake-area/' target='_blank' class='ocs-coa-link'>Ten Cent Lake-Juniper Lake Area, COA 193</a>",
"Tenmile Area, COA 092": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/tenmile-area/' target='_blank' class='ocs-coa-link'>Tenmile Area, COA 092</a>",
"Tenmile Lake, COA 040": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/tenmile-lake/' target='_blank' class='ocs-coa-link'>Tenmile Lake, COA 040</a>",
"Thomas Creek-Goose Lake, COA 145": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/thomas-creek-goose-lake/' target='_blank' class='ocs-coa-link'>Thomas Creek-Goose Lake, COA 145</a>",
"Three Forks, COA 206": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/three-forks/' target='_blank' class='ocs-coa-link'>Three Forks, COA 206</a>",
"Tillamook Bay and Tributaries, COA 011": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/tillamook-bay-and-tributaries/' target='_blank' class='ocs-coa-link'>Tillamook Bay and Tributaries, COA 011</a>",
"Tillamook Head, COA 003": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/tillamook-head/' target='_blank' class='ocs-coa-link'>Tillamook Head, COA 003</a>",
"Trask Mountain, COA 018": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/trask-mountain/' target='_blank' class='ocs-coa-link'>Trask Mountain, COA 018</a>",
"Trout Creek Mountains, COA 204": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/trout-creek-mountains/' target='_blank' class='ocs-coa-link'>Trout Creek Mountains, COA 204</a>",
"Umpqua Headwaters, COA 119": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/umpqua-headwaters/' target='_blank' class='ocs-coa-link'>Umpqua Headwaters, COA 119</a>",
"Umpqua River, COA 042": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/umpqua-river/' target='_blank' class='ocs-coa-link'>Umpqua River, COA 042</a>",
"Umpqua River Estuary, COA 038": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/umpqua-river-estuary/' target='_blank' class='ocs-coa-link'>Umpqua River Estuary, COA 038</a>",
# Current COA data has typo "Deschuttes". This line hides that error, the next will handle data when fixed. (RDH 20190626)
"Upper Deschuttes River, COA 129": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/upper-deschutes-river/' target='_blank' class='ocs-coa-link'>Upper Deschutes River, COA 129</a>",
"Upper Deschutes River, COA 129": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/upper-deschutes-river/' target='_blank' class='ocs-coa-link'>Upper Deschutes River, COA 129</a>",
"Upper Grande Ronde River Area, COA 160": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/upper-grande-ronde-river-area/' target='_blank' class='ocs-coa-link'>Upper Grande Ronde River Area, COA 160</a>",
"Upper Klamath Lake Area, COA 138": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/upper-klamath-lake-area/' target='_blank' class='ocs-coa-link'>Upper Klamath Lake Area, COA 138</a>",
"Upper Silvies River, COA 178": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/upper-silvies-river/' target='_blank' class='ocs-coa-link'>Upper Silvies River, COA 178</a>",
"Upper Siuslaw, COA 089": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/upper-siuslaw/' target='_blank' class='ocs-coa-link'>Upper Siuslaw, COA 089</a>",
"Upper South Fork Malheur Area, COA 188": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/upper-south-fork-malheur-area/' target='_blank' class='ocs-coa-link'>Upper South Fork Malheur Area, COA 188</a>",
"Walla Walla Headwaters, COA 157": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/walla-walla-headwaters/' target='_blank' class='ocs-coa-link'>Walla Walla Headwaters, COA 157</a>",
"Wallowa Mountains, COA 163": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/wallowa-mountains/' target='_blank' class='ocs-coa-link'>Wallowa Mountains, COA 163</a>",
"Warm Springs River, COA 126": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/warm-springs-river/' target='_blank' class='ocs-coa-link'>Warm Springs River, COA 126</a>",
"Warner Basin Wetlands, COA 199": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/warner-basin-wetlands/' target='_blank' class='ocs-coa-link'>Warner Basin Wetlands, COA 199</a>",
"Warner Mountains, COA 146": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/warner-mountains/' target='_blank' class='ocs-coa-link'>Warner Mountains, COA 146</a>",
"Warner West, COA 198": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/warner-west/' target='_blank' class='ocs-coa-link'>Warner West, COA 198</a>",
"Wasco Oaks, COA 125": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/wasco-oaks/' target='_blank' class='ocs-coa-link'>Wasco Oaks, COA 125</a>",
"Wassen Creek, COA 039": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/wassen-creek/' target='_blank' class='ocs-coa-link'>Wassen Creek, COA 039</a>",
"West Eugene Area, COA 086": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/west-eugene-area/' target='_blank' class='ocs-coa-link'>West Eugene Area, COA 086</a>",
"Whychus Creek, COA 128": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/whychus-creek/' target='_blank' class='ocs-coa-link'>Whychus Creek, COA 128</a>",
"Willow Creek-Birch Creek Area, COA 167": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/willow-creek-birch-creek-area/' target='_blank' class='ocs-coa-link'>Willow Creek-Birch Creek Area, COA 167</a>",
"Yachats River Area, COA 030": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/yachats-river-area/' target='_blank' class='ocs-coa-link'>Yachats River Area, COA 030</a>",
"Yamhill Oaks-Willamina Oaks North, COA 067": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/yamhill-oaks-willamina-oaks-north/' target='_blank' class='ocs-coa-link'>Yamhill Oaks-Willamina Oaks North, COA 067</a>",
"Yaquina Bay, COA 026": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/yaquina-bay/' target='_blank' class='ocs-coa-link'>Yaquina Bay, COA 026</a>",
"Zumwalt Prairie Plateau, COA 162": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/zumwalt-prairie-plateau/' target='_blank' class='ocs-coa-link'>Zumwalt Prairie Plateau, COA 162</a>",
"Lower Sandy River, COA 057": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/lower-sandy-river-2/' target='_blank' class='ocs-coa-link'>Lower Sandy River, COA 057</a>",
"Hayden Island-Government Island, COA 055": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/hayden-island-government-island/' target='_blank' class='ocs-coa-link'>Hayden Island-Government Island, COA 055</a>",
"Smith-Bybee Lakes and Columbia Slough, COA 056": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/smith-bybee-lakes-and-columbia-slough/' target='_blank' class='ocs-coa-link'>Smith-Bybee Lakes and Columbia Slough, COA 056</a>",
"Lower Willamette River Floodplain, COA 059": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/lower-willamette-river-floodplain/' target='_blank' class='ocs-coa-link'>Lower Willamette River Floodplain, COA 059</a>",
"Tualatin River, COA 064": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/tualatin-river/' target='_blank' class='ocs-coa-link'>Tualatin River, COA 064</a>",
"Middle Willamette River Floodplain, COA 060": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/middle-willamette-river-floodplain/' target='_blank' class='ocs-coa-link'>Middle Willamette River Floodplain, COA 060</a>",
"Upper Willamette River Floodplain, COA 061": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/upper-willamette-river-floodplain/' target='_blank' class='ocs-coa-link'>Upper Willamette River Floodplain, COA 061</a>",
"Hood River, COA 106": u"<a href='https://oregonconservationstrategy.org/conservation-opportunity-area/hood-river/' target='_blank' class='ocs-coa-link'>Hood River, COA 106</a>",
}
| 177.821752 | 326 | 0.750684 | 15,662 | 117,718 | 5.601137 | 0.08179 | 0.036535 | 0.073069 | 0.263049 | 0.890727 | 0.861077 | 0.811319 | 0.689872 | 0.56073 | 0.41214 | 0 | 0.023004 | 0.067951 | 117,718 | 661 | 327 | 178.090772 | 0.776538 | 0.010636 | 0 | 0 | 0 | 0.990726 | 0.931097 | 0.217081 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c55b825c29c5dc2855ff332b5ec593de97c781f2 | 24,428 | py | Python | ftmap_transform.py | GraphaQL-Neo4J/orientation-aware-firearm-detection | aab21adb93890ade7101c504058eb4a0331ab279 | [
"MIT"
] | 38 | 2019-07-28T11:23:35.000Z | 2021-06-05T12:34:31.000Z | ftmap_transform.py | engrjavediqbal/orientation-aware-firearm-detection | aab21adb93890ade7101c504058eb4a0331ab279 | [
"MIT"
] | 3 | 2019-10-02T20:41:07.000Z | 2020-03-10T07:16:59.000Z | ftmap_transform.py | engrjavediqbal/orientation-aware-firearm-detection | aab21adb93890ade7101c504058eb4a0331ab279 | [
"MIT"
] | 14 | 2019-07-28T11:24:16.000Z | 2021-03-25T12:27:21.000Z | # Modified & Written by CVML
import numpy as np
import matplotlib.pyplot as plt
import cv2
import re
import math
from numpy.linalg import inv
from numpy import linalg
from numpy import matrix
import xml.etree.ElementTree as ET
def transformer_layer(input_fmap, angle, box, output_tr_flag, out_dims=None, **kwargs):
B = np.shape(input_fmap)[0]
H = np.shape(input_fmap)[1]
W = np.shape(input_fmap)[2]
C = np.shape(input_fmap)[3]
cntr = np.asarray([(box[1]+box[3])/2, (box[0]+box[2])/2])
T1 = [[1,0,0],[0,1,0],[-cntr[1],-cntr[0],1]]
T2 = np.asarray([[np.cos(np.deg2rad(int(angle))), np.sin(np.deg2rad(int(angle))), 0],[-np.sin(np.deg2rad(int(angle))),np.cos(np.deg2rad(int(angle))),0],[0,0,1]])
T3 = [[1,0,0],[0,1,0],[cntr[1],cntr[0],1]]
T = np.dot(np.transpose(T3),np.dot(np.transpose(T2),np.transpose(T1)))
corner_pts = [[0,W-1,0,W-1],[0,0,H-1,H-1],[1,1,1,1]]
trans_cpoints = np.dot(T[0:2,:],corner_pts)
xymin = trans_cpoints.min(1)
xymax = trans_cpoints.max(1)
out_H = np.int32(xymax[1] - xymin[1] + 1)
out_W = np.int32(xymax[0] - xymin[0] + 1)
out_fmap_size = [0,0,out_W, out_H]
T4 = [[1,0,0],[0,1,0],[-xymin[0], -xymin[1],1]];
T_final = np.dot(np.transpose(T4),T);
rect_pts = [ [box[0],box[2], box[0], box[2]] ,[box[1],box[1],box[3],box[3]],[1,1,1,1]]
trans_rpoints = np.dot(T_final[0:2,:],rect_pts)
rxymin = trans_rpoints.min(1)
rxymax = trans_rpoints.max(1)
cropped_box = [np.int32(np.floor(rxymin[0])), np.int32(np.floor(rxymin[1])), np.int32(np.floor(rxymax[0])), np.int32(np.floor(rxymax[1]))]
intSec1, intSec2 = LinesIntersectionForLargestBox(trans_rpoints, np.array(rect_pts), angle)
height_deltas = [intSec1[1]-cropped_box[1], cropped_box[3]-intSec2[1]]
if output_tr_flag == True:
return cropped_box
else:
batch_grids = affine_grid_generator(out_H, out_W, T_final)
x_s = batch_grids[:,0, :, :]
y_s = batch_grids[:,1, :, :]
out_fmap = bilinear_sampler_Interpol(input_fmap, x_s, y_s) # Interpolation with in bbox and extend outside using Ia
negative_flag = False
if cropped_box[0] < 0:
cropped_box[2] = int(cropped_box[2]) - int(cropped_box[0])
cropped_box[0] = int(cropped_box[0]) - int(cropped_box[0])
if cropped_box[1] < 0:
cropped_box[3] = int(cropped_box[3]) - int(cropped_box[1])
cropped_box[1] = int(cropped_box[1]) - int(cropped_box[1])
f_map = out_fmap[:, cropped_box[1]:cropped_box[3] , cropped_box[0]:cropped_box[2] ,:]
return f_map, T_final, cropped_box, negative_flag, trans_rpoints, trans_cpoints, out_fmap_size, height_deltas
def affine_grid_generator(H, W, theta):
# create normalized 2D grid
x = np.arange(W)
y = np.arange(H)
x_t, y_t = np.meshgrid(x, y)
# flatten
x_t_flat = np.reshape(x_t, (-1))
y_t_flat = np.reshape(y_t, (-1))
# reshape to [x_t, y_t , 1] - (homogeneous form)
ones = np.ones((np.shape(x_t_flat)[0]))
sampling_grid = np.stack([x_t_flat, y_t_flat, ones])
# transform the sampling grid - batch multiply
theta_inv = np.linalg.inv(theta)
out_sampGrid = np.dot(theta_inv[0:2,:], sampling_grid)
# batch grid has shape (num_batch, 2, H*W)
# reshape to (num_batch, 2, H, W)
batch_grids = out_sampGrid.reshape((1, 2, H, W))
return batch_grids
def bilinear_sampler(input_fmap, x, y):
# prepare useful params
B = np.shape(input_fmap)[0]
H = np.shape(input_fmap)[1]
W = np.shape(input_fmap)[2]
C = np.shape(input_fmap)[3]
max_y = np.int32(H - 1)
max_x = np.int32(W - 1)
zero = np.zeros([], dtype='int32')
# grab 4 nearest corner points for each (x_i, y_i)
# i.e. we need a rectangle around the point of interest
x0 = np.int32(np.floor(x))
x1 = np.int32(x0 + 1)
y0 = np.int32(np.floor(y))
y1 = np.int32(y0 + 1)
# clip to range [0, H/W] to not violate img boundaries
x0 = np.clip(x0, zero, max_x)
x1 = np.clip(x1, zero, max_x)
y0 = np.clip(y0, zero, max_y)
y1 = np.clip(y1, zero, max_y)
# get pixel value at corner coords
Ia = input_fmap[0,y0,x0,:]
Ib = input_fmap[0,y1,x0,:]
Ic = input_fmap[0,y0,x1,:]
Id = input_fmap[0,y1,x1,:]
# recast as float for delta calculation
x0 = np.float32(x0)
x1 = np.float32(x1)
y0 = np.float32(y0)
y1 = np.float32(y1)
# calculate deltas
wa = (x1-x) * (y1-y)
wb = (x1-x) * (y-y0)
wc = (x-x0) * (y1-y)
wd = (x-x0) * (y-y0)
# add dimension for addition
wa = np.expand_dims(wa,axis=3)
wb = np.expand_dims(wb,axis=3)
wc = np.expand_dims(wc,axis=3)
wd = np.expand_dims(wd,axis=3)
wa[np.where(wa<0)]=0
wb[np.where(wb<0)]=0
wc[np.where(wc<0)]=0
wd[np.where(wd<0)]=0
output_fmap = Ib
return output_fmap
def bilinear_sampler_Interpol(input_fmap, x, y):
# prepare useful params
B = np.shape(input_fmap)[0]
H = np.shape(input_fmap)[1]
W = np.shape(input_fmap)[2]
C = np.shape(input_fmap)[3]
max_y = np.int32(H - 1)
max_x = np.int32(W - 1)
zero = np.zeros([], dtype='int32')
# grab 4 nearest corner points for each (x_i, y_i)
# i.e. we need a rectangle around the point of interest
x0 = np.int32(np.floor(x))
x1 = np.int32(x0 + 1)
y0 = np.int32(np.floor(y))
y1 = np.int32(y0 + 1)
ad1 = 1*(x0>=0)
ad2 = 1*(x0<=max_x)
ad3 = 1*(y0>=0)
ad4 = 1*(y0<=max_y)
maskx = 1*(ad1[0,:,:] * ad2[0,:,:])
masky = 1*(ad3[0,:,:] * ad4[0,:,:])
mask = maskx*masky
mask = mask.reshape(1,mask.shape[0], mask.shape[1], 1)
# clip to range [0, H/W] to not violate img boundaries
x0 = np.clip(x0, zero, max_x)
x1 = np.clip(x1, zero, max_x)
y0 = np.clip(y0, zero, max_y)
y1 = np.clip(y1, zero, max_y)
# get pixel value at corner coords
Ia = input_fmap[0,y0,x0,:]
Ib = input_fmap[0,y1,x0,:]
Ic = input_fmap[0,y0,x1,:]
Id = input_fmap[0,y1,x1,:]
# recast as float for delta calculation
x0 = np.float32(x0)
x1 = np.float32(x1)
y0 = np.float32(y0)
y1 = np.float32(y1)
# calculate deltas
wa = (x1-x) * (y1-y)
wb = (x1-x) * (y-y0)
wc = (x-x0) * (y1-y)
wd = (x-x0) * (y-y0)
# add dimension for addition
wa = np.expand_dims(wa,axis=3)
wb = np.expand_dims(wb,axis=3)
wc = np.expand_dims(wc,axis=3)
wd = np.expand_dims(wd,axis=3)
wa[np.where(wa<0)]=0
wb[np.where(wb<0)]=0
wc[np.where(wc<0)]=0
wd[np.where(wd<0)]=0
# compute output
output_fmap = (wa*Ia + wb*Ib + wc*Ic + wd*Id)
#output_fmap = Ib
outM = output_fmap.copy()
outM = outM * mask
mask0 = 1*(mask==0)
conVals = Ia * mask0
output_fmap = conVals + outM
output_fmap = Ia
return output_fmap
def LinesIntersectionForLargestBox(trans_rpoints, rect_pts, theta):
def line(p1, p2):
A = (p1[1] - p2[1])
B = (p2[0] - p1[0])
C = (p1[0]*p2[1] - p2[0]*p1[1])
return A, B, -C
def intersection(L1, L2):
D = L1[0] * L2[1] - L1[1] * L2[0]
Dx = L1[2] * L2[1] - L1[1] * L2[2]
Dy = L1[0] * L2[2] - L1[2] * L2[0]
if D != 0:
x = Dx / D
y = Dy / D
return x,y
else:
return False
rxymin = trans_rpoints.min(1)
rxymax = trans_rpoints.max(1)
widN = rxymax[0] - rxymin[0]
higN = rxymax[1] - rxymin[1]
rxyminOrig = rect_pts.min(1)
rxymaxOrig = rect_pts.max(1)
wid = rxymaxOrig[0] - rxyminOrig[0]
hig = rxymaxOrig[1] - rxyminOrig[1]
# positive angle smaller width
if wid<=hig and theta>=0:
L1 = line([trans_rpoints[0][0], trans_rpoints[1][0]], [trans_rpoints[0][2], trans_rpoints[1][2]])
L2 = line([trans_rpoints[0][1], trans_rpoints[1][1]], [trans_rpoints[0][3], trans_rpoints[1][3]])
L3 = line([rxymin[0], rxymin[1]], [rxymax[0], rxymax[1]])
#print('Condition 01 executed')
# positive angle greater width
elif wid>hig and theta>=0:
L1 = line([trans_rpoints[0][0], trans_rpoints[1][0]], [trans_rpoints[0][1], trans_rpoints[1][1]])
L2 = line([trans_rpoints[0][2], trans_rpoints[1][2]], [trans_rpoints[0][3], trans_rpoints[1][3]])
L3 = line([rxymax[0], rxymin[1]], [rxymin[0], rxymax[1]])
#print('Condition 02 executed')
# negative angle greater width
elif wid>hig and theta<0:# and (widOrig>higOrig):
L1 = line([trans_rpoints[0][0], trans_rpoints[1][0]], [trans_rpoints[0][1], trans_rpoints[1][1]])
L2 = line([trans_rpoints[0][2], trans_rpoints[1][2]], [trans_rpoints[0][3], trans_rpoints[1][3]])
L3 = line([rxymin[0], rxymin[1]], [rxymax[0], rxymax[1]])
#print('Condition 03 executed')
# negative angle smaller width
elif (wid<=hig and theta<0): #or (widOrig<=higOrig):
L1 = line([trans_rpoints[0][0], trans_rpoints[1][0]], [trans_rpoints[0][2], trans_rpoints[1][2]])
L2 = line([trans_rpoints[0][1], trans_rpoints[1][1]], [trans_rpoints[0][3], trans_rpoints[1][3]])
L3 = line([rxymax[0], rxymin[1]], [rxymin[0], rxymax[1]])
#print('Condition 04 executed')
if L1 and L3:
intSec1 = intersection(L1, L3)
if L2 and L3:
intSec2 = intersection(L2, L3)
if not intSec1 or not intSec2:
intSec1 = [0, 0]
intSec2 = [widN, higN]
return intSec1, intSec2
def transformer_layer_fMap(input_fmap, angle, rpn_boxes, out_dims=None, **kwargs):
B = np.shape(input_fmap)[0]
H = np.shape(input_fmap)[1]
W = np.shape(input_fmap)[2]
C = np.shape(input_fmap)[3]
#cntr = np.asarray([(box[1]+box[3])/2, (box[0]+box[2])/2])
cntr = np.asarray([H/2, W/2])
#print('Original RPN :', box)
T1 = [[1,0,0],[0,1,0],[-cntr[1],-cntr[0],1]]
T2 = np.asarray([[np.cos(np.deg2rad(int(angle))), np.sin(np.deg2rad(int(angle))), 0],[-np.sin(np.deg2rad(int(angle))),np.cos(np.deg2rad(int(angle))),0],[0,0,1]])
T3 = [[1,0,0],[0,1,0],[cntr[1],cntr[0],1]]
T = np.dot(np.transpose(T3),np.dot(np.transpose(T2),np.transpose(T1)))
#T = np.dot(np.transpose(T3),np.dot(np.transpose(T2),T1))
corner_pts = [[0,W-1,0,W-1],[0,0,H-1,H-1],[1,1,1,1]]
#print('corner_pts', corner_pts)
trans_cpoints = np.dot(T[0:2,:],corner_pts)
xymin = trans_cpoints.min(1)
xymax = trans_cpoints.max(1)
out_H = np.int32(xymax[1] - xymin[1] + 1)
out_W = np.int32(xymax[0] - xymin[0] + 1)
#print('out_W', out_W, out_H, W,H)
out_fmap_size = [0,0,out_W, out_H]
T4 = [[1,0,0],[0,1,0],[-xymin[0], -xymin[1],1]];
#print T
#print T4
T_final = np.dot(np.transpose(T4),T);
#print T_final
tr_rotated_box_all = []
for idx in range(0, len(rpn_boxes)):
box = rpn_boxes[idx,1:]/16
#print('box', box)
rect_pts = [ [box[0],box[2], box[0], box[2]] ,[box[1],box[1],box[3],box[3]],[1,1,1,1]]
trans_rpoints = np.dot(T_final[0:2,:],rect_pts)
rxymin = trans_rpoints.min(1)
rxymax = trans_rpoints.max(1)
cropped_box = [np.int32(np.floor(rxymin[0])), np.int32(np.floor(rxymin[1])), np.int32(np.floor(rxymax[0])), np.int32(np.floor(rxymax[1]))]
#cropped_box = [rxymin[0],rxymin[1],rxymax[0],rxymax[1]]
# find coordinates for maximum area inscribed rectangle
intSec1, intSec2 = LinesIntersectionForLargestBox(trans_rpoints, np.array(rect_pts), angle)
height_deltas = [intSec1[1]-cropped_box[1], cropped_box[3]-intSec2[1]]
#print('height_deltas: ', height_deltas)
tr_rotated_box = [rxymin[0], rxymin[1]+height_deltas[0], rxymax[0], rxymax[1]-height_deltas[1]]
#print 'tr_cropped_box: ', tr_cropped_box
tr_rotated_box = [ik * 16 for ik in tr_rotated_box]
tr_rotated_box_all.append(tr_rotated_box)
batch_grids = affine_grid_generator(out_H, out_W, T_final)
x_s = batch_grids[:,0, :, :]
y_s = batch_grids[:,1, :, :]
out_fmap = bilinear_sampler(input_fmap, x_s, y_s)
#print 'out_fmap ', out_fmap.shape
#print 'input_fmap ', input_fmap.shape
#tr_rotated_box_all = np.array(tr_rotated_box_all)
return out_fmap, T_final, tr_rotated_box_all
def transformer_layer_fMapSep(input_fmap, orient_scores, rpn_boxes, out_dims=None, **kwargs):
theta = [0, 90, 135, 45, 157.5, 112.5, 67.5, 22.5]
B = np.shape(input_fmap)[0]
H = np.shape(input_fmap)[1]
W = np.shape(input_fmap)[2]
C = np.shape(input_fmap)[3]
print('widHig', B, H, W, C)
outMap = np.zeros((len(rpn_boxes), 72,72, input_fmap.shape[3]), dtype = float)
#outMap = np.zeros((len(rpn_boxes), 50,50, input_fmap.shape[3]))
tr_rotated_box_all = []
transApplied = []
#ang1 = np.array(np.argmax(orient_scores, axis = 1))
#print('Im here :', 1*(ang1==0), 1*(ang1==1))
#idx0 = np.where(np.logical_or((ang1 == 0)*1,(ang1 == 1)*1))[0]
#print(idx0)
for idx in range(0, len(rpn_boxes)):
transCurrent = []
angle = theta[np.argmax(orient_scores[idx, :], axis = 0)]
if angle==0 or angle==90 :
#print ("input_fmap.shape",input_fmap.shape)
outMap[idx, 0:input_fmap.shape[1], 0:input_fmap.shape[2], 0:input_fmap.shape[3]] = input_fmap
#print rpn_boxes[idx,1:5], [idx]+[rpn_boxes[idx,1],rpn_boxes[idx,2], rpn_boxes[idx,3], rpn_boxes[idx,4]]
tr_rotated_box_all.append([idx]+[rpn_boxes[idx,1],rpn_boxes[idx,2], rpn_boxes[idx,3], rpn_boxes[idx,4]])
T11 = [[1,0,0],[0,1,0],[0,0,1]]
transCurrent.append(T11)
transCurrent.append(T11)
transApplied.append(transCurrent)
box = rpn_boxes[idx,1:5]/16
sz = [box[3]-box[1], box[2]-box[0]]
cntr = np.asarray([(box[1]+box[3])/2, (box[0]+box[2])/2])
T1 = [[1,0,0],[0,1,0],[-cntr[1],-cntr[0],1]]
T2 = np.asarray([[np.cos(np.deg2rad(int(angle))), np.sin(np.deg2rad(int(angle))), 0],[-np.sin(np.deg2rad(int(angle))),np.cos(np.deg2rad(int(angle))),0],[0,0,1]])
T3 = [[1,0,0],[0,1,0],[cntr[1],cntr[0],1]]
T = np.dot(np.transpose(T3),np.dot(np.transpose(T2),np.transpose(T1)))
corner_pts = [[0,W-1,0,W-1],[0,0,H-1,H-1],[1,1,1,1]]
trans_cpoints = np.dot(T[0:2,:],corner_pts)
xymin = trans_cpoints.min(1)
xymax = trans_cpoints.max(1)
out_H = np.int32(xymax[1] - xymin[1] + 1)
out_W = np.int32(xymax[0] - xymin[0] + 1)
out_fmap_size = [0,0,out_W, out_H]
T4 = [[1,0,0],[0,1,0],[-xymin[0], -xymin[1],1]];
T_final = np.dot(np.transpose(T4),T);
else:
if angle>90:
angle = angle-180
box = rpn_boxes[idx,1:5]/16
#print('angle', angle)
sz = [box[3]-box[1], box[2]-box[0]]
cntr = np.asarray([(box[1]+box[3])/2, (box[0]+box[2])/2])
#print('Original RPN :', box)
T1 = [[1,0,0],[0,1,0],[-cntr[1],-cntr[0],1]]
T2 = np.asarray([[np.cos(np.deg2rad(int(angle))), np.sin(np.deg2rad(int(angle))), 0],[-np.sin(np.deg2rad(int(angle))),np.cos(np.deg2rad(int(angle))),0],[0,0,1]])
T3 = [[1,0,0],[0,1,0],[cntr[1],cntr[0],1]]
T = np.dot(np.transpose(T3),np.dot(np.transpose(T2),np.transpose(T1)))
#T = np.dot(np.transpose(T3),np.dot(np.transpose(T2),T1))
corner_pts = [[0,W-1,0,W-1],[0,0,H-1,H-1],[1,1,1,1]]
#print('corner_pts', corner_pts)
trans_cpoints = np.dot(T[0:2,:],corner_pts)
xymin = trans_cpoints.min(1)
xymax = trans_cpoints.max(1)
out_H = np.int32(xymax[1] - xymin[1] + 1)
out_W = np.int32(xymax[0] - xymin[0] + 1)
#print('out_W', out_W, out_H, W,H)
out_fmap_size = [0,0,out_W, out_H]
T4 = [[1,0,0],[0,1,0],[-xymin[0], -xymin[1],1]];
#print T
#print T4
T_final = np.dot(np.transpose(T4),T);
#print T_final
rect_pts = [ [box[0],box[2], box[0], box[2]] ,[box[1],box[1],box[3],box[3]],[1,1,1,1]]
trans_rpoints = np.dot(T_final[0:2,:],rect_pts)
rxymin = trans_rpoints.min(1)
rxymax = trans_rpoints.max(1)
cropped_box = [np.int32(np.floor(rxymin[0])), np.int32(np.floor(rxymin[1])), np.int32(np.floor(rxymax[0])), np.int32(np.floor(rxymax[1]))]
#print('cropped_box :', cropped_box)
#print('trans_rpoints :', trans_rpoints)
# find coordinates for maximum area inscribed rectangle
intSec1, intSec2 = LinesIntersectionForLargestBox(trans_rpoints, np.array(rect_pts), angle)
height_deltas = [intSec1[1]-cropped_box[1], cropped_box[3]-intSec2[1]]
#print('height_deltas: ', height_deltas)
T11 = [[1,0,0],[0,1,0],[-rxymin[0],-rxymin[1],1]]
T11 = np.transpose(T11)
rect_pts1 = [trans_rpoints[0], trans_rpoints[1],[1,1,1,1]]
trans_rpoints = np.dot(T11[0:2,:],rect_pts1)
rxymin1 = trans_rpoints.min(1)
rxymax1 = trans_rpoints.max(1)
#print ('trans_rpoints00 : ', trans_rpoints)
tr_rotated_box = [rxymin1[0], rxymin1[1]+height_deltas[0], rxymax1[0], rxymax1[1]-height_deltas[1]]
#print ('box : ', box)
#print ('tr_rotated_box : ', tr_rotated_box)
tr_rotated_box = [ik * 16 for ik in tr_rotated_box]
#ross = [[0]+ il for il in rotated_rpns]
tr_rotated_box_all.append([idx]+tr_rotated_box)
transCurrent.append(T_final)
transCurrent.append(T11)
batch_grids = affine_grid_generator(out_H, out_W, T_final)
x_s = batch_grids[:,0, :, :]
y_s = batch_grids[:,1, :, :]
out_fmap = bilinear_sampler_Interpol(input_fmap.copy(), x_s, y_s)
if cropped_box[0] < 0:
cropped_box[2] = int(cropped_box[2] - cropped_box[0])
cropped_box[0] = int(cropped_box[0] - cropped_box[0])
if cropped_box[1] < 0:
cropped_box[3] = int(cropped_box[3] - cropped_box[1])
cropped_box[1] = int(cropped_box[1] - cropped_box[1])
f_map = out_fmap[:, cropped_box[1]:cropped_box[3] , cropped_box[0]:cropped_box[2] ,:]
#print('output_fmap', (f_map[0,:,:,:]).sum())
outMap[idx, 0:f_map.shape[1], 0:f_map.shape[2], 0:f_map.shape[3]] = f_map
#print('output_fmap1', (outMap[idx,:,:,:]).sum())
#tr_rotated_box_all = np.array(tr_rotated_box_all)
#print('featureMap size : ', outMap.shape)
transApplied.append(transCurrent)
return outMap, tr_rotated_box_all, transApplied, T_final
###### backward ######
#def transformer_layer_fMapSep_backward(input_fmap, orient_scores, rpn_boxes, out_dims=None, **kwargs):
def transformer_layer_fMapSep_backward(input_grad, orient_scores, in_rpn_boxes, out_dims=None, **kwargs):
theta = [0, 90, 135, 45, 157.5, 112.5, 67.5, 22.5]
B = np.shape(input_grad)[0]
H = np.shape(input_grad)[1]
W = np.shape(input_grad)[2]
C = np.shape(input_grad)[3]
print('widHig', B, H, W, C)
outMap = np.zeros((len(in_rpn_boxes), 102,102, input_grad.shape[3]), dtype = float)
#outMap = np.zeros((len(rpn_boxes), 50,50, input_fmap.shape[3]))
tr_rotated_box_all = []
transApplied = []
#ang1 = np.array(np.argmax(orient_scores, axis = 1))
#print('Im here :', 1*(ang1==0), 1*(ang1==1))
#idx0 = np.where(np.logical_or((ang1 == 0)*1,(ang1 == 1)*1))[0]
#print("len",len(in_rpn_boxes))
for idx in range(0, len(in_rpn_boxes)):
transCurrent = []
angle = theta[np.argmax(orient_scores[idx, :], axis = 0)]
#print ("angle", angle)
if angle==0 or angle==90 :
#print ("input_grad.shape",input_grad.shape)
outMap[idx, 0:input_grad.shape[1], 0:input_grad.shape[2], 0:input_grad.shape[3]] = input_grad[idx, :, :, :]
#print rpn_boxes[idx,1:5], [idx]+[rpn_boxes[idx,1],rpn_boxes[idx,2], rpn_boxes[idx,3], rpn_boxes[idx,4]]
tr_rotated_box_all.append([idx]+[in_rpn_boxes[idx,1],in_rpn_boxes[idx,2], in_rpn_boxes[idx,3], in_rpn_boxes[idx,4]])
T11 = [[1,0,0],[0,1,0],[0,0,1]]
transCurrent.append(T11)
transCurrent.append(T11)
transApplied.append(transCurrent)
box = in_rpn_boxes[idx,1:5]/16
sz = [box[3]-box[1], box[2]-box[0]]
cntr = np.asarray([(box[1]+box[3])/2, (box[0]+box[2])/2])
T1 = [[1,0,0],[0,1,0],[-cntr[1],-cntr[0],1]]
T2 = np.asarray([[np.cos(np.deg2rad(int(angle))), np.sin(np.deg2rad(int(angle))), 0],[-np.sin(np.deg2rad(int(angle))),np.cos(np.deg2rad(int(angle))),0],[0,0,1]])
T3 = [[1,0,0],[0,1,0],[cntr[1],cntr[0],1]]
T = np.dot(np.transpose(T3),np.dot(np.transpose(T2),np.transpose(T1)))
corner_pts = [[0,W-1,0,W-1],[0,0,H-1,H-1],[1,1,1,1]]
trans_cpoints = np.dot(T[0:2,:],corner_pts)
xymin = trans_cpoints.min(1)
xymax = trans_cpoints.max(1)
out_H = np.int32(xymax[1] - xymin[1] + 1)
out_W = np.int32(xymax[0] - xymin[0] + 1)
out_fmap_size = [0,0,out_W, out_H]
T4 = [[1,0,0],[0,1,0],[-xymin[0], -xymin[1],1]];
T_final = np.dot(np.transpose(T4),T);
T_final_inv = inv(T_final)
else:
if angle>90:
angle = angle-180
box = in_rpn_boxes[idx,1:5]/16
#print('angle', angle)
sz = [box[3]-box[1], box[2]-box[0]]
cntr = np.asarray([(box[1]+box[3])/2, (box[0]+box[2])/2])
#print('Original RPN :', box)
T1 = [[1,0,0],[0,1,0],[-cntr[1],-cntr[0],1]]
T2 = np.asarray([[np.cos(np.deg2rad(int(angle))), np.sin(np.deg2rad(int(angle))), 0],[-np.sin(np.deg2rad(int(angle))),np.cos(np.deg2rad(int(angle))),0],[0,0,1]])
T3 = [[1,0,0],[0,1,0],[cntr[1],cntr[0],1]]
T = np.dot(np.transpose(T3),np.dot(np.transpose(T2),np.transpose(T1)))
#T = np.dot(np.transpose(T3),np.dot(np.transpose(T2),T1))
corner_pts = [[0,W-1,0,W-1],[0,0,H-1,H-1],[1,1,1,1]]
#print('corner_pts', corner_pts)
trans_cpoints = np.dot(T[0:2,:],corner_pts)
xymin = trans_cpoints.min(1)
xymax = trans_cpoints.max(1)
out_H = np.int32(xymax[1] - xymin[1] + 1)
out_W = np.int32(xymax[0] - xymin[0] + 1)
#print('out_W', out_W, out_H, W,H)
out_fmap_size = [0,0,out_W, out_H]
T4 = [[1,0,0],[0,1,0],[-xymin[0], -xymin[1],1]];
#print T
#print T4
T_final = np.dot(np.transpose(T4),T);
T_final_inv = inv(T_final)
#print T_final
rect_pts = [ [box[0],box[2], box[0], box[2]] ,[box[1],box[1],box[3],box[3]],[1,1,1,1]]
trans_rpoints = np.dot(T_final_inv[0:2,:],rect_pts)
rxymin = trans_rpoints.min(1)
rxymax = trans_rpoints.max(1)
cropped_box = [np.int32(np.floor(rxymin[0])), np.int32(np.floor(rxymin[1])), np.int32(np.floor(rxymax[0])), np.int32(np.floor(rxymax[1]))]
#print('cropped_box :', cropped_box)
#print('trans_rpoints :', trans_rpoints)
# find coordinates for maximum area inscribed rectangle
#intSec1, intSec2 = LinesIntersectionForLargestBox(trans_rpoints, np.array(rect_pts), angle)
#height_deltas = [intSec1[1]-cropped_box[1], cropped_box[3]-intSec2[1]]
#print('height_deltas: ', height_deltas)
T11 = [[1,0,0],[0,1,0],[-rxymin[0],-rxymin[1],1]]
T11 = np.transpose(T11)
rect_pts1 = [trans_rpoints[0], trans_rpoints[1],[1,1,1,1]]
trans_rpoints = np.dot(T11[0:2,:],rect_pts1)
rxymin1 = trans_rpoints.min(1)
rxymax1 = trans_rpoints.max(1)
#print ('trans_rpoints00 : ', trans_rpoints)
#tr_rotated_box = [rxymin1[0], rxymin1[1]+height_deltas[0], rxymax1[0], rxymax1[1]-height_deltas[1]]
tr_rotated_box = [rxymin1[0], rxymin1[1], rxymax1[0], rxymax1[1]]
#print ('box : ', box)
#print ('tr_rotated_box : ', tr_rotated_box)
tr_rotated_box = [ik * 16 for ik in tr_rotated_box]
#ross = [[0]+ il for il in rotated_rpns]
tr_rotated_box_all.append([idx]+tr_rotated_box)
transCurrent.append(T_final_inv)
transCurrent.append(T11)
batch_grids = affine_grid_generator(out_H, out_W, T_final_inv)
x_s = batch_grids[:,0, :, :]
y_s = batch_grids[:,1, :, :]
tup = np.reshape(input_grad[idx, :, :, :], (1,np.shape(input_grad)[1],np.shape(input_grad)[2],np.shape(input_grad)[3]))
out_fmap = bilinear_sampler_Interpol(tup.copy(), x_s, y_s)
#print ("tup.shape",tup.shape)
#print (xyz)
if cropped_box[0] < 0:
cropped_box[2] = int(cropped_box[2] - cropped_box[0])
cropped_box[0] = int(cropped_box[0] - cropped_box[0])
if cropped_box[1] < 0:
cropped_box[3] = int(cropped_box[3] - cropped_box[1])
cropped_box[1] = int(cropped_box[1] - cropped_box[1])
f_map = out_fmap[:, cropped_box[1]:cropped_box[3] , cropped_box[0]:cropped_box[2] ,:]
#print('output_fmap', (f_map[0,:,:,:]).sum())
outMap[idx, 0:f_map.shape[1], 0:f_map.shape[2], 0:f_map.shape[3]] = f_map
#print('output_fmap1', (outMap[idx,:,:,:]).sum())
#tr_rotated_box_all = np.array(tr_rotated_box_all)
#print('featureMap size : ', outMap.shape)
transApplied.append(transCurrent)
return outMap, tr_rotated_box_all, transApplied
| 30.649937 | 166 | 0.60709 | 4,248 | 24,428 | 3.333569 | 0.06709 | 0.012429 | 0.028812 | 0.008474 | 0.841395 | 0.817951 | 0.804251 | 0.799732 | 0.786597 | 0.782572 | 0 | 0.073898 | 0.19232 | 24,428 | 796 | 167 | 30.688442 | 0.643842 | 0.177256 | 0 | 0.726392 | 0 | 0 | 0.001102 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.021792 | null | null | 0.004843 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3d80663872436766195a0b0dccb85f721f4533b7 | 13,297 | py | Python | tests/rest_tests/test_feedback.py | Revfluence/clarifai-python | e235cb6282097b2a9ca3ce8876e730b2ab68c83f | [
"Apache-2.0"
] | null | null | null | tests/rest_tests/test_feedback.py | Revfluence/clarifai-python | e235cb6282097b2a9ca3ce8876e730b2ab68c83f | [
"Apache-2.0"
] | null | null | null | tests/rest_tests/test_feedback.py | Revfluence/clarifai-python | e235cb6282097b2a9ca3ce8876e730b2ab68c83f | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Test for feedback API endpoints
"""
import logging
from clarifai.rest import ClarifaiApp
from clarifai.rest import FeedbackInfo, FeedbackType
from clarifai.rest import Region, RegionInfo, BoundingBox, Concept
from clarifai.rest import Face, FaceAgeAppearance, FaceIdentity, FaceGenderAppearance, \
FaceMulticulturalAppearance
import unittest
class TestFeedback(unittest.TestCase):
_multiprocess_can_split_ = True
to_cleanup = []
@classmethod
def setUpClass(cls):
cls.app = ClarifaiApp(log_level=logging.WARN)
def test_send_concept_feedback(self):
""" send concept feedback to models """
feedback_info = FeedbackInfo(end_user_id='robert_python_test_key', session_id='sessionSS',
event_type='annotation',
output_id='oooooooid')
m = self.app.models.get('general-v1.3')
ret = m.send_concept_feedback(input_id='bb', url='https://samples.clarifai.com/dog2.jpeg',
concepts=['dog', 'puppy'], not_concepts=['cat', 'tiger'],
feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
ret = m.send_concept_feedback(input_id='bb', url='https://samples.clarifai.com/dog2.jpeg',
concepts=['dog', 'puppy'],
feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
ret = m.send_concept_feedback(input_id='bb', url='https://samples.clarifai.com/dog2.jpeg',
not_concepts=['cat'],
feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
def test_send_region_feedback(self):
""" send region feedback with concepts """
feedback_info = FeedbackInfo(end_user_id='robert_python_test_key',
session_id='from_your_browser', event_type='annotation',
output_id='oooooooid')
m = self.app.models.get('general-v1.3')
concepts = [Concept(concept_id='ab', value=False)]
regions = [
Region(RegionInfo(bbox=BoundingBox(0.3, 0.2, 0.7, 0.8), feedback_type='accurate'),
concepts=concepts)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
concepts=['matid2'], not_concepts=['lambo'],
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
concepts = [Concept(concept_id='ab', value=False),
Concept(concept_id='bb', value=True)]
regions = [Region(RegionInfo(bbox=BoundingBox(0.3, 0.2, 0.7, 0.8)), concepts=concepts),
Region(RegionInfo(bbox=BoundingBox(0.1, 0.2, 0.8, 0.85)), concepts=concepts)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
concepts=['matid2'], not_concepts=['lambo'],
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
def test_send_different_region_feedbacks(self):
""" send different kind of region feedbacks
- accurate
- misplaced
- not_detected
- false_positive
"""
feedback_info = FeedbackInfo(end_user_id='robert_python_test_key',
session_id='from_your_browser', event_type='annotation',
output_id='oooooooid')
m = self.app.models.get('general-v1.3')
# positive
concepts = [Concept(concept_id='ab', value=False)]
regions = [
Region(RegionInfo(bbox=BoundingBox(0.3, 0.2, 0.7, 0.8), feedback_type='accurate'),
concepts=concepts)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
concepts=['matid2'], not_concepts=['lambo'],
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
# misplaced
regions = [
Region(RegionInfo(bbox=BoundingBox(0.3, 0.2, 0.8, 0.9), feedback_type='misplaced'),
concepts=concepts)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
concepts=['matid2'], not_concepts=['lambo'],
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
# misplaced
regions = [
Region(RegionInfo(bbox=BoundingBox(0.1, 0.2, 0.4, 0.5), feedback_type='not_detected'),
concepts=concepts)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
concepts=['matid2'], not_concepts=['lambo'],
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
# false_positive
regions = [
Region(RegionInfo(bbox=BoundingBox(0.1, 0.2, 0.4, 0.5), feedback_type='false_positive'),
concepts=concepts)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
concepts=['matid2'], not_concepts=['lambo'],
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
def test_send_different_region_feedbacks_with_enum_vars(self):
""" send different kind of region feedbacks with enums
- accurate
- misplaced
- not_detected
- false_positive
"""
feedback_info = FeedbackInfo(end_user_id='robert_python_test_key',
session_id='from_your_browser', event_type='annotation',
output_id='oooooooid')
m = self.app.models.get('general-v1.3')
# positive
concepts = [Concept(concept_id='ab', value=False)]
regions = [Region(
RegionInfo(bbox=BoundingBox(0.3, 0.2, 0.7, 0.8), feedback_type=FeedbackType.accurate),
concepts=concepts)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
concepts=['matid2'], not_concepts=['lambo'],
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
# misplaced
regions = [Region(
RegionInfo(bbox=BoundingBox(0.3, 0.2, 0.8, 0.9), feedback_type=FeedbackType.misplaced),
concepts=concepts)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
concepts=['matid2'], not_concepts=['lambo'],
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
# misplaced
regions = [Region(RegionInfo(bbox=BoundingBox(0.1, 0.2, 0.4, 0.5),
feedback_type=FeedbackType.not_detected), concepts=concepts)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
concepts=['matid2'], not_concepts=['lambo'],
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
# false_positive
regions = [Region(RegionInfo(bbox=BoundingBox(0.1, 0.2, 0.4, 0.5),
feedback_type=FeedbackType.false_positive), concepts=concepts)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
concepts=['matid2'], not_concepts=['lambo'],
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
def test_send_age_feedback(self):
""" send face feedback """
feedback_info = FeedbackInfo(end_user_id='robert_python_test_key',
session_id='from_your_browser', event_type='annotation',
output_id='oooooooid')
m = self.app.models.get('general-v1.3')
identities = [Concept(concept_id='xx', value=True), Concept(concept_id='x2', value=False)]
ages = [Concept(concept_id='1', value=True)]
face = Face(identity=FaceIdentity(identities), age_appearance=FaceAgeAppearance(ages))
regions = [Region(RegionInfo(bbox=BoundingBox(0.3, 0.2, 0.7, 0.8)), face=face)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
# send multiple items
identities = [Concept(concept_id='xx', value=True), Concept(concept_id='x2', value=False),
Concept(concept_id='x4', value=True)]
ages = [Concept(concept_id='1', value=True), Concept(concept_id='2', value=False)]
face = Face(identity=FaceIdentity(identities), age_appearance=FaceAgeAppearance(ages))
regions = [Region(RegionInfo(bbox=BoundingBox(0.3, 0.2, 0.7, 0.8)), face=face)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
def test_send_gender_feedback(self):
""" send face feedback """
feedback_info = FeedbackInfo(end_user_id='robert_python_test_key',
session_id='from_your_browser', event_type='annotation',
output_id='oooooooid')
m = self.app.models.get('general-v1.3')
identities = [Concept(concept_id='xx', value=True), Concept(concept_id='x2', value=False)]
genders = [Concept(concept_id='male', value=True),
Concept(concept_id='female', value=False)]
face = Face(identity=FaceIdentity(identities),
gender_appearance=FaceGenderAppearance(genders))
regions = [Region(RegionInfo(bbox=BoundingBox(0.3, 0.2, 0.7, 0.8)), face=face)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
def test_send_multicultural_feedback(self):
""" send face feedback """
feedback_info = FeedbackInfo(end_user_id='robert_python_test_key',
session_id='from_your_browser', event_type='annotation',
output_id='oooooooid')
m = self.app.models.get('general-v1.3')
identities = [Concept(concept_id='xx', value=True), Concept(concept_id='x2', value=False)]
cultures = [Concept(concept_id='american', value=True),
Concept(concept_id='asian', value=False)]
face = Face(identity=FaceIdentity(identities),
multicultural_appearance=FaceMulticulturalAppearance(cultures))
regions = [Region(RegionInfo(bbox=BoundingBox(0.3, 0.2, 0.7, 0.8)), face=face)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
def test_send_complete_multicultural_feedback(self):
""" send face feedback """
feedback_info = FeedbackInfo(end_user_id='robert_python_test_key',
session_id='from_your_browser', event_type='annotation',
output_id='oooooooid')
m = self.app.models.get('general-v1.3')
identities = [Concept(concept_id='xx', value=True), Concept(concept_id='x2', value=False)]
ages = [Concept(concept_id='1', value=True), Concept(concept_id='2', value=False)]
genders = [Concept(concept_id='male', value=True),
Concept(concept_id='female', value=False)]
cultures = [Concept(concept_id='american', value=True),
Concept(concept_id='asian', value=False)]
face = Face(identity=FaceIdentity(identities),
age_appearance=FaceAgeAppearance(ages),
gender_appearance=FaceGenderAppearance(genders),
multicultural_appearance=FaceMulticulturalAppearance(cultures))
regions = [Region(RegionInfo(bbox=BoundingBox(0.3, 0.2, 0.7, 0.8)), face=face)]
ret = m.send_region_feedback(input_id='xyz', url='https://samples.clarifai.com/dog.tiff',
regions=regions, feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
def test_send_search_feedback(self):
""" send search feedback """
feedback_info = FeedbackInfo(end_user_id='robert_python_test_key',
session_id='from_your_browser', event_type='search_click',
search_id='searchID')
ret = self.app.inputs.send_search_feedback(input_id='xyz', feedback_info=feedback_info)
self.assertEqual(ret['status']['code'], 10000)
if __name__ == '__main__':
unittest.main()
| 44.771044 | 96 | 0.616004 | 1,522 | 13,297 | 5.180026 | 0.092641 | 0.071537 | 0.058853 | 0.057839 | 0.885464 | 0.881152 | 0.881152 | 0.862126 | 0.862126 | 0.85756 | 0 | 0.026439 | 0.246221 | 13,297 | 296 | 97 | 44.922297 | 0.760152 | 0.040159 | 0 | 0.744792 | 0 | 0 | 0.142891 | 0.015692 | 0 | 0 | 0 | 0 | 0.098958 | 1 | 0.052083 | false | 0 | 0.03125 | 0 | 0.098958 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9a7ab86844aa16594f0257eb5f2b256121f33d7e | 2,411 | py | Python | nnc/helpers/torch_utils/nn_architectures/fully_connected.py | celestialteapot/nnc | 868d51de5b7ca7050b3efe9072c77168444ca048 | [
"MIT"
] | 17 | 2021-04-06T22:57:07.000Z | 2022-03-31T08:37:08.000Z | nnc/helpers/torch_utils/nn_architectures/fully_connected.py | celestialteapot/nnc | 868d51de5b7ca7050b3efe9072c77168444ca048 | [
"MIT"
] | 3 | 2022-02-21T14:50:01.000Z | 2022-02-23T15:21:45.000Z | nnc/helpers/torch_utils/nn_architectures/fully_connected.py | celestialteapot/nnc | 868d51de5b7ca7050b3efe9072c77168444ca048 | [
"MIT"
] | 6 | 2022-01-26T06:56:14.000Z | 2022-02-24T01:21:49.000Z | import torch
from torch.nn import Linear
from nnc.helpers.torch_utils.indexing import multi_unsqueeze
class StackedDenseTimeControl(torch.nn.Module):
"""
Very simple Elu architecture for control of linear systems
"""
def __init__(self, n_nodes, n_drivers, n_hidden=1, hidden_size=10,
activation=torch.nn.functional.elu, use_bias=True):
super().__init__()
self.input_layer = torch.nn.Linear(1, hidden_size, bias=use_bias)
hidden_layers = []
for i in range(n_hidden):
hidden_layers.append(torch.nn.Linear(hidden_size, hidden_size, bias=use_bias))
self.hidden_layers = torch.nn.ModuleList(hidden_layers)
self.output_layer = torch.nn.Linear(hidden_size, n_drivers, bias=use_bias)
self.activation = activation
def forward(self, t, x):
"""
:param t: A scalar or a batch with scalars
:param x: input_states for all nodes
:return:
"""
t = t
tshape = len(t.shape)
if tshape < 2:
t = multi_unsqueeze(t, 2-tshape)
u = self.input_layer(t)
u = self.activation(u)
for module in self.hidden_layers:
u = module(u)
u = self.activation(u)
u = self.output_layer(u)
return u
class StackedDenseFeedbackControl(torch.nn.Module):
"""
Very simple Elu architecture for control of systems with feedback
"""
def __init__(self, n_nodes, n_drivers, n_hidden=1, hidden_size=10,
activation=torch.nn.functional.elu, use_bias=True):
super().__init__()
self.input_layer = torch.nn.Linear(n_nodes, hidden_size, bias=use_bias)
hidden_layers = []
for i in range(n_hidden):
hidden_layers.append(torch.nn.Linear(hidden_size, hidden_size, bias=use_bias))
self.hidden_layers = torch.nn.ModuleList(hidden_layers)
self.output_layer = torch.nn.Linear(hidden_size, n_drivers, bias=use_bias)
self.activation = activation
def forward(self, t, x):
"""
:param t: A scalar or a batch with scalars
:param x: input_states for all nodes
:return:
"""
u = self.input_layer(x)
u = self.activation(u)
for module in self.hidden_layers:
u = module(u)
u = self.activation(u)
u = self.output_layer(u)
return u
| 34.942029 | 90 | 0.622978 | 323 | 2,411 | 4.439628 | 0.210526 | 0.063459 | 0.054393 | 0.050209 | 0.814505 | 0.814505 | 0.814505 | 0.814505 | 0.814505 | 0.814505 | 0 | 0.005166 | 0.277478 | 2,411 | 68 | 91 | 35.455882 | 0.818025 | 0.125674 | 0 | 0.711111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.088889 | false | 0 | 0.066667 | 0 | 0.244444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9ab646d4d0718fbf94e643abd9c6733dbdba3e25 | 175 | py | Python | tests/test_qerf.py | PaulHoward77/itmlogic | 4b19cbadb20478b8a13d47e3e945dbc40293255f | [
"MIT"
] | null | null | null | tests/test_qerf.py | PaulHoward77/itmlogic | 4b19cbadb20478b8a13d47e3e945dbc40293255f | [
"MIT"
] | null | null | null | tests/test_qerf.py | PaulHoward77/itmlogic | 4b19cbadb20478b8a13d47e3e945dbc40293255f | [
"MIT"
] | null | null | null | import pytest
from itmlogic.qerf import qerf
def test_qerf():
assert round(qerf(-1),4) == 0.8413
assert round(qerf(1),4) == 0.1587
assert round(qerf(10),4) == 0
| 19.444444 | 38 | 0.645714 | 30 | 175 | 3.733333 | 0.5 | 0.294643 | 0.401786 | 0.285714 | 0.321429 | 0.321429 | 0 | 0 | 0 | 0 | 0 | 0.12766 | 0.194286 | 175 | 8 | 39 | 21.875 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.5 | 1 | 0.166667 | true | 0 | 0.333333 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
b10b3c4de69032ec14f4e55d373ee6f2d540932b | 1,137 | py | Python | terrascript/scaleway/r.py | vutsalsinghal/python-terrascript | 3b9fb5ad77453d330fb0cd03524154a342c5d5dc | [
"BSD-2-Clause"
] | null | null | null | terrascript/scaleway/r.py | vutsalsinghal/python-terrascript | 3b9fb5ad77453d330fb0cd03524154a342c5d5dc | [
"BSD-2-Clause"
] | null | null | null | terrascript/scaleway/r.py | vutsalsinghal/python-terrascript | 3b9fb5ad77453d330fb0cd03524154a342c5d5dc | [
"BSD-2-Clause"
] | null | null | null | # terrascript/scaleway/r.py
import terrascript
class scaleway_bucket(terrascript.Resource):
pass
class scaleway_compute_instance_ip(terrascript.Resource):
pass
class scaleway_compute_instance_volume(terrascript.Resource):
pass
class scaleway_compute_instance_security_group(terrascript.Resource):
pass
class scaleway_compute_instance_server(terrascript.Resource):
pass
class scaleway_compute_instance_placement_group(terrascript.Resource):
pass
class scaleway_storage_object_bucket(terrascript.Resource):
pass
class scaleway_user_data(terrascript.Resource):
pass
class scaleway_server(terrascript.Resource):
pass
class scaleway_token(terrascript.Resource):
pass
class scaleway_ssh_key(terrascript.Resource):
pass
class scaleway_ip(terrascript.Resource):
pass
class scaleway_ip_reverse_dns(terrascript.Resource):
pass
class scaleway_security_group(terrascript.Resource):
pass
class scaleway_security_group_rule(terrascript.Resource):
pass
class scaleway_volume(terrascript.Resource):
pass
class scaleway_volume_attachment(terrascript.Resource):
pass
| 20.303571 | 70 | 0.810906 | 130 | 1,137 | 6.792308 | 0.215385 | 0.250283 | 0.442809 | 0.507361 | 0.84145 | 0.759909 | 0.414496 | 0 | 0 | 0 | 0 | 0 | 0.12577 | 1,137 | 55 | 71 | 20.672727 | 0.88833 | 0.021988 | 0 | 0.485714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.485714 | 0.028571 | 0 | 0.514286 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
b12c2f50f7ccd0a084e47fca3769a4db1c5a7e6c | 63,994 | py | Python | TweakApi/apis/product_material_api.py | tweak-com-public/tweak-api-client-python | 019f86da11fdb12683d516f8f37db5d717380bcc | [
"Apache-2.0"
] | null | null | null | TweakApi/apis/product_material_api.py | tweak-com-public/tweak-api-client-python | 019f86da11fdb12683d516f8f37db5d717380bcc | [
"Apache-2.0"
] | null | null | null | TweakApi/apis/product_material_api.py | tweak-com-public/tweak-api-client-python | 019f86da11fdb12683d516f8f37db5d717380bcc | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
tweak-api
Tweak API to integrate with all the Tweak services. You can find out more about Tweak at <a href='https://www.tweak.com'>https://www.tweak.com</a>, #tweak.
OpenAPI spec version: 1.0.8-beta.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class ProductMaterialApi(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def product_materials_change_stream_get(self, **kwargs):
"""
Create a change stream.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_change_stream_get(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str options:
:return: file
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_change_stream_get_with_http_info(**kwargs)
else:
(data) = self.product_materials_change_stream_get_with_http_info(**kwargs)
return data
def product_materials_change_stream_get_with_http_info(self, **kwargs):
"""
Create a change stream.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_change_stream_get_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str options:
:return: file
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['options']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_change_stream_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/ProductMaterials/change-stream'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'options' in params:
query_params['options'] = params['options']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='file',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def product_materials_change_stream_post(self, **kwargs):
"""
Create a change stream.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_change_stream_post(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str options:
:return: file
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_change_stream_post_with_http_info(**kwargs)
else:
(data) = self.product_materials_change_stream_post_with_http_info(**kwargs)
return data
def product_materials_change_stream_post_with_http_info(self, **kwargs):
"""
Create a change stream.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_change_stream_post_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str options:
:return: file
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['options']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_change_stream_post" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/ProductMaterials/change-stream'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
if 'options' in params:
form_params.append(('options', params['options']))
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='file',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def product_materials_count_get(self, **kwargs):
"""
Count instances of the model matched by where from the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_count_get(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str where: Criteria to match model instances
:return: InlineResponse2001
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_count_get_with_http_info(**kwargs)
else:
(data) = self.product_materials_count_get_with_http_info(**kwargs)
return data
def product_materials_count_get_with_http_info(self, **kwargs):
"""
Count instances of the model matched by where from the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_count_get_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str where: Criteria to match model instances
:return: InlineResponse2001
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['where']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_count_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/ProductMaterials/count'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'where' in params:
query_params['where'] = params['where']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse2001',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def product_materials_find_one_get(self, **kwargs):
"""
Find first instance of the model matched by filter from the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_find_one_get(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str filter: Filter defining fields, where, include, order, offset, and limit - must be a JSON-encoded string ({\"something\":\"value\"})
:return: ProductMaterial
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_find_one_get_with_http_info(**kwargs)
else:
(data) = self.product_materials_find_one_get_with_http_info(**kwargs)
return data
def product_materials_find_one_get_with_http_info(self, **kwargs):
"""
Find first instance of the model matched by filter from the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_find_one_get_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str filter: Filter defining fields, where, include, order, offset, and limit - must be a JSON-encoded string ({\"something\":\"value\"})
:return: ProductMaterial
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['filter']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_find_one_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/ProductMaterials/findOne'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'filter' in params:
query_params['filter'] = params['filter']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProductMaterial',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def product_materials_get(self, **kwargs):
"""
Find all instances of the model matched by filter from the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_get(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str filter: Filter defining fields, where, include, order, offset, and limit - must be a JSON-encoded string ({\"something\":\"value\"})
:return: list[ProductMaterial]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_get_with_http_info(**kwargs)
else:
(data) = self.product_materials_get_with_http_info(**kwargs)
return data
def product_materials_get_with_http_info(self, **kwargs):
"""
Find all instances of the model matched by filter from the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_get_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str filter: Filter defining fields, where, include, order, offset, and limit - must be a JSON-encoded string ({\"something\":\"value\"})
:return: list[ProductMaterial]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['filter']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/ProductMaterials'.replace('{format}', 'json')
path_params = {}
query_params = {}
if 'filter' in params:
query_params['filter'] = params['filter']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[ProductMaterial]',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def product_materials_id_delete(self, id, **kwargs):
"""
Delete a model instance by {{id}} from the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_delete(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Model id (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_id_delete_with_http_info(id, **kwargs)
else:
(data) = self.product_materials_id_delete_with_http_info(id, **kwargs)
return data
def product_materials_id_delete_with_http_info(self, id, **kwargs):
"""
Delete a model instance by {{id}} from the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_delete_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Model id (required)
:return: object
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_id_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `product_materials_id_delete`")
collection_formats = {}
resource_path = '/ProductMaterials/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='object',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def product_materials_id_exists_get(self, id, **kwargs):
"""
Check whether a model instance exists in the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_exists_get(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Model id (required)
:return: InlineResponse2002
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_id_exists_get_with_http_info(id, **kwargs)
else:
(data) = self.product_materials_id_exists_get_with_http_info(id, **kwargs)
return data
def product_materials_id_exists_get_with_http_info(self, id, **kwargs):
"""
Check whether a model instance exists in the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_exists_get_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Model id (required)
:return: InlineResponse2002
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_id_exists_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `product_materials_id_exists_get`")
collection_formats = {}
resource_path = '/ProductMaterials/{id}/exists'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse2002',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def product_materials_id_get(self, id, **kwargs):
"""
Find a model instance by {{id}} from the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_get(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Model id (required)
:param str filter: Filter defining fields and include - must be a JSON-encoded string ({\"something\":\"value\"})
:return: ProductMaterial
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_id_get_with_http_info(id, **kwargs)
else:
(data) = self.product_materials_id_get_with_http_info(id, **kwargs)
return data
def product_materials_id_get_with_http_info(self, id, **kwargs):
"""
Find a model instance by {{id}} from the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_get_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Model id (required)
:param str filter: Filter defining fields and include - must be a JSON-encoded string ({\"something\":\"value\"})
:return: ProductMaterial
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'filter']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_id_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `product_materials_id_get`")
collection_formats = {}
resource_path = '/ProductMaterials/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
if 'filter' in params:
query_params['filter'] = params['filter']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProductMaterial',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def product_materials_id_head(self, id, **kwargs):
"""
Check whether a model instance exists in the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_head(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Model id (required)
:return: InlineResponse2002
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_id_head_with_http_info(id, **kwargs)
else:
(data) = self.product_materials_id_head_with_http_info(id, **kwargs)
return data
def product_materials_id_head_with_http_info(self, id, **kwargs):
"""
Check whether a model instance exists in the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_head_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Model id (required)
:return: InlineResponse2002
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_id_head" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `product_materials_id_head`")
collection_formats = {}
resource_path = '/ProductMaterials/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'HEAD',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='InlineResponse2002',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def product_materials_id_patch(self, id, **kwargs):
"""
Patch attributes for a model instance and persist it into the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_patch(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ProductMaterial id (required)
:param ProductMaterial data: An object of model property name/value pairs
:return: ProductMaterial
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_id_patch_with_http_info(id, **kwargs)
else:
(data) = self.product_materials_id_patch_with_http_info(id, **kwargs)
return data
def product_materials_id_patch_with_http_info(self, id, **kwargs):
"""
Patch attributes for a model instance and persist it into the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_patch_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ProductMaterial id (required)
:param ProductMaterial data: An object of model property name/value pairs
:return: ProductMaterial
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'data']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_id_patch" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `product_materials_id_patch`")
collection_formats = {}
resource_path = '/ProductMaterials/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'data' in params:
body_params = params['data']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProductMaterial',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def product_materials_id_put(self, id, **kwargs):
"""
Replace attributes for a model instance and persist it into the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_put(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Model id (required)
:param ProductMaterial data: Model instance data
:return: ProductMaterial
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_id_put_with_http_info(id, **kwargs)
else:
(data) = self.product_materials_id_put_with_http_info(id, **kwargs)
return data
def product_materials_id_put_with_http_info(self, id, **kwargs):
"""
Replace attributes for a model instance and persist it into the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_put_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Model id (required)
:param ProductMaterial data: Model instance data
:return: ProductMaterial
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'data']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_id_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `product_materials_id_put`")
collection_formats = {}
resource_path = '/ProductMaterials/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'data' in params:
body_params = params['data']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProductMaterial',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def product_materials_id_replace_post(self, id, **kwargs):
"""
Replace attributes for a model instance and persist it into the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_replace_post(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Model id (required)
:param ProductMaterial data: Model instance data
:return: ProductMaterial
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_id_replace_post_with_http_info(id, **kwargs)
else:
(data) = self.product_materials_id_replace_post_with_http_info(id, **kwargs)
return data
def product_materials_id_replace_post_with_http_info(self, id, **kwargs):
"""
Replace attributes for a model instance and persist it into the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_replace_post_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Model id (required)
:param ProductMaterial data: Model instance data
:return: ProductMaterial
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'data']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_id_replace_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `product_materials_id_replace_post`")
collection_formats = {}
resource_path = '/ProductMaterials/{id}/replace'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'data' in params:
body_params = params['data']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProductMaterial',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def product_materials_id_team_get(self, id, **kwargs):
"""
Fetches belongsTo relation team.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_team_get(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ProductMaterial id (required)
:param bool refresh:
:return: Team
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_id_team_get_with_http_info(id, **kwargs)
else:
(data) = self.product_materials_id_team_get_with_http_info(id, **kwargs)
return data
def product_materials_id_team_get_with_http_info(self, id, **kwargs):
"""
Fetches belongsTo relation team.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_id_team_get_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: ProductMaterial id (required)
:param bool refresh:
:return: Team
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'refresh']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_id_team_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `product_materials_id_team_get`")
collection_formats = {}
resource_path = '/ProductMaterials/{id}/team'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
if 'refresh' in params:
query_params['refresh'] = params['refresh']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Team',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
def product_materials_post(self, **kwargs):
"""
Create a new instance of the model and persist it into the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_post(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param ProductMaterial data: Model instance data
:return: ProductMaterial
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.product_materials_post_with_http_info(**kwargs)
else:
(data) = self.product_materials_post_with_http_info(**kwargs)
return data
def product_materials_post_with_http_info(self, **kwargs):
"""
Create a new instance of the model and persist it into the data source.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.product_materials_post_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param ProductMaterial data: Model instance data
:return: ProductMaterial
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['data']
all_params.append('callback')
all_params.append('_return_http_data_only')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method product_materials_post" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/ProductMaterials'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'data' in params:
body_params = params['data']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json', 'application/xml', 'text/xml', 'application/javascript', 'text/javascript'])
if not header_params['Accept']:
del header_params['Accept']
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json', 'application/x-www-form-urlencoded', 'application/xml', 'text/xml'])
# Authentication setting
auth_settings = ['access_token']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ProductMaterial',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
collection_formats=collection_formats)
| 40.838545 | 165 | 0.565897 | 6,466 | 63,994 | 5.380761 | 0.039901 | 0.064383 | 0.033111 | 0.028972 | 0.962635 | 0.959014 | 0.956513 | 0.947172 | 0.943435 | 0.943148 | 0 | 0.001131 | 0.350423 | 63,994 | 1,566 | 166 | 40.864623 | 0.835839 | 0.323999 | 0 | 0.821429 | 0 | 0 | 0.175009 | 0.067435 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039835 | false | 0 | 0.009615 | 0 | 0.108516 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4912826c42ed1122aaa8ac60d0a302ed7bf5c32b | 880 | py | Python | 2021_2022/Training_4/RSA?/solution.py | 0awawa0/DonNU_CTF | 7ff693fdba4609298f5556ea583fe604980d76e3 | [
"MIT"
] | null | null | null | 2021_2022/Training_4/RSA?/solution.py | 0awawa0/DonNU_CTF | 7ff693fdba4609298f5556ea583fe604980d76e3 | [
"MIT"
] | null | null | null | 2021_2022/Training_4/RSA?/solution.py | 0awawa0/DonNU_CTF | 7ff693fdba4609298f5556ea583fe604980d76e3 | [
"MIT"
] | null | null | null |
e = 272414194933637281923243626018746257179
n = 13670506216901788900238185227045760852421219896313098788199072116881425524359200284687555066101912589723168953443730772810703817875772731187254144516380993437534912174358374447142855693260343833577685119439469506982589836901610725831430885692785115275423964861646442271089904949981421849065181654030536671988981847864850153468049248513773673117189574205217862924602631082738370424268795872461059536588514805198415021502289845385850972211188875723505385162714997820068582476377023638386408259619096438614752647095380624595065108527159008861606957473197057318079082757369746459130277509479082078971082083309372859055189
c = 14960792561056535796247944855688100594661509413289901669705081061545389519668380417123374499218241309806730416180265312877338407882718213679
flag = (c * pow(e, -1, n)) % n
print(bytes.fromhex(hex(flag)[2:])) | 110 | 621 | 0.95 | 19 | 880 | 44 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.928987 | 0.023864 | 880 | 8 | 622 | 110 | 0.044237 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.